WorldWideScience

Sample records for sampling strategy designed

  1. A proposal of optimal sampling design using a modularity strategy

    Science.gov (United States)

    Simone, A.; Giustolisi, O.; Laucelli, D. B.

    2016-08-01

    In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.

  2. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  3. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    Science.gov (United States)

    Debasish Saha; Armen R. Kemanian; Benjamin M. Rau; Paul R. Adler; Felipe Montes

    2017-01-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (...

  4. Sample design and gamma-ray counting strategy of neutron activation system for triton burnup measurements in KSTAR

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Jungmin [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of); Cheon, Mun Seong [ITER Korea, National Fusion Research Institute, Daejeon (Korea, Republic of); Chung, Kyoung-Jae, E-mail: jkjlsh1@snu.ac.kr [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of); Hwang, Y.S. [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of)

    2016-11-01

    Highlights: • Sample design for triton burnup ratio measurement is carried out. • Samples for 14.1 MeV neutron measurements are selected for KSTAR. • Si and Cu are the most suitable materials for d-t neutron measurements. • Appropriate γ-ray counting strategies for each selected sample are established. - Abstract: On the purpose of triton burnup measurements in Korea Superconducting Tokamak Advanced Research (KSTAR) deuterium plasmas, appropriate neutron activation system (NAS) samples for 14.1 MeV d-t neutron measurements have been designed and gamma-ray counting strategy is established. Neutronics calculations are performed with the MCNP5 neutron transport code for the KSTAR neutral beam heated deuterium plasma discharges. Based on those calculations and the assumed d-t neutron yield, the activities induced by d-t neutrons are estimated with the inventory code FISPACT-2007 for candidate sample materials: Si, Cu, Al, Fe, Nb, Co, Ti, and Ni. It is found that Si, Cu, Al, and Fe are suitable for the KSATR NAS in terms of the minimum detectable activity (MDA) calculated based on the standard deviation of blank measurements. Considering background gamma-rays radiated from surrounding structures activated by thermalized fusion neutrons, appropriate gamma-ray counting strategy for each selected sample is established.

  5. Evaluation of sampling strategies to estimate crown biomass

    Directory of Open Access Journals (Sweden)

    Krishna P Poudel

    2015-01-01

    Full Text Available Background Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire modeling. However, crown biomass is difficult to predict because of the variability within and among species and sites. Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies. In this study, we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass. Methods Using data collected from 20 destructively sampled trees, we evaluated 11 different sampling strategies using six evaluation statistics: bias, relative bias, root mean square error (RMSE, relative RMSE, amount of biomass sampled, and relative biomass sampled. We also evaluated the performance of the selected sampling strategies when different numbers of branches (3, 6, 9, and 12 are selected from each tree. Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass. Results Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled. However, the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled. Under the stratified sampling strategy, selecting unequal number of branches per stratum produced approximately similar results to simple random sampling, but it further decreased RMSE when information on branch diameter is used in the design and estimation phases. Conclusions Use of

  6. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    Science.gov (United States)

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  7. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Directory of Open Access Journals (Sweden)

    Jake M Ferguson

    2014-06-01

    Full Text Available The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  8. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Science.gov (United States)

    Ferguson, Jake M; Langebrake, Jessica B; Cannataro, Vincent L; Garcia, Andres J; Hamman, Elizabeth A; Martcheva, Maia; Osenberg, Craig W

    2014-06-01

    The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  9. Potential-Decomposition Strategy in Markov Chain Monte Carlo Sampling Algorithms

    International Nuclear Information System (INIS)

    Shangguan Danhua; Bao Jingdong

    2010-01-01

    We introduce the potential-decomposition strategy (PDS), which can he used in Markov chain Monte Carlo sampling algorithms. PDS can be designed to make particles move in a modified potential that favors diffusion in phase space, then, by rejecting some trial samples, the target distributions can be sampled in an unbiased manner. Furthermore, if the accepted trial samples are insufficient, they can be recycled as initial states to form more unbiased samples. This strategy can greatly improve efficiency when the original potential has multiple metastable states separated by large barriers. We apply PDS to the 2d Ising model and a double-well potential model with a large barrier, demonstrating in these two representative examples that convergence is accelerated by orders of magnitude.

  10. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  11. Soil sampling strategies: Evaluation of different approaches

    Energy Technology Data Exchange (ETDEWEB)

    De Zorzi, Paolo [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, 100-00128 Roma (Italy)], E-mail: paolo.dezorzi@apat.it; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, 100-00128 Roma (Italy); Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia [Agenzia Regionale per la Prevenzione e Protezione dell' Ambiente del Veneto, ARPA Veneto, U.O. Centro Qualita Dati, Via Spalato, 14-36045 Vicenza (Italy)

    2008-11-15

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2{sigma}, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  12. Soil sampling strategies: Evaluation of different approaches

    International Nuclear Information System (INIS)

    De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-01-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2σ, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies

  13. Soil sampling strategies: evaluation of different approaches.

    Science.gov (United States)

    de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-11-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2sigma, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  14. Adaptive designs for the one-sample log-rank test.

    Science.gov (United States)

    Schmidt, Rene; Faldum, Andreas; Kwiecien, Robert

    2017-09-22

    Traditional designs in phase IIa cancer trials are single-arm designs with a binary outcome, for example, tumor response. In some settings, however, a time-to-event endpoint might appear more appropriate, particularly in the presence of loss to follow-up. Then the one-sample log-rank test might be the method of choice. It allows to compare the survival curve of the patients under treatment to a prespecified reference survival curve. The reference curve usually represents the expected survival under standard of the care. In this work, convergence of the one-sample log-rank statistic to Brownian motion is proven using Rebolledo's martingale central limit theorem while accounting for staggered entry times of the patients. On this basis, a confirmatory adaptive one-sample log-rank test is proposed where provision is made for data dependent sample size reassessment. The focus is to apply the inverse normal method. This is done in two different directions. The first strategy exploits the independent increments property of the one-sample log-rank statistic. The second strategy is based on the patient-wise separation principle. It is shown by simulation that the proposed adaptive test might help to rescue an underpowered trial and at the same time lowers the average sample number (ASN) under the null hypothesis as compared to a single-stage fixed sample design. © 2017, The International Biometric Society.

  15. Mars Sample Return - Launch and Detection Strategies for Orbital Rendezvous

    Science.gov (United States)

    Woolley, Ryan C.; Mattingly, Richard L.; Riedel, Joseph E.; Sturm, Erick J.

    2011-01-01

    This study sets forth conceptual mission design strategies for the ascent and rendezvous phase of the proposed NASA/ESA joint Mars Sample Return Campaign. The current notional mission architecture calls for the launch of an acquisition/cache rover in 2018, an orbiter with an Earth return vehicle in 2022, and a fetch rover and ascent vehicle in 2024. Strategies are presented to launch the sample into a coplanar orbit with the Orbiter which facilitate robust optical detection, orbit determination, and rendezvous. Repeating ground track orbits exist at 457 and 572 km which provide multiple launch opportunities with similar geometries for detection and rendezvous.

  16. Mars Sample Return: Launch and Detection Strategies for Orbital Rendezvous

    Science.gov (United States)

    Woolley, Ryan C.; Mattingly, Richard L.; Riedel, Joseph E.; Sturm, Erick J.

    2011-01-01

    This study sets forth conceptual mission design strategies for the ascent and rendezvous phase of the proposed NASA/ESA joint Mars Sample Return Campaign. The current notional mission architecture calls for the launch of an acquisition/ caching rover in 2018, an Earth return orbiter in 2022, and a fetch rover with ascent vehicle in 2024. Strategies are presented to launch the sample into a nearly coplanar orbit with the Orbiter which would facilitate robust optical detection, orbit determination, and rendezvous. Repeating ground track orbits existat 457 and 572 km which would provide multiple launch opportunities with similar geometries for detection and rendezvous.

  17. Spent nuclear fuel sampling strategy

    International Nuclear Information System (INIS)

    Bergmann, D.W.

    1995-01-01

    This report proposes a strategy for sampling the spent nuclear fuel (SNF) stored in the 105-K Basins (105-K East and 105-K West). This strategy will support decisions concerning the path forward SNF disposition efforts in the following areas: (1) SNF isolation activities such as repackaging/overpacking to a newly constructed staging facility; (2) conditioning processes for fuel stabilization; and (3) interim storage options. This strategy was developed without following the Data Quality Objective (DQO) methodology. It is, however, intended to augment the SNF project DQOS. The SNF sampling is derived by evaluating the current storage condition of the SNF and the factors that effected SNF corrosion/degradation

  18. Sampling strategies in antimicrobial resistance monitoring: evaluating how precision and sensitivity vary with the number of animals sampled per farm.

    Directory of Open Access Journals (Sweden)

    Takehisa Yamamoto

    Full Text Available Because antimicrobial resistance in food-producing animals is a major public health concern, many countries have implemented antimicrobial monitoring systems at a national level. When designing a sampling scheme for antimicrobial resistance monitoring, it is necessary to consider both cost effectiveness and statistical plausibility. In this study, we examined how sampling scheme precision and sensitivity can vary with the number of animals sampled from each farm, while keeping the overall sample size constant to avoid additional sampling costs. Five sampling strategies were investigated. These employed 1, 2, 3, 4 or 6 animal samples per farm, with a total of 12 animals sampled in each strategy. A total of 1,500 Escherichia coli isolates from 300 fattening pigs on 30 farms were tested for resistance against 12 antimicrobials. The performance of each sampling strategy was evaluated by bootstrap resampling from the observational data. In the bootstrapping procedure, farms, animals, and isolates were selected randomly with replacement, and a total of 10,000 replications were conducted. For each antimicrobial, we observed that the standard deviation and 2.5-97.5 percentile interval of resistance prevalence were smallest in the sampling strategy that employed 1 animal per farm. The proportion of bootstrap samples that included at least 1 isolate with resistance was also evaluated as an indicator of the sensitivity of the sampling strategy to previously unidentified antimicrobial resistance. The proportion was greatest with 1 sample per farm and decreased with larger samples per farm. We concluded that when the total number of samples is pre-specified, the most precise and sensitive sampling strategy involves collecting 1 sample per farm.

  19. Fault Management Design Strategies

    Science.gov (United States)

    Day, John C.; Johnson, Stephen B.

    2014-01-01

    Development of dependable systems relies on the ability of the system to determine and respond to off-nominal system behavior. Specification and development of these fault management capabilities must be done in a structured and principled manner to improve our understanding of these systems, and to make significant gains in dependability (safety, reliability and availability). Prior work has described a fundamental taxonomy and theory of System Health Management (SHM), and of its operational subset, Fault Management (FM). This conceptual foundation provides a basis to develop framework to design and implement FM design strategies that protect mission objectives and account for system design limitations. Selection of an SHM strategy has implications for the functions required to perform the strategy, and it places constraints on the set of possible design solutions. The framework developed in this paper provides a rigorous and principled approach to classifying SHM strategies, as well as methods for determination and implementation of SHM strategies. An illustrative example is used to describe the application of the framework and the resulting benefits to system and FM design and dependability.

  20. User-driven sampling strategies in image exploitation

    Science.gov (United States)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  1. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    Science.gov (United States)

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  2. Spatiotemporally Representative and Cost-Efficient Sampling Design for Validation Activities in Wanglang Experimental Site

    Directory of Open Access Journals (Sweden)

    Gaofei Yin

    2017-11-01

    Full Text Available Spatiotemporally representative Elementary Sampling Units (ESUs are required for capturing the temporal variations in surface spatial heterogeneity through field measurements. Since inaccessibility often coexists with heterogeneity, a cost-efficient sampling design is mandatory. We proposed a sampling strategy to generate spatiotemporally representative and cost-efficient ESUs based on the conditioned Latin hypercube sampling scheme. The proposed strategy was constrained by multi-temporal Normalized Difference Vegetation Index (NDVI imagery, and the ESUs were limited within a sampling feasible region established based on accessibility criteria. A novel criterion based on the Overlapping Area (OA between the NDVI frequency distribution histogram from the sampled ESUs and that from the entire study area was used to assess the sampling efficiency. A case study in Wanglang National Nature Reserve in China showed that the proposed strategy improves the spatiotemporally representativeness of sampling (mean annual OA = 74.7% compared to the single-temporally constrained (OA = 68.7% and the random sampling (OA = 63.1% strategies. The introduction of the feasible region constraint significantly reduces in-situ labour-intensive characterization necessities at expenses of about 9% loss in the spatiotemporal representativeness of the sampling. Our study will support the validation activities in Wanglang experimental site providing a benchmark for locating the nodes of automatic observation systems (e.g., LAINet which need a spatially distributed and temporally fixed sampling design.

  3. Mendelian breeding units versus standard sampling strategies: mitochondrial DNA variation in southwest Sardinia

    Directory of Open Access Journals (Sweden)

    Daria Sanna

    2011-01-01

    Full Text Available We report a sampling strategy based on Mendelian Breeding Units (MBUs, representing an interbreeding group of individuals sharing a common gene pool. The identification of MBUs is crucial for case-control experimental design in association studies. The aim of this work was to evaluate the possible existence of bias in terms of genetic variability and haplogroup frequencies in the MBU sample, due to severe sample selection. In order to reach this goal, the MBU sampling strategy was compared to a standard selection of individuals according to their surname and place of birth. We analysed mitochondrial DNA variation (first hypervariable segment and coding region in unrelated healthy subjects from two different areas of Sardinia: the area around the town of Cabras and the western Campidano area. No statistically significant differences were observed when the two sampling methods were compared, indicating that the stringent sample selection needed to establish a MBU does not alter original genetic variability and haplogroup distribution. Therefore, the MBU sampling strategy can be considered a useful tool in association studies of complex traits.

  4. Optimal experiment design in a filtering context with application to sampled network data

    OpenAIRE

    Singhal, Harsh; Michailidis, George

    2010-01-01

    We examine the problem of optimal design in the context of filtering multiple random walks. Specifically, we define the steady state E-optimal design criterion and show that the underlying optimization problem leads to a second order cone program. The developed methodology is applied to tracking network flow volumes using sampled data, where the design variable corresponds to controlling the sampling rate. The optimal design is numerically compared to a myopic and a naive strategy. Finally, w...

  5. A Geostatistical Approach to Indoor Surface Sampling Strategies

    DEFF Research Database (Denmark)

    Schneider, Thomas; Petersen, Ole Holm; Nielsen, Allan Aasbjerg

    1990-01-01

    Particulate surface contamination is of concern in production industries such as food processing, aerospace, electronics and semiconductor manufacturing. There is also an increased awareness that surface contamination should be monitored in industrial hygiene surveys. A conceptual and theoretical...... framework for designing sampling strategies is thus developed. The distribution and spatial correlation of surface contamination can be characterized using concepts from geostatistical science, where spatial applications of statistics is most developed. The theory is summarized and particulate surface...... contamination, sampled from small areas on a table, have been used to illustrate the method. First, the spatial correlation is modelled and the parameters estimated from the data. Next, it is shown how the contamination at positions not measured can be estimated with kriging, a minimum mean square error method...

  6. Validated sampling strategy for assessing contaminants in soil stockpiles

    International Nuclear Information System (INIS)

    Lame, Frank; Honders, Ton; Derksen, Giljam; Gadella, Michiel

    2005-01-01

    Dutch legislation on the reuse of soil requires a sampling strategy to determine the degree of contamination. This sampling strategy was developed in three stages. Its main aim is to obtain a single analytical result, representative of the true mean concentration of the soil stockpile. The development process started with an investigation into how sample pre-treatment could be used to obtain representative results from composite samples of heterogeneous soil stockpiles. Combining a large number of random increments allows stockpile heterogeneity to be fully represented in the sample. The resulting pre-treatment method was then combined with a theoretical approach to determine the necessary number of increments per composite sample. At the second stage, the sampling strategy was evaluated using computerised models of contaminant heterogeneity in soil stockpiles. The now theoretically based sampling strategy was implemented by the Netherlands Centre for Soil Treatment in 1995. It was applied to all types of soil stockpiles, ranging from clean to heavily contaminated, over a period of four years. This resulted in a database containing the analytical results of 2570 soil stockpiles. At the final stage these results were used for a thorough validation of the sampling strategy. It was concluded that the model approach has indeed resulted in a sampling strategy that achieves analytical results representative of the mean concentration of soil stockpiles. - A sampling strategy that ensures analytical results representative of the mean concentration in soil stockpiles is presented and validated

  7. Adaptive sampling strategies with high-throughput molecular dynamics

    Science.gov (United States)

    Clementi, Cecilia

    Despite recent significant hardware and software developments, the complete thermodynamic and kinetic characterization of large macromolecular complexes by molecular simulations still presents significant challenges. The high dimensionality of these systems and the complexity of the associated potential energy surfaces (creating multiple metastable regions connected by high free energy barriers) does not usually allow to adequately sample the relevant regions of their configurational space by means of a single, long Molecular Dynamics (MD) trajectory. Several different approaches have been proposed to tackle this sampling problem. We focus on the development of ensemble simulation strategies, where data from a large number of weakly coupled simulations are integrated to explore the configurational landscape of a complex system more efficiently. Ensemble methods are of increasing interest as the hardware roadmap is now mostly based on increasing core counts, rather than clock speeds. The main challenge in the development of an ensemble approach for efficient sampling is in the design of strategies to adaptively distribute the trajectories over the relevant regions of the systems' configurational space, without using any a priori information on the system global properties. We will discuss the definition of smart adaptive sampling approaches that can redirect computational resources towards unexplored yet relevant regions. Our approaches are based on new developments in dimensionality reduction for high dimensional dynamical systems, and optimal redistribution of resources. NSF CHE-1152344, NSF CHE-1265929, Welch Foundation C-1570.

  8. Sampling strategies for indoor radon investigations

    International Nuclear Information System (INIS)

    Prichard, H.M.

    1983-01-01

    Recent investigations prompted by concern about the environmental effects of residential energy conservation have produced many accounts of indoor radon concentrations far above background levels. In many instances time-normalized annual exposures exceeded the 4 WLM per year standard currently used for uranium mining. Further investigations of indoor radon exposures are necessary to judge the extent of the problem and to estimate the practicality of health effects studies. A number of trends can be discerned as more indoor surveys are reported. It is becoming increasingly clear that local geological factors play a major, if not dominant role in determining the distribution of indoor radon concentrations in a given area. Within a giving locale, indoor radon concentrations tend to be log-normally distributed, and sample means differ markedly from one region to another. The appreciation of geological factors and the general log-normality of radon distributions will improve the accuracy of population dose estimates and facilitate the design of preliminary health effects studies. The relative merits of grab samples, short and long term integrated samples, and more complicated dose assessment strategies are discussed in the context of several types of epidemiological investigations. A new passive radon sampler with a 24 hour integration time is described and evaluated as a tool for pilot investigations

  9. An instrument design and sample strategy for measuring soil respiration in the coastal temperate rain forest

    Science.gov (United States)

    Nay, S. M.; D'Amore, D. V.

    2009-12-01

    The coastal temperate rainforest (CTR) along the northwest coast of North America is a large and complex mosaic of forests and wetlands located on an undulating terrain ranging from sea level to thousands of meters in elevation. This biome stores a dynamic portion of the total carbon stock of North America. The fate of the terrestrial carbon stock is of concern due to the potential for mobilization and export of this store to both the atmosphere as carbon respiration flux and ocean as dissolved organic and inorganic carbon flux. Soil respiration is the largest export vector in the system and must be accurately measured to gain any comprehensive understanding of how carbon moves though this system. Suitable monitoring tools capable of measuring carbon fluxes at small spatial scales are essential for our understanding of carbon dynamics at larger spatial scales within this complex assemblage of ecosystems. We have adapted instrumentation and developed a sampling strategy for optimizing replication of soil respiration measurements to quantify differences among spatially complex landscape units of the CTR. We start with the design of the instrument to ease the technological, ergonomic and financial barriers that technicians encounter in monitoring the efflux of CO2 from the soil. Our sampling strategy optimizes the physical efforts of the field work and manages for the high variation of flux measurements encountered in this difficult environment of rough terrain, dense vegetation and wet climate. Our soil respirometer incorporates an infra-red gas analyzer (LiCor Inc. LI-820) and an 8300 cm3 soil respiration chamber; the device is durable, lightweight, easy to operate and can be built for under $5000 per unit. The modest unit price allows for a multiple unit fleet to be deployed and operated in an intensive field monitoring campaign. We use a large 346 cm2 collar to accommodate as much micro spatial variation as feasible and to facilitate repeated measures for tracking

  10. The SDSS-IV MaNGA Sample: Design, Optimization, and Usage Considerations

    Science.gov (United States)

    Wake, David A.; Bundy, Kevin; Diamond-Stanic, Aleksandar M.; Yan, Renbin; Blanton, Michael R.; Bershady, Matthew A.; Sánchez-Gallego, José R.; Drory, Niv; Jones, Amy; Kauffmann, Guinevere; Law, David R.; Li, Cheng; MacDonald, Nicholas; Masters, Karen; Thomas, Daniel; Tinker, Jeremy; Weijmans, Anne-Marie; Brownstein, Joel R.

    2017-09-01

    We describe the sample design for the SDSS-IV MaNGA survey and present the final properties of the main samples along with important considerations for using these samples for science. Our target selection criteria were developed while simultaneously optimizing the size distribution of the MaNGA integral field units (IFUs), the IFU allocation strategy, and the target density to produce a survey defined in terms of maximizing signal-to-noise ratio, spatial resolution, and sample size. Our selection strategy makes use of redshift limits that only depend on I-band absolute magnitude (M I ), or, for a small subset of our sample, M I and color (NUV - I). Such a strategy ensures that all galaxies span the same range in angular size irrespective of luminosity and are therefore covered evenly by the adopted range of IFU sizes. We define three samples: the Primary and Secondary samples are selected to have a flat number density with respect to M I and are targeted to have spectroscopic coverage to 1.5 and 2.5 effective radii (R e ), respectively. The Color-Enhanced supplement increases the number of galaxies in the low-density regions of color-magnitude space by extending the redshift limits of the Primary sample in the appropriate color bins. The samples cover the stellar mass range 5× {10}8≤slant {M}* ≤slant 3× {10}11 {M}⊙ {h}-2 and are sampled at median physical resolutions of 1.37 and 2.5 kpc for the Primary and Secondary samples, respectively. We provide weights that will statistically correct for our luminosity and color-dependent selection function and IFU allocation strategy, thus correcting the observed sample to a volume-limited sample.

  11. Sampling strategies for estimating brook trout effective population size

    Science.gov (United States)

    Andrew R. Whiteley; Jason A. Coombs; Mark Hudy; Zachary Robinson; Keith H. Nislow; Benjamin H. Letcher

    2012-01-01

    The influence of sampling strategy on estimates of effective population size (Ne) from single-sample genetic methods has not been rigorously examined, though these methods are increasingly used. For headwater salmonids, spatially close kin association among age-0 individuals suggests that sampling strategy (number of individuals and location from...

  12. Comparison of sampling strategies for tobacco retailer inspections to maximize coverage in vulnerable areas and minimize cost.

    Science.gov (United States)

    Lee, Joseph G L; Shook-Sa, Bonnie E; Bowling, J Michael; Ribisl, Kurt M

    2017-06-23

    In the United States, tens of thousands of inspections of tobacco retailers are conducted each year. Various sampling choices can reduce travel costs, emphasize enforcement in areas with greater non-compliance, and allow for comparability between states and over time. We sought to develop a model sampling strategy for state tobacco retailer inspections. Using a 2014 list of 10,161 North Carolina tobacco retailers, we compared results from simple random sampling; stratified, clustered at the ZIP code sampling; and, stratified, clustered at the census tract sampling. We conducted a simulation of repeated sampling and compared approaches for their comparative level of precision, coverage, and retailer dispersion. While maintaining an adequate design effect and statistical precision appropriate for a public health enforcement program, both stratified, clustered ZIP- and tract-based approaches were feasible. Both ZIP and tract strategies yielded improvements over simple random sampling, with relative improvements, respectively, of average distance between retailers (reduced 5.0% and 1.9%), percent Black residents in sampled neighborhoods (increased 17.2% and 32.6%), percent Hispanic residents in sampled neighborhoods (reduced 2.2% and increased 18.3%), percentage of sampled retailers located near schools (increased 61.3% and 37.5%), and poverty rate in sampled neighborhoods (increased 14.0% and 38.2%). States can make retailer inspections more efficient and targeted with stratified, clustered sampling. Use of statistically appropriate sampling strategies like these should be considered by states, researchers, and the Food and Drug Administration to improve program impact and allow for comparisons over time and across states. The authors present a model tobacco retailer sampling strategy for promoting compliance and reducing costs that could be used by U.S. states and the Food and Drug Administration (FDA). The design is feasible to implement in North Carolina. Use of

  13. Development of a sampling strategy and sample size calculation to estimate the distribution of mammographic breast density in Korean women.

    Science.gov (United States)

    Jun, Jae Kwan; Kim, Mi Jin; Choi, Kui Son; Suh, Mina; Jung, Kyu-Won

    2012-01-01

    Mammographic breast density is a known risk factor for breast cancer. To conduct a survey to estimate the distribution of mammographic breast density in Korean women, appropriate sampling strategies for representative and efficient sampling design were evaluated through simulation. Using the target population from the National Cancer Screening Programme (NCSP) for breast cancer in 2009, we verified the distribution estimate by repeating the simulation 1,000 times using stratified random sampling to investigate the distribution of breast density of 1,340,362 women. According to the simulation results, using a sampling design stratifying the nation into three groups (metropolitan, urban, and rural), with a total sample size of 4,000, we estimated the distribution of breast density in Korean women at a level of 0.01% tolerance. Based on the results of our study, a nationwide survey for estimating the distribution of mammographic breast density among Korean women can be conducted efficiently.

  14. Iterative Prototyping of Strategy Implementation Workshop Design

    DEFF Research Database (Denmark)

    Kryger, Anders

    2018-01-01

    Purpose: The purpose of this paper is to demonstrate how a strategy implementation workshop design can be developed and tested while minimizing the time spent on developing the design. Design/methodology/approach: This multiple case study at a diesel engine company shows how iterative prototyping...... can be used to structure the design process of a strategy implementation workshop. Findings: Strategy implementation workshop design can be developed in resource-constrained environments through iterative prototyping of the workshop design. Each workshop iteration can generate value in its own right...... draw on his/her experience as well as add to his/her knowledge base. Originality/value: Introducing iterative prototyping in an organizational context can facilitate fast yet structured development of a rigorous workshop design. Strategy consultants are provided with empirical examples of how...

  15. A census-weighted, spatially-stratified household sampling strategy for urban malaria epidemiology

    Directory of Open Access Journals (Sweden)

    Slutsker Laurence

    2008-02-01

    Full Text Available Abstract Background Urban malaria is likely to become increasingly important as a consequence of the growing proportion of Africans living in cities. A novel sampling strategy was developed for urban areas to generate a sample simultaneously representative of population and inhabited environments. Such a strategy should facilitate analysis of important epidemiological relationships in this ecological context. Methods Census maps and summary data for Kisumu, Kenya, were used to create a pseudo-sampling frame using the geographic coordinates of census-sampled structures. For every enumeration area (EA designated as urban by the census (n = 535, a sample of structures equal to one-tenth the number of households was selected. In EAs designated as rural (n = 32, a geographically random sample totalling one-tenth the number of households was selected from a grid of points at 100 m intervals. The selected samples were cross-referenced to a geographic information system, and coordinates transferred to handheld global positioning units. Interviewers found the closest eligible household to the sampling point and interviewed the caregiver of a child aged Results 4,336 interviews were completed in 473 of the 567 study area EAs from June 2002 through February 2003. EAs without completed interviews were randomly distributed, and non-response was approximately 2%. Mean distance from the assigned sampling point to the completed interview was 74.6 m, and was significantly less in urban than rural EAs, even when controlling for number of households. The selected sample had significantly more children and females of childbearing age than the general population, and fewer older individuals. Conclusion This method selected a sample that was simultaneously population-representative and inclusive of important environmental variation. The use of a pseudo-sampling frame and pre-programmed handheld GPS units is more efficient and may yield a more complete sample than

  16. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  17. Nature-Inspired Design : Strategies for Sustainable Product Development

    NARCIS (Netherlands)

    De Pauw, I.C.

    2015-01-01

    Product designers can apply different strategies, methods, and tools for sustainable product development. Nature-Inspired Design Strategies (NIDS) offer designers a distinct class of strategies that use ‘nature’ as a guiding source of knowledge and inspiration for addressing sustainability.

  18. The Green Studio Handbook: Environmental Strategies for Schematic Design

    Directory of Open Access Journals (Sweden)

    Alison G. Kwok

    2012-11-01

    Full Text Available In design studio projects we often see schemes with inspired, yet unvalidated, gestural sketches related to wishful green strategies. Yellow and blue magic arrows represent hypotheses about the behavior of daylight and/or air flow in and about buildings. This paper provides an overview of The Green Studio Handbook, recently published as a resource for designers seeking clear guidelines for integrating green design strategies into the conceptual and schematic phases of design. The book contains a discussion of the integration of green strategies and how building form, orientation, and spatial layout are critical to the proper performance of certain green strategies; 40 green design strategies in six broad topic areas, each providing acatalog of information for common strategies that must be implemented at the schematic design phase; and nine case studies that show how various green strategies work together in a finished building. This paper provides excerpts of several design strategies and one case study and suggests a variety of ways that the book may be used.Keywords: green design, case studies, education, schematic design

  19. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  20. Linking Design to Business Strategy Through Functional Analysis

    DEFF Research Database (Denmark)

    Simonsen, Jesper

    1997-01-01

    The paper discusses how designers, conducting design projects in specific organization's, can assure that the design of IT is appropriately linked to the organizations overall business strategy. A case study is presented in the form of a design project in a small public organization. Functional...... analysis was used as a means to clarify how a specific needed information system could support the organization's new business strategy. Using functional analysis in the design project had a powerful effect: it seriously challenged the organization's business strategy and revealed that the system...... to the relation between an organization's IT-projects and its business strategy and by suggesting that it is the responsibility of the designers, conducting design projects, to assure that this task is taken proper care of. Practical guidelines for this purpose are given....

  1. Planning schistosomiasis control: investigation of alternative sampling strategies for Schistosoma mansoni to target mass drug administration of praziquantel in East Africa.

    Science.gov (United States)

    Sturrock, Hugh J W; Gething, Pete W; Ashton, Ruth A; Kolaczinski, Jan H; Kabatereine, Narcis B; Brooker, Simon

    2011-09-01

    In schistosomiasis control, there is a need to geographically target treatment to populations at high risk of morbidity. This paper evaluates alternative sampling strategies for surveys of Schistosoma mansoni to target mass drug administration in Kenya and Ethiopia. Two main designs are considered: lot quality assurance sampling (LQAS) of children from all schools; and a geostatistical design that samples a subset of schools and uses semi-variogram analysis and spatial interpolation to predict prevalence in the remaining unsurveyed schools. Computerized simulations are used to investigate the performance of sampling strategies in correctly classifying schools according to treatment needs and their cost-effectiveness in identifying high prevalence schools. LQAS performs better than geostatistical sampling in correctly classifying schools, but at a cost with a higher cost per high prevalence school correctly classified. It is suggested that the optimal surveying strategy for S. mansoni needs to take into account the goals of the control programme and the financial and drug resources available.

  2. Efficiency enhancement of optimized Latin hypercube sampling strategies: Application to Monte Carlo uncertainty analysis and meta-modeling

    Science.gov (United States)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Janssen, Hans

    2015-02-01

    The majority of literature regarding optimized Latin hypercube sampling (OLHS) is devoted to increasing the efficiency of these sampling strategies through the development of new algorithms based on the combination of innovative space-filling criteria and specialized optimization schemes. However, little attention has been given to the impact of the initial design that is fed into the optimization algorithm, on the efficiency of OLHS strategies. Previous studies, as well as codes developed for OLHS, have relied on one of the following two approaches for the selection of the initial design in OLHS: (1) the use of random points in the hypercube intervals (random LHS), and (2) the use of midpoints in the hypercube intervals (midpoint LHS). Both approaches have been extensively used, but no attempt has been previously made to compare the efficiency and robustness of their resulting sample designs. In this study we compare the two approaches and show that the space-filling characteristics of OLHS designs are sensitive to the initial design that is fed into the optimization algorithm. It is also illustrated that the space-filling characteristics of OLHS designs based on midpoint LHS are significantly better those based on random LHS. The two approaches are compared by incorporating their resulting sample designs in Monte Carlo simulation (MCS) for uncertainty propagation analysis, and then, by employing the sample designs in the selection of the training set for constructing non-intrusive polynomial chaos expansion (NIPCE) meta-models which subsequently replace the original full model in MCSs. The analysis is based on two case studies involving numerical simulation of density dependent flow and solute transport in porous media within the context of seawater intrusion in coastal aquifers. We show that the use of midpoint LHS as the initial design increases the efficiency and robustness of the resulting MCSs and NIPCE meta-models. The study also illustrates that this

  3. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    Science.gov (United States)

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  4. A Bayesian sampling strategy for hazardous waste site characterization

    International Nuclear Information System (INIS)

    Skalski, J.R.

    1987-12-01

    Prior knowledge based on historical records or physical evidence often suggests the existence of a hazardous waste site. Initial surveys may provide additional or even conflicting evidence of site contamination. This article presents a Bayes sampling strategy that allocates sampling at a site using this prior knowledge. This sampling strategy minimizes the environmental risks of missing chemical or radionuclide hot spots at a waste site. The environmental risk is shown to be proportional to the size of the undetected hot spot or inversely proportional to the probability of hot spot detection. 12 refs., 2 figs

  5. Methodology Series Module 5: Sampling Strategies

    OpenAIRE

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ? Sampling Method?. There are essentially two types of sampling methods: 1) probability sampling ? based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling ? based on researcher's choice, population that accessible & available. Some of the non-probabilit...

  6. Population pharmacokinetic analysis of clopidogrel in healthy Jordanian subjects with emphasis optimal sampling strategy.

    Science.gov (United States)

    Yousef, A M; Melhem, M; Xue, B; Arafat, T; Reynolds, D K; Van Wart, S A

    2013-05-01

    Clopidogrel is metabolized primarily into an inactive carboxyl metabolite (clopidogrel-IM) or to a lesser extent an active thiol metabolite. A population pharmacokinetic (PK) model was developed using NONMEM(®) to describe the time course of clopidogrel-IM in plasma and to design a sparse-sampling strategy to predict clopidogrel-IM exposures for use in characterizing anti-platelet activity. Serial blood samples from 76 healthy Jordanian subjects administered a single 75 mg oral dose of clopidogrel were collected and assayed for clopidogrel-IM using reverse phase high performance liquid chromatography. A two-compartment (2-CMT) PK model with first-order absorption and elimination plus an absorption lag-time was evaluated, as well as a variation of this model designed to mimic enterohepatic recycling (EHC). Optimal PK sampling strategies (OSS) were determined using WinPOPT based upon collection of 3-12 post-dose samples. A two-compartment model with EHC provided the best fit and reduced bias in C(max) (median prediction error (PE%) of 9.58% versus 12.2%) relative to the basic two-compartment model, AUC(0-24) was similar for both models (median PE% = 1.39%). The OSS for fitting the two-compartment model with EHC required the collection of seven samples (0.25, 1, 2, 4, 5, 6 and 12 h). Reasonably unbiased and precise exposures were obtained when re-fitting this model to a reduced dataset considering only these sampling times. A two-compartment model considering EHC best characterized the time course of clopidogrel-IM in plasma. Use of the suggested OSS will allow for the collection of fewer PK samples when assessing clopidogrel-IM exposures. Copyright © 2013 John Wiley & Sons, Ltd.

  7. Trends in observable passive solar design strategies for existing homes in the U.S

    International Nuclear Information System (INIS)

    Kruzner, Kelly; Cox, Kristin; Machmer, Brian; Klotz, Leidy

    2013-01-01

    Passive design strategies are among the most cost-effective methods to reduce energy consumption in buildings. However, the prevalence of these strategies in existing U.S. homes is not well understood. To help address this issue, this research evaluated a nationally-representative sample of 1000 existing homes distributed geographically across the U.S. Using satellite images, each building was evaluated for three passive design strategies: orientation, roof color, and level of shading. Several statistically significant regional trends were identified. For example, existing homes in the High Plains, Ohio Valley, Northwest, and Southern regions show a statistically significant trend towards orientation in the East–West direction, an effective passive design strategy. Less intuitively, in terms of what would seem to be optimal passive design, buildings in the High Plains and Ohio Valley generally have lighter roof colors than buildings in the warmer Southwest region. At the national level, no statistically significant trends were found towards the passive design strategies evaluated. These trends give us no reason to believe they were a major consideration in the design of existing homes. Policy measures and education may be required to take advantage of the opportunity for cost-effective energy savings through more widespread passive solar design. - Highlights: ► GoogleMaps to examine implementation of cost-effective, observable passive solar strategies in U.S. houses. ► No national trends toward passive solar design in U.S.—a missed opportunity. ► Some regional passive solar trends in U.S. for house orientation, roof color

  8. Appropriate teaching and learning strategies for the architectural design process in pedagogic design studios

    Directory of Open Access Journals (Sweden)

    Ashraf M. Soliman

    2017-06-01

    Full Text Available The national qualification framework of a country requires a certain level of knowledge and complexity of skills for an academic degree to be recognized. For architectural programs, student workload is heavy on design courses. Therefore, each course must be carefully developed to ensure that students are not overloaded. Teaching and learning strategies have different implications for courses, which occasionally result in overloading the students. This research aims to study the three main pillars of teaching and learning strategies for each design phase in pedagogic design studios. The most appropriate model for each teaching and learning strategy, including a set of the three main pillars, is then identified for each design phase. A practical strategy for managing design studios is also determined. The aforementioned three pillars are as follows: teaching and learning methods, assigned tasks or study aspects, and design communication techniques. Two research methods, namely, a literature review and a survey among design educators, are adopted. The literature review examines aspects that contribute to the design process and its phases, teaching methods, design skills, communication methods, and studio management strategies. On the basis of the literature review, the background of developments and practices in the design education process are used as constructive tools to develop the survey for design educators. Through the survey, the pillars of teaching and learning strategies that are frequently practiced in design studios are evaluated. Results of this study are classified into three ranks using the nature break classification method for numerical values. Subsequently, three priority models that correspond to teaching and learning strategies, as well as to the required skills and capabilities, are established. A group-based strategy with an interdisciplinary approach is also determined to be the most suitable technique for managing the

  9. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  10. Evaluating sampling strategies for larval cisco (Coregonus artedi)

    Science.gov (United States)

    Myers, J.T.; Stockwell, J.D.; Yule, D.L.; Black, J.A.

    2008-01-01

    To improve our ability to assess larval cisco (Coregonus artedi) populations in Lake Superior, we conducted a study to compare several sampling strategies. First, we compared density estimates of larval cisco concurrently captured in surface waters with a 2 x 1-m paired neuston net and a 0.5-m (diameter) conical net. Density estimates obtained from the two gear types were not significantly different, suggesting that the conical net is a reasonable alternative to the more cumbersome and costly neuston net. Next, we assessed the effect of tow pattern (sinusoidal versus straight tows) to examine if propeller wash affected larval density. We found no effect of propeller wash on the catchability of larval cisco. Given the availability of global positioning systems, we recommend sampling larval cisco using straight tows to simplify protocols and facilitate straightforward measurements of volume filtered. Finally, we investigated potential trends in larval cisco density estimates by sampling four time periods during the light period of a day at individual sites. Our results indicate no significant trends in larval density estimates during the day. We conclude estimates of larval cisco density across space are not confounded by time at a daily timescale. Well-designed, cost effective surveys of larval cisco abundance will help to further our understanding of this important Great Lakes forage species.

  11. Development of Mitigation Strategy for Beyond Design Basis External Events for NRC Design Certification

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Hak; Lee, Jae Jong; Kim, Myung Ki [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In this study, how to develop FLEX strategy for beyond-design-basis external events for U. S. NRC design certification is examined. The development method of FLEX strategy for U. S. NRC design certification is examined. The applicants should make unit-specific FLEX strategy and establish the minimum coping capabilities consistent with unit-specific evaluation of the potential impacts and responses to BDBEEs. NEI 12-06 outlines the process to define and deploy the diverse and flexible mitigation strategies(FLEX strategy) that will increase defense-in-depth for beyond-design-basis scenarios to address the extended loss of alternating current (ac) power (ELAP) and loss of normal access to the ultimate heat sink (LUHS) occurring simultaneously at all units on a site. The order (EA-12-049) is issued to all reactor licensees, including holders of active, Construction Permit (CP) holders, and Combined License (COL) holders. Applicants for the new reactor design certification should prepare and submit FLEX strategy for NRC staff's review. Site-specific data related with the new reactor can't be determined during the new reactor design certification applications so that the unit-specific FLEX strategy should be developed.

  12. Development of Mitigation Strategy for Beyond Design Basis External Events for NRC Design Certification

    International Nuclear Information System (INIS)

    Kim, Dong Hak; Lee, Jae Jong; Kim, Myung Ki

    2013-01-01

    In this study, how to develop FLEX strategy for beyond-design-basis external events for U. S. NRC design certification is examined. The development method of FLEX strategy for U. S. NRC design certification is examined. The applicants should make unit-specific FLEX strategy and establish the minimum coping capabilities consistent with unit-specific evaluation of the potential impacts and responses to BDBEEs. NEI 12-06 outlines the process to define and deploy the diverse and flexible mitigation strategies(FLEX strategy) that will increase defense-in-depth for beyond-design-basis scenarios to address the extended loss of alternating current (ac) power (ELAP) and loss of normal access to the ultimate heat sink (LUHS) occurring simultaneously at all units on a site. The order (EA-12-049) is issued to all reactor licensees, including holders of active, Construction Permit (CP) holders, and Combined License (COL) holders. Applicants for the new reactor design certification should prepare and submit FLEX strategy for NRC staff's review. Site-specific data related with the new reactor can't be determined during the new reactor design certification applications so that the unit-specific FLEX strategy should be developed

  13. Dealing with trade-offs in destructive sampling designs for occupancy surveys.

    Directory of Open Access Journals (Sweden)

    Stefano Canessa

    Full Text Available Occupancy surveys should be designed to minimise false absences. This is commonly achieved by increasing replication or increasing the efficiency of surveys. In the case of destructive sampling designs, in which searches of individual microhabitats represent the repeat surveys, minimising false absences leads to an inherent trade-off. Surveyors can sample more low quality microhabitats, bearing the resultant financial costs and producing wider-spread impacts, or they can target high quality microhabitats were the focal species is more likely to be found and risk more severe impacts on local habitat quality. We show how this trade-off can be solved with a decision-theoretic approach, using the Millewa Skink Hemiergis millewae from southern Australia as a case study. Hemiergis millewae is an endangered reptile that is best detected using destructive sampling of grass hummocks. Within sites that were known to be occupied by H. millewae, logistic regression modelling revealed that lizards were more frequently detected in large hummocks. If this model is an accurate representation of the detection process, searching large hummocks is more efficient and requires less replication, but this strategy also entails destruction of the best microhabitats for the species. We developed an optimisation tool to calculate the minimum combination of the number and size of hummocks to search to achieve a given cumulative probability of detecting the species at a site, incorporating weights to reflect the sensitivity of the results to a surveyor's priorities. The optimisation showed that placing high weight on minimising volume necessitates impractical replication, whereas placing high weight on minimising replication requires searching very large hummocks which are less common and may be vital for H. millewae. While destructive sampling methods are sometimes necessary, surveyors must be conscious of the ecological impacts of these methods. This study provides a

  14. A sampling strategy to establish existing plant configuration baselines

    International Nuclear Information System (INIS)

    Buchanan, L.P.

    1995-01-01

    The Department of Energy's Gaseous Diffusion Plants (DOEGDP) are undergoing a Safety Analysis Update Program. As part of this program, critical existing structures are being reevaluated for Natural Phenomena Hazards (NPH) based on the recommendations of UCRL-15910. The Department of Energy has specified that current plant configurations be used in the performance of these reevaluations. This paper presents the process and results of a walkdown program implemented at DOEGDP to establish the current configuration baseline for these existing critical structures for use in subsequent NPH evaluations. These structures are classified as moderate hazard facilities and were constructed in the early 1950's. The process involved a statistical sampling strategy to determine the validity of critical design information as represented on the original design drawings such as member sizes, orientation, connection details and anchorage. A floor load inventory of the dead load of the equipment, both permanently attached and spare, was also performed as well as a walkthrough inspection of the overall structure to identify any other significant anomalies

  15. Effective sampling strategy to detect food and feed contamination

    NARCIS (Netherlands)

    Bouzembrak, Yamine; Fels, van der Ine

    2018-01-01

    Sampling plans for food safety hazards are aimed to be used to determine whether a lot of food is contaminated (with microbiological or chemical hazards) or not. One of the components of sampling plans is the sampling strategy. The aim of this study was to compare the performance of three

  16. A preliminary evaluation of comminution and sampling strategies for radioactive cemented waste

    Energy Technology Data Exchange (ETDEWEB)

    Bilodeau, M.; Lastra, R.; Bouzoubaa, N. [Natural Resources Canada, Ottawa, ON (Canada); Chapman, M. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2011-07-01

    Lixiviation of Hg, U and Cs contaminants and micro-encapsulation of cemented radioactive waste (CRW) are the two main components of a CRW stabilization research project carried out at Natural Resources Canada in collaboration with Atomic Energy of Canada Limited. Unmolding CRW from the storage pail, its fragmentation into a size range suitable for both processes and the collection of a representative sample are three essential steps for providing optimal material conditions for the two studies. Separation of wires, metals and plastic incorporated into CRW samples is also required. A comminution and sampling strategy was developed to address all those needs. Dust emissions and other health and safety concerns were given full consideration. Surrogate cemented waste (SCW) was initially used for this comminution study where Cu was used as a substitute for U and Hg. SCW was characterized as a friable material through the measurement of the Bond work index of 7.7 kWh/t. A mineralogical investigation and the calibration of material heterogeneity parameters of the sampling error model showed that Cu, Hg and Cs are finely disseminated in the cement matrix. A sampling strategy was built from the model and successfully validated with radioactive waste. A larger than expected sampling error was observed with U due to the formation of large U solid phases, which were not observed with the Cu tracer. SCW samples were crushed and ground under different rock fragmentation mechanisms: compression (jaw and cone crushers, rod mill), impact (ball mill), attrition, high voltage disintegration and high pressure water (and liquid nitrogen) jetting. Cryogenic grinding was also tested with the attrition mill. Crushing and grinding technologies were assessed against criteria that were gathered from literature surveys, experiential know-how and discussion with the client and field experts. Water jetting and its liquid nitrogen variant were retained for pail cutting and waste unmolding while

  17. A preliminary evaluation of comminution and sampling strategies for radioactive cemented waste

    International Nuclear Information System (INIS)

    Bilodeau, M.; Lastra, R.; Bouzoubaa, N.; Chapman, M.

    2011-01-01

    Lixiviation of Hg, U and Cs contaminants and micro-encapsulation of cemented radioactive waste (CRW) are the two main components of a CRW stabilization research project carried out at Natural Resources Canada in collaboration with Atomic Energy of Canada Limited. Unmolding CRW from the storage pail, its fragmentation into a size range suitable for both processes and the collection of a representative sample are three essential steps for providing optimal material conditions for the two studies. Separation of wires, metals and plastic incorporated into CRW samples is also required. A comminution and sampling strategy was developed to address all those needs. Dust emissions and other health and safety concerns were given full consideration. Surrogate cemented waste (SCW) was initially used for this comminution study where Cu was used as a substitute for U and Hg. SCW was characterized as a friable material through the measurement of the Bond work index of 7.7 kWh/t. A mineralogical investigation and the calibration of material heterogeneity parameters of the sampling error model showed that Cu, Hg and Cs are finely disseminated in the cement matrix. A sampling strategy was built from the model and successfully validated with radioactive waste. A larger than expected sampling error was observed with U due to the formation of large U solid phases, which were not observed with the Cu tracer. SCW samples were crushed and ground under different rock fragmentation mechanisms: compression (jaw and cone crushers, rod mill), impact (ball mill), attrition, high voltage disintegration and high pressure water (and liquid nitrogen) jetting. Cryogenic grinding was also tested with the attrition mill. Crushing and grinding technologies were assessed against criteria that were gathered from literature surveys, experiential know-how and discussion with the client and field experts. Water jetting and its liquid nitrogen variant were retained for pail cutting and waste unmolding while

  18. Novel strategies for sample preparation in forensic toxicology.

    Science.gov (United States)

    Samanidou, Victoria; Kovatsi, Leda; Fragou, Domniki; Rentifis, Konstantinos

    2011-09-01

    This paper provides a review of novel strategies for sample preparation in forensic toxicology. The review initially outlines the principle of each technique, followed by sections addressing each class of abused drugs separately. The novel strategies currently reviewed focus on the preparation of various biological samples for the subsequent determination of opiates, benzodiazepines, amphetamines, cocaine, hallucinogens, tricyclic antidepressants, antipsychotics and cannabinoids. According to our experience, these analytes are the most frequently responsible for intoxications in Greece. The applications of techniques such as disposable pipette extraction, microextraction by packed sorbent, matrix solid-phase dispersion, solid-phase microextraction, polymer monolith microextraction, stir bar sorptive extraction and others, which are rapidly gaining acceptance in the field of toxicology, are currently reviewed.

  19. RAMI strategies in the IFMIF Test Facilities design

    Energy Technology Data Exchange (ETDEWEB)

    Abal, Javier, E-mail: javier.abal@upc.edu [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Dies, Javier [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Arroyo, José Manuel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, 28040 Madrid (Spain); Bargalló, Enric [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Casal, Natalia; García, Ángela [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, 28040 Madrid (Spain); Martínez, Gonzalo; Tapia, Carlos; De Blas, Alfredo [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Mollá, Joaquín; Ibarra, Ángel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, 28040 Madrid (Spain)

    2013-10-15

    Highlights: • We have implemented fault tolerant design strategies so that the strong availability requirements are met. • The evolution to the present design of the signal and cooling lines inside the TTC has also been compared. • The RAMI analyses have demonstrated a strong capability in being a complementary tool in the design of IFMIF Test Facilities. -- Abstract: In this paper, a RAMI analysis of the different stages in Test Facilities (TF) design is described. The comparison between the availability results has been a milestone not only to evaluate the major unavailability contributors in the updates but also to implement fault tolerant design strategies when possible. These strategies encompass a wide range of design activities: from the definition of degraded modes of operation in the Test Facilities to specific modifications in the test modules in order to guarantee their fail safe operation.

  20. RAMI strategies in the IFMIF Test Facilities design

    International Nuclear Information System (INIS)

    Abal, Javier; Dies, Javier; Arroyo, José Manuel; Bargalló, Enric; Casal, Natalia; García, Ángela; Martínez, Gonzalo; Tapia, Carlos; De Blas, Alfredo; Mollá, Joaquín; Ibarra, Ángel

    2013-01-01

    Highlights: • We have implemented fault tolerant design strategies so that the strong availability requirements are met. • The evolution to the present design of the signal and cooling lines inside the TTC has also been compared. • The RAMI analyses have demonstrated a strong capability in being a complementary tool in the design of IFMIF Test Facilities. -- Abstract: In this paper, a RAMI analysis of the different stages in Test Facilities (TF) design is described. The comparison between the availability results has been a milestone not only to evaluate the major unavailability contributors in the updates but also to implement fault tolerant design strategies when possible. These strategies encompass a wide range of design activities: from the definition of degraded modes of operation in the Test Facilities to specific modifications in the test modules in order to guarantee their fail safe operation

  1. Limited-sampling strategies for anti-infective agents: systematic review.

    Science.gov (United States)

    Sprague, Denise A; Ensom, Mary H H

    2009-09-01

    Area under the concentration-time curve (AUC) is a pharmacokinetic parameter that represents overall exposure to a drug. For selected anti-infective agents, pharmacokinetic-pharmacodynamic parameters, such as AUC/MIC (where MIC is the minimal inhibitory concentration), have been correlated with outcome in a few studies. A limited-sampling strategy may be used to estimate pharmacokinetic parameters such as AUC, without the frequent, costly, and inconvenient blood sampling that would be required to directly calculate the AUC. To discuss, by means of a systematic review, the strengths, limitations, and clinical implications of published studies involving a limited-sampling strategy for anti-infective agents and to propose improvements in methodology for future studies. The PubMed and EMBASE databases were searched using the terms "anti-infective agents", "limited sampling", "optimal sampling", "sparse sampling", "AUC monitoring", "abbreviated AUC", "abbreviated sampling", and "Bayesian". The reference lists of retrieved articles were searched manually. Included studies were classified according to modified criteria from the US Preventive Services Task Force. Twenty studies met the inclusion criteria. Six of the studies (involving didanosine, zidovudine, nevirapine, ciprofloxacin, efavirenz, and nelfinavir) were classified as providing level I evidence, 4 studies (involving vancomycin, didanosine, lamivudine, and lopinavir-ritonavir) provided level II-1 evidence, 2 studies (involving saquinavir and ceftazidime) provided level II-2 evidence, and 8 studies (involving ciprofloxacin, nelfinavir, vancomycin, ceftazidime, ganciclovir, pyrazinamide, meropenem, and alpha interferon) provided level III evidence. All of the studies providing level I evidence used prospectively collected data and proper validation procedures with separate, randomly selected index and validation groups. However, most of the included studies did not provide an adequate description of the methods or

  2. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    International Nuclear Information System (INIS)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ 1 -minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy

  3. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    Science.gov (United States)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ1-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.

  4. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    Science.gov (United States)

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. A Strategy for Uncertainty Visualization Design

    Science.gov (United States)

    2009-10-01

    143–156, Magdeburg , Germany . [11] Thomson, J., Hetzler, E., MacEachren, A., Gahegan, M. and Pavel, M. (2005), A Typology for Visualizing Uncertainty...and Stasko [20] to bridge analytic gaps in visualization design, when tasks in the strategy overlap (and therefore complement) design frameworks

  6. Sampling design for long-term regional trends in marine rocky intertidal communities

    Science.gov (United States)

    Irvine, Gail V.; Shelley, Alice

    2013-01-01

    Probability-based designs reduce bias and allow inference of results to the pool of sites from which they were chosen. We developed and tested probability-based designs for monitoring marine rocky intertidal assemblages at Glacier Bay National Park and Preserve (GLBA), Alaska. A multilevel design was used that varied in scale and inference. The levels included aerial surveys, extensive sampling of 25 sites, and more intensive sampling of 6 sites. Aerial surveys of a subset of intertidal habitat indicated that the original target habitat of bedrock-dominated sites with slope ≤30° was rare. This unexpected finding illustrated one value of probability-based surveys and led to a shift in the target habitat type to include steeper, more mixed rocky habitat. Subsequently, we evaluated the statistical power of different sampling methods and sampling strategies to detect changes in the abundances of the predominant sessile intertidal taxa: barnacles Balanomorpha, the mussel Mytilus trossulus, and the rockweed Fucus distichus subsp. evanescens. There was greatest power to detect trends in Mytilus and lesser power for barnacles and Fucus. Because of its greater power, the extensive, coarse-grained sampling scheme was adopted in subsequent years over the intensive, fine-grained scheme. The sampling attributes that had the largest effects on power included sampling of “vertical” line transects (vs. horizontal line transects or quadrats) and increasing the number of sites. We also evaluated the power of several management-set parameters. Given equal sampling effort, sampling more sites fewer times had greater power. The information gained through intertidal monitoring is likely to be useful in assessing changes due to climate, including ocean acidification; invasive species; trampling effects; and oil spills.

  7. OUTPACE long duration stations: physical variability, context of biogeochemical sampling, and evaluation of sampling strategy

    Directory of Open Access Journals (Sweden)

    A. de Verneil

    2018-04-01

    Full Text Available Research cruises to quantify biogeochemical fluxes in the ocean require taking measurements at stations lasting at least several days. A popular experimental design is the quasi-Lagrangian drifter, often mounted with in situ incubations or sediment traps that follow the flow of water over time. After initial drifter deployment, the ship tracks the drifter for continuing measurements that are supposed to represent the same water environment. An outstanding question is how to best determine whether this is true. During the Oligotrophy to UlTra-oligotrophy PACific Experiment (OUTPACE cruise, from 18 February to 3 April 2015 in the western tropical South Pacific, three separate stations of long duration (five days over the upper 500 m were conducted in this quasi-Lagrangian sampling scheme. Here we present physical data to provide context for these three stations and to assess whether the sampling strategy worked, i.e., that a single body of water was sampled. After analyzing tracer variability and local water circulation at each station, we identify water layers and times where the drifter risks encountering another body of water. While almost no realization of this sampling scheme will be truly Lagrangian, due to the presence of vertical shear, the depth-resolved observations during the three stations show most layers sampled sufficiently homogeneous physical environments during OUTPACE. By directly addressing the concerns raised by these quasi-Lagrangian sampling platforms, a protocol of best practices can begin to be formulated so that future research campaigns include the complementary datasets and analyses presented here to verify the appropriate use of the drifter platform.

  8. Sampling strategies to measure the prevalence of common recurrent infections in longitudinal studies

    Directory of Open Access Journals (Sweden)

    Luby Stephen P

    2010-08-01

    Full Text Available Abstract Background Measuring recurrent infections such as diarrhoea or respiratory infections in epidemiological studies is a methodological challenge. Problems in measuring the incidence of recurrent infections include the episode definition, recall error, and the logistics of close follow up. Longitudinal prevalence (LP, the proportion-of-time-ill estimated by repeated prevalence measurements, is an alternative measure to incidence of recurrent infections. In contrast to incidence which usually requires continuous sampling, LP can be measured at intervals. This study explored how many more participants are needed for infrequent sampling to achieve the same study power as frequent sampling. Methods We developed a set of four empirical simulation models representing low and high risk settings with short or long episode durations. The model was used to evaluate different sampling strategies with different assumptions on recall period and recall error. Results The model identified three major factors that influence sampling strategies: (1 the clustering of episodes in individuals; (2 the duration of episodes; (3 the positive correlation between an individual's disease incidence and episode duration. Intermittent sampling (e.g. 12 times per year often requires only a slightly larger sample size compared to continuous sampling, especially in cluster-randomized trials. The collection of period prevalence data can lead to highly biased effect estimates if the exposure variable is associated with episode duration. To maximize study power, recall periods of 3 to 7 days may be preferable over shorter periods, even if this leads to inaccuracy in the prevalence estimates. Conclusion Choosing the optimal approach to measure recurrent infections in epidemiological studies depends on the setting, the study objectives, study design and budget constraints. Sampling at intervals can contribute to making epidemiological studies and trials more efficient, valid

  9. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    This report describes the experimental and sampling design developed to assess sampling approaches and methods for detecting contamination in a building and clearing the building for use after decontamination. An Idaho National Laboratory (INL) building will be contaminated with BG (Bacillus globigii, renamed Bacillus atrophaeus), a simulant for Bacillus anthracis (BA). The contamination, sampling, decontamination, and re-sampling will occur per the experimental and sampling design. This INL-2 Sample Collection Operational Test is being planned by the Validated Sampling Plan Working Group (VSPWG). The primary objectives are: 1) Evaluate judgmental and probabilistic sampling for characterization as well as probabilistic and combined (judgment and probabilistic) sampling approaches for clearance, 2) Conduct these evaluations for gradient contamination (from low or moderate down to absent or undetectable) for different initial concentrations of the contaminant, 3) Explore judgment composite sampling approaches to reduce sample numbers, 4) Collect baseline data to serve as an indication of the actual levels of contamination in the tests. A combined judgmental and random (CJR) approach uses Bayesian methodology to combine judgmental and probabilistic samples to make clearance statements of the form "X% confidence that at least Y% of an area does not contain detectable contamination” (X%/Y% clearance statements). The INL-2 experimental design has five test events, which 1) vary the floor of the INL building on which the contaminant will be released, 2) provide for varying the amount of contaminant released to obtain desired concentration gradients, and 3) investigate overt as well as covert release of contaminants. Desirable contaminant gradients would have moderate to low concentrations of contaminant in rooms near the release point, with concentrations down to zero in other rooms. Such gradients would provide a range of contamination levels to challenge the sampling

  10. Design Strategies for Aptamer-Based Biosensors

    Science.gov (United States)

    Han, Kun; Liang, Zhiqiang; Zhou, Nandi

    2010-01-01

    Aptamers have been widely used as recognition elements for biosensor construction, especially in the detection of proteins or small molecule targets, and regarded as promising alternatives for antibodies in bioassay areas. In this review, we present an overview of reported design strategies for the fabrication of biosensors and classify them into four basic modes: target-induced structure switching mode, sandwich or sandwich-like mode, target-induced dissociation/displacement mode and competitive replacement mode. In view of the unprecedented advantages brought about by aptamers and smart design strategies, aptamer-based biosensors are expected to be one of the most promising devices in bioassay related applications. PMID:22399891

  11. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  12. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  13. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  14. Strategies for Designing and Developing Services for Manufacturing Firms

    DEFF Research Database (Denmark)

    Tan, Adrian; Matzen, Detlef; McAloone, Tim C.

    2009-01-01

    Product/service-systems (PSS) are in effect an approach to designing integrated products and services with a focus on both customer activities and product life cycle considerations. Literature offers a range of serviceoriented design strategies from product-oriented DfX approaches to more customer......-oriented approaches such as integrated solutions and service design. These design strategies are mapped out in relation to how applicable they are to different types of services. Case studies from two industrial companies are used to confront the existing literature in order to begin to understand how manufacturing...... companies may align their business strategies with their product and service development activities....

  15. Strategies for Designing and Developing Services for Manufacturing Firms

    DEFF Research Database (Denmark)

    Tan, Adrian Ronald; Matzen, Detlef; McAloone, Tim C.

    2010-01-01

    Product/service-systems (PSS) are in effect an approach to designing integrated products and services with a focus on both customer- and product life cycleactivities . Literature offers a range of service-oriented design strategies from product-oriented DfX approaches to more customer...... manufacturing companies may align their product and service development activities with their business strategies.......-oriented approaches such as integrated solutions and service design. These design strategies are mapped out in relation to how applicable they are to different types of services. Case studies from two industrial companies are used to confront the existing literature in order to improve understanding of how...

  16. Sampling designs for contaminant temporal trend analyses using sedentary species exemplified by the snails Bellamya aeruginosa and Viviparus viviparus.

    Science.gov (United States)

    Yin, Ge; Danielsson, Sara; Dahlberg, Anna-Karin; Zhou, Yihui; Qiu, Yanling; Nyberg, Elisabeth; Bignert, Anders

    2017-10-01

    Environmental monitoring typically assumes samples and sampling activities to be representative of the population being studied. Given a limited budget, an appropriate sampling strategy is essential to support detecting temporal trends of contaminants. In the present study, based on real chemical analysis data on polybrominated diphenyl ethers in snails collected from five subsites in Tianmu Lake, computer simulation is performed to evaluate three sampling strategies by the estimation of required sample size, to reach a detection of an annual change of 5% with a statistical power of 80% and 90% with a significant level of 5%. The results showed that sampling from an arbitrarily selected sampling spot is the worst strategy, requiring much more individual analyses to achieve the above mentioned criteria compared with the other two approaches. A fixed sampling site requires the lowest sample size but may not be representative for the intended study object e.g. a lake and is also sensitive to changes of that particular sampling site. In contrast, sampling at multiple sites along the shore each year, and using pooled samples when the cost to collect and prepare individual specimens are much lower than the cost for chemical analysis, would be the most robust and cost efficient strategy in the long run. Using statistical power as criterion, the results demonstrated quantitatively the consequences of various sampling strategies, and could guide users with respect of required sample sizes depending on sampling design for long term monitoring programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Science.gov (United States)

    Mitra, Abhishek; Skrzypczak, Magdalena; Ginalski, Krzysztof; Rowicka, Maga

    2015-01-01

    Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding). Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants). Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol) that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively, we discuss how

  18. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Directory of Open Access Journals (Sweden)

    Abhishek Mitra

    Full Text Available Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding. Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants. Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively

  19. Measurement of radioactivity in the environment - Soil - Part 2: Guidance for the selection of the sampling strategy, sampling and pre-treatment of samples

    International Nuclear Information System (INIS)

    2007-01-01

    This part of ISO 18589 specifies the general requirements, based on ISO 11074 and ISO/IEC 17025, for all steps in the planning (desk study and area reconnaissance) of the sampling and the preparation of samples for testing. It includes the selection of the sampling strategy, the outline of the sampling plan, the presentation of general sampling methods and equipment, as well as the methodology of the pre-treatment of samples adapted to the measurements of the activity of radionuclides in soil. This part of ISO 18589 is addressed to the people responsible for determining the radioactivity present in soil for the purpose of radiation protection. It is applicable to soil from gardens, farmland, urban or industrial sites, as well as soil not affected by human activities. This part of ISO 18589 is applicable to all laboratories regardless of the number of personnel or the range of the testing performed. When a laboratory does not undertake one or more of the activities covered by this part of ISO 18589, such as planning, sampling or testing, the corresponding requirements do not apply. Information is provided on scope, normative references, terms and definitions and symbols, principle, sampling strategy, sampling plan, sampling process, pre-treatment of samples and recorded information. Five annexes inform about selection of the sampling strategy according to the objectives and the radiological characterization of the site and sampling areas, diagram of the evolution of the sample characteristics from the sampling site to the laboratory, example of sampling plan for a site divided in three sampling areas, example of a sampling record for a single/composite sample and example for a sample record for a soil profile with soil description. A bibliography is provided

  20. A systematic examination of a random sampling strategy for source apportionment calculations.

    Science.gov (United States)

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. A Fractual Mechanical Testing and Design Strategy for FRC Structures

    DEFF Research Database (Denmark)

    Stang, Henrik; Olesen, John Forbes

    1999-01-01

    A unified testing and design strategy for fibre reinforced concrete structures is summarised. The strategy is based on fracture mechanical concepts. Emphasis is placed on material characterisation and testing specifications.......A unified testing and design strategy for fibre reinforced concrete structures is summarised. The strategy is based on fracture mechanical concepts. Emphasis is placed on material characterisation and testing specifications....

  2. Economic Design of Acceptance Sampling Plans in a Two-Stage Supply Chain

    Directory of Open Access Journals (Sweden)

    Lie-Fern Hsu

    2012-01-01

    Full Text Available Supply Chain Management, which is concerned with material and information flows between facilities and the final customers, has been considered the most popular operations strategy for improving organizational competitiveness nowadays. With the advanced development of computer technology, it is getting easier to derive an acceptance sampling plan satisfying both the producer's and consumer's quality and risk requirements. However, all the available QC tables and computer software determine the sampling plan on a noneconomic basis. In this paper, we design an economic model to determine the optimal sampling plan in a two-stage supply chain that minimizes the producer's and the consumer's total quality cost while satisfying both the producer's and consumer's quality and risk requirements. Numerical examples show that the optimal sampling plan is quite sensitive to the producer's product quality. The product's inspection, internal failure, and postsale failure costs also have an effect on the optimal sampling plan.

  3. Empathic design: Research strategies

    Directory of Open Access Journals (Sweden)

    Joyce Thomas

    2013-01-01

    Full Text Available This paper explores the role of empathy within new product development from the perspective of human-centred design. The authors have developed a range of empathic design tools and strategies that help to identify authentic human needs.For products and services to be effective, they need to satisfy both functional and emotional needs of individuals. In addition, the individual user needs to feel that the product and/or service has been designed ‘just for them’, otherwise they may misuse, underuse or abandon the product/service. This becomes critical with a product such as a Zimmer frame (walker, when it fails to resonate with the patient due to any stigma the patient may perceive, and thus remains unused.When training young designers to consider the wider community (people unlike themselves during the design process, it has proven extremely valuable to take them outside their comfort zones, by seeking to develop empathy with the end user for whom they are designing. Empathic modelling offers designers the opportunity to develop greater insight and understanding, in order to support more effective design outcomes. Sensitising designers to the different ways that individuals complete daily tasks has helped to diminish the gap between themselves and others (e.g. people with disabilities.The authors intend for this paper to resonate with health care providers. Human-centred design can help to refocus the designer, by placing the individual end user’s needs at the heart of their decision-making.

  4. Empathic design: Research strategies.

    Science.gov (United States)

    Thomas, Joyce; McDonagh, Deana

    2013-01-01

    This paper explores the role of empathy within new product development from the perspective of human-centred design. The authors have developed a range of empathic design tools and strategies that help to identify authentic human needs.For products and services to be effective, they need to satisfy both functional and emotional needs of individuals. In addition, the individual user needs to feel that the product and/or service has been designed 'just for them', otherwise they may misuse, underuse or abandon the product/service. This becomes critical with a product such as a Zimmer frame (walker), when it fails to resonate with the patient due to any stigma the patient may perceive, and thus remains unused.When training young designers to consider the wider community (people unlike themselves) during the design process, it has proven extremely valuable to take them outside their comfort zones, by seeking to develop empathy with the end user for whom they are designing. Empathic modelling offers designers the opportunity to develop greater insight and understanding, in order to support more effective design outcomes. Sensitising designers to the different ways that individuals complete daily tasks has helped to diminish the gap between themselves and others (e.g. people with disabilities).The authors intend for this paper to resonate with health care providers. Human-centred design can help to refocus the designer, by placing the individual end user's needs at the heart of their decision-making.

  5. Design Requirements, Epistemic Uncertainty and Solution Development Strategies in Software Design

    DEFF Research Database (Denmark)

    Ball, Linden J.; Onarheim, Balder; Christensen, Bo Thomas

    2010-01-01

    This paper investigates the potential involvement of “epistemic uncertainty” in mediating between complex design requirements and strategic switches in software design strategies. The analysis revealed that the designers produced an initial “first-pass” solution to the given design brief in a bre...... a view of software design as involving a mixed breadth-first and depth-first solution development approach, with strategic switching to depth-first design being triggered by requirement complexity and being mediated by associated feelings of uncertainty....

  6. Developing an Integrated Design Strategy for Chip Layout Optimization

    NARCIS (Netherlands)

    Wits, Wessel Willems; Jauregui Becker, Juan Manuel; van Vliet, Frank Edward; te Riele, G.J.

    2011-01-01

    This paper presents an integrated design strategy for chip layout optimization. The strategy couples both electric and thermal aspects during the conceptual design phase to improve chip performances; thermal management being one of the major topics. The layout of the chip circuitry is optimized

  7. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Directory of Open Access Journals (Sweden)

    Lauren Hund

    Full Text Available Lot quality assurance sampling (LQAS surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  8. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Science.gov (United States)

    Hund, Lauren; Bedrick, Edward J; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  9. A weighted sampling algorithm for the design of RNA sequences with targeted secondary structure and nucleotide distribution.

    Science.gov (United States)

    Reinharz, Vladimir; Ponty, Yann; Waldispühl, Jérôme

    2013-07-01

    The design of RNA sequences folding into predefined secondary structures is a milestone for many synthetic biology and gene therapy studies. Most of the current software uses similar local search strategies (i.e. a random seed is progressively adapted to acquire the desired folding properties) and more importantly do not allow the user to control explicitly the nucleotide distribution such as the GC-content in their sequences. However, the latter is an important criterion for large-scale applications as it could presumably be used to design sequences with better transcription rates and/or structural plasticity. In this article, we introduce IncaRNAtion, a novel algorithm to design RNA sequences folding into target secondary structures with a predefined nucleotide distribution. IncaRNAtion uses a global sampling approach and weighted sampling techniques. We show that our approach is fast (i.e. running time comparable or better than local search methods), seedless (we remove the bias of the seed in local search heuristics) and successfully generates high-quality sequences (i.e. thermodynamically stable) for any GC-content. To complete this study, we develop a hybrid method combining our global sampling approach with local search strategies. Remarkably, our glocal methodology overcomes both local and global approaches for sampling sequences with a specific GC-content and target structure. IncaRNAtion is available at csb.cs.mcgill.ca/incarnation/. Supplementary data are available at Bioinformatics online.

  10. Sampling strategies to capture single-cell heterogeneity

    OpenAIRE

    Satwik Rajaram; Louise E. Heinrich; John D. Gordan; Jayant Avva; Kathy M. Bonness; Agnieszka K. Witkiewicz; James S. Malter; Chloe E. Atreya; Robert S. Warren; Lani F. Wu; Steven J. Altschuler

    2017-01-01

    Advances in single-cell technologies have highlighted the prevalence and biological significance of cellular heterogeneity. A critical question is how to design experiments that faithfully capture the true range of heterogeneity from samples of cellular populations. Here, we develop a data-driven approach, illustrated in the context of image data, that estimates the sampling depth required for prospective investigations of single-cell heterogeneity from an existing collection of samples. ...

  11. 30 CFR 71.208 - Bimonthly sampling; designated work positions.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bimonthly sampling; designated work positions... UNDERGROUND COAL MINES Sampling Procedures § 71.208 Bimonthly sampling; designated work positions. (a) Each... standard when quartz is present), respirable dust sampling of designated work positions shall begin on the...

  12. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    OpenAIRE

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we comp...

  13. Designing a monitoring program to estimate estuarine survival of anadromous salmon smolts: simulating the effect of sample design on inference

    Science.gov (United States)

    Romer, Jeremy D.; Gitelman, Alix I.; Clements, Shaun; Schreck, Carl B.

    2015-01-01

    A number of researchers have attempted to estimate salmonid smolt survival during outmigration through an estuary. However, it is currently unclear how the design of such studies influences the accuracy and precision of survival estimates. In this simulation study we consider four patterns of smolt survival probability in the estuary, and test the performance of several different sampling strategies for estimating estuarine survival assuming perfect detection. The four survival probability patterns each incorporate a systematic component (constant, linearly increasing, increasing and then decreasing, and two pulses) and a random component to reflect daily fluctuations in survival probability. Generally, spreading sampling effort (tagging) across the season resulted in more accurate estimates of survival. All sampling designs in this simulation tended to under-estimate the variation in the survival estimates because seasonal and daily variation in survival probability are not incorporated in the estimation procedure. This under-estimation results in poorer performance of estimates from larger samples. Thus, tagging more fish may not result in better estimates of survival if important components of variation are not accounted for. The results of our simulation incorporate survival probabilities and run distribution data from previous studies to help illustrate the tradeoffs among sampling strategies in terms of the number of tags needed and distribution of tagging effort. This information will assist researchers in developing improved monitoring programs and encourage discussion regarding issues that should be addressed prior to implementation of any telemetry-based monitoring plan. We believe implementation of an effective estuary survival monitoring program will strengthen the robustness of life cycle models used in recovery plans by providing missing data on where and how much mortality occurs in the riverine and estuarine portions of smolt migration. These data

  14. Multimedia Matrix: A Cognitive Strategy for Designers.

    Science.gov (United States)

    Sherry, Annette C.

    This instructional development project evaluates the effect of a matrix-based strategy to assist multimedia authors in acquiring and applying principles for effective multimedia design. The Multimedia Matrix, based on the Park and Hannafin "Twenty Principles and Implications for Interactive Multimedia" design, displays a condensed…

  15. Using Linked Survey Paradata to Improve Sampling Strategies in the Medical Expenditure Panel Survey

    Directory of Open Access Journals (Sweden)

    Mirel Lisa B.

    2017-06-01

    Full Text Available Using paradata from a prior survey that is linked to a new survey can help a survey organization develop more effective sampling strategies. One example of this type of linkage or subsampling is between the National Health Interview Survey (NHIS and the Medical Expenditure Panel Survey (MEPS. MEPS is a nationally representative sample of the U.S. civilian, noninstitutionalized population based on a complex multi-stage sample design. Each year a new sample is drawn as a subsample of households from the prior year’s NHIS. The main objective of this article is to examine how paradata from a prior survey can be used in developing a sampling scheme in a subsequent survey. A framework for optimal allocation of the sample in substrata formed for this purpose is presented and evaluated for the relative effectiveness of alternative substratification schemes. The framework is applied, using real MEPS data, to illustrate how utilizing paradata from the linked survey offers the possibility of making improvements to the sampling scheme for the subsequent survey. The improvements aim to reduce the data collection costs while maintaining or increasing effective responding sample sizes and response rates for a harder to reach population.

  16. Accuracy assessment of the National Forest Inventory map of Mexico: sampling designs and the fuzzy characterization of landscapes

    Directory of Open Access Journals (Sweden)

    Stéphane Couturier

    2009-10-01

    Full Text Available There is no record so far in the literature of a comprehensive method to assess the accuracy of regional scale Land Cover/ Land Use (LCLU maps in the sub-tropical belt. The elevated biodiversity and the presence of highly fragmented classes hamper the use of sampling designs commonly employed in previous assessments of mainly temperate zones. A sampling design for assessing the accuracy of the Mexican National Forest Inventory (NFI map at community level is presented. A pilot study was conducted on the Cuitzeo Lake watershed region covering 400 000 ha of the 2000 Landsat-derived map. Various sampling designs were tested in order to find a trade-off between operational costs, a good spatial distribution of the sample and the inclusion of all scarcely distributed classes (‘rare classes’. A two-stage sampling design where the selection of Primary Sampling Units (PSU was done under separate schemes for commonly and scarcely distributed classes, showed best characteristics. A total of 2 023 punctual secondary sampling units were verified against their NFI map label. Issues regarding the assessment strategy and trends of class confusions are devised.

  17. Sampling Design of Soil Physical Properties in a Conilon Coffee Field

    Directory of Open Access Journals (Sweden)

    Eduardo Oliveira de Jesus Santos

    Full Text Available ABSTRACT Establishing the number of samples required to determine values of soil physical properties ultimately results in optimization of labor and allows better representation of such attributes. The objective of this study was to analyze the spatial variability of soil physical properties in a Conilon coffee field and propose a soil sampling method better attuned to conditions of the management system. The experiment was performed in a Conilon coffee field in Espírito Santo state, Brazil, under a 3.0 × 2.0 × 1.0 m (4,000 plants ha-1 double spacing design. An irregular grid, with dimensions of 107 × 95.7 m and 65 sampling points, was set up. Soil samples were collected from the 0.00-0.20 m depth from each sampling point. Data were analyzed under descriptive statistical and geostatistical methods. Using statistical parameters, the adequate number of samples for analyzing the attributes under study was established, which ranged from 1 to 11 sampling points. With the exception of particle density, all soil physical properties showed a spatial dependence structure best fitted to the spherical model. Establishment of the number of samples and spatial variability for the physical properties of soils may be useful in developing sampling strategies that minimize costs for farmers within a tolerable and predictable level of error.

  18. Assessment of sampling strategies for estimation of site mean concentrations of stormwater pollutants.

    Science.gov (United States)

    McCarthy, David T; Zhang, Kefeng; Westerlund, Camilla; Viklander, Maria; Bertrand-Krajewski, Jean-Luc; Fletcher, Tim D; Deletic, Ana

    2018-02-01

    The estimation of stormwater pollutant concentrations is a primary requirement of integrated urban water management. In order to determine effective sampling strategies for estimating pollutant concentrations, data from extensive field measurements at seven different catchments was used. At all sites, 1-min resolution continuous flow measurements, as well as flow-weighted samples, were taken and analysed for total suspend solids (TSS), total nitrogen (TN) and Escherichia coli (E. coli). For each of these parameters, the data was used to calculate the Event Mean Concentrations (EMCs) for each event. The measured Site Mean Concentrations (SMCs) were taken as the volume-weighted average of these EMCs for each parameter, at each site. 17 different sampling strategies, including random and fixed strategies were tested to estimate SMCs, which were compared with the measured SMCs. The ratios of estimated/measured SMCs were further analysed to determine the most effective sampling strategies. Results indicate that the random sampling strategies were the most promising method in reproducing SMCs for TSS and TN, while some fixed sampling strategies were better for estimating the SMC of E. coli. The differences in taking one, two or three random samples were small (up to 20% for TSS, and 10% for TN and E. coli), indicating that there is little benefit in investing in collection of more than one sample per event if attempting to estimate the SMC through monitoring of multiple events. It was estimated that an average of 27 events across the studied catchments are needed for characterising SMCs of TSS with a 90% confidence interval (CI) width of 1.0, followed by E.coli (average 12 events) and TN (average 11 events). The coefficient of variation of pollutant concentrations was linearly and significantly correlated to the 90% confidence interval ratio of the estimated/measured SMCs (R 2  = 0.49; P sampling frequency needed to accurately estimate SMCs of pollutants. Crown

  19. Thermally adapted design strategy of colonial houses in Surabaya

    Science.gov (United States)

    Antaryama, I. G. N.; Ekasiwi, S. N. N.; Mappajaya, A.; Ulum, M. S.

    2018-03-01

    Colonial buildings, including houses, have been considered as a representation of climate-responsive architecture. The design was thought to be a hybrid model of Dutch and tropical architecture. It was created by way of reinventing tropical and Dutch architecture design principles, and expressed in a new form, i.e. neither resembling Dutch nor tropical building. Aside from this new image, colonial house does show good climatic responses. Previous researches on colonial house generally focus on qualitative assessment of climate performance of the building. Yet this kind of study tends to concentrate on building elements, e.g. wall, window, etc. The present study is designed to give more complete picture of architecture design strategy of the house by exploring and analysing thermal performance of colonial buildings and their related architecture design strategies. Field measurements are conducted during the dry season in several colonial building in Surabaya. Air temperature and humidity are both taken, representing internal and external thermal conditions of the building. These data are then evaluated to determine thermal performance of the house. Finally, various design strategies are examined in order to reveal their significant contributions to its thermal performance. Results of the study in Surabaya confirm findings of the previous researches that are conducted in other locations, which stated that thermal performance of the house is generally good. Passive design strategies such as mass effect and ventilation play an important role in determining performance of the building.

  20. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    Energy Technology Data Exchange (ETDEWEB)

    Man, Jun [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Zhang, Jiangjiang [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Li, Weixuan [Pacific Northwest National Laboratory, Richland Washington USA; Zeng, Lingzao [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Wu, Laosheng [Department of Environmental Sciences, University of California, Riverside California USA

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees of freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.

  1. Investigation of Architectural Strategies in Relation to Daylight and Integrated Design

    DEFF Research Database (Denmark)

    Jørgensen, Michael; Iversen, Anne; Bjerregaard Jensen, Lotte

    2012-01-01

    his paper investigates the use of daylight in three architecturally successful buildings. The aim is to discuss the challenges and opportunities of architectural daylight strategies in relation to integrated design. All these buildings were designed with the focus on a strategy of using daylight...... to create well-lit, exciting spaces and spatial sequences. The original ideas, thoughts, and decisions behind the designs and daylight strategy are compared with answers in questionnaires from test subjects who have experienced the space and lighting conditions created. The results indicate...... that the architectural daylight strategies formulated by the architects and engineers at the beginning of the design process are actually experienced by the “users” in the existing buildings. The architectural daylight strategy was different in each of the three libraries, and analysis of the results shows that daylight...

  2. Strategy update in a design and consulting company

    OpenAIRE

    Kiviniemi, Juha

    2017-01-01

    The target company of the present thesis is a small engineering company specialising in electrical design. The company’s current development stage and changes in its operating environment call for a strategy up-date. The objective of this study was to formulate key conclusions based on an analysis of the company’s current strategy and its operating environment which will later act as an input to the company’s strategy process. The strategy process itself is left outside the scope of this...

  3. Strategies for the design of bright upconversion nanoparticles for bioanalytical applications

    Science.gov (United States)

    Wiesholler, Lisa M.; Hirsch, Thomas

    2018-06-01

    In recent years upconversion nanoparticles (UCNPs) received great attention because of their outstanding optical properties. Especially in bioanalytical applications this class of materials can overcome limitations of common probes like high background fluorescence or blinking. Nevertheless, the requirements for UCNPs to be applicable in biological samples, e.g. small size, water-dispersibility, excitation at low power density are in contradiction with the demand of high brightness. Therefore, a lot of attention is payed to the enhancement of the upconversion luminescence. This review discuss the recent trends and strategies to boost the brightness of UCNPs, classified in three main directions: a) improving the efficiency of energy absorption by the sensitizer via coupling to plasmonic or photonic structures or via attachment of ligands for light harvesting; b) minimizing non-radiative deactivation by variations in the architecture of UCNPs; and c) changing the excitation wavelength to get bright particles at low excitation power density for applications in aqueous systems. These strategies are critically reviewed including current limitations as well as future perspectives for the design of efficient UCNPs especially for sensing application in biological samples or cells.

  4. On efficiency of some ratio estimators in double sampling design ...

    African Journals Online (AJOL)

    In this paper, three sampling ratio estimators in double sampling design were proposed with the intention of finding an alternative double sampling design estimator to the conventional ratio estimator in double sampling design discussed by Cochran (1997), Okafor (2002) , Raj (1972) and Raj and Chandhok (1999).

  5. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  6. Chapter 2: Sampling strategies in forest hydrology and biogeochemistry

    Science.gov (United States)

    Roger C. Bales; Martha H. Conklin; Branko Kerkez; Steven Glaser; Jan W. Hopmans; Carolyn T. Hunsaker; Matt Meadows; Peter C. Hartsough

    2011-01-01

    Many aspects of forest hydrology have been based on accurate but not necessarily spatially representative measurements, reflecting the measurement capabilities that were traditionally available. Two developments are bringing about fundamental changes in sampling strategies in forest hydrology and biogeochemistry: (a) technical advances in measurement capability, as is...

  7. Sampling strategy to develop a primary core collection of apple ...

    African Journals Online (AJOL)

    PRECIOUS

    2010-01-11

    Jan 11, 2010 ... Physiology and Molecular Biology for Fruit, Tree, Beijing 100193, China. ... analyzed on genetic diversity to ensure their represen- .... strategy, cluster and random sampling. .... on isozyme data―A simulation study, Theor.

  8. MSM actuators: design rules and control strategies

    Energy Technology Data Exchange (ETDEWEB)

    Holz, Benedikt; Janocha, Hartmut [Laboratory of Process Automation (LPA), Saarland University, Saarbruecken (Germany); Riccardi, Leonardo; Naso, David [Department of Electronics and Electrical Science (DEE), Politecnico di Bari (Italy)

    2012-08-15

    Magnetic shape memory (MSM) alloys are comparatively new active materials which can be used for several industrial applications, ranging from precise positioning systems to advanced robotics. Beyond the material research, which deals with the basic thermo-magneto-mechanical properties of the crystals, the design as well as the control of the actuators displacement is an essential challenge. This paper addresses those two topics, trying to give to the reader a useful overview of existing results, but also presents new ideas. First, it introduces and discusses in details some possible designs, with a special emphasis on innovative actuator design concepts which are able to exploit the particular potentialities of MSM elements. The second focus of the paper is on the problem of designing a controller, i.e., an algorithm that allows to obtain a required performance from the actuator. The proposed control strategies try to take into account two main characteristics of MSM elements: the hysteresis and the temperature dependence. The effectiveness of the strategies is emphasized by experimental results performed on a commercially available MSM actuator demonstrator. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  9. Design compliance matrix waste sample container filling system for nested, fixed-depth sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This design compliance matrix document provides specific design related functional characteristics, constraints, and requirements for the container filling system that is part of the nested, fixed-depth sampling system. This document addresses performance, external interfaces, ALARA, Authorization Basis, environmental and design code requirements for the container filling system. The container filling system will interface with the waste stream from the fluidic pumping channels of the nested, fixed-depth sampling system and will fill containers with waste that meet the Resource Conservation and Recovery Act (RCRA) criteria for waste that contains volatile and semi-volatile organic materials. The specifications for the nested, fixed-depth sampling system are described in a Level 2 Specification document (HNF-3483, Rev. 1). The basis for this design compliance matrix document is the Tank Waste Remediation System (TWRS) desk instructions for design Compliance matrix documents (PI-CP-008-00, Rev. 0)

  10. Passive Design Strategies to Enhance Natural Ventilation in Buildings "Election of Passive Design Strategies to Achieve Natural Ventilation in Iraqi Urban Environment with Hot Arid Climate"

    Directory of Open Access Journals (Sweden)

    Ghada M.Ismael Abdul Razzaq Kamoona

    2016-06-01

    Full Text Available the natural ventilation in buildings is one of effective strategies for achieving energy efficiency in buildings by employing methods and ways of passive design, as well as its efficiency in providing high ranges of thermal comfort for occupants in buildings and raises their productivity. Because the concept of natural ventilation for many people confined to achieve through the windows and openings only, become necessary to provide this research to demonstrate the various passive design strategies for natural ventilation. Then, research problem: Insufficient knowledge about the importance and mechanism of the application of passive design strategies for natural ventilation in buildings. The research objective is: Analysis of passive design strategies to achieve natural ventilation in buildings, for the purpose of the proper selection of them to Iraqi urban environment. Accordingly, the research included two parts: First, the theoretical part, which dealt with the conceptual framework of natural ventilation and deriving the most important aspects in it, in order to adopted as a base for the practical part of the research. Second: the practical part, which analyzed examples of buildings projects that employed various design strategies for natural ventilation, according to the theoretical framework that has been drawn. The main conclusion is, Necessity to adopt various passive design strategies for natural ventilation in Iraqi urban environment with hot dry climate, as they have a significant impact in reducing the energy consumption for the purposes of ventilation and cooling, as well as for its efficiency in improving air quality in indoor environments of buildings.

  11. Synthetic Multiple-Imputation Procedure for Multistage Complex Samples

    Directory of Open Access Journals (Sweden)

    Zhou Hanzhi

    2016-03-01

    Full Text Available Multiple imputation (MI is commonly used when item-level missing data are present. However, MI requires that survey design information be built into the imputation models. For multistage stratified clustered designs, this requires dummy variables to represent strata as well as primary sampling units (PSUs nested within each stratum in the imputation model. Such a modeling strategy is not only operationally burdensome but also inferentially inefficient when there are many strata in the sample design. Complexity only increases when sampling weights need to be modeled. This article develops a generalpurpose analytic strategy for population inference from complex sample designs with item-level missingness. In a simulation study, the proposed procedures demonstrate efficient estimation and good coverage properties. We also consider an application to accommodate missing body mass index (BMI data in the analysis of BMI percentiles using National Health and Nutrition Examination Survey (NHANES III data. We argue that the proposed methods offer an easy-to-implement solution to problems that are not well-handled by current MI techniques. Note that, while the proposed method borrows from the MI framework to develop its inferential methods, it is not designed as an alternative strategy to release multiply imputed datasets for complex sample design data, but rather as an analytic strategy in and of itself.

  12. Building communication strategy on health prevention through the human-centered design

    Directory of Open Access Journals (Sweden)

    Karine de Mello Freire

    2016-03-01

    Full Text Available It has been identified a latent need for developing efficient communication strategies for prevention of diseases and also, design as a potential agent to create communications artifacts that are able to promote self-care. In order to analyze a design process that develops this kind of artifact, an action research in IAPI Health Center in Porto Alegre was done. The action’s goal was to design a strategy to promote self-care to prevent cervical cancer. The process was conducted from the human centered design approach - HCD, which seeks to create solutions desirable for people and feasible for organizations from three main phases: a Hear, in which inspirations are originated from stories collected from people; b Create, which aims to translate these knowledge into prototypes; and, c Deliver, where the prototypes are tested and developed with users. Communication strategies were supported by design studies about visual-verbal rhetoric. As results, this design approach has shown adequate to create communication strategies targeted at self-care behaviors, aiming to empower users to change their behavior.

  13. Comparison of sampling designs for estimating deforestation from landsat TM and MODIS imagery: a case study in Mato Grosso, Brazil.

    Science.gov (United States)

    Zhu, Shanyou; Zhang, Hailong; Liu, Ronggao; Cao, Yun; Zhang, Guixin

    2014-01-01

    Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  14. Primary energy implications of different design strategies for an apartment building

    International Nuclear Information System (INIS)

    Tettey, Uniben Yao Ayikoe; Dodoo, Ambrose; Gustavsson, Leif

    2016-01-01

    In this study, we explored the effects of different design strategies on final and primary energy use for production and operation of a newly constructed apartment building. We analysed alternatives of the building “As built” as well as to energy efficiency levels of the Swedish building code and passive house criteria. Our approach is based on achieving improved versions of the building alternatives from combination of design strategies giving the lowest space heating and cooling demand and primary energy use, respectively. We found that the combination of design strategies resulting in the improved building alternatives varies depending on the approach. The improved building alternatives gave up to 19–34% reduction in operation primary energy use compared to the initial alternatives. The share of production primary energy use of the improved building alternatives was 39–54% of the total primary energy use for production, space heating, space cooling and ventilation over 50-year lifespan, compared to 31–42% for the initial alternatives. This study emphasises the importance of incorporating appropriate design strategies to reduce primary energy use for building operation and suggests that combining such strategies with careful choice of building frame materials could result in significant primary energy savings in the built environment. - Highlights: • Primary energy implications of different design strategies were analysed. • The improved building alternatives had 19–34% lower operation primary energy use. • The improved building alternatives had higher production primary energy use. • Still, the improved building alternatives had lower overall primary energy use. • Design strategies should be combined with careful building frame material choice.

  15. DESIGNING STRATEGIES FOR IMPROVING TOTAL QUALITY MANAGEMENT IN MANUFACTURING INDUSTRIES

    Directory of Open Access Journals (Sweden)

    Alfian Nur Ubay

    2017-05-01

    Full Text Available This research was aimed at designing strategies for improving total quality management at CV XYZ and PT HIJ. The research locations were selected intentionally with the consideration that the company is a middle class company that started to apply a study in line with the research topic. The experts were chosen using an approach method. This research used a descriptive approach and quantitative analysis through questionnaires using purposive sampling. The stages began with data processing, i.e. testing the questionnaire quality through validity and reliability tests, making a causality diagram, evaluating the implementation levels of each company by giving evaluation scales based on the existing condition, making House of Quality (HOQ using QFD methods, and then analyzing the problem solutions produced from the QFD methods with 5W + IH analysis, and finally determining the improvement priorities using Fuzzy AHP methods. The results were the strategies for improving total quality management /TQM of CV XYZ, namely the factor that plays the most important role was improving the quality management performance. The actor that has the competence to carry out the TQM improvement is the director. The prioritized goal to be achieved is a commitment to improve the quality of goods and services.  The prioritized strategy used in improving TQM is carrying out SOP consistently.Keywords: strategies, improvement, TQM, manufacturing company, fuzzy AHPABSTRAKPenelitian ini bertujuan merancang strategi peningkatan manajemen mutu terpadu pada CV XYZ dan PT HIJ. Pemilihan lokasi penelitian dilakukan secara sengaja dengan pertimbangan bahwa perusahaan tersebut merupakan perusahaan kelas menengah yang mulai menerapkan kajian sesuai dengan topik penelitian. Penentuan pakar dilakukan dengan metode pendekatan secara sengaja. Adapun penelitian ini menggunakan pendekatan deskriptif dan analisa kuantitatif dengan menyebarkan kuesioner secara sengaja (purposive sampling

  16. Designing Search UX Strategies for eCommerce Success

    CERN Document Server

    Nudelman, Greg

    2011-01-01

    Best practices, practical advice, and design ideas for successful ecommerce search A glaring gap has existed in the market for a resource that offers a comprehensive, actionable design patterns and design strategies for ecommerce search-but no longer. With this invaluable book, user experience designer and user researcher Greg Nudelman shares his years of experience working on popular ecommerce sites as he tackles even the most difficult ecommerce search design problems. Nudelman helps you create highly effective and intuitive ecommerce search design solutions and he takes a unique forward-thi

  17. Sample design for the residential energy consumption survey

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  18. Entrepreneurship and response strategies to challenges in engineering and design education

    DEFF Research Database (Denmark)

    Jørgensen, Ulrik; Pineda, Andres Felipe Valderrama

    2012-01-01

    Entrepreneurship is one of the contemporary expectations to engineers and their training at engineering schools. But what is entrepreneurship? We propose three different conceptualizations of entrepreneurship in engineering and design programs. They are: (1) the technology-driven promotion response...... centered in technological development; (2) the business selection response strategy centered in business skills (which should be additional to the technical skills); and (3) the design intervention response strategy focused on a network approach to technology, business and society. These conceptualizations...... are response strategies from engineering communities, professors and institutions to perceived challenges. We argue that all engineering educators deal in one way or another with the three response strategies when approaching issues of curricular design, academicreform and the international accreditation...

  19. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  20. Designing a fuzzy expert system for selecting knowledge management strategy

    Directory of Open Access Journals (Sweden)

    Ameneh Khadivar

    2014-12-01

    Full Text Available knowledge management strategy is mentioned as one of the most important success factors for implementing knowledge management. The KM strategy selection is a complex decision that requires consideration of several factors. For evaluation and selection of an appropriate knowledge management strategy in organizations, many factors must be considered. The identified factors and their impact on knowledge management strategy are inherently ambiguous. In this study, an overview of theoretical foundations of research regarding the different knowledge management strategies has been done And factors influencing the knowledge management strategy selection have been extracted from conceptual frameworks and models. How these factors influence the knowledge management strategy selection is extracted through the fuzzy Delphi. Next a fuzzy expert system for the selection of appropriate knowledge management strategy is designed with respect to factors that have an impact on knowledge management strategy. The factors which influence the selection of knowledge management strategy include: general business strategy, organizational structure, cultural factors, IT strategy, strategic human resource management, social level, the types of knowledge creation processes and release it. The factors which influence the knowledge management strategy selection include: business strategy general, organizational structure, cultural factors, IT strategy, human resource management strategies, socialization level, knowledge types and its creation and diffusion processes. According to identified factors which affect the knowledge management strategy, the final strategy is recommended based on the range of human-oriented and system-oriented by keep the balance of explicit and implicit knowledge. The Designed system performance is tested and evaluated by the information related to three Iranian organization.

  1. System design description for sampling fuel in K basins

    International Nuclear Information System (INIS)

    Baker, R.B.

    1996-01-01

    This System Design Description provides: (1) statements of the Spent Nuclear Fuel Projects (SNFP) needs requiring sampling of fuel in the K East and K West Basins, (2) the sampling equipment functions and requirements, (3) a general work plan and the design logic being followed to develop the equipment, and (4) a summary description of the design for the sampling equipment. The report summarizes the integrated application of both the subject equipment and the canister sludge sampler in near-term characterization campaigns at K Basins

  2. Sampling design for use by the soil decontamination project

    International Nuclear Information System (INIS)

    Rutherford, D.W.; Stevens, J.R.

    1981-01-01

    This report proposes a general approach to the problem and discusses sampling of soil to map the contaminated area and to provide samples for characterizaton of soil components and contamination. Basic concepts in sample design are reviewed with reference to environmental transuranic studies. Common designs are reviewed and evaluated for use with specific objectives that might be required by the soil decontamination project. Examples of a hierarchial design pilot study and a combined hierarchial and grid study are proposed for the Rocky Flats 903 pad area

  3. Identifying Instructional Strategies Used to Design Mobile Learning in a Corporate Setting

    Science.gov (United States)

    Jackson-Butler, Uletta

    2016-01-01

    The purpose of this qualitative embedded multiple case study was to describe what instructional strategies corporate instructional designers were using to design mobile learning and to understand from their experiences which instructional strategies they believed enhance learning. Participants were five instructional designers who were actively…

  4. [Saarland Growth Study: sampling design].

    Science.gov (United States)

    Danker-Hopfe, H; Zabransky, S

    2000-01-01

    The use of reference data to evaluate the physical development of children and adolescents is part of the daily routine in the paediatric ambulance. The construction of such reference data is based on the collection of extensive reference data. There are different kinds of reference data: cross sectional references, which are based on data collected from a big representative cross-sectional sample of the population, longitudinal references, which are based on follow-up surveys of usually smaller samples of individuals from birth to maturity, and mixed longitudinal references, which are a combination of longitudinal and cross-sectional reference data. The advantages and disadvantages of the different methods of data collection and the resulting reference data are discussed. The Saarland Growth Study was conducted for several reasons: growth processes are subject to secular changes, there are no specific reference data for children and adolescents from this part of the country and the growth charts in use in the paediatric praxis are possibly not appropriate any more. Therefore, the Saarland Growth Study served two purposes a) to create actual regional reference data and b) to create a database for future studies on secular trends in growth processes of children and adolescents from Saarland. The present contribution focusses on general remarks on the sampling design of (cross-sectional) growth surveys and its inferences for the design of the present study.

  5. Paradigms for adaptive statistical information designs: practical experiences and strategies.

    Science.gov (United States)

    Wang, Sue-Jane; Hung, H M James; O'Neill, Robert

    2012-11-10

    design. We highlight the substantial risk of planning the sample size for confirmatory trials when information is very uninformative and stipulate the advantages of adaptive statistical information designs for planning exploratory trials. Practical experiences and strategies as lessons learned from more recent adaptive design proposals will be discussed to pinpoint the improved utilities of adaptive design clinical trials and their potential to increase the chance of a successful drug development. Published 2012. This article is a US Government work and is in the public domain in the USA.

  6. Estimation after classification using lot quality assurance sampling: corrections for curtailed sampling with application to evaluating polio vaccination campaigns.

    Science.gov (United States)

    Olives, Casey; Valadez, Joseph J; Pagano, Marcello

    2014-03-01

    To assess the bias incurred when curtailment of Lot Quality Assurance Sampling (LQAS) is ignored, to present unbiased estimators, to consider the impact of cluster sampling by simulation and to apply our method to published polio immunization data from Nigeria. We present estimators of coverage when using two kinds of curtailed LQAS strategies: semicurtailed and curtailed. We study the proposed estimators with independent and clustered data using three field-tested LQAS designs for assessing polio vaccination coverage, with samples of size 60 and decision rules of 9, 21 and 33, and compare them to biased maximum likelihood estimators. Lastly, we present estimates of polio vaccination coverage from previously published data in 20 local government authorities (LGAs) from five Nigerian states. Simulations illustrate substantial bias if one ignores the curtailed sampling design. Proposed estimators show no bias. Clustering does not affect the bias of these estimators. Across simulations, standard errors show signs of inflation as clustering increases. Neither sampling strategy nor LQAS design influences estimates of polio vaccination coverage in 20 Nigerian LGAs. When coverage is low, semicurtailed LQAS strategies considerably reduces the sample size required to make a decision. Curtailed LQAS designs further reduce the sample size when coverage is high. Results presented dispel the misconception that curtailed LQAS data are unsuitable for estimation. These findings augment the utility of LQAS as a tool for monitoring vaccination efforts by demonstrating that unbiased estimation using curtailed designs is not only possible but these designs also reduce the sample size. © 2014 John Wiley & Sons Ltd.

  7. Recruitment of hard-to-reach population subgroups via adaptations of the snowball sampling strategy.

    Science.gov (United States)

    Sadler, Georgia Robins; Lee, Hau-Chen; Lim, Rod Seung-Hwan; Fullerton, Judith

    2010-09-01

    Nurse researchers and educators often engage in outreach to narrowly defined populations. This article offers examples of how variations on the snowball sampling recruitment strategy can be applied in the creation of culturally appropriate, community-based information dissemination efforts related to recruitment to health education programs and research studies. Examples from the primary author's program of research are provided to demonstrate how adaptations of snowball sampling can be used effectively in the recruitment of members of traditionally underserved or vulnerable populations. The adaptation of snowball sampling techniques, as described in this article, helped the authors to gain access to each of the more-vulnerable population groups of interest. The use of culturally sensitive recruitment strategies is both appropriate and effective in enlisting the involvement of members of vulnerable populations. Adaptations of snowball sampling strategies should be considered when recruiting participants for education programs or for research studies when the recruitment of a population-based sample is not essential.

  8. Sampling strategies for the analysis of reactive low-molecular weight compounds in air

    NARCIS (Netherlands)

    Henneken, H.

    2006-01-01

    Within this thesis, new sampling and analysis strategies for the determination of airborne workplace contaminants have been developed. Special focus has been directed towards the development of air sampling methods that involve diffusive sampling. In an introductory overview, the current

  9. Comparison of Sampling Designs for Estimating Deforestation from Landsat TM and MODIS Imagery: A Case Study in Mato Grosso, Brazil

    Directory of Open Access Journals (Sweden)

    Shanyou Zhu

    2014-01-01

    Full Text Available Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  10. Design and development of multiple sample counting setup

    International Nuclear Information System (INIS)

    Rath, D.P.; Murali, S.; Babu, D.A.R.

    2010-01-01

    Full text: The analysis of active samples on regular basis for ambient air activity and floor contamination from radio chemical lab accounts for major chunk of the operational activity in Health Physicist's responsibility. The requirement for daily air sample analysis on immediate counting and delayed counting from various labs in addition to samples of smear swipe check of lab led to the urge for development of system that could cater multiple sample analysis in a time programmed manner on a single sample loading. A multiple alpha/beta counting system for counting was designed and fabricated. It has arrangements for loading 10 samples in slots in order, get counted in a time programmed manner with results displayed and records maintained in PC. The paper describes the design and development of multiple sample counting setup presently in use at the facility has resulted in reduction of man-hour consumption in counting and recording of the results

  11. Conceptual design strategy for liquid-metal-wall inertial-fusion reactors

    International Nuclear Information System (INIS)

    Monsler, M.J.; Meier, W.R.

    1981-02-01

    The liquid-metal-wall chamber has emerged as an attractive reactor concept for inertial fusion energy conversion. The principal feature of this concept is a thick, free-flowing blanket of liquid metal used to protect the structure of the reactor. The development and design of liquid-metal-wall chambers over the past decade provides a basis for formulating a conceptual design strategy for such chambers. Both the attractive and unattractive features of a LMW chamber are enumerated, and a design strategy is formulated which accommodates the engineering constraints while minimizing the liquid-metal flow rate

  12. Conceptual design strategy for liquid-metal-wall inertial-fusion reactors

    Energy Technology Data Exchange (ETDEWEB)

    Monsler, M.J.; Meier, W.R.

    1981-02-01

    The liquid-metal-wall chamber has emerged as an attractive reactor concept for inertial fusion energy conversion. The principal feature of this concept is a thick, free-flowing blanket of liquid metal used to protect the structure of the reactor. The development and design of liquid-metal-wall chambers over the past decade provides a basis for formulating a conceptual design strategy for such chambers. Both the attractive and unattractive features of a LMW chamber are enumerated, and a design strategy is formulated which accommodates the engineering constraints while minimizing the liquid-metal flow rate.

  13. A computational study of a fast sampling valve designed to sample soot precursors inside a forming diesel spray plume

    International Nuclear Information System (INIS)

    Dumitrescu, Cosmin; Puzinauskas, Paulius V.; Agrawal, Ajay K.; Liu, Hao; Daly, Daniel T.

    2009-01-01

    Accurate chemical reaction mechanisms are critically needed to fully optimize combustion strategies for modern internal-combustion engines. These mechanisms are needed to predict emission formation and the chemical heat release characteristics for traditional direct-injection diesel as well as recently-developed and proposed variant combustion strategies. Experimental data acquired under conditions representative of such combustion strategies are required to validate these reaction mechanisms. This paper explores the feasibility of developing a fast sampling valve which extracts reactants at known locations in the spray reaction structure to provide these data. CHEMKIN software is used to establish the reaction timescales which dictate the required fast sampling capabilities. The sampling process is analyzed using separate FLUENT and CHEMKIN calculations. The non-reacting FLUENT CFD calculations give a quantitative estimate of the sample quantity as well as the fluid mixing and thermal history. A CHEMKIN reactor network has been created that reflects these mixing and thermal time scales and allows a theoretical evaluation of the quenching process

  14. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  15. Development of improved space sampling strategies for ocean chemical properties: Total carbon dioxide and dissolved nitrate

    Science.gov (United States)

    Goyet, Catherine; Davis, Daniel; Peltzer, Edward T.; Brewer, Peter G.

    1995-01-01

    Large-scale ocean observing programs such as the Joint Global Ocean Flux Study (JGOFS) and the World Ocean Circulation Experiment (WOCE) today, must face the problem of designing an adequate sampling strategy. For ocean chemical variables, the goals and observing technologies are quite different from ocean physical variables (temperature, salinity, pressure). We have recently acquired data on the ocean CO2 properties on WOCE cruises P16c and P17c that are sufficiently dense to test for sampling redundancy. We use linear and quadratic interpolation methods on the sampled field to investigate what is the minimum number of samples required to define the deep ocean total inorganic carbon (TCO2) field within the limits of experimental accuracy (+/- 4 micromol/kg). Within the limits of current measurements, these lines were oversampled in the deep ocean. Should the precision of the measurement be improved, then a denser sampling pattern may be desirable in the future. This approach rationalizes the efficient use of resources for field work and for estimating gridded (TCO2) fields needed to constrain geochemical models.

  16. Designing an enhanced groundwater sample collection system

    International Nuclear Information System (INIS)

    Schalla, R.

    1994-10-01

    As part of an ongoing technical support mission to achieve excellence and efficiency in environmental restoration activities at the Laboratory for Energy and Health-Related Research (LEHR), Pacific Northwest Laboratory (PNL) provided guidance on the design and construction of monitoring wells and identified the most suitable type of groundwater sampling pump and accessories for monitoring wells. The goal was to utilize a monitoring well design that would allow for hydrologic testing and reduce turbidity to minimize the impact of sampling. The sampling results of the newly designed monitoring wells were clearly superior to those of the previously installed monitoring wells. The new wells exhibited reduced turbidity, in addition to improved access for instrumentation and hydrologic testing. The variable frequency submersible pump was selected as the best choice for obtaining groundwater samples. The literature references are listed at the end of this report. Despite some initial difficulties, the actual performance of the variable frequency, submersible pump and its accessories was effective in reducing sampling time and labor costs, and its ease of use was preferred over the previously used bladder pumps. The surface seals system, called the Dedicator, proved to be useful accessory to prevent surface contamination while providing easy access for water-level measurements and for connecting the pump. Cost savings resulted from the use of the pre-production pumps (beta units) donated by the manufacturer for the demonstration. However, larger savings resulted from shortened field time due to the ease in using the submersible pumps and the surface seal access system. Proper deployment of the monitoring wells also resulted in cost savings and ensured representative samples

  17. Representativeness-based sampling network design for the State of Alaska

    Science.gov (United States)

    Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove

    2013-01-01

    Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...

  18. Extending cluster lot quality assurance sampling designs for surveillance programs.

    Science.gov (United States)

    Hund, Lauren; Pagano, Marcello

    2014-07-20

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Impact of sampling strategy on stream load estimates in till landscape of the Midwest

    Science.gov (United States)

    Vidon, P.; Hubbard, L.E.; Soyeux, E.

    2009-01-01

    Accurately estimating various solute loads in streams during storms is critical to accurately determine maximum daily loads for regulatory purposes. This study investigates the impact of sampling strategy on solute load estimates in streams in the US Midwest. Three different solute types (nitrate, magnesium, and dissolved organic carbon (DOC)) and three sampling strategies are assessed. Regardless of the method, the average error on nitrate loads is higher than for magnesium or DOC loads, and all three methods generally underestimate DOC loads and overestimate magnesium loads. Increasing sampling frequency only slightly improves the accuracy of solute load estimates but generally improves the precision of load calculations. This type of investigation is critical for water management and environmental assessment so error on solute load calculations can be taken into account by landscape managers, and sampling strategies optimized as a function of monitoring objectives. ?? 2008 Springer Science+Business Media B.V.

  20. A Sample-Based Forest Monitoring Strategy Using Landsat, AVHRR and MODIS Data to Estimate Gross Forest Cover Loss in Malaysia between 1990 and 2005

    Directory of Open Access Journals (Sweden)

    Peter Potapov

    2013-04-01

    Full Text Available Insular Southeast Asia is a hotspot of humid tropical forest cover loss. A sample-based monitoring approach quantifying forest cover loss from Landsat imagery was implemented to estimate gross forest cover loss for two eras, 1990–2000 and 2000–2005. For each time interval, a probability sample of 18.5 km × 18.5 km blocks was selected, and pairs of Landsat images acquired per sample block were interpreted to quantify forest cover area and gross forest cover loss. Stratified random sampling was implemented for 2000–2005 with MODIS-derived forest cover loss used to define the strata. A probability proportional to x (πpx design was implemented for 1990–2000 with AVHRR-derived forest cover loss used as the x variable to increase the likelihood of including forest loss area in the sample. The estimated annual gross forest cover loss for Malaysia was 0.43 Mha/yr (SE = 0.04 during 1990–2000 and 0.64 Mha/yr (SE = 0.055 during 2000–2005. Our use of the πpx sampling design represents a first practical trial of this design for sampling satellite imagery. Although the design performed adequately in this study, a thorough comparative investigation of the πpx design relative to other sampling strategies is needed before general design recommendations can be put forth.

  1. Thermal probe design for Europa sample acquisition

    Science.gov (United States)

    Horne, Mera F.

    2018-01-01

    The planned lander missions to the surface of Europa will access samples from the subsurface of the ice in a search for signs of life. A small thermal drill (probe) is proposed to meet the sample requirement of the Science Definition Team's (SDT) report for the Europa mission. The probe is 2 cm in diameter and 16 cm in length and is designed to access the subsurface to 10 cm deep and to collect five ice samples of 7 cm3 each, approximately. The energy required to penetrate the top 10 cm of ice in a vacuum is 26 Wh, approximately, and to melt 7 cm3 of ice is 1.2 Wh, approximately. The requirement stated in the SDT report of collecting samples from five different sites can be accommodated with repeated use of the same thermal drill. For smaller sample sizes, a smaller probe of 1.0 cm in diameter with the same length of 16 cm could be utilized that would require approximately 6.4 Wh to penetrate the top 10 cm of ice, and 0.02 Wh to collect 0.1 g of sample. The thermal drill has the advantage of simplicity of design and operations and the ability to penetrate ice over a range of densities and hardness while maintaining sample integrity.

  2. Food design strategies to increase vegetable intake

    NARCIS (Netherlands)

    Oliviero, Teresa; Fogliano, Vincenzo

    2016-01-01

    Background: Public campaigns promoting consumption of fruits and vegetables had limited results as consumers habits are difficult to modify. The incorporation of fruits and vegetables into regularly eaten products is a food design strategy that leads to several advantages. Pasta is a staple food

  3. Perspectives on land snails - sampling strategies for isotopic analyses

    Science.gov (United States)

    Kwiecien, Ola; Kalinowski, Annika; Kamp, Jessica; Pellmann, Anna

    2017-04-01

    Since the seminal works of Goodfriend (1992), several substantial studies confirmed a relation between the isotopic composition of land snail shells (d18O, d13C) and environmental parameters like precipitation amount, moisture source, temperature and vegetation type. This relation, however, is not straightforward and site dependent. The choice of sampling strategy (discrete or bulk sampling) and cleaning procedure (several methods can be used, but comparison of their effects in an individual shell has yet not been achieved) further complicate the shell analysis. The advantage of using snail shells as environmental archive lies in the snails' limited mobility, and therefore an intrinsic aptitude of recording local and site-specific conditions. Also, snail shells are often found at dated archaeological sites. An obvious drawback is that shell assemblages rarely make up a continuous record, and a single shell is only a snapshot of the environmental setting at a given time. Shells from archaeological sites might represent a dietary component and cooking would presumably alter the isotopic signature of aragonite material. Consequently, a proper sampling strategy is of great importance and should be adjusted to the scientific question. Here, we compare and contrast different sampling approaches using modern shells collected in Morocco, Spain and Germany. The bulk shell approach (fine-ground material) yields information on mean environmental parameters within the life span of analyzed individuals. However, despite homogenization, replicate measurements of bulk shell material returned results with a variability greater than analytical precision (up to 2‰ for d18O, and up to 1‰ for d13C), calling for caution analyzing only single individuals. Horizontal high-resolution sampling (single drill holes along growth lines) provides insights into the amplitude of seasonal variability, while vertical high-resolution sampling (multiple drill holes along the same growth line

  4. Sample preparation strategies for food and biological samples prior to nanoparticle detection and imaging

    DEFF Research Database (Denmark)

    Larsen, Erik Huusfeldt; Löschner, Katrin

    2014-01-01

    microscopy (TEM) proved to be necessary for trouble shooting of results obtained from AFFF-LS-ICP-MS. Aqueous and enzymatic extraction strategies were tested for thorough sample preparation aiming at degrading the sample matrix and to liberate the AgNPs from chicken meat into liquid suspension. The resulting...... AFFF-ICP-MS fractograms, which corresponded to the enzymatic digests, showed a major nano-peak (about 80 % recovery of AgNPs spiked to the meat) plus new smaller peaks that eluted close to the void volume of the fractograms. Small, but significant shifts in retention time of AFFF peaks were observed...... for the meat sample extracts and the corresponding neat AgNP suspension, and rendered sizing by way of calibration with AgNPs as sizing standards inaccurate. In order to gain further insight into the sizes of the separated AgNPs, or their possible dissolved state, fractions of the AFFF eluate were collected...

  5. Design strategy for terahertz quantum dot cascade lasers.

    Science.gov (United States)

    Burnett, Benjamin A; Williams, Benjamin S

    2016-10-31

    The development of quantum dot cascade lasers has been proposed as a path to obtain terahertz semiconductor lasers that operate at room temperature. The expected benefit is due to the suppression of nonradiative electron-phonon scattering and reduced dephasing that accompanies discretization of the electronic energy spectrum. We present numerical modeling which predicts that simple scaling of conventional quantum well based designs to the quantum dot regime will likely fail due to electrical instability associated with high-field domain formation. A design strategy adapted for terahertz quantum dot cascade lasers is presented which avoids these problems. Counterintuitively, this involves the resonant depopulation of the laser's upper state with the LO-phonon energy. The strategy is tested theoretically using a density matrix model of transport and gain, which predicts sufficient gain for lasing at stable operating points. Finally, the effect of quantum dot size inhomogeneity on the optical lineshape is explored, suggesting that the design concept is robust to a moderate amount of statistical variation.

  6. Sampling strategy for estimating human exposure pathways to consumer chemicals

    NARCIS (Netherlands)

    Papadopoulou, Eleni; Padilla-Sanchez, Juan A.; Collins, Chris D.; Cousins, Ian T.; Covaci, Adrian; de Wit, Cynthia A.; Leonards, Pim E.G.; Voorspoels, Stefan; Thomsen, Cathrine; Harrad, Stuart; Haug, Line S.

    2016-01-01

    Human exposure to consumer chemicals has become a worldwide concern. In this work, a comprehensive sampling strategy is presented, to our knowledge being the first to study all relevant exposure pathways in a single cohort using multiple methods for assessment of exposure from each exposure pathway.

  7. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  8. Dried blood spot measurement: application in tacrolimus monitoring using limited sampling strategy and abbreviated AUC estimation.

    Science.gov (United States)

    Cheung, Chi Yuen; van der Heijden, Jaques; Hoogtanders, Karin; Christiaans, Maarten; Liu, Yan Lun; Chan, Yiu Han; Choi, Koon Shing; van de Plas, Afke; Shek, Chi Chung; Chau, Ka Foon; Li, Chun Sang; van Hooff, Johannes; Stolk, Leo

    2008-02-01

    Dried blood spot (DBS) sampling and high-performance liquid chromatography tandem-mass spectrometry have been developed in monitoring tacrolimus levels. Our center favors the use of limited sampling strategy and abbreviated formula to estimate the area under concentration-time curve (AUC(0-12)). However, it is inconvenient for patients because they have to wait in the center for blood sampling. We investigated the application of DBS method in tacrolimus level monitoring using limited sampling strategy and abbreviated AUC estimation approach. Duplicate venous samples were obtained at each time point (C(0), C(2), and C(4)). To determine the stability of blood samples, one venous sample was sent to our laboratory immediately. The other duplicate venous samples, together with simultaneous fingerprick blood samples, were sent to the University of Maastricht in the Netherlands. Thirty six patients were recruited and 108 sets of blood samples were collected. There was a highly significant relationship between AUC(0-12), estimated from venous blood samples, and fingerprick blood samples (r(2) = 0.96, P AUC(0-12) strategy as drug monitoring.

  9. A conceptual design strategy for liquid-metal-wall inertial fusion reactors

    International Nuclear Information System (INIS)

    Monsler, M.J.; Meier, W.R.

    1981-01-01

    The liquid-metal-wall chamber has emerged as an attractive reactor concept for inertial fusion energy conversion. The principal feature of this concept is a thick, free-flowing blanket of liquid metal used to protect the structure of the reactor. The development and design of liquid-metal-wall chambers over the past decade are reviewed from the perspective of formulating a conceptual design strategy for such chambers. The basis for the design strategy is set by enumerating both the attractive and unattractive features of a LMW chamber. Past concepts are then reviewed to identify conceptual design approaches and physical configurations that enhance the positive aspects and minimize the negative aspects. A detailed description of the engineering considerations is given, including such topics as the selection of a liquid metal, control of radiation damage, selection of structural material, control of tritium breeding and extraction, control of wall stress, and designing for a given rep-rate. Finally, a design strategy is formulated which accomodates the engineering constraints while minimizing the liquid-metal flow rate. (orig.)

  10. The Problem Solver and The Artisan Designer: Strategies for Utilizing Design Idea Archives

    DEFF Research Database (Denmark)

    Inie, Nanna; Endo, Allison; Dow, Steven

    2018-01-01

    This paper presents the results of an extensive qualitative study investigating how professional designers utilize personal idea archives. While we know that designers archive creative ideas in different formats and on different platforms, we know little about if and how designers utilize...... these idea archives in their daily practice. Through a series of interviews (n=20) and walkthroughs of design idea archives, we identified two archetypal strategies. The Problem Solver is concerned with the task at hand, keeps relevant ideas around, and discards them when the ideas have served their purpose...

  11. Statistical sampling strategies

    International Nuclear Information System (INIS)

    Andres, T.H.

    1987-01-01

    Systems assessment codes use mathematical models to simulate natural and engineered systems. Probabilistic systems assessment codes carry out multiple simulations to reveal the uncertainty in values of output variables due to uncertainty in the values of the model parameters. In this paper, methods are described for sampling sets of parameter values to be used in a probabilistic systems assessment code. Three Monte Carlo parameter selection methods are discussed: simple random sampling, Latin hypercube sampling, and sampling using two-level orthogonal arrays. Three post-selection transformations are also described: truncation, importance transformation, and discretization. Advantages and disadvantages of each method are summarized

  12. ACS sampling system: design, implementation, and performance evaluation

    Science.gov (United States)

    Di Marcantonio, Paolo; Cirami, Roberto; Chiozzi, Gianluca

    2004-09-01

    By means of ACS (ALMA Common Software) framework we designed and implemented a sampling system which allows sampling of every Characteristic Component Property with a specific, user-defined, sustained frequency limited only by the hardware. Collected data are sent to various clients (one or more Java plotting widgets, a dedicated GUI or a COTS application) using the ACS/CORBA Notification Channel. The data transport is optimized: samples are cached locally and sent in packets with a lower and user-defined frequency to keep network load under control. Simultaneous sampling of the Properties of different Components is also possible. Together with the design and implementation issues we present the performance of the sampling system evaluated on two different platforms: on a VME based system using VxWorks RTOS (currently adopted by ALMA) and on a PC/104+ embedded platform using Red Hat 9 Linux operating system. The PC/104+ solution offers, as an alternative, a low cost PC compatible hardware environment with free and open operating system.

  13. Recruiting hard-to-reach United States population sub-groups via adaptations of snowball sampling strategy

    Science.gov (United States)

    Sadler, Georgia Robins; Lee, Hau-Chen; Seung-Hwan Lim, Rod; Fullerton, Judith

    2011-01-01

    Nurse researchers and educators often engage in outreach to narrowly defined populations. This article offers examples of how variations on the snowball sampling recruitment strategy can be applied in the creation of culturally appropriate, community-based information dissemination efforts related to recruitment to health education programs and research studies. Examples from the primary author’s program of research are provided to demonstrate how adaptations of snowball sampling can be effectively used in the recruitment of members of traditionally underserved or vulnerable populations. The adaptation of snowball sampling techniques, as described in this article, helped the authors to gain access to each of the more vulnerable population groups of interest. The use of culturally sensitive recruitment strategies is both appropriate and effective in enlisting the involvement of members of vulnerable populations. Adaptations of snowball sampling strategies should be considered when recruiting participants for education programs or subjects for research studies when recruitment of a population based sample is not essential. PMID:20727089

  14. Design strategies from sexual exploitation and sex work studies among women and girls: Methodological considerations in a hidden and vulnerable population.

    Science.gov (United States)

    Gerassi, Lara; Edmond, Tonya; Nichols, Andrea

    2017-06-01

    The study of sex trafficking, prostitution, sex work, and sexual exploitation is associated with many methodological issues and challenges. Researchers' study designs must consider the many safety issues related to this vulnerable and hidden population. Community advisory boards and key stakeholder involvement are essential to study design to increase safety of participants, usefulness of study aims, and meaningfulness of conclusions. Nonrandomized sampling strategies are most often utilized when studying exploited women and girls, which have the capacity to provide rich data and require complex sampling and recruitment methods. This article reviews the current methodological issues when studying this marginalized population as well as strategies to address challenges while working with the community in order to bring about social change. The authors also discuss their own experiences in collaborating with community organizations to conduct research in this field.

  15. Mobile Variable Depth Sampling System Design Study

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study

  16. Mobile Variable Depth Sampling System Design Study

    Energy Technology Data Exchange (ETDEWEB)

    BOGER, R.M.

    2000-08-25

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study.

  17. Sampling and analyte enrichment strategies for ambient mass spectrometry.

    Science.gov (United States)

    Li, Xianjiang; Ma, Wen; Li, Hongmei; Ai, Wanpeng; Bai, Yu; Liu, Huwei

    2018-01-01

    Ambient mass spectrometry provides great convenience for fast screening, and has showed promising potential in analytical chemistry. However, its relatively low sensitivity seriously restricts its practical utility in trace compound analysis. In this review, we summarize the sampling and analyte enrichment strategies coupled with nine modes of representative ambient mass spectrometry (desorption electrospray ionization, paper vhspray ionization, wooden-tip spray ionization, probe electrospray ionization, coated blade spray ionization, direct analysis in real time, desorption corona beam ionization, dielectric barrier discharge ionization, and atmospheric-pressure solids analysis probe) that have dramatically increased the detection sensitivity. We believe that these advances will promote routine use of ambient mass spectrometry. Graphical abstract Scheme of sampling stretagies for ambient mass spectrometry.

  18. A satellite digital controller or 'play that PID tune again, Sam'. [Position, Integral, Derivative feedback control algorithm for design strategy

    Science.gov (United States)

    Seltzer, S. M.

    1976-01-01

    The problem discussed is to design a digital controller for a typical satellite. The controlled plant is considered to be a rigid body acting in a plane. The controller is assumed to be a digital computer which, when combined with the proposed control algorithm, can be represented as a sampled-data system. The objective is to present a design strategy and technique for selecting numerical values for the control gains (assuming position, integral, and derivative feedback) and the sample rate. The technique is based on the parameter plane method and requires that the system be amenable to z-transform analysis.

  19. Observing System Simulation Experiments for the assessment of temperature sampling strategies in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    F. Raicich

    2003-01-01

    Full Text Available For the first time in the Mediterranean Sea various temperature sampling strategies are studied and compared to each other by means of the Observing System Simulation Experiment technique. Their usefulness in the framework of the Mediterranean Forecasting System (MFS is assessed by quantifying their impact in a Mediterranean General Circulation Model in numerical twin experiments via univariate data assimilation of temperature profiles in summer and winter conditions. Data assimilation is performed by means of the optimal interpolation algorithm implemented in the SOFA (System for Ocean Forecasting and Analysis code. The sampling strategies studied here include various combinations of eXpendable BathyThermograph (XBT profiles collected along Volunteer Observing Ship (VOS tracks, Airborne XBTs (AXBTs and sea surface temperatures. The actual sampling strategy adopted in the MFS Pilot Project during the Targeted Operational Period (TOP, winter-spring 2000 is also studied. The data impact is quantified by the error reduction relative to the free run. The most effective sampling strategies determine 25–40% error reduction, depending on the season, the geographic area and the depth range. A qualitative relationship can be recognized in terms of the spread of information from the data positions, between basin circulation features and spatial patterns of the error reduction fields, as a function of different spatial and seasonal characteristics of the dynamics. The largest error reductions are observed when samplings are characterized by extensive spatial coverages, as in the cases of AXBTs and the combination of XBTs and surface temperatures. The sampling strategy adopted during the TOP is characterized by little impact, as a consequence of a sampling frequency that is too low. Key words. Oceanography: general (marginal and semi-enclosed seas; numerical modelling

  20. Observing System Simulation Experiments for the assessment of temperature sampling strategies in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    F. Raicich

    Full Text Available For the first time in the Mediterranean Sea various temperature sampling strategies are studied and compared to each other by means of the Observing System Simulation Experiment technique. Their usefulness in the framework of the Mediterranean Forecasting System (MFS is assessed by quantifying their impact in a Mediterranean General Circulation Model in numerical twin experiments via univariate data assimilation of temperature profiles in summer and winter conditions. Data assimilation is performed by means of the optimal interpolation algorithm implemented in the SOFA (System for Ocean Forecasting and Analysis code. The sampling strategies studied here include various combinations of eXpendable BathyThermograph (XBT profiles collected along Volunteer Observing Ship (VOS tracks, Airborne XBTs (AXBTs and sea surface temperatures. The actual sampling strategy adopted in the MFS Pilot Project during the Targeted Operational Period (TOP, winter-spring 2000 is also studied.

    The data impact is quantified by the error reduction relative to the free run. The most effective sampling strategies determine 25–40% error reduction, depending on the season, the geographic area and the depth range. A qualitative relationship can be recognized in terms of the spread of information from the data positions, between basin circulation features and spatial patterns of the error reduction fields, as a function of different spatial and seasonal characteristics of the dynamics. The largest error reductions are observed when samplings are characterized by extensive spatial coverages, as in the cases of AXBTs and the combination of XBTs and surface temperatures. The sampling strategy adopted during the TOP is characterized by little impact, as a consequence of a sampling frequency that is too low.

    Key words. Oceanography: general (marginal and semi-enclosed seas; numerical modelling

  1. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    Science.gov (United States)

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  2. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  3. Conditional estimation of exponential random graph models from snowball sampling designs

    NARCIS (Netherlands)

    Pattison, Philippa E.; Robins, Garry L.; Snijders, Tom A. B.; Wang, Peng

    2013-01-01

    A complete survey of a network in a large population may be prohibitively difficult and costly. So it is important to estimate models for networks using data from various network sampling designs, such as link-tracing designs. We focus here on snowball sampling designs, designs in which the members

  4. ANL small-sample calorimeter system design and operation

    International Nuclear Information System (INIS)

    Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.

    1978-07-01

    The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg

  5. Parameters Design for Logarithmic Quantizer Based on Zoom Strategy

    Directory of Open Access Journals (Sweden)

    Jingjing Yan

    2017-01-01

    Full Text Available This paper is concerned with the problem of designing suitable parameters for logarithmic quantizer such that the closed-loop system is asymptotic convergent. Based on zoom strategy, we propose two methods for quantizer parameters design, under which it ensures that the state of the closed-loop system can load in the invariant sets after some certain moments. Then we obtain that the quantizer is unsaturated, and thus the quantization errors are bounded under the time-varying logarithm quantization strategy. On that basis, we obtain that the closed-loop system is asymptotic convergent. A benchmark example is given to show the usefulness of the proposed methods, and the comparison results are illustrated.

  6. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    Science.gov (United States)

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  7. Baseline Design Compliance Matrix for the Rotary Mode Core Sampling System

    International Nuclear Information System (INIS)

    LECHELT, J.A.

    2000-01-01

    The purpose of the design compliance matrix (DCM) is to provide a single-source document of all design requirements associated with the fifteen subsystems that make up the rotary mode core sampling (RMCS) system. It is intended to be the baseline requirement document for the RMCS system and to be used in governing all future design and design verification activities associated with it. This document is the DCM for the RMCS system used on Hanford single-shell radioactive waste storage tanks. This includes the Exhauster System, Rotary Mode Core Sample Trucks, Universal Sampling System, Diesel Generator System, Distribution Trailer, X-Ray Cart System, Breathing Air Compressor, Nitrogen Supply Trailer, Casks and Cask Truck, Service Trailer, Core Sampling Riser Equipment, Core Sampling Support Trucks, Foot Clamp, Ramps and Platforms and Purged Camera System. Excluded items are tools such as light plants and light stands. Other items such as the breather inlet filter are covered by a different design baseline. In this case, the inlet breather filter is covered by the Tank Farms Design Compliance Matrix

  8. Outcome-Dependent Sampling Design and Inference for Cox's Proportional Hazards Model.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P; Zhou, Haibo

    2016-11-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study.

  9. A sampling strategy for estimating plot average annual fluxes of chemical elements from forest soils

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.; Vries, de W.

    2010-01-01

    A sampling strategy for estimating spatially averaged annual element leaching fluxes from forest soils is presented and tested in three Dutch forest monitoring plots. In this method sampling locations and times (days) are selected by probability sampling. Sampling locations were selected by

  10. Sample preparation composite and replicate strategy for assay of solid oral drug products.

    Science.gov (United States)

    Harrington, Brent; Nickerson, Beverly; Guo, Michele Xuemei; Barber, Marc; Giamalva, David; Lee, Carlos; Scrivens, Garry

    2014-12-16

    In pharmaceutical analysis, the results of drug product assay testing are used to make decisions regarding the quality, efficacy, and stability of the drug product. In order to make sound risk-based decisions concerning drug product potency, an understanding of the uncertainty of the reportable assay value is required. Utilizing the most restrictive criteria in current regulatory documentation, a maximum variability attributed to method repeatability is defined for a drug product potency assay. A sampling strategy that reduces the repeatability component of the assay variability below this predefined maximum is demonstrated. The sampling strategy consists of determining the number of dosage units (k) to be prepared in a composite sample of which there may be a number of equivalent replicate (r) sample preparations. The variability, as measured by the standard error (SE), of a potency assay consists of several sources such as sample preparation and dosage unit variability. A sampling scheme that increases the number of sample preparations (r) and/or number of dosage units (k) per sample preparation will reduce the assay variability and thus decrease the uncertainty around decisions made concerning the potency of the drug product. A maximum allowable repeatability component of the standard error (SE) for the potency assay is derived using material in current regulatory documents. A table of solutions for the number of dosage units per sample preparation (r) and number of replicate sample preparations (k) is presented for any ratio of sample preparation and dosage unit variability.

  11. Designing multiple ligands - medicinal chemistry strategies and challenges.

    Science.gov (United States)

    Morphy, Richard; Rankovic, Zoran

    2009-01-01

    It has been widely recognised over the recent years that parallel modulation of multiple biological targets can be beneficial for treatment of diseases with complex etiologies such as cancer asthma, and psychiatric disease. In this article, current strategies for the generation of ligands with a specific multi-target profile (designed multiple ligands or DMLs) are described and a number of illustrative example are given. Designing multiple ligands is frequently a challenging endeavour for medicinal chemists, with the need to appropriately balance affinity for 2 or more targets whilst obtaining physicochemical and pharmacokinetic properties that are consistent with the administration of an oral drug. Given that the properties of DMLs are influenced to a large extent by the proteomic superfamily to which the targets belong and the lead generation strategy that is pursued, an early assessment of the feasibility of any given DML project is essential.

  12. Strategies and arguments of ergonomic design for sustainability.

    Science.gov (United States)

    Marano, Antonio; Di Bucchianico, Giuseppe; Rossi, Emilio

    2012-01-01

    Referring to the discussion recently promoted by the Sub-Technical Committee n°4 "Ergonomics and design for sustainability", in this paper will be shown the early results of a theoretical and methodological study on Ergonomic design for sustainability. In particular, the research is based on the comparison between the common thematic structure characterizing Ergonomics, with the principles of Sustainable Development and with criteria adopted from other disciplines already oriented toward Sustainability. The paper identifies an early logical-interpretative model and describes possible and relevant Strategies of Ergonomic design for sustainability, which are connected in a series of specific Sustainable Arguments.

  13. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software

    OpenAIRE

    Dols, W. Stuart; Persily, Andrew K.; Morrow, Jayne B.; Matzke, Brett D.; Sego, Landon H.; Nuffer, Lisa L.; Pulsipher, Brent A.

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by vir...

  14. Metabolomic analysis of urine samples by UHPLC-QTOF-MS: Impact of normalization strategies.

    Science.gov (United States)

    Gagnebin, Yoric; Tonoli, David; Lescuyer, Pierre; Ponte, Belen; de Seigneux, Sophie; Martin, Pierre-Yves; Schappler, Julie; Boccard, Julien; Rudaz, Serge

    2017-02-22

    Among the various biological matrices used in metabolomics, urine is a biofluid of major interest because of its non-invasive collection and its availability in large quantities. However, significant sources of variability in urine metabolomics based on UHPLC-MS are related to the analytical drift and variation of the sample concentration, thus requiring normalization. A sequential normalization strategy was developed to remove these detrimental effects, including: (i) pre-acquisition sample normalization by individual dilution factors to narrow the concentration range and to standardize the analytical conditions, (ii) post-acquisition data normalization by quality control-based robust LOESS signal correction (QC-RLSC) to correct for potential analytical drift, and (iii) post-acquisition data normalization by MS total useful signal (MSTUS) or probabilistic quotient normalization (PQN) to prevent the impact of concentration variability. This generic strategy was performed with urine samples from healthy individuals and was further implemented in the context of a clinical study to detect alterations in urine metabolomic profiles due to kidney failure. In the case of kidney failure, the relation between creatinine/osmolality and the sample concentration is modified, and relying only on these measurements for normalization could be highly detrimental. The sequential normalization strategy was demonstrated to significantly improve patient stratification by decreasing the unwanted variability and thus enhancing data quality. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1990-01-01

    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  16. Assessment of Passive vs. Active Strategies for a School Building Design

    Directory of Open Access Journals (Sweden)

    Ji Eun Kang

    2015-11-01

    Full Text Available This paper presents a simulation study to reduce heating and cooling energy demand of a school building in Seoul Metropolitan Area, Korea. The aim of this study was to estimate the impact of passive vs. active approaches on energy savings in buildings using EnergyPlus simulation. By controlling lighting, the energy saving of the original school building design was found most significant, and increased by 32% when the design was improved. It is noteworthy that energy saving potential of each room varies significantly depending on the rooms’ thermal characteristics and orientation. Thus, the analysis of energy saving should be introduced at the individual space level, not at the whole building level. Additionally, the simulation studies should be involved for rational decision-making. Finally, it was concluded that priority should be given to passive building design strategies, such as building orientation, as well as control and utilization of solar radiation. These passive energy saving strategies are related to urban, architectural design, and engineering issues, and are more beneficial in terms of energy savings than active strategies.

  17. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    OpenAIRE

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the co...

  18. Stratified random sampling plans designed to assist in the determination of radon and radon daughter concentrations in underground uranium mine atmosphere

    International Nuclear Information System (INIS)

    Makepeace, C.E.

    1981-01-01

    Sampling strategies for the monitoring of deleterious agents present in uranium mine air in underground and surface mining areas are described. These methods are designed to prevent overexposure of the lining of the respiratory system of uranium miners to ionizing radiation from radon and radon daughters, and whole body overexposure to external gamma radiation. A detailed description is provided of stratified random sampling monitoring methodology for obtaining baseline data to be used as a reference for subsequent compliance assessment

  19. Development strategy and conceptual design of China Lead-based Research Reactor

    International Nuclear Information System (INIS)

    Wu, Yican; Bai, Yunqing; Song, Yong; Huang, Qunying; Zhao, Zhumin; Hu, Liqin

    2016-01-01

    Highlights: • China LEAd-based Reactor (CLEAR) proposed by Institute of Nuclear Energy Safety Technology (INEST) is selected as the ADS reference reactor. • The Chinese ADS development program consists of three stages, and during the first stage, a 10 MW th lead-based research reactor named CLEAR-I will be built with subcritical and critical dual-mode operation capability for validation of ADS transmutation system and lead cooled fast reactor technology. • Major design principles of CLEAR-I are oriented at technology feasibility, safety reliability, experiment flexibility and technology continuity. Followed by the development strategy and design principles, CLEAR-I design options and conceptual design scenarios are presented. - Abstract: Chinese Academy of Sciences (CAS) launched an engineering project to develop an Accelerator Driven System (ADS) for nuclear waste transmutation since 2011, and China LEAd-based Reactor (CLEAR) proposed by Institute of Nuclear Energy Safety Technology (INEST) is selected as the ADS reference reactor. In this paper, the development strategy and conceptual design of China Lead-based Research Reactor are proposed. The Chinese ADS development program consists of three stages, and during the first stage, a 10 MW th lead-based research reactor named CLEAR-I will be built with subcritical and critical dual-mode operation capability for validation of ADS transmutation system and lead cooled fast reactor technology. Major design principles of CLEAR-I are oriented at technology feasibility, safety reliability, experiment flexibility and technology continuity. Followed by the development strategy and design principles, CLEAR-I design options and conceptual design scenarios are presented.

  20. Sampling and analysis strategies to support waste form qualification

    International Nuclear Information System (INIS)

    Westsik, J.H. Jr.; Pulsipher, B.A.; Eggett, D.L.; Kuhn, W.L.

    1989-04-01

    As part of the waste acceptance process, waste form producers will be required to (1) demonstrate that their glass waste form will meet minimum specifications, (2) show that the process can be controlled to consistently produce an acceptable waste form, and (3) provide documentation that the waste form produced meets specifications. Key to the success of these endeavors is adequate sampling and chemical and radiochemical analyses of the waste streams from the waste tanks through the process to the final glass product. This paper suggests sampling and analysis strategies for meeting specific statistical objectives of (1) detection of compositions outside specification limits, (2) prediction of final glass product composition, and (3) estimation of composition in process vessels for both reporting and guiding succeeding process steps. 2 refs., 1 fig., 3 tabs

  1. Sample design considerations of indoor air exposure surveys

    International Nuclear Information System (INIS)

    Cox, B.G.; Mage, D.T.; Immerman, F.W.

    1988-01-01

    Concern about the potential for indoor air pollution has prompted recent surveys of radon and NO 2 concentrations in homes and personal exposure studies of volatile organics, carbon monoxide and pesticides, to name a few. The statistical problems in designing sample surveys that measure the physical environment are diverse and more complicated than those encountered in traditional surveys of human attitudes and attributes. This paper addresses issues encountered when designing indoor air quality (IAQ) studies. General statistical concepts related to target population definition, frame creation, and sample selection for area household surveys and telephone surveys are presented. The implications of different measurement approaches are discussed, and response rate considerations are described

  2. Optimizing sampling design to deal with mist-net avoidance in Amazonian birds and bats.

    Directory of Open Access Journals (Sweden)

    João Tiago Marques

    Full Text Available Mist netting is a widely used technique to sample bird and bat assemblages. However, captures often decline with time because animals learn and avoid the locations of nets. This avoidance or net shyness can substantially decrease sampling efficiency. We quantified the day-to-day decline in captures of Amazonian birds and bats with mist nets set at the same location for four consecutive days. We also evaluated how net avoidance influences the efficiency of surveys under different logistic scenarios using re-sampling techniques. Net avoidance caused substantial declines in bird and bat captures, although more accentuated in the latter. Most of the decline occurred between the first and second days of netting: 28% in birds and 47% in bats. Captures of commoner species were more affected. The numbers of species detected also declined. Moving nets daily to minimize the avoidance effect increased captures by 30% in birds and 70% in bats. However, moving the location of nets may cause a reduction in netting time and captures. When moving the nets caused the loss of one netting day it was no longer advantageous to move the nets frequently. In bird surveys that could even decrease the number of individuals captured and species detected. Net avoidance can greatly affect sampling efficiency but adjustments in survey design can minimize this. Whenever nets can be moved without losing netting time and the objective is to capture many individuals, they should be moved daily. If the main objective is to survey species present then nets should still be moved for bats, but not for birds. However, if relocating nets causes a significant loss of netting time, moving them to reduce effects of shyness will not improve sampling efficiency in either group. Overall, our findings can improve the design of mist netting sampling strategies in other tropical areas.

  3. A cache-friendly sampling strategy for texture-based volume rendering on GPU

    Directory of Open Access Journals (Sweden)

    Junpeng Wang

    2017-06-01

    Full Text Available The texture-based volume rendering is a memory-intensive algorithm. Its performance relies heavily on the performance of the texture cache. However, most existing texture-based volume rendering methods blindly map computational resources to texture memory and result in incoherent memory access patterns, causing low cache hit rates in certain cases. The distance between samples taken by threads of an atomic scheduling unit (e.g. a warp of 32 threads in CUDA of the GPU is a crucial factor that affects the texture cache performance. Based on this fact, we present a new sampling strategy, called Warp Marching, for the ray-casting algorithm of texture-based volume rendering. The effects of different sample organizations and different thread-pixel mappings in the ray-casting algorithm are thoroughly analyzed. Also, a pipeline manner color blending approach is introduced and the power of warp-level GPU operations is leveraged to improve the efficiency of parallel executions on the GPU. In addition, the rendering performance of the Warp Marching is view-independent, and it outperforms existing empty space skipping techniques in scenarios that need to render large dynamic volumes in a low resolution image. Through a series of micro-benchmarking and real-life data experiments, we rigorously analyze our sampling strategies and demonstrate significant performance enhancements over existing sampling methods.

  4. Strategy for thermo-gravimetric analysis of K East fuel samples

    International Nuclear Information System (INIS)

    Lawrence, L.A.

    1997-01-01

    A strategy was developed for the Thermo-Gravimetric Analysis (TGA) testing of K East fuel samples for oxidation rate determinations. Tests will first establish if there are any differences for dry air oxidation between the K West and K East fuel. These tests will be followed by moist inert gas oxidation rate measurements. The final series of tests will consider pure water vapor i.e., steam

  5. Energy efficiency design strategies for buildings with grid-connected photovoltaic systems

    Science.gov (United States)

    Yimprayoon, Chanikarn

    The building sector in the United States represents more than 40% of the nation's energy consumption. Energy efficiency design strategies and renewable energy are keys to reduce building energy demand. Grid-connected photovoltaic (PV) systems installed on buildings have been the fastest growing market in the PV industry. This growth poses challenges for buildings qualified to serve in this market sector. Electricity produced from solar energy is intermittent. Matching building electricity demand with PV output can increase PV system efficiency. Through experimental methods and case studies, computer simulations were used to investigate the priorities of energy efficiency design strategies that decreased electricity demand while producing load profiles matching with unique output profiles from PV. Three building types (residential, commercial, and industrial) of varying sizes and use patterns located in 16 climate zones were modeled according to ASHRAE 90.1 requirements. Buildings were analyzed individually and as a group. Complying with ASHRAE energy standards can reduce annual electricity consumption at least 13%. With energy efficiency design strategies, the reduction could reach up to 65%, making it possible for PV systems to meet reduced demands in residential and industrial buildings. The peak electricity demand reduction could be up to 71% with integration of strategies and PV. Reducing lighting power density was the best single strategy with high overall performances. Combined strategies such as zero energy building are also recommended. Electricity consumption reductions are the sum of the reductions from strategies and PV output. However, peak electricity reductions were less than their sum because they reduced peak at different times. The potential of grid stress reduction is significant. Investment incentives from government and utilities are necessary. The PV system sizes on net metering interconnection should not be limited by legislation existing in

  6. Design strategies for pollution prevention in industries (life cycle design)

    International Nuclear Information System (INIS)

    Saleemi, A.R.

    1997-01-01

    Pollution prevention and adoption of clean technologies in the industry are to be the proper strategies to flight against the growing industrial pollution in Pakistan. These strategies will not only reduce the existing pollution load and will also help to have sustainable industrial development in Pakistan in is well established that the concept of pollution prevention demands use of minimum, resources with maximum efficiency to achieve double benefits such as resource conservation and environmental protection. The application of cleaner production and waste minimization in thousand of industries in other part of world has proved beyond doubt that the use of cleaner technology is cheaper as compared to installation of waste treatment plants for end of pipe treatment. Waste treatment plants have been blamed not to solve any pollution problem, but only to transfer pollution from one environmental media to another. The adoption of waste treatment technologies have also created lot of other problems. Thousand of industries in the world have change their focus of activities from end of pipe treatment to pollution prevention techniques. It is the right time to start pollution prevention activities in industry. The design of a product system in the industry can be represented logically as a series of decision and choices made individually and collectively by design participant. The choices range from the selection of materials and manufacturing processes to choices relating to shape, from and function of a product. The product life cycle design provides a logical system for addressing pollution prevention because the full range of environmental consequence associated with the product can be considered and it is a powerful tool for identifying and maximizing the environmental benefits of pollution prevention. The life cycle assesment (LCA) concept suggests that decision making should be based on consideration of the cradle-to grave characteristics of the product, process

  7. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  8. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    Science.gov (United States)

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  9. Rapid E-learning Development Strategies and a Multimedia Project Design Model

    Science.gov (United States)

    Sözcü, Ömer Faruk; Ipek, Ismail

    2014-01-01

    The purpose of the study is to discuss e-learning design strategies which can be used for multimedia projects as a design model. Recent advances in instructional technologies have been found to be very important in the design of training courses by using rapid instructional design (ID) approaches. The approaches were developed to use in training…

  10. On Adaptive Extended Different Life Cycle of Product Design Strategy

    Science.gov (United States)

    Wenwen, Jiang; Zhibin, Xie

    The article uses research ways of following the whole lifespan of product and enterprise's development course to research strategy of company's product design and development. It announces enterprises of different nature, enterprises at different developing stage will adopt different mode strategy. It also announces close causality between development course of company and central technology and product. The result indicated in different developing stages such as company development period, crisis predicament period, lasting steadies period, improving by payback period, issues steadies secondary period, declining go and live period, enterprise should pursue different mode product tactics of research and development such as shrinking strategy, consolidating strategy, innovation keeping forging ahead strategy. Enterprise should break regular management mode to introduce different research and development mode to promote enterprise's competitiveness effectively.

  11. Sampling designs matching species biology produce accurate and affordable abundance indices

    Directory of Open Access Journals (Sweden)

    Grant Harris

    2013-12-01

    Full Text Available Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling, it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS data from 42 Alaskan brown bears (Ursus arctos. Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion, and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture

  12. Sampling designs matching species biology produce accurate and affordable abundance indices.

    Science.gov (United States)

    Harris, Grant; Farley, Sean; Russell, Gareth J; Butler, Matthew J; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km(2) cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions

  13. Sampling designs matching species biology produce accurate and affordable abundance indices

    Science.gov (United States)

    Farley, Sean; Russell, Gareth J.; Butler, Matthew J.; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions, which

  14. Lagoa Real design. Description and evaluation of sampling system

    International Nuclear Information System (INIS)

    Hashizume, B.K.

    1982-10-01

    This report describes the samples preparation system of drilling from Lagoa Real Design, aiming obtainment representative fraction of the half from drilling outlier. The error of sampling + analysis and analytical accuracy was obtainment by delayed neutron analysis. (author)

  15. Collaboration During the NASA ABoVE Airborne SAR Campaign: Sampling Strategies Used by NGEE Arctic and Other Partners in Alaska and Western Canada

    Science.gov (United States)

    Wullschleger, S. D.; Charsley-Groffman, L.; Baltzer, J. L.; Berg, A. A.; Griffith, P. C.; Jafarov, E. E.; Marsh, P.; Miller, C. E.; Schaefer, K. M.; Siqueira, P.; Wilson, C. J.; Kasischke, E. S.

    2017-12-01

    There is considerable interest in using L- and P-band Synthetic Aperture Radar (SAR) data to monitor variations in aboveground woody biomass, soil moisture, and permafrost conditions in high-latitude ecosystems. Such information is useful for quantifying spatial heterogeneity in surface and subsurface properties, and for model development and evaluation. To conduct these studies, it is desirable that field studies share a common sampling strategy so that the data from multiple sites can be combined and used to analyze variations in conditions across different landscape geomorphologies and vegetation types. In 2015, NASA launched the decade-long Arctic-Boreal Vulnerability Experiment (ABoVE) to study the sensitivity and resilience of these ecosystems to disturbance and environmental change. NASA is able to leverage its remote sensing strengths to collect airborne and satellite observations to capture important ecosystem properties and dynamics across large spatial scales. A critical component of this effort includes collection of ground-based data that can be used to analyze, calibrate and validate remote sensing products. ABoVE researchers at a large number of sites located in important Arctic and boreal ecosystems in Alaska and western Canada are following common design protocols and strategies for measuring soil moisture, thaw depth, biomass, and wetland inundation. Here we elaborate on those sampling strategies as used in the 2017 summer SAR campaign and address the sampling design and measurement protocols for supporting the ABoVE aerial activities. Plot size, transect length, and distribution of replicates across the landscape systematically allowed investigators to optimally sample a site for soil moisture, thaw depth, and organic layer thickness. Specific examples and data sets are described for the Department of Energy's Next-Generation Ecosystem Experiments (NGEE Arctic) project field sites near Nome and Barrow, Alaska. Future airborne and satellite

  16. A Waterfall Design Strategy for Using Social Media for Instruction

    Science.gov (United States)

    Ahern, Terence C.

    2016-01-01

    Using social media can create a rich learning environment that crosses all content areas. The key to creating this environment is for instructors and designers to match appropriate social media software with the intended learning outcome. This article describes an instructional design strategy that helps educators create learning activities that…

  17. Using Learning Analytics to Characterize Student Experimentation Strategies in Engineering Design

    Science.gov (United States)

    Vieira, Camilo; Goldstein, Molly Hathaway; Purzer, Senay; Magana, Alejandra J.

    2016-01-01

    Engineering design is a complex process both for students to participate in and for instructors to assess. Informed designers use the key strategy of conducting experiments as they test ideas to inform next steps. Conversely, beginning designers experiment less, often with confounding variables. These behaviours are not easy to assess in…

  18. A radial sampling strategy for uniform k-space coverage with retrospective respiratory gating in 3D ultrashort-echo-time lung imaging.

    Science.gov (United States)

    Park, Jinil; Shin, Taehoon; Yoon, Soon Ho; Goo, Jin Mo; Park, Jang-Yeon

    2016-05-01

    The purpose of this work was to develop a 3D radial-sampling strategy which maintains uniform k-space sample density after retrospective respiratory gating, and demonstrate its feasibility in free-breathing ultrashort-echo-time lung MRI. A multi-shot, interleaved 3D radial sampling function was designed by segmenting a single-shot trajectory of projection views such that each interleaf samples k-space in an incoherent fashion. An optimal segmentation factor for the interleaved acquisition was derived based on an approximate model of respiratory patterns such that radial interleaves are evenly accepted during the retrospective gating. The optimality of the proposed sampling scheme was tested by numerical simulations and phantom experiments using human respiratory waveforms. Retrospectively, respiratory-gated, free-breathing lung MRI with the proposed sampling strategy was performed in healthy subjects. The simulation yielded the most uniform k-space sample density with the optimal segmentation factor, as evidenced by the smallest standard deviation of the number of neighboring samples as well as minimal side-lobe energy in the point spread function. The optimality of the proposed scheme was also confirmed by minimal image artifacts in phantom images. Human lung images showed that the proposed sampling scheme significantly reduced streak and ring artifacts compared with the conventional retrospective respiratory gating while suppressing motion-related blurring compared with full sampling without respiratory gating. In conclusion, the proposed 3D radial-sampling scheme can effectively suppress the image artifacts due to non-uniform k-space sample density in retrospectively respiratory-gated lung MRI by uniformly distributing gated radial views across the k-space. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Design review report for rotary mode core sample truck (RMCST) modifications for flammable gas tanks, preliminary design

    International Nuclear Information System (INIS)

    Corbett, J.E.

    1996-02-01

    This report documents the completion of a preliminary design review for the Rotary Mode Core Sample Truck (RMCST) modifications for flammable gas tanks. The RMCST modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review

  20. Analytical strategies for uranium determination in natural water and industrial effluents samples

    International Nuclear Information System (INIS)

    Santos, Juracir Silva

    2011-01-01

    The work was developed under the project 993/2007 - 'Development of analytical strategies for uranium determination in environmental and industrial samples - Environmental monitoring in the Caetite city, Bahia, Brazil' and made possible through a partnership established between Universidade Federal da Bahia and the Comissao Nacional de Energia Nuclear. Strategies were developed to uranium determination in natural water and effluents of uranium mine. The first one was a critical evaluation of the determination of uranium by inductively coupled plasma optical emission spectrometry (ICP OES) performed using factorial and Doehlert designs involving the factors: acid concentration, radio frequency power and nebuliser gas flow rate. Five emission lines were simultaneously studied (namely: 367.007, 385.464, 385.957, 386.592 and 409.013 nm), in the presence of HN0 3 , H 3 C 2 00H or HCI. The determinations in HN0 3 medium were the most sensitive. Among the factors studied, the gas flow rate was the most significant for the five emission lines. Calcium caused interference in the emission intensity for some lines and iron did not interfere (at least up to 10 mg L -1 ) in the five lines studied. The presence of 13 other elements did not affect the emission intensity of uranium for the lines chosen. The optimized method, using the line at 385.957 nm, allows the determination of uranium with limit of quantification of 30 μg L -1 and precision expressed as RSD lower than 2.2% for uranium concentrations of either 500 and 1000 μg L -1 . In second one, a highly sensitive flow-based procedure for uranium determination in natural waters is described. A 100-cm optical path flow cell based on a liquid-core waveguide (LCW) was exploited to increase sensitivity of the arsenazo 111 method, aiming to achieve the limits established by environmental regulations. The flow system was designed with solenoid micro-pumps in order to improve mixing and minimize reagent consumption, as well as

  1. Design of strategies to assess lumbar posture during work.

    NARCIS (Netherlands)

    Burdorf, A.; Riel, van M.

    1996-01-01

    Quantitative characterization of postural load on the back should describe exposure patterns among workers and factors affecting these exposure patterns. This article presents general guidelines for designing appropriate measurement strategies; how to obtain detailed data with an applicable

  2. Recommended Immunological Strategies to Screen for Botulinum Neurotoxin-Containing Samples

    Directory of Open Access Journals (Sweden)

    Stéphanie Simon

    2015-11-01

    Full Text Available Botulinum neurotoxins (BoNTs cause the life-threatening neurological illness botulism in humans and animals and are divided into seven serotypes (BoNT/A–G, of which serotypes A, B, E, and F cause the disease in humans. BoNTs are classified as “category A” bioterrorism threat agents and are relevant in the context of the Biological Weapons Convention. An international proficiency test (PT was conducted to evaluate detection, quantification and discrimination capabilities of 23 expert laboratories from the health, food and security areas. Here we describe three immunological strategies that proved to be successful for the detection and quantification of BoNT/A, B, and E considering the restricted sample volume (1 mL distributed. To analyze the samples qualitatively and quantitatively, the first strategy was based on sensitive immunoenzymatic and immunochromatographic assays for fast qualitative and quantitative analyses. In the second approach, a bead-based suspension array was used for screening followed by conventional ELISA for quantification. In the third approach, an ELISA plate format assay was used for serotype specific immunodetection of BoNT-cleaved substrates, detecting the activity of the light chain, rather than the toxin protein. The results provide guidance for further steps in quality assurance and highlight problems to address in the future.

  3. Recommended Immunological Strategies to Screen for Botulinum Neurotoxin-Containing Samples.

    Science.gov (United States)

    Simon, Stéphanie; Fiebig, Uwe; Liu, Yvonne; Tierney, Rob; Dano, Julie; Worbs, Sylvia; Endermann, Tanja; Nevers, Marie-Claire; Volland, Hervé; Sesardic, Dorothea; Dorner, Martin B

    2015-11-26

    Botulinum neurotoxins (BoNTs) cause the life-threatening neurological illness botulism in humans and animals and are divided into seven serotypes (BoNT/A-G), of which serotypes A, B, E, and F cause the disease in humans. BoNTs are classified as "category A" bioterrorism threat agents and are relevant in the context of the Biological Weapons Convention. An international proficiency test (PT) was conducted to evaluate detection, quantification and discrimination capabilities of 23 expert laboratories from the health, food and security areas. Here we describe three immunological strategies that proved to be successful for the detection and quantification of BoNT/A, B, and E considering the restricted sample volume (1 mL) distributed. To analyze the samples qualitatively and quantitatively, the first strategy was based on sensitive immunoenzymatic and immunochromatographic assays for fast qualitative and quantitative analyses. In the second approach, a bead-based suspension array was used for screening followed by conventional ELISA for quantification. In the third approach, an ELISA plate format assay was used for serotype specific immunodetection of BoNT-cleaved substrates, detecting the activity of the light chain, rather than the toxin protein. The results provide guidance for further steps in quality assurance and highlight problems to address in the future.

  4. Mechanical design and simulation of an automatized sample exchanger

    International Nuclear Information System (INIS)

    Lopez, Yon; Gora, Jimmy; Bedregal, Patricia; Hernandez, Yuri; Baltuano, Oscar; Gago, Javier

    2013-01-01

    The design of a turntable type sample exchanger for irradiation and with a capacity for up to 20 capsules was performed. Its function is the automatic sending of samples contained in polyethylene capsules, for irradiation in the grid position of the reactor core, using a pneumatic system and further analysis by neutron activation. This study shows the structural design analysis and calculations in selecting motors and actuators. This development will improve efficiency in the analysis, reducing the contribution of the workers and also the radiation exposure time. (authors).

  5. Diabetes City: How Urban Game Design Strategies Can Help Diabetics

    Science.gov (United States)

    Knöll, Martin

    Computer Games are about to leave their “electronic shells” and enter the city. So-called Serious Pervasive Games (SPGs) [1] allow for hybrid - simultaneously physical and virtual - experiences, applying technologies of ubiquitous computing, communication and “intelligent” interfaces. They begin to focus on non-entertaining purposes. The following article a) presents game design strategies as a missing link between pervasive computing, Ambient Intelligence and user’s everyday life. Therefore it spurs a discussion how Pervasive Healthcare focusing on the therapy and prevention of chronic diseases can benefit from urban game design strategies. b) Moreover the article presents the development and work in progress of “DiabetesCity“ - an educational game prototype for young diabetics.

  6. Multidimensional (OLAP) Analysis for Designing Dynamic Learning Strategy

    Science.gov (United States)

    Rozeva, A.; Deliyska, B.

    2010-10-01

    Learning strategy in an intelligent learning system is generally elaborated on the basis of assessment of the following factors: learner's time for reaction, content of the learning object, amount of learning material in a learning object, learning object specification, e-learning medium and performance control. Current work proposes architecture for dynamic learning strategy design by implementing multidimensional analysis model of learning factors. The analysis model concerns on-line analytical processing (OLAP) of learner's data structured as multidimensional cube. Main components of the architecture are analysis agent for performing the OLAP operations on learner data cube, adaptation generator and knowledge selection agent for performing adaptive navigation in the learning object repository. The output of the analysis agent is involved in dynamic elaboration of learning strategy that fits best to learners profile and behavior. As a result an adaptive learning path for individual learner and for learner groups is generated.

  7. Using visual thinking strategies with nursing students to enhance nursing assessment skills: A qualitative design.

    Science.gov (United States)

    Nanavaty, Joanne

    2018-03-01

    This qualitative design study addressed the enhancement of nursing assessment skills through the use of Visual Thinking Strategies and reflection. This study advances understanding of the use of Visual Thinking Strategies and reflection as ways to explore new methods of thinking and observing patient situations relating to health care. Sixty nursing students in a licensed practical nursing program made up the sample of participants who attended an art gallery as part of a class assignment. Participants replied to a survey of interest for participation at the art gallery. Participants reviewed artwork at the gallery and shared observations with the larger group during a post-conference session in a gathering area of the museum at the end of the visit. A reflective exercise on the art gallery experience exhibited further thoughts about the art gallery experience and demonstrated the connections made to clinical practice by the student. The findings of this study support the use of Visual Thinking Strategies and reflection as effective teaching and learning tools for enhancing nursing skills. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Outcome-Dependent Sampling Design and Inference for Cox’s Proportional Hazards Model

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P.; Zhou, Haibo

    2016-01-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study. PMID:28090134

  9. Reinventing classics: the hidden design strategies of renowned chefs

    OpenAIRE

    Agogué , Marine; Hatchuel , Armand

    2015-01-01

    International audience; Reinventing classics is a well-used yet complex design pattern. Indeed, a reinterpreted classic needs to relate to the original object while simultaneously challenging the initial model and providing a new and fresh look to the well established classic. However, this design strategy remains understudied, and we aimed to contribute to the literature by addressing the lack of theoretical models for reinventing classics. Reinterpreting tradition is a key process for chefs...

  10. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research.

    Science.gov (United States)

    Palinkas, Lawrence A; Horwitz, Sarah M; Green, Carla A; Wisdom, Jennifer P; Duan, Naihua; Hoagwood, Kimberly

    2015-09-01

    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research.

  11. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research

    Science.gov (United States)

    Palinkas, Lawrence A.; Horwitz, Sarah M.; Green, Carla A.; Wisdom, Jennifer P.; Duan, Naihua; Hoagwood, Kimberly

    2013-01-01

    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research. PMID:24193818

  12. Instructional Design-Based Research on Problem Solving Strategies

    Science.gov (United States)

    Emre-Akdogan, Elçin; Argün, Ziya

    2016-01-01

    The main goal of this study is to find out the effect of the instructional design method on the enhancement of problem solving abilities of students. Teaching sessions were applied to ten students who are in 11th grade, to teach them problem solving strategies which are working backwards, finding pattern, adopting a different point of view,…

  13. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  14. Population Pharmacokinetics and Optimal Sampling Strategy for Model-Based Precision Dosing of Melphalan in Patients Undergoing Hematopoietic Stem Cell Transplantation.

    Science.gov (United States)

    Mizuno, Kana; Dong, Min; Fukuda, Tsuyoshi; Chandra, Sharat; Mehta, Parinda A; McConnell, Scott; Anaissie, Elias J; Vinks, Alexander A

    2018-05-01

    High-dose melphalan is an important component of conditioning regimens for patients undergoing hematopoietic stem cell transplantation. The current dosing strategy based on body surface area results in a high incidence of oral mucositis and gastrointestinal and liver toxicity. Pharmacokinetically guided dosing will individualize exposure and help minimize overexposure-related toxicity. The purpose of this study was to develop a population pharmacokinetic model and optimal sampling strategy. A population pharmacokinetic model was developed with NONMEM using 98 observations collected from 15 adult patients given the standard dose of 140 or 200 mg/m 2 by intravenous infusion. The determinant-optimal sampling strategy was explored with PopED software. Individual area under the curve estimates were generated by Bayesian estimation using full and the proposed sparse sampling data. The predictive performance of the optimal sampling strategy was evaluated based on bias and precision estimates. The feasibility of the optimal sampling strategy was tested using pharmacokinetic data from five pediatric patients. A two-compartment model best described the data. The final model included body weight and creatinine clearance as predictors of clearance. The determinant-optimal sampling strategies (and windows) were identified at 0.08 (0.08-0.19), 0.61 (0.33-0.90), 2.0 (1.3-2.7), and 4.0 (3.6-4.0) h post-infusion. An excellent correlation was observed between area under the curve estimates obtained with the full and the proposed four-sample strategy (R 2  = 0.98; p strategy promises to achieve the target area under the curve as part of precision dosing.

  15. Fate of organic microcontaminants in wastewater treatment and river systems: An uncertainty assessment in view of sampling strategy, and compound consumption rate and degradability.

    Science.gov (United States)

    Aymerich, I; Acuña, V; Ort, C; Rodríguez-Roda, I; Corominas, Ll

    2017-11-15

    The growing awareness of the relevance of organic microcontaminants on the environment has led to a growing number of studies on attenuation of these compounds in wastewater treatment plants (WWTP) and rivers. However, the effects of the sampling strategies (frequency and duration of composite samples) on the attenuation estimates are largely unknown. Our goal was to assess how frequency and duration of composite samples influence uncertainty of the attenuation estimates in WWTPs and rivers. Furthermore, we also assessed how compound consumption rate and degradability influence uncertainty. The assessment was conducted through simulating the integrated wastewater system of Puigcerdà (NE Iberian Peninsula) using a sewer pattern generator and a coupled model of WWTP and river. Results showed that the sampling strategy is especially critical at the influent of WWTP, particularly when the number of toilet flushes containing the compound of interest is small (≤100 toilet flushes with compound day -1 ), and less critical at the effluent of the WWTP and in the river due to the mixing effects of the WWTP. For example, at the WWTP, when evaluating a compound that is present in 50 pulses·d -1 using a sampling frequency of 15-min to collect a 24-h composite sample, the attenuation uncertainty can range from 94% (0% degradability) to 9% (90% degradability). The estimation of attenuation in rivers is less critical than in WWTPs, as the attenuation uncertainty was lower than 10% for all evaluated scenarios. Interestingly, the errors in the estimates of attenuation are usually lower than those of loads for most sampling strategies and compound characteristics (e.g. consumption and degradability), although the opposite occurs for compounds with low consumption and inappropriate sampling strategies at the WWTP. Hence, when designing a sampling campaign, one should consider the influence of compounds' consumption and degradability as well as the desired level of accuracy in

  16. Vernacular design based on sustainable disaster's mitigation communication and education strategy

    Science.gov (United States)

    Mansoor, Alvanov Zpalanzani

    2015-04-01

    Indonesia is located between three active tectonic plates, which are prone to natural disasters such as earthquake, volcanic eruption, and also giant tidal wave-tsunami. Adequate infrastructure plays an important role in disaster mitigation, yet without good public awareness, the mitigation process won't be succeeded. The absence of awareness can lead to infrastructure mistreatment. Several reports on lack of understanding or misinterpretation of disaster mitigation especially from rural and coastal communities need to be solved, especially from communication aspects. This is an interdisciplinary study on disaster mitigation communication design and education strategy from visual communication design studies paradigm. This paper depicts research results which applying vernacular design base to elaborate sustainable mitigation communication and education strategy on various visual media and social campaigns. This paper also describes several design approaches which may becomes way to elaborate sustainable awareness and understanding on disaster mitigation among rural and coastal communities in Indonesia.

  17. Comparison of active and passive sampling strategies for the monitoring of pesticide contamination in streams

    Science.gov (United States)

    Assoumani, Azziz; Margoum, Christelle; Guillemain, Céline; Coquery, Marina

    2014-05-01

    The monitoring of water bodies regarding organic contaminants, and the determination of reliable estimates of concentrations are challenging issues, in particular for the implementation of the Water Framework Directive. Several strategies can be applied to collect water samples for the determination of their contamination level. Grab sampling is fast, easy, and requires little logistical and analytical needs in case of low frequency sampling campaigns. However, this technique lacks of representativeness for streams with high variations of contaminant concentrations, such as pesticides in rivers located in small agricultural watersheds. Increasing the representativeness of this sampling strategy implies greater logistical needs and higher analytical costs. Average automated sampling is therefore a solution as it allows, in a single analysis, the determination of more accurate and more relevant estimates of concentrations. Two types of automatic samplings can be performed: time-related sampling allows the assessment of average concentrations, whereas flow-dependent sampling leads to average flux concentrations. However, the purchase and the maintenance of automatic samplers are quite expensive. Passive sampling has recently been developed as an alternative to grab or average automated sampling, to obtain at lower cost, more realistic estimates of the average concentrations of contaminants in streams. These devices allow the passive accumulation of contaminants from large volumes of water, resulting in ultratrace level detection and smoothed integrative sampling over periods ranging from days to weeks. They allow the determination of time-weighted average (TWA) concentrations of the dissolved fraction of target contaminants, but they need to be calibrated in controlled conditions prior to field applications. In other words, the kinetics of the uptake of the target contaminants into the sampler must be studied in order to determine the corresponding sampling rate

  18. AGROECOLOGY: PRINCIPLES AND STRATEGIES FOR THE DESIGN OF SUSTAINABLE AGROECOSYSTEMS

    Directory of Open Access Journals (Sweden)

    João Carlos Canuto

    2017-04-01

    Full Text Available The theme of this paper is the debate on principles and strategies for designing sustainable agricultural systems. The paper builds on a broad approach to principles, moving to the more specific approach to strategies and finalizing with a micro-scale perspective on the practice of drawings and the consequences of each possible option. The objective is first of all to put to the debate the dialectic between conceptual plurality and unity in Agroecology. The problem in focus is to situate more clearly what are sustainable agroecosystems and, as a consequence, how to connect principles and strategies to make them viable. Regarding the theoretical reference, we use the classic authors of Agroecology and some critical articles on the conceptual question. The methodology that gives foundation to the approach is based on the author's theoretical and practical experience, with a qualitative, subjective and intuitive character. The results are only the presentation of ideas in order to contribute to the conceptual debate now in vogue and also to glimpse, on a smaller scale, the practical issue of sustainable agroecosystems designs.

  19. Designing System Reforms: Using a Systems Approach to Translate Incident Analyses into Prevention Strategies

    Science.gov (United States)

    Goode, Natassia; Read, Gemma J. M.; van Mulken, Michelle R. H.; Clacy, Amanda; Salmon, Paul M.

    2016-01-01

    Advocates of systems thinking approaches argue that accident prevention strategies should focus on reforming the system rather than on fixing the “broken components.” However, little guidance exists on how organizations can translate incident data into prevention strategies that address the systemic causes of accidents. This article describes and evaluates a series of systems thinking prevention strategies that were designed in response to the analysis of multiple incidents. The study was undertaken in the led outdoor activity (LOA) sector in Australia, which delivers supervised or instructed outdoor activities such as canyoning, sea kayaking, rock climbing and camping. The design process involved workshops with practitioners, and focussed on incident data analyzed using Rasmussen's AcciMap technique. A series of reflection points based on the systemic causes of accidents was used to guide the design process, and the AcciMap technique was used to represent the prevention strategies and the relationships between them, leading to the creation of PreventiMaps. An evaluation of the PreventiMaps revealed that all of them incorporated the core principles of the systems thinking approach and many proposed prevention strategies for improving vertical integration across the LOA system. However, the majority failed to address the migration of work practices and the erosion of risk controls. Overall, the findings suggest that the design process was partially successful in helping practitioners to translate incident data into prevention strategies that addressed the systemic causes of accidents; refinement of the design process is required to focus practitioners more on designing monitoring and feedback mechanisms to support decisions at the higher levels of the system. PMID:28066296

  20. Design development of robotic system for on line sampling in fuel reprocessing

    International Nuclear Information System (INIS)

    Balasubramanian, G.R.; Venugopal, P.R.; Padmashali, G.K.

    1990-01-01

    This presentation describes the design and developmental work that is being carried out for the design of an automated sampling system for fast reactor fuel reprocessing plants. The plant proposes to use integrated sampling system. The sample is taken across regular process streams from any intermediate hold up pot. A robot system is planned to take the sample from the sample pot, transfer it to the sample bottle, cap the bottle and transfer the bottle to a pneumatic conveying station. The system covers a large number of sample pots. Alternate automated systems are also examined (1). (author). 4 refs., 2 figs

  1. Stratified sampling design based on data mining.

    Science.gov (United States)

    Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung

    2013-09-01

    To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.

  2. Reasoning Strategies in the Context of Engineering Design with Everyday Materials

    Science.gov (United States)

    Worsley, Marcelo; Blikstein, Paulo

    2016-01-01

    "Making" represents an increasingly popular label for describing a form of engineering design. While making is growing in popularity, there are still open questions about the strategies that students are using in these activities. Assessing and improving learning in making/ engineering design contexts require that we have a better…

  3. Latent spatial models and sampling design for landscape genetics

    Science.gov (United States)

    Hanks, Ephraim M.; Hooten, Mevin B.; Knick, Steven T.; Oyler-McCance, Sara J.; Fike, Jennifer A.; Cross, Todd B.; Schwartz, Michael K.

    2016-01-01

    We propose a spatially-explicit approach for modeling genetic variation across space and illustrate how this approach can be used to optimize spatial prediction and sampling design for landscape genetic data. We propose a multinomial data model for categorical microsatellite allele data commonly used in landscape genetic studies, and introduce a latent spatial random effect to allow for spatial correlation between genetic observations. We illustrate how modern dimension reduction approaches to spatial statistics can allow for efficient computation in landscape genetic statistical models covering large spatial domains. We apply our approach to propose a retrospective spatial sampling design for greater sage-grouse (Centrocercus urophasianus) population genetics in the western United States.

  4. Integrated design strategy for product life-cycle management

    Science.gov (United States)

    Johnson, G. Patrick

    2001-02-01

    Two major trends suggest new considerations for environmentally conscious manufacturing (ECM) -- the continuation of dematerialization and the growing trend toward goods becoming services. A diversity of existing research could be integrated around those trends in ways that can enhance ECM. Major research-based achievements in information, computation, and communications systems, sophisticated and inexpensive sensing capabilities, highly automated and precise manufacturing technologies, and new materials continue to drive the phenomenon of dematerialization - the reduction of the material and energy content of per capita GDP. Knowledge is also growing about the sociology, economics, mathematics, management and organization of complex socio-economic systems. And that has driven a trend towards goods evolving into services. But even with these significant trends, the value of material, energy, information and human resources incorporated into the manufacture, use and disposal of modern products and services often far exceeds the benefits realized. Multi-disciplinary research integrating these drivers with advances in ECM concepts could be the basis for a new strategy of production. It is argued that a strategy of integrating information resources with physical and human resources over product life cycles, together with considering products as streams of service over time, could lead to significant economic payoff. That strategy leads to an overall design concept to minimize costs of all resources over the product life cycle to more fully capture benefits of all resources incorporated into modern products. It is possible by including life cycle monitoring, periodic component replacement, re-manufacture, salvage and human factor skill enhancement into initial design.

  5. Visual Sample Plan (VSP) Software: Designs and Data Analyses for Sampling Contaminated Buildings

    International Nuclear Information System (INIS)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Nuffer, Lisa L.; Hassig, Nancy L.

    2005-01-01

    A new module of the Visual Sample Plan (VSP) software has been developed to provide sampling designs and data analyses for potentially contaminated buildings. An important application is assessing levels of contamination in buildings after a terrorist attack. This new module, funded by DHS through the Combating Terrorism Technology Support Office, Technical Support Working Group, was developed to provide a tailored, user-friendly and visually-orientated buildings module within the existing VSP software toolkit, the latest version of which can be downloaded from http://dqo.pnl.gov/vsp. In case of, or when planning against, a chemical, biological, or radionuclide release within a building, the VSP module can be used to quickly and easily develop and visualize technically defensible sampling schemes for walls, floors, ceilings, and other surfaces to statistically determine if contamination is present, its magnitude and extent throughout the building and if decontamination has been effective. This paper demonstrates the features of this new VSP buildings module, which include: the ability to import building floor plans or to easily draw, manipulate, and view rooms in several ways; being able to insert doors, windows and annotations into a room; 3-D graphic room views with surfaces labeled and floor plans that show building zones that have separate air handing units. The paper will also discuss the statistical design and data analysis options available in the buildings module. Design objectives supported include comparing an average to a threshold when the data distribution is normal or unknown, and comparing measurements to a threshold to detect hotspots or to insure most of the area is uncontaminated when the data distribution is normal or unknown

  6. An Integrated Mixed Methods Research Design: Example of the Project Foreign Language Learning Strategies and Achievement: Analysis of Strategy Clusters and Sequences

    OpenAIRE

    Vlčková Kateřina

    2014-01-01

    The presentation focused on an so called integrated mixed method research design example on a basis of a Czech Science Foundation Project Nr. GAP407/12/0432 "Foreign Language Learning Strategies and Achievement: Analysis of Strategy Clusters and Sequences". All main integrated parts of the mixed methods research design were discussed: the aim, theoretical framework, research question, methods and validity threats. Prezentace se zaměřovala na tzv. integrovaný vícemetodový výzkumný design na...

  7. A design strategy for magnetorheological dampers using porous valves

    International Nuclear Information System (INIS)

    Hu, W; Robinson, R; Wereley, N M

    2009-01-01

    To design a porous-valve-based magnetorheological (MR) damper, essential design parameters are presented. The key elements affecting the damper performance are identified using flow analysis in porous media and an empirical magnetic field distribution in the porous valve. Based on a known MR fluid, the relationship between the controllable force of the damper and the porous valve characteristics, i.e. porosity and tortuosity, is developed. The effect of the porosity and tortuosity on the field-off damping force is exploited by using semi-empirical flow analysis. The critical flow rate for the onset of nonlinear viscous damping force is determined. Using the above design elements, an MR damper using by-pass porous valve is designed and tested. The experimental damper force and equivalent damping are compared with the predicted results to validate this design strategy.

  8. A design strategy for magnetorheological dampers using porous valves

    Energy Technology Data Exchange (ETDEWEB)

    Hu, W; Robinson, R; Wereley, N M [Smart Structures Laboratory, Alfred Gessow Rotorcraft Center, Department of Aerospace Engineering, University of Maryland, College Park, MD 20742 (United States)], E-mail: wereley@umd.edu

    2009-02-01

    To design a porous-valve-based magnetorheological (MR) damper, essential design parameters are presented. The key elements affecting the damper performance are identified using flow analysis in porous media and an empirical magnetic field distribution in the porous valve. Based on a known MR fluid, the relationship between the controllable force of the damper and the porous valve characteristics, i.e. porosity and tortuosity, is developed. The effect of the porosity and tortuosity on the field-off damping force is exploited by using semi-empirical flow analysis. The critical flow rate for the onset of nonlinear viscous damping force is determined. Using the above design elements, an MR damper using by-pass porous valve is designed and tested. The experimental damper force and equivalent damping are compared with the predicted results to validate this design strategy.

  9. Sampling strategies for tropical forest nutrient cycling studies: a case study in São Paulo, Brazil

    Directory of Open Access Journals (Sweden)

    G. Sparovek

    1997-12-01

    Full Text Available The precise sampling of soil, biological or micro climatic attributes in tropical forests, which are characterized by a high diversity of species and complex spatial variability, is a difficult task. We found few basic studies to guide sampling procedures. The objective of this study was to define a sampling strategy and data analysis for some parameters frequently used in nutrient cycling studies, i. e., litter amount, total nutrient amounts in litter and its composition (Ca, Mg, Κ, Ν and P, and soil attributes at three depths (organic matter, Ρ content, cation exchange capacity and base saturation. A natural remnant forest in the West of São Paulo State (Brazil was selected as study area and samples were collected in July, 1989. The total amount of litter and its total nutrient amounts had a high spatial independent variance. Conversely, the variance of litter composition was lower and the spatial dependency was peculiar to each nutrient. The sampling strategy for the estimation of litter amounts and the amount of nutrient in litter should be different than the sampling strategy for nutrient composition. For the estimation of litter amounts and the amount of nutrients in litter (related to quantity a large number of randomly distributed determinations are needed. Otherwise, for the estimation of litter nutrient composition (related to quality a smaller amount of spatially located samples should be analyzed. The determination of sampling for soil attributes differed according to the depth. Overall, surface samples (0-5 cm showed high short distance spatial dependent variance, whereas, subsurface samples exhibited spatial dependency in longer distances. Short transects with sampling interval of 5-10 m are recommended for surface sampling. Subsurface samples must also be spatially located, but with transects or grids with longer distances between sampling points over the entire area. Composite soil samples would not provide a complete

  10. Clinical usefulness of limited sampling strategies for estimating AUC of proton pump inhibitors.

    Science.gov (United States)

    Niioka, Takenori

    2011-03-01

    Cytochrome P450 (CYP) 2C19 (CYP2C19) genotype is regarded as a useful tool to predict area under the blood concentration-time curve (AUC) of proton pump inhibitors (PPIs). In our results, however, CYP2C19 genotypes had no influence on AUC of all PPIs during fluvoxamine treatment. These findings suggest that CYP2C19 genotyping is not always a good indicator for estimating AUC of PPIs. Limited sampling strategies (LSS) were developed to estimate AUC simply and accurately. It is important to minimize the number of blood samples because of patient's acceptance. This article reviewed the usefulness of LSS for estimating AUC of three PPIs (omeprazole: OPZ, lansoprazole: LPZ and rabeprazole: RPZ). The best prediction formulas in each PPI were AUC(OPZ)=9.24 x C(6h)+2638.03, AUC(LPZ)=12.32 x C(6h)+3276.09 and AUC(RPZ)=1.39 x C(3h)+7.17 x C(6h)+344.14, respectively. In order to optimize the sampling strategy of LPZ, we tried to establish LSS for LPZ using a time point within 3 hours through the property of pharmacokinetics of its enantiomers. The best prediction formula using the fewest sampling points (one point) was AUC(racemic LPZ)=6.5 x C(3h) of (R)-LPZ+13.7 x C(3h) of (S)-LPZ-9917.3 x G1-14387.2×G2+7103.6 (G1: homozygous extensive metabolizer is 1 and the other genotypes are 0; G2: heterozygous extensive metabolizer is 1 and the other genotypes are 0). Those strategies, plasma concentration monitoring at one or two time-points, might be more suitable for AUC estimation than reference to CYP2C19 genotypes, particularly in the case of coadministration of CYP mediators.

  11. Strategies in architectural design and urban planning in the context of energy efficiency in buildings

    Directory of Open Access Journals (Sweden)

    Vuksanović Dušan

    2007-01-01

    Full Text Available Some of the design concepts in architecture and urban planning, created on demands of energy efficiency, that apply in early stages of design process a schematic design, i.e. in the phase of creating the basis of architectural or planning solution, are analyzed in this paper. These design strategies have a role to be comprehensive enough to provide application of their key potentials, but at the same time they need to remain simple enough and not burden a designer with inadequate number of information. Design models for passive heating, passive cooling and natural lighting that refer to the buildings mainly have been considered, together with the principles for the settlements or building groups. Guiding a design concept towards the one of described design principles e.g. their application within the diurnal and seasonal cycles, depends on local climatic conditions and type of building (residential, commercial or educational. The presentation of a model is followed by the explanation of the phenomena of impacts/influences (climate program and answers (the concept, passive components related to a certain strategy, and by the illustration of a strategy on a realized object (case study. Issues of design strategies on energy efficiency are considered through different levels, e.g. through spatial organization, form and added components of buildings, as well as structure and characteristics of elements of external structures - facades and roofs.

  12. Supply chain implications of sustainable design strategies for electronics products

    OpenAIRE

    De Coster, R; Bateman, RJ; Plant, AVC

    2012-01-01

    Increasing legislative and consumer pressures on manufacturers to improve sustainability necessitates that manufacturers consider the overall life cycle and not be scope restricted in creating products. Product strategies to improve sustainability have design implications as many of the decisions made during the design stage will then determine the environmental performance of the final product. Coordination across the supply chain is potentially beneficial as products with improved energy ef...

  13. Achieving consensus in robot swarms design and analysis of strategies for the best-of-n problem

    CERN Document Server

    Valentini, Gabriele

    2017-01-01

    This book focuses on the design and analysis of collective decision-making strategies for the best-of-n problem. After providing a formalization of the structure of the best-of-n problem supported by a comprehensive survey of the swarm robotics literature, it introduces the functioning of a collective decision-making strategy and identifies a set of mechanisms that are essential for a strategy to solve the best-of-n problem. The best-of-n problem is an abstraction that captures the frequent requirement of a robot swarm to choose one option from of a finite set when optimizing benefits and costs. The book leverages the identification of these mechanisms to develop a modular and model-driven methodology to design collective decision-making strategies and to analyze their performance at different level of abstractions. Lastly, the author provides a series of case studies in which the proposed methodology is used to design different strategies, using robot experiments to show how the designed strategies can b...

  14. Females' sampling strategy to comparatively evaluate prospective mates in the peacock blenny Salaria pavo

    Science.gov (United States)

    Locatello, Lisa; Rasotto, Maria B.

    2017-08-01

    Emerging evidence suggests the occurrence of comparative decision-making processes in mate choice, questioning the traditional idea of female choice based on rules of absolute preference. In such a scenario, females are expected to use a typical best-of- n sampling strategy, being able to recall previous sampled males based on memory of their quality and location. Accordingly, the quality of preferred mate is expected to be unrelated to both the number and the sequence of female visits. We found support for these predictions in the peacock blenny, Salaria pavo, a fish where females have the opportunity to evaluate the attractiveness of many males in a short time period and in a restricted spatial range. Indeed, even considering the variability in preference among females, most of them returned to previous sampled males for further evaluations; thus, the preferred male did not represent the last one in the sequence of visited males. Moreover, there was no relationship between the attractiveness of the preferred male and the number of further visits assigned to the other males. Our results suggest the occurrence of a best-of- n mate sampling strategy in the peacock blenny.

  15. Optimal Sizing and Control Strategy Design for Heavy Hybrid Electric Truck

    Directory of Open Access Journals (Sweden)

    Yuan Zou

    2012-01-01

    Full Text Available Due to the complexity of the hybrid powertrain, the control is highly involved to improve the collaborations of the different components. For the specific powertrain, the components' sizing just gives the possibility to propel the vehicle and the control will realize the function of the propulsion. Definitely the components' sizing also gives the constraints to the control design, which cause a close coupling between the sizing and control strategy design. This paper presents a parametric study focused on sizing of the powertrain components and optimization of the power split between the engine and electric motor for minimizing the fuel consumption. A framework is put forward to accomplish the optimal sizing and control design for a heavy parallel pre-AMT hybrid truck under the natural driving schedule. The iterative plant-controller combined optimization methodology is adopted to optimize the key parameters of the plant and control strategy simultaneously. A scalable powertrain model based on a bilevel optimization framework is built. Dynamic programming is applied to find the optimal control in the inner loop with a prescribed cycle. The parameters are optimized in the outer loop. The results are analysed and the optimal sizing and control strategy are achieved simultaneously.

  16. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  17. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  18. Effects of Direct Fuel Injection Strategies on Cycle-by-Cycle Variability in a Gasoline Homogeneous Charge Compression Ignition Engine: Sample Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Jacek Hunicz

    2015-01-01

    Full Text Available In this study we summarize and analyze experimental observations of cyclic variability in homogeneous charge compression ignition (HCCI combustion in a single-cylinder gasoline engine. The engine was configured with negative valve overlap (NVO to trap residual gases from prior cycles and thus enable auto-ignition in successive cycles. Correlations were developed between different fuel injection strategies and cycle average combustion and work output profiles. Hypothesized physical mechanisms based on these correlations were then compared with trends in cycle-by-cycle predictability as revealed by sample entropy. The results of these comparisons help to clarify how fuel injection strategy can interact with prior cycle effects to affect combustion stability and so contribute to design control methods for HCCI engines.

  19. Vernacular design based on sustainable disaster’s mitigation communication and education strategy

    International Nuclear Information System (INIS)

    Mansoor, Alvanov Zpalanzani

    2015-01-01

    Indonesia is located between three active tectonic plates, which are prone to natural disasters such as earthquake, volcanic eruption, and also giant tidal wave-tsunami. Adequate infrastructure plays an important role in disaster mitigation, yet without good public awareness, the mitigation process won’t be succeeded. The absence of awareness can lead to infrastructure mistreatment. Several reports on lack of understanding or misinterpretation of disaster mitigation especially from rural and coastal communities need to be solved, especially from communication aspects. This is an interdisciplinary study on disaster mitigation communication design and education strategy from visual communication design studies paradigm. This paper depicts research results which applying vernacular design base to elaborate sustainable mitigation communication and education strategy on various visual media and social campaigns. This paper also describes several design approaches which may becomes way to elaborate sustainable awareness and understanding on disaster mitigation among rural and coastal communities in Indonesia

  20. Vernacular design based on sustainable disaster’s mitigation communication and education strategy

    Energy Technology Data Exchange (ETDEWEB)

    Mansoor, Alvanov Zpalanzani, E-mail: nova.zp@gmail.com, E-mail: alvanov@fsrd.itb.ac.id [Visual Communication Design Study Program, Faculty of Art and Design, Institut Teknologi Bandung Jalan Ganesa No. 10, Bandung 40132 (Indonesia)

    2015-04-24

    Indonesia is located between three active tectonic plates, which are prone to natural disasters such as earthquake, volcanic eruption, and also giant tidal wave-tsunami. Adequate infrastructure plays an important role in disaster mitigation, yet without good public awareness, the mitigation process won’t be succeeded. The absence of awareness can lead to infrastructure mistreatment. Several reports on lack of understanding or misinterpretation of disaster mitigation especially from rural and coastal communities need to be solved, especially from communication aspects. This is an interdisciplinary study on disaster mitigation communication design and education strategy from visual communication design studies paradigm. This paper depicts research results which applying vernacular design base to elaborate sustainable mitigation communication and education strategy on various visual media and social campaigns. This paper also describes several design approaches which may becomes way to elaborate sustainable awareness and understanding on disaster mitigation among rural and coastal communities in Indonesia.

  1. Addressing Underrepresentation in Sex Work Research: Reflections on Designing a Purposeful Sampling Strategy.

    Science.gov (United States)

    Bungay, Vicky; Oliffe, John; Atchison, Chris

    2016-06-01

    Men, transgender people, and those working in off-street locales have historically been underrepresented in sex work health research. Failure to include all sections of sex worker populations precludes comprehensive understandings about a range of population health issues, including potential variations in the manifestation of such issues within and between population subgroups, which in turn can impede the development of effective services and interventions. In this article, we describe our attempts to define, determine, and recruit a purposeful sample for a qualitative study examining the interrelationships between sex workers' health and the working conditions in the Vancouver off-street sex industry. Detailed is our application of ethnographic mapping approaches to generate information about population diversity and work settings within distinct geographical boundaries. Bearing in mind the challenges and the overwhelming discrimination sex workers experience, we scope recommendations for safe and effective purposeful sampling inclusive of sex workers' heterogeneity. © The Author(s) 2015.

  2. A Strategy for Material-specific e-Textile Interaction Design

    DEFF Research Database (Denmark)

    Gowrishankar, Ramyah; Bredies, Katharina; Ylirisku, Salu

    2017-01-01

    The interaction design of e-Textile products are often characterized by conventions adopted from electronic devices rather than developing interactions that can be specific to e-Textiles. We argue that textile materials feature a vast potential for the design of novel digital interactions....... Especially the shape-reformation capabilities of textiles may inform the design of expressive and aesthetically rewarding applications. In this chapter, we propose ways in which the textileness of e-Textiles can be better harnessed. We outline an e-Textile Interaction Design strategy that is based...... on defining the material-specificity of e-Textiles as its ability to deform in ways that match the expectations we have of textile materials. It embraces an open-ended exploration of textile-related interactions (for e.g. stretching, folding, turning-inside-out etc.) and their potential for electronic...

  3. Strategy paper. Remedial design/remedial action 100 Area. Revision 2

    International Nuclear Information System (INIS)

    Donahoe, R.L.

    1995-10-01

    This strategy paper identifies and defines the approach for remedial design and remedial action (RD/RA) for source waste sites in the 100 Area of the Hanford Site, located in southeastern Washington State. This paper provides the basis for the US Department of Energy (DOE) to assess and approve the Environmental Restoration Contractor's (ERC) approach to RD/RA. Additionally, DOE is requesting review/agreement from the US Environmental Protection Agency (EPA) and Washington State Department of Ecology (Ecology) on the strategy presented in this document in order to expedite remedial activities

  4. User-Centered Design Strategies for Massive Open Online Courses (MOOCs)

    Science.gov (United States)

    Mendoza-Gonzalez, Ricardo, Ed.

    2016-01-01

    In today's society, educational opportunities have evolved beyond the traditional classroom setting. Most universities have implemented virtual learning environments in an effort to provide more opportunities for potential or current students seeking alternative and more affordable learning solutions. "User-Centered Design Strategies for…

  5. Design of intelligent comfort control system with human learning and minimum power control strategies

    International Nuclear Information System (INIS)

    Liang, J.; Du, R.

    2008-01-01

    This paper presents the design of an intelligent comfort control system by combining the human learning and minimum power control strategies for the heating, ventilating and air conditioning (HVAC) system. In the system, the predicted mean vote (PMV) is adopted as the control objective to improve indoor comfort level by considering six comfort related variables, whilst a direct neural network controller is designed to overcome the nonlinear feature of the PMV calculation for better performance. To achieve the highest comfort level for the specific user, a human learning strategy is designed to tune the user's comfort zone, and then, a VAV and minimum power control strategy is proposed to minimize the energy consumption further. In order to validate the system design, a series of computer simulations are performed based on a derived HVAC and thermal space model. The simulation results confirm the design of the intelligent comfort control system. In comparison to the conventional temperature controller, this system can provide a higher comfort level and better system performance, so it has great potential for HVAC applications in the future

  6. Xeml Lab: a tool that supports the design of experiments at a graphical interface and generates computer-readable metadata files, which capture information about genotypes, growth conditions, environmental perturbations and sampling strategy.

    Science.gov (United States)

    Hannemann, Jan; Poorter, Hendrik; Usadel, Björn; Bläsing, Oliver E; Finck, Alex; Tardieu, Francois; Atkin, Owen K; Pons, Thijs; Stitt, Mark; Gibon, Yves

    2009-09-01

    Data mining depends on the ability to access machine-readable metadata that describe genotypes, environmental conditions, and sampling times and strategy. This article presents Xeml Lab. The Xeml Interactive Designer provides an interactive graphical interface at which complex experiments can be designed, and concomitantly generates machine-readable metadata files. It uses a new eXtensible Mark-up Language (XML)-derived dialect termed XEML. Xeml Lab includes a new ontology for environmental conditions, called Xeml Environment Ontology. However, to provide versatility, it is designed to be generic and also accepts other commonly used ontology formats, including OBO and OWL. A review summarizing important environmental conditions that need to be controlled, monitored and captured as metadata is posted in a Wiki (http://www.codeplex.com/XeO) to promote community discussion. The usefulness of Xeml Lab is illustrated by two meta-analyses of a large set of experiments that were performed with Arabidopsis thaliana during 5 years. The first reveals sources of noise that affect measurements of metabolite levels and enzyme activities. The second shows that Arabidopsis maintains remarkably stable levels of sugars and amino acids across a wide range of photoperiod treatments, and that adjustment of starch turnover and the leaf protein content contribute to this metabolic homeostasis.

  7. Strategies of Energy Efficiency Design in Traditional Kangbaiwan Mansion in China

    Directory of Open Access Journals (Sweden)

    Song Xiaoqing

    2016-01-01

    Full Text Available The building sector is one of the highest energy consuming sectors in the world as well as in China, it is urgent to seek an energy efficiency way of sustainable architecture development. From the perspective of tradition, this paper focus on strategies of energy efficiency design that contained in excellent vernacular dwellings. On the basis of analyzing an example of Kangbaiwan Mansion, it illustrates the advantage of environment ecosystem, and summarizes the physical and cultural characteristic of its buildings, especially the climate-adapting overall arrangement and sustainable strategies of natural ventilation and passive solar gain, which can be a fertile source of modern energy efficiency architecture design as well as a proper way of inheriting the outstanding traditional culture.

  8. Limited sampling strategy models for estimating the AUC of gliclazide in Chinese healthy volunteers.

    Science.gov (United States)

    Huang, Ji-Han; Wang, Kun; Huang, Xiao-Hui; He, Ying-Chun; Li, Lu-Jin; Sheng, Yu-Cheng; Yang, Juan; Zheng, Qing-Shan

    2013-06-01

    The aim of this work is to reduce the cost of required sampling for the estimation of the area under the gliclazide plasma concentration versus time curve within 60 h (AUC0-60t ). The limited sampling strategy (LSS) models were established and validated by the multiple regression model within 4 or fewer gliclazide concentration values. Absolute prediction error (APE), root of mean square error (RMSE) and visual prediction check were used as criterion. The results of Jack-Knife validation showed that 10 (25.0 %) of the 40 LSS based on the regression analysis were not within an APE of 15 % using one concentration-time point. 90.2, 91.5 and 92.4 % of the 40 LSS models were capable of prediction using 2, 3 and 4 points, respectively. Limited sampling strategies were developed and validated for estimating AUC0-60t of gliclazide. This study indicates that the implementation of an 80 mg dosage regimen enabled accurate predictions of AUC0-60t by the LSS model. This study shows that 12, 6, 4, 2 h after administration are the key sampling times. The combination of (12, 2 h), (12, 8, 2 h) or (12, 8, 4, 2 h) can be chosen as sampling hours for predicting AUC0-60t in practical application according to requirement.

  9. Implications of clinical trial design on sample size requirements.

    Science.gov (United States)

    Leon, Andrew C

    2008-07-01

    The primary goal in designing a randomized controlled clinical trial (RCT) is to minimize bias in the estimate of treatment effect. Randomized group assignment, double-blinded assessments, and control or comparison groups reduce the risk of bias. The design must also provide sufficient statistical power to detect a clinically meaningful treatment effect and maintain a nominal level of type I error. An attempt to integrate neurocognitive science into an RCT poses additional challenges. Two particularly relevant aspects of such a design often receive insufficient attention in an RCT. Multiple outcomes inflate type I error, and an unreliable assessment process introduces bias and reduces statistical power. Here we describe how both unreliability and multiple outcomes can increase the study costs and duration and reduce the feasibility of the study. The objective of this article is to consider strategies that overcome the problems of unreliability and multiplicity.

  10. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  11. Design and Printing Strategies in 3D Bioprinting of Cell-Hydrogels: A Review.

    Science.gov (United States)

    Lee, Jia Min; Yeong, Wai Yee

    2016-11-01

    Bioprinting is an emerging technology that allows the assembling of both living and non-living biological materials into an ideal complex layout for further tissue maturation. Bioprinting aims to produce engineered tissue or organ in a mechanized, organized, and optimized manner. Various biomaterials and techniques have been utilized to bioprint biological constructs in different shapes, sizes and resolutions. There is a need to systematically discuss and analyze the reported strategies employed to fabricate these constructs. We identified and discussed important design factors in bioprinting, namely shape and resolution, material heterogeneity, and cellular-material remodeling dynamism. Each design factors are represented by the corresponding process capabilities and printing parameters. The process-design map will inspire future biomaterials research in these aspects. Design considerations such as data processing, bio-ink formulation and process selection are discussed. Various printing and crosslinking strategies, with relevant applications, are also systematically reviewed. We categorized them into 5 general bioprinting strategies, including direct bioprinting, in-process crosslinking, post-process crosslinking, indirect bioprinting and hybrid bioprinting. The opportunities and outlook in 3D bioprinting are highlighted. This review article will serve as a framework to advance computer-aided design in bioprinting technologies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Sample preparation composite and replicate strategy case studies for assay of solid oral drug products.

    Science.gov (United States)

    Nickerson, Beverly; Harrington, Brent; Li, Fasheng; Guo, Michele Xuemei

    2017-11-30

    Drug product assay is one of several tests required for new drug products to ensure the quality of the product at release and throughout the life cycle of the product. Drug product assay testing is typically performed by preparing a composite sample of multiple dosage units to obtain an assay value representative of the batch. In some cases replicate composite samples may be prepared and the reportable assay value is the average value of all the replicates. In previously published work by Harrington et al. (2014) [5], a sample preparation composite and replicate strategy for assay was developed to provide a systematic approach which accounts for variability due to the analytical method and dosage form with a standard error of the potency assay criteria based on compendia and regulatory requirements. In this work, this sample preparation composite and replicate strategy for assay is applied to several case studies to demonstrate the utility of this approach and its application at various stages of pharmaceutical drug product development. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Extending cluster Lot Quality Assurance Sampling designs for surveillance programs

    OpenAIRE

    Hund, Lauren; Pagano, Marcello

    2014-01-01

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than ...

  14. Outcomes of a systematically designed strategy for the implementation of sex education in Dutch secondary schools

    NARCIS (Netherlands)

    Wiefferink, C.H.; Poelman, J.; Linthorst, M.; Vanwesenbeeck, I.; Wijngaarden, J.C.M. van; Paulussen, T.G.W.M.

    2005-01-01

    This study examines the effects of a systematically designed innovation strategy on teachers' implementation of a sex education curriculum and its related determinants. A quasi-experimental group design was used to assess the effectiveness of the innovation strategy. Teachers filled in

  15. Combination microwave ovens: an innovative design strategy.

    Science.gov (United States)

    Tinga, Wayne R; Eke, Ken

    2012-01-01

    Reducing the sensitivity of microwave oven heating and cooking performance to load volume, load placement and load properties has been a long-standing challenge for microwave and microwave-convection oven designers. Conventional design problem and solution methods are reviewed to provide greater insight into the challenge and optimum operation of a microwave oven after which a new strategy is introduced. In this methodology, a special load isolating and energy modulating device called a transducer-exciter is used containing an iris, a launch box, a phase, amplitude and frequency modulator and a coupling plate designed to provide spatially distributed coupling to the oven. This system, when applied to a combined microwave-convection oven, gives astounding performance improvements to all kinds of baked and roasted foods including sensitive items such as cakes and pastries, with the only compromise being a reasonable reduction in the maximum available microwave power. Large and small metal utensils can be used in the oven with minimal or no performance penalty on energy uniformity and cooking results. Cooking times are greatly reduced from those in conventional ovens while maintaining excellent cooking performance.

  16. Bionic Design for Mars Sampling Scoop Inspired by Himalayan Marmot Claw

    Directory of Open Access Journals (Sweden)

    Long Xue

    2016-01-01

    Full Text Available Cave animals are often adapted to digging and life underground, with claw toes similar in structure and function to a sampling scoop. In this paper, the clawed toes of the Himalayan marmot were selected as a biological prototype for bionic research. Based on geometric parameter optimization of the clawed toes, a bionic sampling scoop for use on Mars was designed. Using a 3D laser scanner, the point cloud data of the second front claw toe was acquired. Parametric equations and contour curves for the claw were then built with cubic polynomial fitting. We obtained 18 characteristic curve equations for the internal and external contours of the claw. A bionic sampling scoop was designed according to the structural parameters of Curiosity’s sampling shovel and the contours of the Himalayan marmot’s claw. Verifying test results showed that when the penetration angle was 45° and the sampling speed was 0.33 r/min, the bionic sampling scoops’ resistance torque was 49.6% less than that of the prototype sampling scoop. When the penetration angle was 60° and the sampling speed was 0.22 r/min, the resistance torque of the bionic sampling scoop was 28.8% lower than that of the prototype sampling scoop.

  17. Design Driven Innovation as a Differentiation Strategy - in the Context of Automotive Industry

    Directory of Open Access Journals (Sweden)

    Mosarrat Farhana

    2015-07-01

    Full Text Available Though, in the automotive industry, technology is considered as a source of innovation and development, emerging change in consumer perception has brought industry focus on design. Evolution of this industry is closely related to the convergence of technology and design. In such context, implication of design driven innovation strategy in the automotive industry has potential to be explored and to create sustained competitive advantage through balancing customers’ need, technological opportunities and product meaning. The aim of this paper is to give a holistic view of design driven innovation as a differentiation strategy in the automotive industry and its implication for strategic management through some relevant research reviews and empirical information. On the contrary, this research lacks detailed description on industry practice to provide greater breadth, since it attempts to correlate the strategic concept of design with the dynamic capability of a firm in that particular industry.

  18. Limited sampling strategy for determining metformin area under the plasma concentration-time curve

    DEFF Research Database (Denmark)

    Santoro, Ana Beatriz; Stage, Tore Bjerregaard; Struchiner, Claudio José

    2016-01-01

    AIM: The aim was to develop and validate limited sampling strategy (LSS) models to predict the area under the plasma concentration-time curve (AUC) for metformin. METHODS: Metformin plasma concentrations (n = 627) at 0-24 h after a single 500 mg dose were used for LSS development, based on all su...

  19. Evaluation of optimized bronchoalveolar lavage sampling designs for characterization of pulmonary drug distribution.

    Science.gov (United States)

    Clewe, Oskar; Karlsson, Mats O; Simonsson, Ulrika S H

    2015-12-01

    Bronchoalveolar lavage (BAL) is a pulmonary sampling technique for characterization of drug concentrations in epithelial lining fluid and alveolar cells. Two hypothetical drugs with different pulmonary distribution rates (fast and slow) were considered. An optimized BAL sampling design was generated assuming no previous information regarding the pulmonary distribution (rate and extent) and with a maximum of two samples per subject. Simulations were performed to evaluate the impact of the number of samples per subject (1 or 2) and the sample size on the relative bias and relative root mean square error of the parameter estimates (rate and extent of pulmonary distribution). The optimized BAL sampling design depends on a characterized plasma concentration time profile, a population plasma pharmacokinetic model, the limit of quantification (LOQ) of the BAL method and involves only two BAL sample time points, one early and one late. The early sample should be taken as early as possible, where concentrations in the BAL fluid ≥ LOQ. The second sample should be taken at a time point in the declining part of the plasma curve, where the plasma concentration is equivalent to the plasma concentration in the early sample. Using a previously described general pulmonary distribution model linked to a plasma population pharmacokinetic model, simulated data using the final BAL sampling design enabled characterization of both the rate and extent of pulmonary distribution. The optimized BAL sampling design enables characterization of both the rate and extent of the pulmonary distribution for both fast and slowly equilibrating drugs.

  20. Designing equitable antiretroviral allocation strategies in resource-constrained countries.

    Directory of Open Access Journals (Sweden)

    David P Wilson

    2005-02-01

    Full Text Available Recently, a global commitment has been made to expand access to antiretrovirals (ARVs in the developing world. However, in many resource-constrained countries the number of individuals infected with HIV in need of treatment will far exceed the supply of ARVs, and only a limited number of health-care facilities (HCFs will be available for ARV distribution. Deciding how to allocate the limited supply of ARVs among HCFs will be extremely difficult. Resource allocation decisions can be made on the basis of many epidemiological, ethical, or preferential treatment priority criteria.Here we use operations research techniques, and we show how to determine the optimal strategy for allocating ARVs among HCFs in order to satisfy the equitable criterion that each individual infected with HIV has an equal chance of receiving ARVs. We present a novel spatial mathematical model that includes heterogeneity in treatment accessibility. We show how to use our theoretical framework, in conjunction with an equity objective function, to determine an optimal equitable allocation strategy (OEAS for ARVs in resource-constrained regions. Our equity objective function enables us to apply the egalitarian principle of equity with respect to access to health care. We use data from the detailed ARV rollout plan designed by the government of South Africa to determine an OEAS for the province of KwaZulu-Natal. We determine the OEAS for KwaZulu-Natal, and we then compare this OEAS with two other ARV allocation strategies: (i allocating ARVs only to Durban (the largest urban city in KwaZulu-Natal province and (ii allocating ARVs equally to all available HCFs. In addition, we compare the OEAS to the current allocation plan of the South African government (which is based upon allocating ARVs to 17 HCFs. We show that our OEAS significantly improves equity in treatment accessibility in comparison with these three ARV allocation strategies. We also quantify how the size of the

  1. Objective sampling design in a highly heterogeneous landscape - characterizing environmental determinants of malaria vector distribution in French Guiana, in the Amazonian region.

    Science.gov (United States)

    Roux, Emmanuel; Gaborit, Pascal; Romaña, Christine A; Girod, Romain; Dessay, Nadine; Dusfour, Isabelle

    2013-12-01

    Sampling design is a key issue when establishing species inventories and characterizing habitats within highly heterogeneous landscapes. Sampling efforts in such environments may be constrained and many field studies only rely on subjective and/or qualitative approaches to design collection strategy. The region of Cacao, in French Guiana, provides an excellent study site to understand the presence and abundance of Anopheles mosquitoes, their species dynamics and the transmission risk of malaria across various environments. We propose an objective methodology to define a stratified sampling design. Following thorough environmental characterization, a factorial analysis of mixed groups allows the data to be reduced and non-collinear principal components to be identified while balancing the influences of the different environmental factors. Such components defined new variables which could then be used in a robust k-means clustering procedure. Then, we identified five clusters that corresponded to our sampling strata and selected sampling sites in each stratum. We validated our method by comparing the species overlap of entomological collections from selected sites and the environmental similarities of the same sites. The Morisita index was significantly correlated (Pearson linear correlation) with environmental similarity based on i) the balanced environmental variable groups considered jointly (p = 0.001) and ii) land cover/use (p-value sampling approach. Land cover/use maps (based on high spatial resolution satellite images) were shown to be particularly useful when studying the presence, density and diversity of Anopheles mosquitoes at local scales and in very heterogeneous landscapes.

  2. Interactive Spacecraft Trajectory Design Strategies Featuring Poincare Map Topology

    Science.gov (United States)

    Schlei, Wayne R.

    Space exploration efforts are shifting towards inexpensive and more agile vehicles. Versatility regarding spacecraft trajectories refers to the agility to correct deviations from an intended path or even the ability to adapt the future path to a new destination--all with limited spaceflight resources (i.e., small DeltaV budgets). Trajectory design methods for such nimble vehicles incorporate equally versatile procedures that allow for rapid and interactive decision making while attempting to reduce Delta V budgets, leading to a versatile trajectory design platform. A versatile design paradigm requires the exploitation of Poincare map topology , or the interconnected web of dynamical structures, existing within the chaotic dynamics of multi-body gravitational models to outline low-Delta V transfer options residing nearby to a current path. This investigation details an autonomous procedure to extract the periodic orbits (topology nodes) and correlated asymptotic flow structures (or the invariant manifolds representing topology links). The autonomous process summarized in this investigation (termed PMATE) overcomes discontinuities on the Poincare section that arise in the applied multi-body model (the planar circular restricted three-body problem) and detects a wide variety of novel periodic orbits. New interactive capabilities deliver a visual analytics foundation for versatile spaceflight design, especially for initial guess generation and manipulation. Such interactive strategies include the selection of states and arcs from Poincare section visualizations and the capabilities to draw and drag trajectories to remove dependency on initial state input. Furthermore, immersive selection is expanded to cull invariant manifold structures, yielding low-DeltaV or even DeltaV-free transfers between periodic orbits. The application of interactive design strategies featuring a dense extraction of Poincare map topology is demonstrated for agile spaceflight with a simple

  3. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    Science.gov (United States)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using

  4. Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review

    Directory of Open Access Journals (Sweden)

    Stephanie Mazzucca

    2018-02-01

    Full Text Available BackgroundThe need for optimal study designs in dissemination and implementation (D&I research is increasingly recognized. Despite the wide range of study designs available for D&I research, we lack understanding of the types of designs and methodologies that are routinely used in the field. This review assesses the designs and methodologies in recently proposed D&I studies and provides resources to guide design decisions.MethodsWe reviewed 404 study protocols published in the journal Implementation Science from 2/2006 to 9/2017. Eligible studies tested the efficacy or effectiveness of D&I strategies (i.e., not effectiveness of the underlying clinical or public health intervention; had a comparison by group and/or time; and used ≥1 quantitative measure. Several design elements were extracted: design category (e.g., randomized; design type [e.g., cluster randomized controlled trial (RCT]; data type (e.g., quantitative; D&I theoretical framework; levels of treatment assignment, intervention, and measurement; and country in which the research was conducted. Each protocol was double-coded, and discrepancies were resolved through discussion.ResultsOf the 404 protocols reviewed, 212 (52% studies tested one or more implementation strategy across 208 manuscripts, therefore meeting inclusion criteria. Of the included studies, 77% utilized randomized designs, primarily cluster RCTs. The use of alternative designs (e.g., stepped wedge increased over time. Fewer studies were quasi-experimental (17% or observational (6%. Many study design categories (e.g., controlled pre–post, matched pair cluster design were represented by only one or two studies. Most articles proposed quantitative and qualitative methods (61%, with the remaining 39% proposing only quantitative. Half of protocols (52% reported using a theoretical framework to guide the study. The four most frequently reported frameworks were Consolidated Framework for Implementing Research and RE

  5. Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review.

    Science.gov (United States)

    Mazzucca, Stephanie; Tabak, Rachel G; Pilar, Meagan; Ramsey, Alex T; Baumann, Ana A; Kryzer, Emily; Lewis, Ericka M; Padek, Margaret; Powell, Byron J; Brownson, Ross C

    2018-01-01

    The need for optimal study designs in dissemination and implementation (D&I) research is increasingly recognized. Despite the wide range of study designs available for D&I research, we lack understanding of the types of designs and methodologies that are routinely used in the field. This review assesses the designs and methodologies in recently proposed D&I studies and provides resources to guide design decisions. We reviewed 404 study protocols published in the journal Implementation Science from 2/2006 to 9/2017. Eligible studies tested the efficacy or effectiveness of D&I strategies (i.e., not effectiveness of the underlying clinical or public health intervention); had a comparison by group and/or time; and used ≥1 quantitative measure. Several design elements were extracted: design category (e.g., randomized); design type [e.g., cluster randomized controlled trial (RCT)]; data type (e.g., quantitative); D&I theoretical framework; levels of treatment assignment, intervention, and measurement; and country in which the research was conducted. Each protocol was double-coded, and discrepancies were resolved through discussion. Of the 404 protocols reviewed, 212 (52%) studies tested one or more implementation strategy across 208 manuscripts, therefore meeting inclusion criteria. Of the included studies, 77% utilized randomized designs, primarily cluster RCTs. The use of alternative designs (e.g., stepped wedge) increased over time. Fewer studies were quasi-experimental (17%) or observational (6%). Many study design categories (e.g., controlled pre-post, matched pair cluster design) were represented by only one or two studies. Most articles proposed quantitative and qualitative methods (61%), with the remaining 39% proposing only quantitative. Half of protocols (52%) reported using a theoretical framework to guide the study. The four most frequently reported frameworks were Consolidated Framework for Implementing Research and RE-AIM ( n  = 16 each), followed by

  6. Novel design methods and control strategies for oil and gas offshore power systems

    DEFF Research Database (Denmark)

    Pierobon, Leonardo

    content), or when the thermal stresses on the working fluid should be minimized. Additionally, the controller is demonstrated to improve the dynamic flexibility of the plant compared to the reference controller designed by the gas turbine manufacturer.The model predictive control can reduce the frequency......This doctoral thesis is devoted to the research of innovative design methods and control strategies for power systems supplying future and existing oshore oil and gas facilities.The author uses these methods to address five research challenges: i) the definitionof the optimal waste heat recovery...... technology, ii) the identification of the best working fluid to design ecient, light and cost-competitive waste heat recovery units, iii) the integration of dynamic criteria in the project phase to discard infeasible designs, iv) the development of a novel control strategy to optimally operate the power...

  7. The Strategy Blueprint: A Strategy Process Computer-Aided Design Tool

    OpenAIRE

    Aldea, Adina Ioana; Febriani, Tania Rizki; Daneva, Maya; Iacob, Maria Eugenia

    2017-01-01

    Strategy has always been a main concern of organizations because it dictates their direction, and therefore determines their success. Thus, organizations need to have adequate support to guide them through their strategy formulation process. The goal of this research is to develop a computer-based tool, known as ‘the Strategy Blueprint’, consisting of a combination of nine strategy techniques, which can help organizations define the most suitable strategy, based on the internal and external f...

  8. Product design and business model strategies for a circular economy

    NARCIS (Netherlands)

    Bocken, N.M.P.; de Pauw, I.C.; Bakker, C.A.; van der Grinten, B.

    The transition within business from a linear to a circular economy brings with it a range of practical challenges for companies. The following question is addressed: What are the product design and business model strategies for
    companies that want to move to a circular economy model? This paper

  9. [Strategies for biobank networks. Classification of different approaches for locating samples and an outlook on the future within the BBMRI-ERIC].

    Science.gov (United States)

    Lablans, Martin; Kadioglu, Dennis; Mate, Sebastian; Leb, Ines; Prokosch, Hans-Ulrich; Ückert, Frank

    2016-03-01

    Medical research projects often require more biological material than can be supplied by a single biobank. For this reason, a multitude of strategies support locating potential research partners with matching material without requiring centralization of sample storage. Classification of different strategies for biobank networks, in particular for locating suitable samples. Description of an IT infrastructure combining these strategies. Existing strategies can be classified according to three criteria: (a) granularity of sample data: coarse bank-level data (catalogue) vs. fine-granular sample-level data, (b) location of sample data: central (central search service) vs. decentral storage (federated search services), and (c) level of automation: automatic (query-based, federated search service) vs. semi-automatic (inquiry-based, decentral search). All mentioned search services require data integration. Metadata help to overcome semantic heterogeneity. The "Common Service IT" in BBMRI-ERIC (Biobanking and BioMolecular Resources Research Infrastructure) unites a catalogue, the decentral search and metadata in an integrated platform. As a result, researchers receive versatile tools to search suitable biomaterial, while biobanks retain a high degree of data sovereignty. Despite their differences, the presented strategies for biobank networks do not rule each other out but can complement and even benefit from each other.

  10. Comparison of sampling strategies for object-based classification of urban vegetation from Very High Resolution satellite images

    Science.gov (United States)

    Rougier, Simon; Puissant, Anne; Stumpf, André; Lachiche, Nicolas

    2016-09-01

    Vegetation monitoring is becoming a major issue in the urban environment due to the services they procure and necessitates an accurate and up to date mapping. Very High Resolution satellite images enable a detailed mapping of the urban tree and herbaceous vegetation. Several supervised classifications with statistical learning techniques have provided good results for the detection of urban vegetation but necessitate a large amount of training data. In this context, this study proposes to investigate the performances of different sampling strategies in order to reduce the number of examples needed. Two windows based active learning algorithms from state-of-art are compared to a classical stratified random sampling and a third combining active learning and stratified strategies is proposed. The efficiency of these strategies is evaluated on two medium size French cities, Strasbourg and Rennes, associated to different datasets. Results demonstrate that classical stratified random sampling can in some cases be just as effective as active learning methods and that it should be used more frequently to evaluate new active learning methods. Moreover, the active learning strategies proposed in this work enables to reduce the computational runtime by selecting multiple windows at each iteration without increasing the number of windows needed.

  11. Design/Operations review of core sampling trucks and associated equipment

    International Nuclear Information System (INIS)

    Shrivastava, H.P.

    1996-01-01

    A systematic review of the design and operations of the core sampling trucks was commissioned by Characterization Equipment Engineering of the Westinghouse Hanford Company in October 1995. The review team reviewed the design documents, specifications, operating procedure, training manuals and safety analysis reports. The review process, findings and corrective actions are summarized in this supporting document

  12. Practical iterative learning control with frequency domain design and sampled data implementation

    CERN Document Server

    Wang, Danwei; Zhang, Bin

    2014-01-01

    This book is on the iterative learning control (ILC) with focus on the design and implementation. We approach the ILC design based on the frequency domain analysis and address the ILC implementation based on the sampled data methods. This is the first book of ILC from frequency domain and sampled data methodologies. The frequency domain design methods offer ILC users insights to the convergence performance which is of practical benefits. This book presents a comprehensive framework with various methodologies to ensure the learnable bandwidth in the ILC system to be set with a balance between learning performance and learning stability. The sampled data implementation ensures effective execution of ILC in practical dynamic systems. The presented sampled data ILC methods also ensure the balance of performance and stability of learning process. Furthermore, the presented theories and methodologies are tested with an ILC controlled robotic system. The experimental results show that the machines can work in much h...

  13. Teaching Design in Middle-School: Instructors' Concerns and Scaffolding Strategies

    Science.gov (United States)

    Bamberger, Yael M.; Cahill, Clara S.

    2013-04-01

    This study deals with engineering education in the middle-school level. Its focus is instructors' concerns in teaching design, as well as scaffolding strategies that can help teachers deal with these concerns. Through participatory action research, nine instructors engaged in a process of development and instruction of a curriculum about energy along with engineering design. A 50-h curriculum was piloted during a summer camp for 38 middle-school students. Data was collected through instructors' materials: observation field notes, daily reflections and post-camp discussions. In addition, students' artifacts and planning graphical models were collected in order to explore how instructors' concerns were aligned with students' learning. Findings indicate three main tensions that reflect instructors' main concerns: how to provide sufficient scaffolding yet encourage creativity, how to scaffold hands-on experiences that promote mindful planning, and how to scaffold students' modeling practices. Pedagogical strategies for teaching design that developed through this work are described, as well as the ways they address the National Research Council (A framework for K-12 science education: practices, crosscutting concepts, and core ideas. National Academies Press, Washington, DC, 2011) core ideas of engineering education and the International Technological Literacy standards (ITEA in Standards for technological literacy, 3rd edn. International Technology education Association, Reston, VA, 2007).

  14. Design of sampling tools for Monte Carlo particle transport code JMCT

    International Nuclear Information System (INIS)

    Shangguan Danhua; Li Gang; Zhang Baoyin; Deng Li

    2012-01-01

    A class of sampling tools for general Monte Carlo particle transport code JMCT is designed. Two ways are provided to sample from distributions. One is the utilization of special sampling methods for special distribution; the other is the utilization of general sampling methods for arbitrary discrete distribution and one-dimensional continuous distribution on a finite interval. Some open source codes are included in the general sampling method for the maximum convenience of users. The sampling results show sampling correctly from distribution which are popular in particle transport can be achieved with these tools, and the user's convenience can be assured. (authors)

  15. A comparison of sample preparation strategies for biological tissues and subsequent trace element analysis using LA-ICP-MS.

    Science.gov (United States)

    Bonta, Maximilian; Török, Szilvia; Hegedus, Balazs; Döme, Balazs; Limbeck, Andreas

    2017-03-01

    Laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) is one of the most commonly applied methods for lateral trace element distribution analysis in medical studies. Many improvements of the technique regarding quantification and achievable lateral resolution have been achieved in the last years. Nevertheless, sample preparation is also of major importance and the optimal sample preparation strategy still has not been defined. While conventional histology knows a number of sample pre-treatment strategies, little is known about the effect of these approaches on the lateral distributions of elements and/or their quantities in tissues. The technique of formalin fixation and paraffin embedding (FFPE) has emerged as the gold standard in tissue preparation. However, the potential use for elemental distribution studies is questionable due to a large number of sample preparation steps. In this work, LA-ICP-MS was used to examine the applicability of the FFPE sample preparation approach for elemental distribution studies. Qualitative elemental distributions as well as quantitative concentrations in cryo-cut tissues as well as FFPE samples were compared. Results showed that some metals (especially Na and K) are severely affected by the FFPE process, whereas others (e.g., Mn, Ni) are less influenced. Based on these results, a general recommendation can be given: FFPE samples are completely unsuitable for the analysis of alkaline metals. When analyzing transition metals, FFPE samples can give comparable results to snap-frozen tissues. Graphical abstract Sample preparation strategies for biological tissues are compared with regard to the elemental distributions and average trace element concentrations.

  16. Optimization of Region-of-Interest Sampling Strategies for Hepatic MRI Proton Density Fat Fraction Quantification

    Science.gov (United States)

    Hong, Cheng William; Wolfson, Tanya; Sy, Ethan Z.; Schlein, Alexandra N.; Hooker, Jonathan C.; Dehkordy, Soudabeh Fazeli; Hamilton, Gavin; Reeder, Scott B.; Loomba, Rohit; Sirlin, Claude B.

    2017-01-01

    BACKGROUND Clinical trials utilizing proton density fat fraction (PDFF) as an imaging biomarker for hepatic steatosis have used a laborious region-of-interest (ROI) sampling strategy of placing an ROI in each hepatic segment. PURPOSE To identify a strategy with the fewest ROIs that consistently achieves close agreement with the nine-ROI strategy. STUDY TYPE Retrospective secondary analysis of prospectively acquired clinical research data. POPULATION A total of 391 adults (173 men, 218 women) with known or suspected NAFLD. FIELD STRENGTH/SEQUENCE Confounder-corrected chemical-shift-encoded 3T MRI using a 2D multiecho gradientrecalled echo technique. ASSESSMENT An ROI was placed in each hepatic segment. Mean nine-ROI PDFF and segmental PDFF standard deviation were computed. Segmental and lobar PDFF were compared. PDFF was estimated using every combinatorial subset of ROIs and compared to the nine-ROI average. STATISTICAL TESTING Mean nine-ROI PDFF and segmental PDFF standard deviation were summarized descriptively. Segmental PDFF was compared using a one-way analysis of variance, and lobar PDFF was compared using a paired t-test and a Bland–Altman analysis. The PDFF estimated by every subset of ROIs was informally compared to the nine-ROI average using median intraclass correlation coefficients (ICCs) and Bland–Altman analyses. RESULTS The study population’s mean whole-liver PDFF was 10.1±8.9% (range: 1.1–44.1%). Although there was no significant difference in average segmental (P=0.452) or lobar (P=0.154) PDFF, left and right lobe PDFF differed by at least 1.5 percentage points in 25.1% (98/391) of patients. Any strategy with ≥ 4 ROIs had ICC >0.995. 115 of 126 four-ROI strategies (91%) had limits of agreement (LOA) 0.995, and 2/36 (6%) of two-ROI strategies and 46/84 (55%) of three-ROI strategies had LOA <1.5%. DATA CONCLUSION Four-ROI sampling strategies with two ROIs in the left and right lobes achieve close agreement with nine-ROI PDFF. Level of

  17. Optimization of region-of-interest sampling strategies for hepatic MRI proton density fat fraction quantification.

    Science.gov (United States)

    Hong, Cheng William; Wolfson, Tanya; Sy, Ethan Z; Schlein, Alexandra N; Hooker, Jonathan C; Fazeli Dehkordy, Soudabeh; Hamilton, Gavin; Reeder, Scott B; Loomba, Rohit; Sirlin, Claude B

    2018-04-01

    Clinical trials utilizing proton density fat fraction (PDFF) as an imaging biomarker for hepatic steatosis have used a laborious region-of-interest (ROI) sampling strategy of placing an ROI in each hepatic segment. To identify a strategy with the fewest ROIs that consistently achieves close agreement with the nine-ROI strategy. Retrospective secondary analysis of prospectively acquired clinical research data. A total of 391 adults (173 men, 218 women) with known or suspected NAFLD. Confounder-corrected chemical-shift-encoded 3T MRI using a 2D multiecho gradient-recalled echo technique. An ROI was placed in each hepatic segment. Mean nine-ROI PDFF and segmental PDFF standard deviation were computed. Segmental and lobar PDFF were compared. PDFF was estimated using every combinatorial subset of ROIs and compared to the nine-ROI average. Mean nine-ROI PDFF and segmental PDFF standard deviation were summarized descriptively. Segmental PDFF was compared using a one-way analysis of variance, and lobar PDFF was compared using a paired t-test and a Bland-Altman analysis. The PDFF estimated by every subset of ROIs was informally compared to the nine-ROI average using median intraclass correlation coefficients (ICCs) and Bland-Altman analyses. The study population's mean whole-liver PDFF was 10.1 ± 8.9% (range: 1.1-44.1%). Although there was no significant difference in average segmental (P = 0.452) or lobar (P = 0.154) PDFF, left and right lobe PDFF differed by at least 1.5 percentage points in 25.1% (98/391) of patients. Any strategy with ≥4 ROIs had ICC >0.995. 115 of 126 four-ROI strategies (91%) had limits of agreement (LOA) 0.995, and 2/36 (6%) of two-ROI strategies and 46/84 (55%) of three-ROI strategies had LOA <1.5%. Four-ROI sampling strategies with two ROIs in the left and right lobes achieve close agreement with nine-ROI PDFF. 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:988-994. © 2017 International Society for Magnetic Resonance

  18. High-Payoff Space Transportation Design Approach with a Technology Integration Strategy

    Science.gov (United States)

    McCleskey, C. M.; Rhodes, R. E.; Chen, T.; Robinson, J.

    2011-01-01

    A general architectural design sequence is described to create a highly efficient, operable, and supportable design that achieves an affordable, repeatable, and sustainable transportation function. The paper covers the following aspects of this approach in more detail: (1) vehicle architectural concept considerations (including important strategies for greater reusability); (2) vehicle element propulsion system packaging considerations; (3) vehicle element functional definition; (4) external ground servicing and access considerations; and, (5) simplified guidance, navigation, flight control and avionics communications considerations. Additionally, a technology integration strategy is forwarded that includes: (a) ground and flight test prior to production commitments; (b) parallel stage propellant storage, such as concentric-nested tanks; (c) high thrust, LOX-rich, LOX-cooled first stage earth-to-orbit main engine; (d) non-toxic, day-of-launch-loaded propellants for upper stages and in-space propulsion; (e) electric propulsion and aero stage control.

  19. Case-based learning in VTLE: An effective strategy for improving learning design

    OpenAIRE

    Guàrdia Ortiz, Lourdes; Sangrà, Albert; Maina, Marcelo Fabián

    2014-01-01

    This article presents preliminary research from an instructional design perspective on the design of the case method as an integral part of pedagogy and technology. Key features and benefits using this teaching and learning strategy in a Virtual Teaching and Learning Environment (VTLE) are identified, taking into account the requirements of the European Higher Education Area (EHEA) for a competence-based curricula design. The implications of these findings for a learning object appro...

  20. Passive solar design strategies: Remodeling guidelines for conserving energy at home

    Science.gov (United States)

    The idea of passive solar is simple, but applying it effectively does require information and attention to the details of design and construction. Some passive solar techniques are modest and low-cost, and require only small changes in remodeler's typical practice. At the other end of the spectrum, some passive solar systems can almost eliminate a house's need for purchased heating (and in some cases, cooling) energy - but probably at a relatively high first cost. In between are a broad range of energy-conserving passive solar techniques. Whether or not they are cost-effective, practical, and attractive enough to offer a market advantage to any individual remodeler depends on very specific factors such as local costs, climate, and market characteristics. Passive Solar Design Strategies: Remodeling Guidelines For Conserving Energy At Home is written to help give remodelers the information they need to make these decisions. Passive Solar Design Strategies is a package in three basic parts: the guidelines contain information about passive solar techniques and how they work, and provides specific examples of systems which will save various percentages of energy; the worksheets offer a simple, fill-in-the-blank method to pre-evaluate the performance of a specific design; and the worked example demonstrates how to complete the worksheets for a typical residence.

  1. Sampling of temporal networks: Methods and biases

    Science.gov (United States)

    Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter

    2017-11-01

    Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.

  2. Nature-inspired design strategies in sustainable product development : A case study of student projects

    NARCIS (Netherlands)

    De Pauw, I.C.; Karana, E.; Kandachar, P.V.

    2012-01-01

    In design practice, Nature-Inspired Design Strategies (NIDS) can be applied when developing sustainable products. However, knowledge on how this actually helps designers is lacking. This study explores the effects of applying Cradle to Cradle and Biomimicry in student projects, as compared to using

  3. Optimizing incomplete sample designs for item response model parameters

    NARCIS (Netherlands)

    van der Linden, Willem J.

    Several models for optimizing incomplete sample designs with respect to information on the item parameters are presented. The following cases are considered: (1) known ability parameters; (2) unknown ability parameters; (3) item sets with multiple ability scales; and (4) response models with

  4. Analytical sample preparation strategies for the determination of antimalarial drugs in human whole blood, plasma and urine

    DEFF Research Database (Denmark)

    Casas, Monica Escolà; Hansen, Martin; Krogh, Kristine A

    2014-01-01

    the available sample preparation strategies combined with liquid chromatographic (LC) analysis to determine antimalarials in whole blood, plasma and urine published over the last decade. Sample preparation can be done by protein precipitation, solid-phase extraction, liquid-liquid extraction or dilution. After...

  5. Modeling in the quality by design environment: Regulatory requirements and recommendations for design space and control strategy appointment.

    Science.gov (United States)

    Djuris, Jelena; Djuric, Zorica

    2017-11-30

    Mathematical models can be used as an integral part of the quality by design (QbD) concept throughout the product lifecycle for variety of purposes, including appointment of the design space and control strategy, continual improvement and risk assessment. Examples of different mathematical modeling techniques (mechanistic, empirical and hybrid) in the pharmaceutical development and process monitoring or control are provided in the presented review. In the QbD context, mathematical models are predominantly used to support design space and/or control strategies. Considering their impact to the final product quality, models can be divided into the following categories: high, medium and low impact models. Although there are regulatory guidelines on the topic of modeling applications, review of QbD-based submission containing modeling elements revealed concerns regarding the scale-dependency of design spaces and verification of models predictions at commercial scale of manufacturing, especially regarding real-time release (RTR) models. Authors provide critical overview on the good modeling practices and introduce concepts of multiple-unit, adaptive and dynamic design space, multivariate specifications and methods for process uncertainty analysis. RTR specification with mathematical model and different approaches to multivariate statistical process control supporting process analytical technologies are also presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Strategies of design, development and activation of the Nova control system

    International Nuclear Information System (INIS)

    Holloway, F.W.

    1983-01-01

    Nova and Novette are large complex experimental laser facilities which require extensive and sophisticated control systems for their successful operation. Often, in major controls projects, certain invisible aspects of the project, such as overall strategy, management, resources and historical constraints, have a more profound effect upon success than any specific hardware/software design. The design and performance of the Nova/Novette laser control system will be presented with special emphasis upon these often controversial aspects

  7. Strategies of design, development and activation of the Nova control system

    Energy Technology Data Exchange (ETDEWEB)

    Holloway, F.W.

    1983-06-30

    Nova and Novette are large complex experimental laser facilities which require extensive and sophisticated control systems for their successful operation. Often, in major controls projects, certain invisible aspects of the project, such as overall strategy, management, resources and historical constraints, have a more profound effect upon success than any specific hardware/software design. The design and performance of the Nova/Novette laser control system will be presented with special emphasis upon these often controversial aspects.

  8. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis.

  9. Comprehension with Instructional Media for Middle School Science: Holistic Performative Design Strategy and Cognitive Load

    Science.gov (United States)

    Peterson, Matthew Owen

    This study identifies three distinct levels of text-image integration in page design in a linear relationship of lesser to greater integration: prose primary, prose subsumed, and fully integrated strategies. Science textbook pages were redesigned according to these holistic design strategies for 158 7th-grade students. There were three separate treatment tests, as well as a pre-test and post-test, and pilot tests with both undergraduate students and the subjects themselves. Subjects found the fully integrated strategy to produce the most visually interesting designs and the prose primary strategy to produce the least interesting, with prose subsumed definitively in between (according to 95% confidence intervals). The strategy employed significantly altered interest in science subject matter in one of three treatments (ANOVA, P=0.0446), where a Student's t-test revealed that the prose subsumed strategy produced higher interest in subject matter than prose primary. The strategy employed significantly altered comprehension of abstract relationships in one of three treatments (ANOVA, P=0.0202), where a Student's t-test revealed that the fully integrated strategy resulted in greater comprehension than prose primary. For the same treatment condition significant differences were found through ANOVA for factual-level knowledge (P=0.0289) but not conceptual-level knowledge ( P=0.0586). For factual-level knowledge prose primary resulted in lesser comprehension than both prose subsumed and fully integrated. Comprehension is defined according to cognitive load theory. No strategy impact on perception of task difficulty was found. This study was approved by North Carolina State University's Institutional Review Board and Wake County Public School System's Research Review Committee.

  10. Improvements to robotics-inspired conformational sampling in rosetta.

    Directory of Open Access Journals (Sweden)

    Amelie Stein

    Full Text Available To accurately predict protein conformations in atomic detail, a computational method must be capable of sampling models sufficiently close to the native structure. All-atom sampling is difficult because of the vast number of possible conformations and extremely rugged energy landscapes. Here, we test three sampling strategies to address these difficulties: conformational diversification, intensification of torsion and omega-angle sampling and parameter annealing. We evaluate these strategies in the context of the robotics-based kinematic closure (KIC method for local conformational sampling in Rosetta on an established benchmark set of 45 12-residue protein segments without regular secondary structure. We quantify performance as the fraction of sub-Angstrom models generated. While improvements with individual strategies are only modest, the combination of intensification and annealing strategies into a new "next-generation KIC" method yields a four-fold increase over standard KIC in the median percentage of sub-Angstrom models across the dataset. Such improvements enable progress on more difficult problems, as demonstrated on longer segments, several of which could not be accurately remodeled with previous methods. Given its improved sampling capability, next-generation KIC should allow advances in other applications such as local conformational remodeling of multiple segments simultaneously, flexible backbone sequence design, and development of more accurate energy functions.

  11. The Study on Mental Health at Work: Design and sampling.

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  12. The Study on Mental Health at Work: Design and sampling

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-01-01

    Aims: The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. Methods: S-MGA is a representative study of German employees aged 31–60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. Results: In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. Conclusions: There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment. PMID:28673202

  13. New sampling strategy using a Bayesian approach to assess iohexol clearance in kidney transplant recipients.

    Science.gov (United States)

    Benz-de Bretagne, I; Le Guellec, C; Halimi, J M; Gatault, P; Barbet, C; Alnajjar, A; Büchler, M; Lebranchu, Y; Andres, Christian Robert; Vourcʼh, P; Blasco, H

    2012-06-01

    Glomerular filtration rate (GFR) measurement is a major issue in kidney transplant recipients for clinicians. GFR can be determined by estimating the plasma clearance of iohexol, a nonradiolabeled compound. For practical and convenient application for patients and caregivers, it is important that a minimal number of samples are drawn. The aim of this study was to develop and validate a Bayesian model with fewer samples for reliable prediction of GFR in kidney transplant recipients. Iohexol plasma concentration-time curves from 95 patients were divided into an index (n = 63) and a validation set (n = 32). Samples (n = 4-6 per patient) were obtained during the elimination phase, that is, between 120 and 270 minutes. Individual reference values of iohexol clearance (CL(iohexol)) were calculated from k (elimination slope) and V (volume of distribution from intercept). Individual CL(iohexol) values were then introduced into the Bröchner-Mortensen equation to obtain the GFR (reference value). A population pharmacokinetic model was developed from the index set and validated using standard methods. For the validation set, we tested various combinations of 1, 2, or 3 sampling time to estimate CL(iohexol). According to the different combinations tested, a maximum a posteriori Bayesian estimation of CL(iohexol) was obtained from population parameters. Individual estimates of GFR were compared with individual reference values through analysis of bias and precision. A capability analysis allowed us to determine the best sampling strategy for Bayesian estimation. A 1-compartment model best described our data. Covariate analysis showed that uremia, serum creatinine, and age were significantly associated with k(e), and weight with V. The strategy, including samples drawn at 120 and 270 minutes, allowed accurate prediction of GFR (mean bias: -3.71%, mean imprecision: 7.77%). With this strategy, about 20% of individual predictions were outside the bounds of acceptance set at ± 10

  14. Towards a strategy of reliable fusion first-wall design

    International Nuclear Information System (INIS)

    Schultz, J.H.

    1981-05-01

    Fusion first walls are subject to a large number of possible failure mechanisms, including erosion due to sputtering, arcing, blistering and vaporization and crack growth due to thermal and magnetic stresses. Each of these failure mechanisms is poorly characterized and has the potential of being severe. A strategy for designing reliably in the face of great uncertainty is discussed. Topological features beneficial to reactor availability are identified. The integration of limiter pumping with rf wave launching is discussed, as a means of simplifying reactor design. The concept of a sewer limiter is introduced, as a possible long-life limiter topology. The concept of flexible armor is discussed, as a means of extending maximum life

  15. A simulation approach to assessing sampling strategies for insect pests: an example with the balsam gall midge.

    Directory of Open Access Journals (Sweden)

    R Drew Carleton

    Full Text Available Estimation of pest density is a basic requirement for integrated pest management in agriculture and forestry, and efficiency in density estimation is a common goal. Sequential sampling techniques promise efficient sampling, but their application can involve cumbersome mathematics and/or intensive warm-up sampling when pests have complex within- or between-site distributions. We provide tools for assessing the efficiency of sequential sampling and of alternative, simpler sampling plans, using computer simulation with "pre-sampling" data. We illustrate our approach using data for balsam gall midge (Paradiplosis tumifex attack in Christmas tree farms. Paradiplosis tumifex proved recalcitrant to sequential sampling techniques. Midge distributions could not be fit by a common negative binomial distribution across sites. Local parameterization, using warm-up samples to estimate the clumping parameter k for each site, performed poorly: k estimates were unreliable even for samples of n ∼ 100 trees. These methods were further confounded by significant within-site spatial autocorrelation. Much simpler sampling schemes, involving random or belt-transect sampling to preset sample sizes, were effective and efficient for P. tumifex. Sampling via belt transects (through the longest dimension of a stand was the most efficient, with sample means converging on true mean density for sample sizes of n ∼ 25-40 trees. Pre-sampling and simulation techniques provide a simple method for assessing sampling strategies for estimating insect infestation. We suspect that many pests will resemble P. tumifex in challenging the assumptions of sequential sampling methods. Our software will allow practitioners to optimize sampling strategies before they are brought to real-world applications, while potentially avoiding the need for the cumbersome calculations required for sequential sampling methods.

  16. A methodology for the sustainable design and implementation strategy of CO2 utilization processes

    DEFF Research Database (Denmark)

    Roh, Kosan; Frauzem, Rebecca; Nguyen, Tuan B. H.

    2016-01-01

    design and analysis is discussed as only limited amounts of process data is available for determining the optimal processing path and in the third stage the issue of implementation strategy is considered. As examples, two CO2 utilization methods for methanol production, combined reforming and direct...... synthesis are considered. Methanol plants employing such methods are developed using synthesis-design and simulation tools and their evaluation indicators are calculated under various implementation strategies. It is demonstrated that integrating or replacing an existing conventional methanol plant...

  17. Reasoning Strategies in the Context of Engineering Design with Everyday Materials

    OpenAIRE

    Worsley, Marcelo; Blikstein, Paulo

    2017-01-01

    ‘‘Making’’ represents an increasingly popular label for describing a form of engineering design. While making is growing in popularity, there are still open questions about the strategies that students are using in these activities. Assessing and improving learning in making/ engineering design contexts require that we have a better understanding of where students’ ideas are coming from and a better way to characterize student progress in open-ended learning environments. In this article, we ...

  18. Formation Design Strategy for SCOPE High-Elliptic Formation Flying Mission

    Science.gov (United States)

    Tsuda, Yuichi

    2007-01-01

    The new formation design strategy using simulated annealing (SA) optimization is presented. The SA algorithm is useful to survey a whole solution space of optimum formation, taking into account realistic constraints composed of continuous and discrete functions. It is revealed that this method is not only applicable for circular orbit, but also for high-elliptic orbit formation flying. The developed algorithm is first tested with a simple cart-wheel motion example, and then applied to the formation design for SCOPE. SCOPE is the next generation geomagnetotail observation mission planned in JAXA, utilizing a formation flying techonology in a high elliptic orbit. A distinctive and useful heuristics is found by investigating SA results, showing the effectiveness of the proposed design process.

  19. A generalized strategy for designing (19)F/(1)H dual-frequency MRI coil for small animal imaging at 4.7 Tesla.

    Science.gov (United States)

    Hu, Lingzhi; Hockett, Frank D; Chen, Junjie; Zhang, Lei; Caruthers, Shelton D; Lanza, Gregory M; Wickline, Samuel A

    2011-07-01

    To propose and test a universal strategy for building (19) F/(1) H dual-frequency RF coil that permits multiple coil geometries. The feasibility to design (19) F/(1) H dual-frequency RF coil based on coupled resonator model was investigated. A series capacitive matching network enables robust impedance matching for both harmonic oscillating modes of the coupled resonator. Two typical designs of (19) F/(1) H volume coils (birdcage and saddle) at 4.7T were implemented and evaluated with electrical bench test and in vivo (19) F/(1) H dual-nuclei imaging. For various combinations of internal resistances of the sample coil and secondary resonator, numerical solutions for the tunable capacitors to optimize impedance matching were obtained using a root-seeking program. Identical and homogeneous B1 field distribution at (19) F and (1) H frequencies were observed in bench test and phantom image. Finally, in vivo mouse imaging confirmed the sensitivity and homogeneity of the (19) F/(1) H dual-frequency coil design. A generalized strategy for designing (19) F/(1) H dual-frequency coils based on the coupled resonator approach was developed and validated. A unique feature of this design is that it preserves the B1 field homogeneity of the RF coil at both resonant frequencies. Thus it minimizes the susceptibility effect on image co-registration. Copyright © 2011 Wiley-Liss, Inc.

  20. Lifetime design strategy for binary geothermal plants considering degradation of geothermal resource productivity

    International Nuclear Information System (INIS)

    Budisulistyo, Denny; Wong, Choon Seng; Krumdieck, Susan

    2017-01-01

    Highlights: • A new lifetime strategy for binary plants considering thermal resource degradations. • The net present value and energy return on investment are selected as indicators. • The results indicate that the design based on point 2 has the best revenue. • Improving plant performance by parameters adjustments and adaptable designs. - Abstract: This work proposes a lifetime design strategy for binary geothermal plants which takes into account heat resource degradation. A model of the resource temperature and mass flow rate decline over a 30 year plant life is developed from a survey of data. The standard approach to optimise a basic subcritical cycle of n-pentane working fluid and select component sizes is used for the resource characteristics in years 1, 7, 15 and 30. The performances of the four plants designed for the different resource conditions are then simulated over the plant life to obtain the best lifetime design. The net present value and energy return on investment are selected as the measures of merit. The production history of a real geothermal well in the Taupo Volcanic Zone, New Zealand, is used as a case study for the lifetime design strategy. The results indicate that the operational parameters (such as mass flow rate of n-pentane, inlet turbine pressure and air mass flow rate) and plant performance (net power output) decrease over the whole plant life. The best lifetime plant design was at year 7 with partly degraded conditions. This condition has the highest net present value at USD 6,894,615 and energy return on investment at 4.15. Detailed thermo-economic analysis was carried out with the aim of improving the plant performance to overcome the resource degradation in two ways: operational parameters adjustments and adaptable designs. The results shows that mass flow rates of n-pentane and air cooling should be adjusted to maintain the performance over the plant life. The plant design can also be adapted by installing a recuperator

  1. Metal-Organic Frameworks: Building Block Design Strategies for the Synthesis of MOFs.

    KAUST Repository

    Luebke, Ryan

    2014-01-01

    A significant and ongoing challenge in materials chemistry and furthermore solid state chemistry is to design materials with the desired properties and characteristics. The field of Metal-Organic Frameworks (MOFs) offers several strategies

  2. Sampling strategy for estimating human exposure pathways to consumer chemicals

    Directory of Open Access Journals (Sweden)

    Eleni Papadopoulou

    2016-03-01

    Full Text Available Human exposure to consumer chemicals has become a worldwide concern. In this work, a comprehensive sampling strategy is presented, to our knowledge being the first to study all relevant exposure pathways in a single cohort using multiple methods for assessment of exposure from each exposure pathway. The selected groups of chemicals to be studied are consumer chemicals whose production and use are currently in a state of transition and are; per- and polyfluorinated alkyl substances (PFASs, traditional and “emerging” brominated flame retardants (BFRs and EBFRs, organophosphate esters (OPEs and phthalate esters (PEs. Information about human exposure to these contaminants is needed due to existing data gaps on human exposure intakes from multiple exposure pathways and relationships between internal and external exposure. Indoor environment, food and biological samples were collected from 61 participants and their households in the Oslo area (Norway on two consecutive days, during winter 2013-14. Air, dust, hand wipes, and duplicate diet (food and drink samples were collected as indicators of external exposure, and blood, urine, blood spots, hair, nails and saliva as indicators of internal exposure. A food diary, food frequency questionnaire (FFQ and indoor environment questionnaire were also implemented. Approximately 2000 samples were collected in total and participant views on their experiences of this campaign were collected via questionnaire. While 91% of our participants were positive about future participation in a similar project, some tasks were viewed as problematic. Completing the food diary and collection of duplicate food/drink portions were the tasks most frequent reported as “hard”/”very hard”. Nevertheless, a strong positive correlation between the reported total mass of food/drinks in the food record and the total weight of the food/drinks in the collection bottles was observed, being an indication of accurate performance

  3. Passive solar design strategies: Remodeling guidelines for conserving energy at home

    Energy Technology Data Exchange (ETDEWEB)

    1991-01-01

    The idea of passive solar is simple, but applying it effectively does require information and attention to the details of design and construction. Some passive solar techniques are modest and low-cost, and require only small changes in remodeler's typical practice. At the other end of the spectrum, some passive solar systems can almost eliminate a house's need for purchased heating (and in some cases, cooling) energy -- but probably at a relatively high first cost. In between are a broad range of energy-conserving passive solar techniques. Whether or not they are cost-effective, practical and attractive enough to offer a market advantage to any individual remodeler depends on very specific factors such as local costs, climate, and market characteristics. Passive solar design strategies: Remodeling Guidelines For Conserving Energy At Homes is written to help give remodelers the information they need to make these decisions. Passive Solar Design Strategies is a package in three basic parts: The Guidelines contain information about passive solar techniques and how they work, and provides specific examples of systems which will save various percentages of energy; The Worksheets offer a simple, fill-in-the-blank method to pre-evaluate the performance of a specific design; The Worked Example demonstrates how to complete the worksheets for a typical residence.

  4. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  5. Biomedical engineering strategies in system design space.

    Science.gov (United States)

    Savageau, Michael A

    2011-04-01

    Modern systems biology and synthetic bioengineering face two major challenges in relating properties of the genetic components of a natural or engineered system to its integrated behavior. The first is the fundamental unsolved problem of relating the digital representation of the genotype to the analog representation of the parameters for the molecular components. For example, knowing the DNA sequence does not allow one to determine the kinetic parameters of an enzyme. The second is the fundamental unsolved problem of relating the parameters of the components and the environment to the phenotype of the global system. For example, knowing the parameters does not tell one how many qualitatively distinct phenotypes are in the organism's repertoire or the relative fitness of the phenotypes in different environments. These also are challenges for biomedical engineers as they attempt to develop therapeutic strategies to treat pathology or to redirect normal cellular functions for biotechnological purposes. In this article, the second of these fundamental challenges will be addressed, and the notion of a "system design space" for relating the parameter space of components to the phenotype space of bioengineering systems will be focused upon. First, the concept of a system design space will be motivated by introducing one of its key components from an intuitive perspective. Second, a simple linear example will be used to illustrate a generic method for constructing the design space in which qualitatively distinct phenotypes can be identified and counted, their fitness analyzed and compared, and their tolerance to change measured. Third, two examples of nonlinear systems from different areas of biomedical engineering will be presented. Finally, after giving reference to a few other applications that have made use of the system design space approach to reveal important design principles, some concluding remarks concerning challenges and opportunities for further development

  6. Design strategy for the combined system of shunt passive and series active filters

    OpenAIRE

    Fujita, Hideki; Akagi, Hirofumi

    1991-01-01

    A design strategy for the combined power filter for a three-phase twelve-pulse thyristor rectifier is proposed. The shunt passive filter, which can minimize the output voltage of the series active filter, is designed and tested in a prototype model. A specially designed shunt passive filter makes it possible to reduce the required rating of the series active filter to 60% compared with a conventional shunt passive filter

  7. Demonstrating Empathy: A Phenomenological Study of Instructional Designers Making Instructional Strategy Decisions for Adult Learners

    Science.gov (United States)

    Vann, Linda S.

    2017-01-01

    Instructional designers are tasked with making instructional strategy decisions to facilitate achievement of learning outcomes as part of their professional responsibilities. While the instructional design process includes learner analysis, that analysis alone does not embody opportunities to assist instructional designers with demonstrations of…

  8. A Comparison of Three Online Recruitment Strategies for Engaging Parents.

    Science.gov (United States)

    Dworkin, Jodi; Hessel, Heather; Gliske, Kate; Rudi, Jessie H

    2016-10-01

    Family scientists can face the challenge of effectively and efficiently recruiting normative samples of parents and families. Utilizing the Internet to recruit parents is a strategic way to find participants where they already are, enabling researchers to overcome many of the barriers to in-person recruitment. The present study was designed to compare three online recruitment strategies for recruiting parents: e-mail Listservs, Facebook, and Amazon Mechanical Turk (MTurk). Analyses revealed differences in the effectiveness and efficiency of data collection. In particular, MTurk resulted in the most demographically diverse sample, in a short period of time, with little cost. Listservs reached a large number of participants and resulted in a comparatively homogeneous sample. Facebook was not successful in recruiting a general sample of parents. Findings provide information that can help family researchers and practitioners be intentional about recruitment strategies and study design.

  9. Surveillance Monitoring Management for General Care Units: Strategy, Design, and Implementation.

    Science.gov (United States)

    McGrath, Susan P; Taenzer, Andreas H; Karon, Nancy; Blike, George

    2016-07-01

    The growing number of monitoring devices, combined with suboptimal patient monitoring and alarm management strategies, has increased "alarm fatigue," which have led to serious consequences. Most reported alarm man- agement approaches have focused on the critical care setting. Since 2007 Dartmouth-Hitchcock (Lebanon, New Hamp- shire) has developed a generalizable and effective design, implementation, and performance evaluation approach to alarm systems for continuous monitoring in general care settings (that is, patient surveillance monitoring). In late 2007, a patient surveillance monitoring system was piloted on the basis of a structured design and implementation approach in a 36-bed orthopedics unit. Beginning in early 2009, it was expanded to cover more than 200 inpatient beds in all medicine and surgical units, except for psychiatry and labor and delivery. Improvements in clinical outcomes (reduction of unplanned transfers by 50% and reduction of rescue events by more than 60% in 2008) and approximately two alarms per patient per 12-hour nursing shift in the original pilot unit have been sustained across most D-H general care units in spite of increasing patient acuity and unit occupancy. Sample analysis of pager notifications indicates that more than 85% of all alarm conditions are resolved within 30 seconds and that more than 99% are resolved before escalation is triggered. The D-H surveillance monitoring system employs several important, generalizable features to manage alarms in a general care setting: alarm delays, static thresholds set appropriately for the prevalence of events in this setting, directed alarm annunciation, and policy-driven customization of thresholds to allow clinicians to respond to needs of individual patients. The systematic approach to design, implementation, and performance management has been key to the success of the system.

  10. Linking Serious Game Narratives with Pedagogical Theories and Pedagogical Design Strategies

    Science.gov (United States)

    De Troyer, Olga; Van Broeckhoven, Frederik; Vlieghe, Joachim

    2017-01-01

    Narrative-based serious games present pedagogical content and interventions through an interactive narrative. To ensure effective learning in such kind of serious games, designers are not only faced with the challenge of creating a compelling narrative, but also with the additional challenge of incorporating suitable pedagogical strategies.…

  11. Gathering Requirements for Teacher Tools: Strategies for Empowering Teachers Through Co-Design

    Science.gov (United States)

    Matuk, Camillia; Gerard, Libby; Lim-Breitbart, Jonathan; Linn, Marcia

    2016-02-01

    Technology can enhance teachers' practice in multiple ways. It can help them better understand patterns in their students' thinking, manage class progress at individual and group levels, and obtain evidence to inform modifications to curriculum and instruction. Such technology is most effective when it is aligned with teachers' goals and expectations. Participatory methods, which involve teachers closely in the design process, are widely recommended for establishing accurate design requirements that address users' needs. By collaborating with researchers, teachers can contribute their professional expertise to shape the tools of their practice, and ultimately ensure their sustained use. However, there is little guidance available for maintaining effective teacher-researcher design partnerships. We describe four strategies for engaging teachers in designing tools intended to support and enhance their practice within a web-based science learning environment: discussing physical artifacts, reacting to scenarios, customizing prototypes, and writing user stories. Using design artifacts and documents of teachers' reflections, we illustrate how we applied these techniques over 5 years of annual professional development workshops, and examine their affordances for eliciting teachers' ideas. We reflect on how these approaches have helped inform technology refinements and innovations. We moreover discuss the further benefits these strategies have had in encouraging teachers to reflect on their own practice and on the roles of technology in supporting it; and in allowing researchers to gain a deeper understanding of the relationship between technology, teaching, and design.

  12. Optimal sampling strategy for data mining

    International Nuclear Information System (INIS)

    Ghaffar, A.; Shahbaz, M.; Mahmood, W.

    2013-01-01

    Latest technology like Internet, corporate intranets, data warehouses, ERP's, satellites, digital sensors, embedded systems, mobiles networks all are generating such a massive amount of data that it is getting very difficult to analyze and understand all these data, even using data mining tools. Huge datasets are becoming a difficult challenge for classification algorithms. With increasing amounts of data, data mining algorithms are getting slower and analysis is getting less interactive. Sampling can be a solution. Using a fraction of computing resources, Sampling can often provide same level of accuracy. The process of sampling requires much care because there are many factors involved in the determination of correct sample size. The approach proposed in this paper tries to find a solution to this problem. Based on a statistical formula, after setting some parameters, it returns a sample size called s ufficient sample size , which is then selected through probability sampling. Results indicate the usefulness of this technique in coping with the problem of huge datasets. (author)

  13. Development of high-immersive simulation system for designing maintenance strategy and its application to CLEAR-I

    International Nuclear Information System (INIS)

    Yang, Zihui; He, Tao; Shang, Leiming; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • A system was self-developed to simulate and verify the maintenance strategy. • Large-scale 3D radiation scene can be rapidly built and rendered in real-time. • A high efficiency simulation way was provided by tracking 6-DOF body motion. • It has been applied to CLEAR-I and proved maintenance strategy was feasible. - Abstract: The maintenance strategy is imperative and important to be considered in the design process of advanced nuclear reactor as early as possible. The design and validation of the maintenance strategy are key challenges which remain to be resolved due to the complex mechanical structure, the expensive physical mockup, and the potential risk of radiation hazards. In this paper, a high-immersive interactive simulation system has been developed to simulate and verify maintenance strategy by using virtual reality. Main features include: (1) rapid modeling and real-time rendering of the large-scale three-dimensional radiation scene, (2) real-time interactive roaming by tracking the 6-DOF body motion, (3) the interactive disassembly simulation of the mechanism structure were controlled by the hand gestures. The system can provide designers with an intuitive experience environment, an efficient and flexible interactive way as well as the whole process simulation of maintenance strategy. It has been applied to simulate and verify the maintenance strategy of the spallation target proton beam window for China LEAd-based research Reactor (CLEAR-I). The simulation result proved the maintenance strategy of proton beam window was reasonable and feasible. The system was useful to verify maintenance strategy, optimize maintenance operation, reduce intervention time and exposure dose of workers. It provided a high efficiency and low cost way for simulating maintenance strategy to meet the requirements of ALARA rules and can be extended to other nuclear facilities

  14. Compressed sensing of roller bearing fault based on multiple down-sampling strategy

    Science.gov (United States)

    Wang, Huaqing; Ke, Yanliang; Luo, Ganggang; Tang, Gang

    2016-02-01

    Roller bearings are essential components of rotating machinery and are often exposed to complex operating conditions, which can easily lead to their failures. Thus, to ensure normal production and the safety of machine operators, it is essential to detect the failures as soon as possible. However, it is a major challenge to maintain a balance between detection efficiency and big data acquisition given the limitations of sampling theory. To overcome these limitations, we try to preserve the information pertaining to roller bearing failures using a sampling rate far below the Nyquist sampling rate, which can ease the pressure generated by the large-scale data. The big data of a faulty roller bearing’s vibration signals is firstly reduced by a down-sample strategy while preserving the fault features by selecting peaks to represent the data segments in time domain. However, a problem arises in that the fault features may be weaker than before, since the noise may be mistaken for the peaks when the noise is stronger than the vibration signals, which makes the fault features unable to be extracted by commonly-used envelope analysis. Here we employ compressive sensing theory to overcome this problem, which can make a signal enhancement and reduce the sample sizes further. Moreover, it is capable of detecting fault features from a small number of samples based on orthogonal matching pursuit approach, which can overcome the shortcomings of the multiple down-sample algorithm. Experimental results validate the effectiveness of the proposed technique in detecting roller bearing faults.

  15. Compressed sensing of roller bearing fault based on multiple down-sampling strategy

    International Nuclear Information System (INIS)

    Wang, Huaqing; Ke, Yanliang; Luo, Ganggang; Tang, Gang

    2016-01-01

    Roller bearings are essential components of rotating machinery and are often exposed to complex operating conditions, which can easily lead to their failures. Thus, to ensure normal production and the safety of machine operators, it is essential to detect the failures as soon as possible. However, it is a major challenge to maintain a balance between detection efficiency and big data acquisition given the limitations of sampling theory. To overcome these limitations, we try to preserve the information pertaining to roller bearing failures using a sampling rate far below the Nyquist sampling rate, which can ease the pressure generated by the large-scale data. The big data of a faulty roller bearing’s vibration signals is firstly reduced by a down-sample strategy while preserving the fault features by selecting peaks to represent the data segments in time domain. However, a problem arises in that the fault features may be weaker than before, since the noise may be mistaken for the peaks when the noise is stronger than the vibration signals, which makes the fault features unable to be extracted by commonly-used envelope analysis. Here we employ compressive sensing theory to overcome this problem, which can make a signal enhancement and reduce the sample sizes further. Moreover, it is capable of detecting fault features from a small number of samples based on orthogonal matching pursuit approach, which can overcome the shortcomings of the multiple down-sample algorithm. Experimental results validate the effectiveness of the proposed technique in detecting roller bearing faults. (paper)

  16. Design of a gravity corer for near shore sediment sampling

    Digital Repository Service at National Institute of Oceanography (India)

    Bhat, S.T.; Sonawane, A.V.; Nayak, B.U.

    For the purpose of geotechnical investigation a gravity corer has been designed and fabricated to obtain undisturbed sediment core samples from near shore waters. The corer was successfully operated at 75 stations up to water depth 30 m. Simplicity...

  17. Instructional Strategies to Support Creativity and Innovation in Education

    Science.gov (United States)

    Seechaliao, Thapanee

    2017-01-01

    The purpose of the study focused on the instructional strategies that support creation of creative and innovative education. The sample for this study consisted of 11 experts in the field of instructional strategies that support innovation of education. Among them, five were specialists in design and development of teaching and learning, three…

  18. Measuring strategies for learning regulation in medical education: scale reliability and dimensionality in a Swedish sample.

    Science.gov (United States)

    Edelbring, Samuel

    2012-08-15

    The degree of learners' self-regulated learning and dependence on external regulation influence learning processes in higher education. These regulation strategies are commonly measured by questionnaires developed in other settings than in which they are being used, thereby requiring renewed validation. The aim of this study was to psychometrically evaluate the learning regulation strategy scales from the Inventory of Learning Styles with Swedish medical students (N = 206). The regulation scales were evaluated regarding their reliability, scale dimensionality and interrelations. The primary evaluation focused on dimensionality and was performed with Mokken scale analysis. To assist future scale refinement, additional item analysis, such as item-to-scale correlations, was performed. Scale scores in the Swedish sample displayed good reliability in relation to published results: Cronbach's alpha: 0.82, 0.72, and 0.65 for self-regulation, external regulation and lack of regulation scales respectively. The dimensionalities in scales were adequate for self-regulation and its subscales, whereas external regulation and lack of regulation displayed less unidimensionality. The established theoretical scales were largely replicated in the exploratory analysis. The item analysis identified two items that contributed little to their respective scales. The results indicate that these scales have an adequate capacity for detecting the three theoretically proposed learning regulation strategies in the medical education sample. Further construct validity should be sought by interpreting scale scores in relation to specific learning activities. Using established scales for measuring students' regulation strategies enables a broad empirical base for increasing knowledge on regulation strategies in relation to different disciplinary settings and contributes to theoretical development.

  19. Effective strategy making: Co-designing scenarios as a tool for effective strategic planning

    Directory of Open Access Journals (Sweden)

    Jan Vogelij

    2015-08-01

    -producing, as making together a product and collaboration towards a strategy, defined as a common exploration of possibilities. The interactions of the participants in the strategy making process, potentially creating trust and enhancing social cohesion, may be more important than the resulting concept for a development strategy. Therefore our attention focused on the way strategies are made in the black box of specific processes. The interactions aiming at co-ownership and collaboration are central in this novel ’interactions approach’. The resulting strategy does not necessarily contain (infra structural projects; a spatial strategy concerns a selected localised policy and argumentative framework for future development. Societal relevance Being a planning consultant with over forty years of experience in practice, the author highly values practical applicability in society. Understanding of practice-related factors for effectiveness of strategy making is important because effective public management saves costs for society; a framework expressing the aimed-for direction of development provides clarity for private initiatives; an agreed strategy helps to coordinate sector policies; several EU and national subsidies for projects require a locally agreed structural frame. Because a development vision concerns the direction of an envisaged development, its effectiveness is defined in terms of performance of the argumentative framework based on the story lines developed in the discussions during the strategy making process. Our search aims at identifying recommendations for enhanced chances for performing strategies in praxis. Approach A theoretical frame was composed based on literature in the fields of planning theory, policy analysis and design. Research questions were formulated concerning the importance of the process related variables: open process management, coownership, co-design, the application of scenarios and visualizations. Concerning the importance of the place

  20. A binary logistic regression model with complex sampling design of ...

    African Journals Online (AJOL)

    2017-09-03

    Sep 3, 2017 ... Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. .... Data was entered into STATA-12 and analyzed using. SPSS-21. .... lack of access/too far or costs too much. 35. 1.2.

  1. Experimental design matters for statistical analysis

    DEFF Research Database (Denmark)

    Jensen, Signe Marie; Schaarschmidt, Frank; Onofri, Andrea

    2018-01-01

    , the experimental design is often more or less neglected when analyzing data. Two data examples were analyzed using different modelling strategies: Firstly, in a randomized complete block design, mean heights of maize treated with a herbicide and one of several adjuvants were compared. Secondly, translocation...... of an insecticide applied to maize as a seed treatment was evaluated using incomplete data from an unbalanced design with several layers of hierarchical sampling. Extensive simulations were carried out to further substantiate the effects of different modelling strategies. RESULTS: It was shown that results from sub...

  2. Inner strategies of coping with operational work amongst SAPS officers

    Directory of Open Access Journals (Sweden)

    Masefako A. Gumani

    2013-11-01

    Research purpose: The objective of this study was to describe inner coping strategies used by officers in the Vhembe district (South Africa to reconstruct stressful and traumatic experiences at work. Motivation for the study: Most studies on coping amongst SAPS officers focus on organisational stress and not on the impact of the officers’ operational work. Research design, approach and method: An exploratory design was used and 20 SAPS officers were selected through purposive sampling. In-depth face-to-face and telephone interviews, as well as diaries were used to collect data, which were analysed using content thematic data analysis. Main findings: The results showed that the main categories of coping strategies that led to management of the impact of operational work amongst the selected sample were centred around problem-focused and emotion-focused strategies, with some use of reappraisal and minimal use of avoidance. Considering the context of the officers’ work, the list of dimensions of inner coping strategies amongst SAPS officers should be extended. Practical/managerial implications: Intervention programmes designed for the SAPS, including critical incident stress debriefing, should take the operational officers’ inner strategies into account to improve the management of the impact of their work. Contribution/value-add: This study contributes to the body of knowledge on the inner coping strategies amongst SAPS officers, with special reference to operational work in a specific setting.

  3. WRAP Module 1 sampling strategy and waste characterization alternatives study

    Energy Technology Data Exchange (ETDEWEB)

    Bergeson, C.L.

    1994-09-30

    The Waste Receiving and Processing Module 1 Facility is designed to examine, process, certify, and ship drums and boxes of solid wastes that have a surface dose equivalent of less than 200 mrem/h. These wastes will include low-level and transuranic wastes that are retrievably stored in the 200 Area burial grounds and facilities in addition to newly generated wastes. Certification of retrievably stored wastes processing in WRAP 1 is required to meet the waste acceptance criteria for onsite treatment and disposal of low-level waste and mixed low-level waste and the Waste Isolation Pilot Plant Waste Acceptance Criteria for the disposal of TRU waste. In addition, these wastes will need to be certified for packaging in TRUPACT-II shipping containers. Characterization of the retrievably stored waste is needed to support the certification process. Characterization data will be obtained from historical records, process knowledge, nondestructive examination nondestructive assay, visual inspection of the waste, head-gas sampling, and analysis of samples taken from the waste containers. Sample characterization refers to the method or methods that are used to test waste samples for specific analytes. The focus of this study is the sample characterization needed to accurately identify the hazardous and radioactive constituents present in the retrieved wastes that will be processed in WRAP 1. In addition, some sampling and characterization will be required to support NDA calculations and to provide an over-check for the characterization of newly generated wastes. This study results in the baseline definition of WRAP 1 sampling and analysis requirements and identifies alternative methods to meet these requirements in an efficient and economical manner.

  4. WRAP Module 1 sampling strategy and waste characterization alternatives study

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    The Waste Receiving and Processing Module 1 Facility is designed to examine, process, certify, and ship drums and boxes of solid wastes that have a surface dose equivalent of less than 200 mrem/h. These wastes will include low-level and transuranic wastes that are retrievably stored in the 200 Area burial grounds and facilities in addition to newly generated wastes. Certification of retrievably stored wastes processing in WRAP 1 is required to meet the waste acceptance criteria for onsite treatment and disposal of low-level waste and mixed low-level waste and the Waste Isolation Pilot Plant Waste Acceptance Criteria for the disposal of TRU waste. In addition, these wastes will need to be certified for packaging in TRUPACT-II shipping containers. Characterization of the retrievably stored waste is needed to support the certification process. Characterization data will be obtained from historical records, process knowledge, nondestructive examination nondestructive assay, visual inspection of the waste, head-gas sampling, and analysis of samples taken from the waste containers. Sample characterization refers to the method or methods that are used to test waste samples for specific analytes. The focus of this study is the sample characterization needed to accurately identify the hazardous and radioactive constituents present in the retrieved wastes that will be processed in WRAP 1. In addition, some sampling and characterization will be required to support NDA calculations and to provide an over-check for the characterization of newly generated wastes. This study results in the baseline definition of WRAP 1 sampling and analysis requirements and identifies alternative methods to meet these requirements in an efficient and economical manner

  5. A strategy for reducing turnaround time in design optimization using a distributed computer system

    Science.gov (United States)

    Young, Katherine C.; Padula, Sharon L.; Rogers, James L.

    1988-01-01

    There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.

  6. Are quantitative trait-dependent sampling designs cost-effective for analysis of rare and common variants?

    Science.gov (United States)

    Yilmaz, Yildiz E; Bull, Shelley B

    2011-11-29

    Use of trait-dependent sampling designs in whole-genome association studies of sequence data can reduce total sequencing costs with modest losses of statistical efficiency. In a quantitative trait (QT) analysis of data from the Genetic Analysis Workshop 17 mini-exome for unrelated individuals in the Asian subpopulation, we investigate alternative designs that sequence only 50% of the entire cohort. In addition to a simple random sampling design, we consider extreme-phenotype designs that are of increasing interest in genetic association analysis of QTs, especially in studies concerned with the detection of rare genetic variants. We also evaluate a novel sampling design in which all individuals have a nonzero probability of being selected into the sample but in which individuals with extreme phenotypes have a proportionately larger probability. We take differential sampling of individuals with informative trait values into account by inverse probability weighting using standard survey methods which thus generalizes to the source population. In replicate 1 data, we applied the designs in association analysis of Q1 with both rare and common variants in the FLT1 gene, based on knowledge of the generating model. Using all 200 replicate data sets, we similarly analyzed Q1 and Q4 (which is known to be free of association with FLT1) to evaluate relative efficiency, type I error, and power. Simulation study results suggest that the QT-dependent selection designs generally yield greater than 50% relative efficiency compared to using the entire cohort, implying cost-effectiveness of 50% sample selection and worthwhile reduction of sequencing costs.

  7. Catch, effort and sampling strategies in the highly variable sardine fisheries around East Java, Indonesia.

    NARCIS (Netherlands)

    Pet, J.S.; Densen, van W.L.T.; Machiels, M.A.M.; Sukkel, M.; Setyohady, D.; Tumuljadi, A.

    1997-01-01

    Temporal and spatial patterns in the fishery for Sardinella spp. around East Java, Indonesia, were studied in an attempt to develop an efficient catch and effort sampling strategy for this highly variable fishery. The inter-annual and monthly variation in catch, effort and catch per unit of effort

  8. Performance Analysis and Design Strategy for a Second-Order, Fixed-Gain, Position-Velocity-Measured (α-β-η-θ Tracking Filter

    Directory of Open Access Journals (Sweden)

    Kenshi Saho

    2017-07-01

    Full Text Available We present a strategy for designing an α - β - η - θ filter, a fixed-gain moving-object tracking filter using position and velocity measurements. First, performance indices and stability conditions for the filter are analytically derived. Then, an optimal gain design strategy using these results is proposed and its relationship to the position-velocity-measured (PVM Kalman filter is shown. Numerical analyses demonstrate the effectiveness of the proposed strategy, as well as a performance improvement over the traditional position-only-measured α - β filter. Moreover, we apply an α - β - η - θ filter designed using this strategy to ultra-wideband Doppler radar tracking in numerical simulations. We verify that the proposed strategy can easily design the gains for an α - β - η - θ filter based on the performance of the ultra-wideband Doppler radar and a rough approximation of the target’s acceleration. Moreover, its effectiveness in predicting the steady state performance in designing the position-velocity-measured Kalman filter is also demonstrated.

  9. Applying Instructional Design Strategies and Behavior Theory to Household Disaster Preparedness Training.

    Science.gov (United States)

    Thomas, Tracy N; Sobelson, Robyn K; Wigington, Corinne J; Davis, Alyson L; Harp, Victoria H; Leander-Griffith, Michelle; Cioffi, Joan P

    Interventions and media campaigns promoting household disaster preparedness have produced mixed results in affecting behaviors. In large part, this is due to the limited application of instructional design strategies and behavior theory, such as the Transtheoretical Model (TTM). This study describes the development and evaluation of Ready CDC, an intervention designed to increase household disaster preparedness among the Centers for Disease Control and Prevention (CDC) workforce. (1) Describe the instructional design strategies employed in the development of Ready CDC and (2) evaluate the intervention's impact on behavior change and factors influencing stage progression for household disaster preparedness behavior. Ready CDC was adapted from the Federal Emergency Management Agency's (FEMA's) Ready campaign. Offered to CDC staff September 2013-November 2015, it consisted of a preassessment of preparedness attitudes and behaviors, an in-person training, behavioral reinforcement communications, and a 3-month follow-up postassessment. Ready CDC employed well-accepted design strategies, including presenting stimulus material and enhancing transfer of desired behavior. Excluding those in the TTM "maintenance" stage at baseline, this study determined 44% of 208 participants progressed at least 1 stage for developing a written disaster plan. Moreover, assessment of progression by stage found among participants in the "precontemplation" (n = 16), "contemplation" (n = 15), and "preparation" (n = 125) stages at baseline for assembling an emergency kit, 25%, 27%, and 43% moved beyond the "preparation" stage, respectively. Factors influencing stage movement included knowledge, attitudes, and community resiliency but varied depending on baseline stage of change. Employing instructional strategies and behavioral theories in preparedness interventions optimizes the potential for individuals to adopt preparedness behaviors. Study findings suggest that stage movement toward

  10. Sampling strategies for subsampled segmented EPI PRF thermometry in MR guided high intensity focused ultrasound

    Science.gov (United States)

    Odéen, Henrik; Todd, Nick; Diakite, Mahamadou; Minalga, Emilee; Payne, Allison; Parker, Dennis L.

    2014-01-01

    Purpose: To investigate k-space subsampling strategies to achieve fast, large field-of-view (FOV) temperature monitoring using segmented echo planar imaging (EPI) proton resonance frequency shift thermometry for MR guided high intensity focused ultrasound (MRgHIFU) applications. Methods: Five different k-space sampling approaches were investigated, varying sample spacing (equally vs nonequally spaced within the echo train), sampling density (variable sampling density in zero, one, and two dimensions), and utilizing sequential or centric sampling. Three of the schemes utilized sequential sampling with the sampling density varied in zero, one, and two dimensions, to investigate sampling the k-space center more frequently. Two of the schemes utilized centric sampling to acquire the k-space center with a longer echo time for improved phase measurements, and vary the sampling density in zero and two dimensions, respectively. Phantom experiments and a theoretical point spread function analysis were performed to investigate their performance. Variable density sampling in zero and two dimensions was also implemented in a non-EPI GRE pulse sequence for comparison. All subsampled data were reconstructed with a previously described temporally constrained reconstruction (TCR) algorithm. Results: The accuracy of each sampling strategy in measuring the temperature rise in the HIFU focal spot was measured in terms of the root-mean-square-error (RMSE) compared to fully sampled “truth.” For the schemes utilizing sequential sampling, the accuracy was found to improve with the dimensionality of the variable density sampling, giving values of 0.65 °C, 0.49 °C, and 0.35 °C for density variation in zero, one, and two dimensions, respectively. The schemes utilizing centric sampling were found to underestimate the temperature rise, with RMSE values of 1.05 °C and 1.31 °C, for variable density sampling in zero and two dimensions, respectively. Similar subsampling schemes

  11. Multi-infill strategy for kriging models used in variable fidelity optimization

    Directory of Open Access Journals (Sweden)

    Chao SONG

    2018-03-01

    Full Text Available In this paper, a computationally efficient optimization method for aerodynamic design has been developed. The low-fidelity model and the multi-infill strategy are utilized in this approach. Low-fidelity data is employed to provide a good global trend for model prediction, and multiple sample points chosen by different infill criteria in each updating cycle are used to enhance the exploitation and exploration ability of the optimization approach. Take the advantages of low-fidelity model and the multi-infill strategy, and no initial sample for the high-fidelity model is needed. This approach is applied to an airfoil design case and a high-dimensional wing design case. It saves a large number of high-fidelity function evaluations for initial model construction. What’s more, faster reduction of an aerodynamic function is achieved, when compared to ordinary kriging using the multi-infill strategy and variable-fidelity model using single infill criterion. The results indicate that the developed approach has a promising application to efficient aerodynamic design when high-fidelity analyses are involved. Keywords: Aerodynamics, Infill criteria, Kriging models, Multi-infill, Optimization

  12. Evaluation of design, leak monitoring, dnd NDEA strategies to assure PBMR Helium pressure boundary reliability - HTR2008-58037

    International Nuclear Information System (INIS)

    Fleming, K. N.; Smit, K.

    2008-01-01

    This paper discusses the reliability and integrity management (RIM) strategies that have been applied in the design of the PBMR passive metallic components for the helium pressure boundary (HPB) to meet reliability targets and to evaluate what combination of strategies are needed to meet the targets. The strategies considered include deterministic design strategies to reduce or eliminate the potential for specific damage mechanisms, use of an on-line leak monitoring system and associated design provisions that provide a high degree of leak detection reliability, and periodic nondestructive examinations combined with repair and replacement strategies to reduce the probability that degradation would lead to pipe ruptures. The PBMR RIM program for passive metallic piping components uses a leak-before-break philosophy. A Markov model developed for use in LWR risk-informed in-service inspection evaluations was applied to investigate the impact of alternative RIM strategies and plant age assumptions on the pipe rupture frequencies as a function of rupture size. Some key results of this investigation are presented in this paper. (authors)

  13. poolHiTS: A Shifted Transversal Design based pooling strategy for high-throughput drug screening

    Directory of Open Access Journals (Sweden)

    Woolf Peter J

    2008-05-01

    Full Text Available Abstract Background A key goal of drug discovery is to increase the throughput of small molecule screens without sacrificing screening accuracy. High-throughput screening (HTS in drug discovery involves testing a large number of compounds in a biological assay to identify active compounds. Normally, molecules from a large compound library are tested individually to identify the activity of each molecule. Usually a small number of compounds are found to be active, however the presence of false positive and negative testing errors suggests that this one-drug one-assay screening strategy can be significantly improved. Pooling designs are testing schemes that test mixtures of compounds in each assay, thereby generating a screen of the whole compound library in fewer tests. By repeatedly testing compounds in different combinations, pooling designs also allow for error-correction. These pooled designs, for specific experiment parameters, can be simply and efficiently created using the Shifted Transversal Design (STD pooling algorithm. However, drug screening contains a number of key constraints that require specific modifications if this pooling approach is to be useful for practical screen designs. Results In this paper, we introduce a pooling strategy called poolHiTS (Pooled High-Throughput Screening which is based on the STD algorithm. In poolHiTS, we implement a limit on the number of compounds that can be mixed in a single assay. In addition, we show that the STD-based pooling strategy is limited in the error-correction that it can achieve. Due to the mixing constraint, we show that it is more efficient to split a large library into smaller blocks of compounds, which are then tested using an optimized strategy repeated for each block. We package the optimal block selection algorithm into poolHiTS. The MATLAB codes for the poolHiTS algorithm and the corresponding decoding strategy are also provided. Conclusion We have produced a practical version

  14. Field screening sampling and analysis strategy and methodology for the 183-H Solar Evaporation Basins: Phase 2, Soils

    International Nuclear Information System (INIS)

    Antipas, A.; Hopkins, A.M.; Wasemiller, M.A.; McCain, R.G.

    1996-01-01

    This document provides a sampling/analytical strategy and methodology for Resource Conservation and Recovery Act (RCRA) closure of the 183-H Solar Evaporation Basins within the boundaries and requirements identified in the initial Phase II Sampling and Analysis Plan for RCRA Closure of the 183-H Solar Evaporation Basins

  15. Modelling of in-stream nitrogen and phosphorus concentrations using different sampling strategies for calibration data

    Science.gov (United States)

    Jomaa, Seifeddine; Jiang, Sanyuan; Yang, Xiaoqiang; Rode, Michael

    2016-04-01

    It is known that a good evaluation and prediction of surface water pollution is mainly limited by the monitoring strategy and the capability of the hydrological water quality model to reproduce the internal processes. To this end, a compromise sampling frequency, which can reflect the dynamical behaviour of leached nutrient fluxes responding to changes in land use, agriculture practices and point sources, and appropriate process-based water quality model are required. The objective of this study was to test the identification of hydrological water quality model parameters (nitrogen and phosphorus) under two different monitoring strategies: (1) regular grab-sampling approach and (2) regular grab-sampling with additional monitoring during the hydrological events using automatic samplers. First, the semi-distributed hydrological water quality HYPE (Hydrological Predictions for the Environment) model was successfully calibrated (1994-1998) for discharge (NSE = 0.86), nitrate-N (lowest NSE for nitrate-N load = 0.69), particulate phosphorus and soluble phosphorus in the Selke catchment (463 km2, central Germany) for the period 1994-1998 using regular grab-sampling approach (biweekly to monthly for nitrogen and phosphorus concentrations). Second, the model was successfully validated during the period 1999-2010 for discharge, nitrate-N, particulate-phosphorus and soluble-phosphorus (lowest NSE for soluble phosphorus load = 0.54). Results, showed that when additional sampling during the events with random grab-sampling approach was used (period 2011-2013), the hydrological model could reproduce only the nitrate-N and soluble phosphorus concentrations reasonably well. However, when additional sampling during the hydrological events was considered, the HYPE model could not represent the measured particulate phosphorus. This reflects the importance of suspended sediment during the hydrological events increasing the concentrations of particulate phosphorus. The HYPE model could

  16. Strategy revealing phenotypic differences among synthetic oscillator designs.

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    2014-09-19

    Considerable progress has been made in identifying and characterizing the component parts of genetic oscillators, which play central roles in all organisms. Nonlinear interaction among components is sufficiently complex that mathematical models are required to elucidate their elusive integrated behavior. Although natural and synthetic oscillators exhibit common architectures, there are numerous differences that are poorly understood. Utilizing synthetic biology to uncover basic principles of simpler circuits is a way to advance understanding of natural circadian clocks and rhythms. Following this strategy, we address the following questions: What are the implications of different architectures and molecular modes of transcriptional control for the phenotypic repertoire of genetic oscillators? Are there designs that are more realizable or robust? We compare synthetic oscillators involving one of three architectures and various combinations of the two modes of transcriptional control using a methodology that provides three innovations: a rigorous definition of phenotype, a procedure for deconstructing complex systems into qualitatively distinct phenotypes, and a graphical representation for illuminating the relationship between genotype, environment, and the qualitatively distinct phenotypes of a system. These methods provide a global perspective on the behavioral repertoire, facilitate comparisons of alternatives, and assist the rational design of synthetic gene circuitry. In particular, the results of their application here reveal distinctive phenotypes for several designs that have been studied experimentally as well as a best design among the alternatives that has yet to be constructed and tested.

  17. Sample size reassessment for a two-stage design controlling the false discovery rate.

    Science.gov (United States)

    Zehetmayer, Sonja; Graf, Alexandra C; Posch, Martin

    2015-11-01

    Sample size calculations for gene expression microarray and NGS-RNA-Seq experiments are challenging because the overall power depends on unknown quantities as the proportion of true null hypotheses and the distribution of the effect sizes under the alternative. We propose a two-stage design with an adaptive interim analysis where these quantities are estimated from the interim data. The second stage sample size is chosen based on these estimates to achieve a specific overall power. The proposed procedure controls the power in all considered scenarios except for very low first stage sample sizes. The false discovery rate (FDR) is controlled despite of the data dependent choice of sample size. The two-stage design can be a useful tool to determine the sample size of high-dimensional studies if in the planning phase there is high uncertainty regarding the expected effect sizes and variability.

  18. A low-power multi port register file design using a low-swing strategy

    International Nuclear Information System (INIS)

    Yan Hao; Liu Yan; Hua Siliang; Wang Donghui; Hou Chaohuan

    2012-01-01

    A low-power register file is designed by using a low-swing strategy and modified NAND address decoders. The proposed low-swing strategy is based on the feedback scheme and uses dynamic logic to reduce the active feedback power. This method contains two parts: WRITE and READ strategy. In the WRITE low-swing scheme, the modified memory cell is used to support low-swing WRITE. The modified NAND decoder not only dissipates less power, but also enables a great deal of area reduction. Compared with the conventional single-ended register file, the low-swing strategy saves 34.5% and 51.15% bit-line power in WRITE and READ separately. The post simulation results indicate a 39.4% power improvement when the twelve ports are all busy. (semiconductor integrated circuits)

  19. Measuring strategies for learning regulation in medical education: Scale reliability and dimensionality in a Swedish sample

    Directory of Open Access Journals (Sweden)

    Edelbring Samuel

    2012-08-01

    Full Text Available Abstract Background The degree of learners’ self-regulated learning and dependence on external regulation influence learning processes in higher education. These regulation strategies are commonly measured by questionnaires developed in other settings than in which they are being used, thereby requiring renewed validation. The aim of this study was to psychometrically evaluate the learning regulation strategy scales from the Inventory of Learning Styles with Swedish medical students (N = 206. Methods The regulation scales were evaluated regarding their reliability, scale dimensionality and interrelations. The primary evaluation focused on dimensionality and was performed with Mokken scale analysis. To assist future scale refinement, additional item analysis, such as item-to-scale correlations, was performed. Results Scale scores in the Swedish sample displayed good reliability in relation to published results: Cronbach’s alpha: 0.82, 0.72, and 0.65 for self-regulation, external regulation and lack of regulation scales respectively. The dimensionalities in scales were adequate for self-regulation and its subscales, whereas external regulation and lack of regulation displayed less unidimensionality. The established theoretical scales were largely replicated in the exploratory analysis. The item analysis identified two items that contributed little to their respective scales. Discussion The results indicate that these scales have an adequate capacity for detecting the three theoretically proposed learning regulation strategies in the medical education sample. Further construct validity should be sought by interpreting scale scores in relation to specific learning activities. Using established scales for measuring students’ regulation strategies enables a broad empirical base for increasing knowledge on regulation strategies in relation to different disciplinary settings and contributes to theoretical development.

  20. Taking the stairs instead: The impact of workplace design standards on health promotion strategies

    Directory of Open Access Journals (Sweden)

    Marian Tye

    2013-01-01

    Full Text Available AbstractBackground Comprehensive health promotion in Western Australia has been conducted from the point of views of policy development, promotion, education and service delivery. Much of this recent work has been focused on supporting workplaces – but there has yet to be any real focus on the design of the actual physical workplace environment from a health promotion perspective. Aims This paper is aimed at highlighting the gap in health promotion knowledge by addressing how the disciplines of architecture and health promotion can work together to challenge the regulations that dictate design practice and ultimately bridge that gap for long-term change. The overarching aim is to undertake further evidenced-based research that will inform best practice in the planning and design of workplaces to reduce sedentary behaviour and increase opportunities for physical activity. Method Within this wide objective this paper focuses in particular on the idea of stairs-versus-lift movement strategies within office buildings. By examining building design guidelines from a health promotion perspective we expose a central dichotomy, where health promotion posters say “Take the stairs instead” whereas the language of building design suggests that the lift is best. Results From a design point of view, the National Codes of Construction (NCC, formally known as the Building Codes of Australia (BCA, the essential technical regulation for all building design and construction, primarily addresses the concepts of ‘egress’ and ‘travel distance’ for escape in the event of fire, and building access in terms of universal access. Additionally, The Property Council of Australia’s Guide to Office Building Quality prioritises lift performance criteria along with the quality and experience of lift use as a major grading factor. There is no provision in either set of standards for staircase quality and experience. Conclusion The stairs, despite being promoted

  1. The ability of environmental healthcare design strategies To impact event related anxiety in paediatric patients: A comprehensive systematic review.

    Science.gov (United States)

    Norton-Westwood, Deborah; Pearson, Alan; Robertson-Malt, Suzanne

    sought from the period of 1980 to 2010.Methods of the Review Data for each study was extracted and assessed by two independent reviewers for methodological validity prior to inclusion in the review using the Joanna Briggs Institute standardised critical appraisal instruments for Qualitative data (JBI-QARI) and for the Meta Analysis of Statistics Assessment (JBI-MAStARI).Results Twenty studies were reviewed, seven of a descriptive experimental design, three of mixed methodologies and thirteen of various qualitative research design methodologies inclusive of Observational, Grounded Theory, Ethnography and Phenomenology.Conclusions The design of the built environment does have the ability to impact either positively or negatively the level of anxiety and fear that children experience when exposed to a healthcare setting. The coping strategies engaged by and unique to each paediatric age group need to not only be understood but supported and reflected in the built environment.Implications for research Architects and healthcare researchers need to collaborate to establish a solid base of evidence related to this important area of interest. Irrespective of the challenges that researchers face in attempting to randomise, manipulate and control the numerous environmental variables that impact a question such as this, such challenges need not, nor should not, prevent or discourage future research. An innovative solution to the challenges faced by researchers in this field is the use of computer modelling and/ or simulation of the hospital environment. Through the use of simulated environments researchers can directly observe user preferences and/ physiological responses.Implication for practice This review highlights an insightful look into the preferences of children as consumers. Although sample sizes were small and results were not quantified in measurable outcomes, the ability for such studies to inform design should not be underestimated. Design strategies both from a

  2. Bayesian assessment of the expected data impact on prediction confidence in optimal sampling design

    Science.gov (United States)

    Leube, P. C.; Geiges, A.; Nowak, W.

    2012-02-01

    Incorporating hydro(geo)logical data, such as head and tracer data, into stochastic models of (subsurface) flow and transport helps to reduce prediction uncertainty. Because of financial limitations for investigation campaigns, information needs toward modeling or prediction goals should be satisfied efficiently and rationally. Optimal design techniques find the best one among a set of investigation strategies. They optimize the expected impact of data on prediction confidence or related objectives prior to data collection. We introduce a new optimal design method, called PreDIA(gnosis) (Preposterior Data Impact Assessor). PreDIA derives the relevant probability distributions and measures of data utility within a fully Bayesian, generalized, flexible, and accurate framework. It extends the bootstrap filter (BF) and related frameworks to optimal design by marginalizing utility measures over the yet unknown data values. PreDIA is a strictly formal information-processing scheme free of linearizations. It works with arbitrary simulation tools, provides full flexibility concerning measurement types (linear, nonlinear, direct, indirect), allows for any desired task-driven formulations, and can account for various sources of uncertainty (e.g., heterogeneity, geostatistical assumptions, boundary conditions, measurement values, model structure uncertainty, a large class of model errors) via Bayesian geostatistics and model averaging. Existing methods fail to simultaneously provide these crucial advantages, which our method buys at relatively higher-computational costs. We demonstrate the applicability and advantages of PreDIA over conventional linearized methods in a synthetic example of subsurface transport. In the example, we show that informative data is often invisible for linearized methods that confuse zero correlation with statistical independence. Hence, PreDIA will often lead to substantially better sampling designs. Finally, we extend our example to specifically

  3. The Strategy Blueprint : A Strategy Process Computer-Aided Design Tool

    NARCIS (Netherlands)

    Aldea, Adina Ioana; Febriani, Tania Rizki; Daneva, Maya; Iacob, Maria Eugenia

    2017-01-01

    Strategy has always been a main concern of organizations because it dictates their direction, and therefore determines their success. Thus, organizations need to have adequate support to guide them through their strategy formulation process. The goal of this research is to develop a computer-based

  4. Performance Assessment Strategies: A computational framework for conceptual design of large roofs

    Directory of Open Access Journals (Sweden)

    Michela Turrin

    2014-01-01

    Full Text Available Using engineering performance evaluations to explore design alternatives during the conceptual phase of architectural design helps to understand the relationships between form and performance; and is crucial for developing well-performing final designs. Computer aided conceptual design has the potential to aid the design team in discovering and highlighting these relationships; especially by means of procedural and parametric geometry to support the generation of geometric design, and building performance simulation tools to support performance assessments. However, current tools and methods for computer aided conceptual design in architecture do not explicitly reveal nor allow for backtracking the relationships between performance and geometry of the design. They currently support post-engineering, rather than the early design decisions and the design exploration process. Focusing on large roofs, this research aims at developing a computational design approach to support designers in performance driven explorations. The approach is meant to facilitate the multidisciplinary integration and the learning process of the designer; and not to constrain the process in precompiled procedures or in hard engineering formulations, nor to automatize it by delegating the design creativity to computational procedures. PAS (Performance Assessment Strategies as a method is the main output of the research. It consists of a framework including guidelines and an extensible library of procedures for parametric modelling. It is structured on three parts. Pre-PAS provides guidelines for a design strategy-definition, toward the parameterization process. Model-PAS provides guidelines, procedures and scripts for building the parametric models. Explore-PAS supports the solutions-assessment based on numeric evaluations and performance simulations, until the identification of a suitable design solution. PAS has been developed based on action research. Several case studies

  5. Tectonic design strategies

    DEFF Research Database (Denmark)

    Beim, Anne

    2000-01-01

    as the poetics of construction, thus it may be considered as an essential activity in the development of the architectural design process.  Similar to the complex nature of the tectonic, the design process is an ongoing movement of interpretation, mediation, and decision making where the skills of the architect...

  6. Evaluation Strategy

    DEFF Research Database (Denmark)

    Coto Chotto, Mayela; Wentzer, Helle; Dirckinck-Holmfeld, Lone

    2009-01-01

    The paper presents an evaluation strategy based on deliberate ideals and principles of dialogue design. The evaluation strategy is based on experiential phenomenology taking the point of departure for design and evaluation processes in the experienced practitioners themselves. The article present...... the evaluation strategy and methodology of a research project Making Online Path to Enter new Markets, MOPEM. It is an EU-research project with partners from different Educational Institutions of Technology and Business in five European Countries.......The paper presents an evaluation strategy based on deliberate ideals and principles of dialogue design. The evaluation strategy is based on experiential phenomenology taking the point of departure for design and evaluation processes in the experienced practitioners themselves. The article presents...

  7. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  8. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  9. An Exploratory Study of Self-Regulated Learning Strategies in a Design Project by Students in Grades 9-12

    Science.gov (United States)

    Lawanto, Oenardi; Butler, Deborah; Cartier, Sylvie; Santoso, Harry; Lawanto, Kevin; Clark, David

    2013-01-01

    This exploratory study evaluated self-regulated learning (SRL) strategies of 27 students in grades 9-12 during an engineering design project. The specific focus of the study was on student task interpretation and its relation to planning and cognitive strategies in design activities. Two research questions guided the study: (1) To what degree was…

  10. Design for mosquito abundance, diversity, and phenology sampling within the National Ecological Observatory Network

    Science.gov (United States)

    Hoekman, D.; Springer, Yuri P.; Barker, C.M.; Barrera, R.; Blackmore, M.S.; Bradshaw, W.E.; Foley, D. H.; Ginsberg, Howard; Hayden, M. H.; Holzapfel, C. M.; Juliano, S. A.; Kramer, L. D.; LaDeau, S. L.; Livdahl, T. P.; Moore, C. G.; Nasci, R.S.; Reisen, W.K.; Savage, H. M.

    2016-01-01

    The National Ecological Observatory Network (NEON) intends to monitor mosquito populations across its broad geographical range of sites because of their prevalence in food webs, sensitivity to abiotic factors and relevance for human health. We describe the design of mosquito population sampling in the context of NEON’s long term continental scale monitoring program, emphasizing the sampling design schedule, priorities and collection methods. Freely available NEON data and associated field and laboratory samples, will increase our understanding of how mosquito abundance, demography, diversity and phenology are responding to land use and climate change.

  11. Design for a Crane Metallic Structure Based on Imperialist Competitive Algorithm and Inverse Reliability Strategy

    Science.gov (United States)

    Fan, Xiao-Ning; Zhi, Bo

    2017-07-01

    Uncertainties in parameters such as materials, loading, and geometry are inevitable in designing metallic structures for cranes. When considering these uncertainty factors, reliability-based design optimization (RBDO) offers a more reasonable design approach. However, existing RBDO methods for crane metallic structures are prone to low convergence speed and high computational cost. A unilevel RBDO method, combining a discrete imperialist competitive algorithm with an inverse reliability strategy based on the performance measure approach, is developed. Application of the imperialist competitive algorithm at the optimization level significantly improves the convergence speed of this RBDO method. At the reliability analysis level, the inverse reliability strategy is used to determine the feasibility of each probabilistic constraint at each design point by calculating its α-percentile performance, thereby avoiding convergence failure, calculation error, and disproportionate computational effort encountered using conventional moment and simulation methods. Application of the RBDO method to an actual crane structure shows that the developed RBDO realizes a design with the best tradeoff between economy and safety together with about one-third of the convergence speed and the computational cost of the existing method. This paper provides a scientific and effective design approach for the design of metallic structures of cranes.

  12. Practical experiences with an extended screening strategy for genetically modified organisms (GMOs) in real-life samples.

    Science.gov (United States)

    Scholtens, Ingrid; Laurensse, Emile; Molenaar, Bonnie; Zaaijer, Stephanie; Gaballo, Heidi; Boleij, Peter; Bak, Arno; Kok, Esther

    2013-09-25

    Nowadays most animal feed products imported into Europe have a GMO (genetically modified organism) label. This means that they contain European Union (EU)-authorized GMOs. For enforcement of these labeling requirements, it is necessary, with the rising number of EU-authorized GMOs, to perform an increasing number of analyses. In addition to this, it is necessary to test products for the potential presence of EU-unauthorized GMOs. Analysis for EU-authorized and -unauthorized GMOs in animal feed has thus become laborious and expensive. Initial screening steps may reduce the number of GMO identification methods that need to be applied, but with the increasing diversity also screening with GMO elements has become more complex. For the present study, the application of an informative detailed 24-element screening and subsequent identification strategy was applied in 50 animal feed samples. Almost all feed samples were labeled as containing GMO-derived materials. The main goal of the study was therefore to investigate if a detailed screening strategy would reduce the number of subsequent identification analyses. An additional goal was to test the samples in this way for the potential presence of EU-unauthorized GMOs. Finally, to test the robustness of the approach, eight of the samples were tested in a concise interlaboratory study. No significant differences were found between the results of the two laboratories.

  13. Digital Content Strategies

    OpenAIRE

    Halbheer, Daniel; Stahl, Florian; Koenigsberg, Oded; Lehmann, Donald R

    2013-01-01

    This paper studies content strategies for online publishers of digital information goods. It examines sampling strategies and compares their performance to paid content and free content strategies. A sampling strategy, where some of the content is offered for free and consumers are charged for access to the rest, is known as a "metered model" in the newspaper industry. We analyze optimal decisions concerning the size of the sample and the price of the paid content when sampling serves the dua...

  14. Towards an optimal sampling strategy for assessing genetic variation within and among white clover (Trifolium repens L. cultivars using AFLP

    Directory of Open Access Journals (Sweden)

    Khosro Mehdi Khanlou

    2011-01-01

    Full Text Available Cost reduction in plant breeding and conservation programs depends largely on correctly defining the minimal sample size required for the trustworthy assessment of intra- and inter-cultivar genetic variation. White clover, an important pasture legume, was chosen for studying this aspect. In clonal plants, such as the aforementioned, an appropriate sampling scheme eliminates the redundant analysis of identical genotypes. The aim was to define an optimal sampling strategy, i.e., the minimum sample size and appropriate sampling scheme for white clover cultivars, by using AFLP data (283 loci from three popular types. A grid-based sampling scheme, with an interplant distance of at least 40 cm, was sufficient to avoid any excess in replicates. Simulations revealed that the number of samples substantially influenced genetic diversity parameters. When using less than 15 per cultivar, the expected heterozygosity (He and Shannon diversity index (I were greatly underestimated, whereas with 20, more than 95% of total intra-cultivar genetic variation was covered. Based on AMOVA, a 20-cultivar sample was apparently sufficient to accurately quantify individual genetic structuring. The recommended sampling strategy facilitates the efficient characterization of diversity in white clover, for both conservation and exploitation.

  15. Glass sampling program during DWPF Integrated Cold Runs

    International Nuclear Information System (INIS)

    Plodinec, M.J.

    1990-01-01

    The described glass sampling program is designed to achieve two objectives: To demonstrate Defense Waste Processing Facility (DWPF) ability to control and verify the radionuclide release properties of the glass product; To confirm DWPF's readiness to obtain glass samples during production, and SRL's readiness to analyze and test those samples remotely. The DWPF strategy for control of the radionuclide release properties of the glass product, and verification of its acceptability are described in this report. The basic approach of the test program is then defined

  16. ASIT--A Problem Solving Strategy for Education and Eco-Friendly Sustainable Design

    Science.gov (United States)

    Turner, Steve

    2009-01-01

    There is growing recognition of the role teaching and learning experiences in technology education can contribute to Education for Sustainable Development. It appears, however, that in the Technology Education classroom little or no change has been achieved to the practice of designing and problem solving strategies oriented towards sustainable…

  17. Multi-saline sample distillation apparatus for hydrogen isotope analyses : design and accuracy

    Science.gov (United States)

    Hassan, Afifa Afifi

    1981-01-01

    A distillation apparatus for saline water samples was designed and tested. Six samples may be distilled simultaneously. The temperature was maintained at 400 C to ensure complete dehydration of the precipitating salts. Consequently, the error in the measured ratio of stable hydrogen isotopes resulting from incomplete dehydration of hydrated salts during distillation was eliminated. (USGS)

  18. Serum sample containing endogenous antibodies interfering with multiple hormone immunoassays. Laboratory strategies to detect interference

    Directory of Open Access Journals (Sweden)

    Elena García-González

    2016-04-01

    Full Text Available Objectives: Endogenous antibodies (EA may interfere with immunoassays, causing erroneous results for hormone analyses. As (in most cases this interference arises from the assay format and most immunoassays, even from different manufacturers, are constructed in a similar way, it is possible for a single type of EA to interfere with different immunoassays. Here we describe the case of a patient whose serum sample contains EA that interfere several hormones tests. We also discuss the strategies deployed to detect interference. Subjects and methods: Over a period of four years, a 30-year-old man was subjected to a plethora of laboratory and imaging diagnostic procedures as a consequence of elevated hormone results, mainly of pituitary origin, which did not correlate with the overall clinical picture. Results: Once analytical interference was suspected, the best laboratory approaches to investigate it were sample reanalysis on an alternative platform and sample incubation with antibody blocking tubes. Construction of an in-house ‘nonsense’ sandwich assay was also a valuable strategy to confirm interference. In contrast, serial sample dilutions were of no value in our case, while polyethylene glycol (PEG precipitation gave inconclusive results, probably due to the use of inappropriate PEG concentrations for several of the tests assayed. Conclusions: Clinicians and laboratorians must be aware of the drawbacks of immunometric assays, and alert to the possibility of EA interference when results do not fit the clinical pattern. Keywords: Endogenous antibodies, Immunoassay, Interference, Pituitary hormones, Case report

  19. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    International Nuclear Information System (INIS)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B.

    2015-01-01

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging

  20. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B. [Radiation Impact Assessment Section, Radiological Safety Division, Indira Gandhi Centre for Atomic Research, Kalpakkam 603 102 (India)

    2015-07-15

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.

  1. [Identification of Systemic Contaminations with Legionella Spec. in Drinking Water Plumbing Systems: Sampling Strategies and Corresponding Parameters].

    Science.gov (United States)

    Völker, S; Schreiber, C; Müller, H; Zacharias, N; Kistemann, T

    2017-05-01

    After the amendment of the Drinking Water Ordinance in 2011, the requirements for the hygienic-microbiological monitoring of drinking water installations have increased significantly. In the BMBF-funded project "Biofilm Management" (2010-2014), we examined the extent to which established sampling strategies in practice can uncover drinking water plumbing systems systemically colonized with Legionella. Moreover, we investigated additional parameters that might be suitable for detecting systemic contaminations. We subjected the drinking water plumbing systems of 8 buildings with known microbial contamination (Legionella) to an intensive hygienic-microbiological sampling with high spatial and temporal resolution. A total of 626 drinking hot water samples were analyzed with classical culture-based methods. In addition, comprehensive hygienic observations were conducted in each building and qualitative interviews with operators and users were applied. Collected tap-specific parameters were quantitatively analyzed by means of sensitivity and accuracy calculations. The systemic presence of Legionella in drinking water plumbing systems has a high spatial and temporal variability. Established sampling strategies were only partially suitable to detect long-term Legionella contaminations in practice. In particular, the sampling of hot water at the calorifier and circulation re-entrance showed little significance in terms of contamination events. To detect the systemic presence of Legionella,the parameters stagnation (qualitatively assessed) and temperature (compliance with the 5K-rule) showed better results. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Shielding design of highly activated sample storage at reactor TRIGA PUSPATI

    International Nuclear Information System (INIS)

    Naim Syauqi Hamzah; Julia Abdul Karim; Mohamad Hairie Rabir; Muhd Husamuddin Abdul Khalil; Mohd Amin Sharifuldin Salleh

    2010-01-01

    Radiation protection has always been one of the most important things considered in Reaktor Triga PUSPATI (RTP) management. Currently, demands on sample activation were increased from variety of applicant in different research field area. Radiological hazard may occur if the samples evaluation done were misjudge or miscalculated. At present, there is no appropriate storage for highly activated samples. For that purpose, special irradiated samples storage box should be provided in order to segregate highly activated samples that produce high dose level and typical activated samples that produce lower dose level (1 - 2 mR/ hr). In this study, thickness required by common shielding material such as lead and concrete to reduce highly activated radiotracer sample (potassium bromide) with initial exposure dose of 5 R/ hr to background level (0.05 mR/ hr) were determined. Analyses were done using several methods including conventional shielding equation, half value layer calculation and Micro shield computer code. Design of new irradiated samples storage box for RTP that capable to contain high level gamma radioactivity were then proposed. (author)

  3. Passive solar design strategies: Remodeling guidelines for conserving energy at home. [Final report

    Energy Technology Data Exchange (ETDEWEB)

    1991-12-31

    The idea of passive solar is simple, but applying it effectively does require information and attention to the details of design and construction. Some passive solar techniques are modest and low-cost, and require only small changes in remodeler`s typical practice. At the other end of the spectrum, some passive solar systems can almost eliminate a house`s need for purchased heating (and in some cases, cooling) energy -- but probably at a relatively high first cost. In between are a broad range of energy-conserving passive solar techniques. Whether or not they are cost-effective, practical and attractive enough to offer a market advantage to any individual remodeler depends on very specific factors such as local costs, climate, and market characteristics. Passive solar design strategies: Remodeling Guidelines For Conserving Energy At Homes is written to help give remodelers the information they need to make these decisions. Passive Solar Design Strategies is a package in three basic parts: The Guidelines contain information about passive solar techniques and how they work, and provides specific examples of systems which will save various percentages of energy; The Worksheets offer a simple, fill-in-the-blank method to pre-evaluate the performance of a specific design; The Worked Example demonstrates how to complete the worksheets for a typical residence.

  4. Design of Underwater Robot Lines Based on a Hybrid Automatic Optimization Strategy

    Institute of Scientific and Technical Information of China (English)

    Wenjing Lyu; Weilin Luo

    2014-01-01

    In this paper, a hybrid automatic optimization strategy is proposed for the design of underwater robot lines. Isight is introduced as an integration platform. The construction of this platform is based on the user programming and several commercial software including UG6.0, GAMBIT2.4.6 and FLUENT12.0. An intelligent parameter optimization method, the particle swarm optimization, is incorporated into the platform. To verify the strategy proposed, a simulation is conducted on the underwater robot model 5470, which originates from the DTRC SUBOFF project. With the automatic optimization platform, the minimal resistance is taken as the optimization goal;the wet surface area as the constraint condition; the length of the fore-body, maximum body radius and after-body’s minimum radius as the design variables. With the CFD calculation, the RANS equations and the standard turbulence model are used for direct numerical simulation. By analyses of the simulation results, it is concluded that the platform is of high efficiency and feasibility. Through the platform, a variety of schemes for the design of the lines are generated and the optimal solution is achieved. The combination of the intelligent optimization algorithm and the numerical simulation ensures a global optimal solution and improves the efficiency of the searching solutions.

  5. Design for experience in the fashion industry: coping strategies in the era of homogenization

    OpenAIRE

    PETERMANS, Ann; Kent, Anthony

    2013-01-01

    The aim of this paper is to focus on how the globalised fashion industry impacts on the way that strategies of clothing brands are translated into the design of their retail environments. The first research question explores the conceptualisation of retail environments by retailers, designers and planners so that they can continue to be perceived as ‘unique’ by customers. The second asks whether retailers, designers and planners can manage the trend towards homogenization in the near future. ...

  6. Two specialized delayed-neutron detector designs for assays of fissionable elements in water and sediment samples

    International Nuclear Information System (INIS)

    Balestrini, S.J.; Balagna, J.P.; Menlove, H.O.

    1976-01-01

    Two specialized neutron-sensitive detectors are described which are employed for rapid assays of fissionable elements by sensing for delayed neutrons emitted by samples after they have been irradiated in a nuclear reactor. The more sensitive of the two detectors, designed to assay for uranium in water samples, is 40% efficient; the other, designed for sediment sample assays, is 27% efficient. These detectors are also designed to operate under water as an inexpensive shielding against neutron leakage from the reactor and neutrons from cosmic rays. (Auth.)

  7. A Minimum Shuffle Core Design Strategy for ESBWR

    International Nuclear Information System (INIS)

    Karve, A.A.; Fawcett, R.M.

    2008-01-01

    The Economic Simplified Boiling Water Reactor (ESBWR) is GEH's next evolution of advanced BWR technology. There are 1132 fuel bundles in the core and the thermal power is 4500 MWt. Similar to conventional plants there is an outage after a specified period of operation, when the plant shuts down. During the outage a specified fraction of fuel bundles are discharged from the core, it is loaded with the same fraction of fresh fuel, and fuel is shuffled to obtain an optimum core design that meets the goals for a successful operation of the next cycle. The discharge, load, and the associated shuffles are time-consuming and expensive tasks that impact the overall outage schedule and costs. Therefore, there is an incentive to keep maneuvers to a minimum and to perform them more efficiently. The benefits for a large core, such as the ESBWR with 1132 fuel bundles, are escalated. This study focuses on a core reload design strategy to minimize the total number of shuffles during an outage. A traditional equilibrium cycle is used as a reference basis, which sets the reference number of shuffles. In the minimum shuffle core design however, a set of two equilibrium cycles (N and N+1, referred to as a 'bi- equilibrium' cycle) is envisioned where the fresh fuel of cycle N (that becomes the once-burnt fuel of cycle N+1) ideally does not move in the two cycles. The cost of fuel efficiency is determined for obtaining such a core loading by comparing it to the traditional equilibrium cycle. There are several additional degrees of freedom when designing a bi-equilibrium cycle that could be utilized, and the potential benefits of these flexibilities are assessed. In summary, the feasibility of a minimum shuffle fuel cycle and core design for an ESBWR is studied. The cost of fuel efficiency is assessed in comparison to the traditional design. (authors)

  8. Logistics Sourcing Strategies in Supply Chain Design

    OpenAIRE

    Liu, Liwen

    2007-01-01

    A company's logistics sourcing strategy determines whether it structures and organizeslogistics within the company or company group or integrates logistics upstream and downstreamin the supply chain. First, three different types of logistics sourcing strategies in supply chaindesign are described and the theoretical background for the development of these strategies,including both transaction cost theory and network theory, is analyzed. Two special casesabout logistics sourcing strategy decis...

  9. A phoswich detector design for improved spatial sampling in PET

    Science.gov (United States)

    Thiessen, Jonathan D.; Koschan, Merry A.; Melcher, Charles L.; Meng, Fang; Schellenberg, Graham; Goertzen, Andrew L.

    2018-02-01

    Block detector designs, utilizing a pixelated scintillator array coupled to a photosensor array in a light-sharing design, are commonly used for positron emission tomography (PET) imaging applications. In practice, the spatial sampling of these designs is limited by the crystal pitch, which must be large enough for individual crystals to be resolved in the detector flood image. Replacing the conventional 2D scintillator array with an array of phoswich elements, each consisting of an optically coupled side-by-side scintillator pair, may improve spatial sampling in one direction of the array without requiring resolving smaller crystal elements. To test the feasibility of this design, a 4 × 4 phoswich array was constructed, with each phoswich element consisting of two optically coupled, 3 . 17 × 1 . 58 × 10mm3 LSO crystals co-doped with cerium and calcium. The amount of calcium doping was varied to create a 'fast' LSO crystal with decay time of 32.9 ns and a 'slow' LSO crystal with decay time of 41.2 ns. Using a Hamamatsu R8900U-00-C12 position-sensitive photomultiplier tube (PS-PMT) and a CAEN V1720 250 MS/s waveform digitizer, we were able to show effective discrimination of the fast and slow LSO crystals in the phoswich array. Although a side-by-side phoswich array is feasible, reflections at the crystal boundary due to a mismatch between the refractive index of the optical adhesive (n = 1 . 5) and LSO (n = 1 . 82) caused it to behave optically as an 8 × 4 array rather than a 4 × 4 array. Direct coupling of each phoswich element to individual photodetector elements may be necessary with the current phoswich array design. Alternatively, in order to implement this phoswich design with a conventional light sharing PET block detector, a high refractive index optical adhesive is necessary to closely match the refractive index of LSO.

  10. Robust Control Mixer Method for Reconfigurable Control Design Using Model Matching Strategy

    DEFF Research Database (Denmark)

    Yang, Zhenyu; Blanke, Mogens; Verhagen, Michel

    2007-01-01

    A novel control mixer method for recon¯gurable control designs is developed. The proposed method extends the matrix-form of the conventional control mixer concept into a LTI dynamic system-form. The H_inf control technique is employed for these dynamic module designs after an augmented control...... system is constructed through a model-matching strategy. The stability, performance and robustness of the reconfigured system can be guaranteed when some conditions are satisfied. To illustrate the effectiveness of the proposed method, a robot system subjected to failures is used to demonstrate...

  11. Conservative strategy-based ensemble surrogate model for optimal groundwater remediation design at DNAPLs-contaminated sites

    Science.gov (United States)

    Ouyang, Qi; Lu, Wenxi; Lin, Jin; Deng, Wenbing; Cheng, Weiguo

    2017-08-01

    The surrogate-based simulation-optimization techniques are frequently used for optimal groundwater remediation design. When this technique is used, surrogate errors caused by surrogate-modeling uncertainty may lead to generation of infeasible designs. In this paper, a conservative strategy that pushes the optimal design into the feasible region was used to address surrogate-modeling uncertainty. In addition, chance-constrained programming (CCP) was adopted to compare with the conservative strategy in addressing this uncertainty. Three methods, multi-gene genetic programming (MGGP), Kriging (KRG) and support vector regression (SVR), were used to construct surrogate models for a time-consuming multi-phase flow model. To improve the performance of the surrogate model, ensemble surrogates were constructed based on combinations of different stand-alone surrogate models. The results show that: (1) the surrogate-modeling uncertainty was successfully addressed by the conservative strategy, which means that this method is promising for addressing surrogate-modeling uncertainty. (2) The ensemble surrogate model that combines MGGP with KRG showed the most favorable performance, which indicates that this ensemble surrogate can utilize both stand-alone surrogate models to improve the performance of the surrogate model.

  12. Principal component analysis applied to Fourier transform infrared spectroscopy for the design of calibration sets for glycerol prediction models in wine and for the detection and classification of outlier samples.

    Science.gov (United States)

    Nieuwoudt, Helene H; Prior, Bernard A; Pretorius, Isak S; Manley, Marena; Bauer, Florian F

    2004-06-16

    Principal component analysis (PCA) was used to identify the main sources of variation in the Fourier transform infrared (FT-IR) spectra of 329 wines of various styles. The FT-IR spectra were gathered using a specialized WineScan instrument. The main sources of variation included the reducing sugar and alcohol content of the samples, as well as the stage of fermentation and the maturation period of the wines. The implications of the variation between the different wine styles for the design of calibration models with accurate predictive abilities were investigated using glycerol calibration in wine as a model system. PCA enabled the identification and interpretation of samples that were poorly predicted by the calibration models, as well as the detection of individual samples in the sample set that had atypical spectra (i.e., outlier samples). The Soft Independent Modeling of Class Analogy (SIMCA) approach was used to establish a model for the classification of the outlier samples. A glycerol calibration for wine was developed (reducing sugar content 8% v/v) with satisfactory predictive ability (SEP = 0.40 g/L). The RPD value (ratio of the standard deviation of the data to the standard error of prediction) was 5.6, indicating that the calibration is suitable for quantification purposes. A calibration for glycerol in special late harvest and noble late harvest wines (RS 31-147 g/L, alcohol > 11.6% v/v) with a prediction error SECV = 0.65 g/L, was also established. This study yielded an analytical strategy that combined the careful design of calibration sets with measures that facilitated the early detection and interpretation of poorly predicted samples and outlier samples in a sample set. The strategy provided a powerful means of quality control, which is necessary for the generation of accurate prediction data and therefore for the successful implementation of FT-IR in the routine analytical laboratory.

  13. Quality-control design for surface-water sampling in the National Water-Quality Network

    Science.gov (United States)

    Riskin, Melissa L.; Reutter, David C.; Martin, Jeffrey D.; Mueller, David K.

    2018-04-10

    The data-quality objectives for samples collected at surface-water sites in the National Water-Quality Network include estimating the extent to which contamination, matrix effects, and measurement variability affect interpretation of environmental conditions. Quality-control samples provide insight into how well the samples collected at surface-water sites represent the true environmental conditions. Quality-control samples used in this program include field blanks, replicates, and field matrix spikes. This report describes the design for collection of these quality-control samples and the data management needed to properly identify these samples in the U.S. Geological Survey’s national database.

  14. Active living by design sustainability strategies.

    Science.gov (United States)

    Kraft, M Katherine; Lee, Joanne J; Brennan, Laura K

    2012-11-01

    Despite substantial increases in improving the translation of health promotion research into practice, community initiatives still struggle with maintaining changes once grant funding has ended. Researchers, funders, and community practitioners are interested in practices that maintain and sustain their efforts. This qualitative study conducted a content analysis of evaluation findings from Active Living by Design (ALbD) to identify activities that community coalitions implemented to maintain their initiative and secure ongoing influence in communities. Investigators analyzed data from interviews, focus groups, and the Progress Reporting System to identify sustainability approaches clustering into five areas: partnership expansion, sustainable funding, permanent advisory committees, policy change, and institution/organization change. Partnership expansion occurred across sectors and disciplines and into broader geographic areas. Additional funding extended beyond grants to earned income streams and dedicated tax revenues. Permanent advisory committees were established to inform decision makers about a range of active living impacts. Policy changes in zoning and comprehensive plans ensured maintenance of health-promoting built environments. Sustainability through institution/organization changes led to allocation of dedicated staff and incorporation of active living values into agency missions. Active Living by Design partnerships defined and messaged their projects to align with policymakers' interests and broad partnership audiences. They found innovative supporters and adapted their original vision to include quality of life, nonmotorized transport, and other complementary efforts that expanded their reach and influence. These sustainability strategies altered awareness within communities, changed community decision-making processes, and created policy changes that have the potential to maintain environments that promote physical activity for years to come

  15. Planning Considerations for a Mars Sample Receiving Facility: Summary and Interpretation of Three Design Studies

    Science.gov (United States)

    Beaty, David W.; Allen, Carlton C.; Bass, Deborah S.; Buxbaum, Karen L.; Campbell, James K.; Lindstrom, David J.; Miller, Sylvia L.; Papanastassiou, Dimitri A.

    2009-10-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed-upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning. The Mars Exploration Program carried out an analysis of the attributes of an SRF to establish its scope, including minimum size and functionality, budgetary requirements (capital cost, operating costs, cost profile), and development schedule. The approach was to arrange for three independent design studies, each led by an architectural design firm, and compare the results. While there were many design elements in common identified by each study team, there were significant differences in the way human operators were to interact with the systems. In aggregate, the design studies provided insight into the attributes of a future SRF and the complex factors to consider for future programmatic planning.

  16. Low-sensitivity H ∞ filter design for linear delta operator systems with sampling time jitter

    Science.gov (United States)

    Guo, Xiang-Gui; Yang, Guang-Hong

    2012-04-01

    This article is concerned with the problem of designing H ∞ filters for a class of linear discrete-time systems with low-sensitivity to sampling time jitter via delta operator approach. Delta-domain model is used to avoid the inherent numerical ill-condition resulting from the use of the standard shift-domain model at high sampling rates. Based on projection lemma in combination with the descriptor system approach often used to solve problems related to delay, a novel bounded real lemma with three slack variables for delta operator systems is presented. A sensitivity approach based on this novel lemma is proposed to mitigate the effects of sampling time jitter on system performance. Then, the problem of designing a low-sensitivity filter can be reduced to a convex optimisation problem. An important consideration in the design of correlation filters is the optimal trade-off between the standard H ∞ criterion and the sensitivity of the transfer function with respect to sampling time jitter. Finally, a numerical example demonstrating the validity of the proposed design method is given.

  17. Study and Practice of Design Strategy%设计策略的思考与实践

    Institute of Scientific and Technical Information of China (English)

    刘方

    2015-01-01

    在欧美国家,工业设计深入参与到策略领域,并从总体战略、业务、产品等多个层面推动策略的创新。设计策略是工业设计发展的最新成果之一,其深度融合了商业与设计的优点,具有区别于传统策略的显著特点,更加适应以体验经济和互联网模式为代表的新经济的需求。本文以笔者经历的设计策略案例出发,探讨设计策略的基本特征和实践经验,并得出结论:在中国,设计策略同样具有现实意义与价值,它不仅可以帮助中国企业开展转型,更可以帮助其理性面对新经济形势下的挑战。%In western countries, industrial design has deeply involved in strategic areas, and promoted innovative strategies from multiple levels overall strategy, business and products. Design strategy is one of the latest achievements in the development of industrial design, which deeply combines the advantages of business and design, has a notable feature different from the traditional strategy, and it can be more adapt to the demands on experience economy and Internet model represented by the new economic situation. In this paper, through the case of personal experience, the author study the basic features and practical experience in design strategy, and conclude: In China, design strategy has the real meaning and value as well, which can not only support the business transformation of local company, but also help them more rationally face the challenge of new economy.

  18. A balancing act? The implications of mixed strategies for performance measurement system design

    NARCIS (Netherlands)

    Dekker, H.C.; Groot, T.L.C.M.; Schoute, M.

    2013-01-01

    This paper examines how firms design performance measurement systems (PMSs) to support the pursuit of mixed strategies. In particular, we examine the implications of firms' joint strategic emphasis on both low cost and differentiation for their use of performance measurement and incentive

  19. Competitive Supply Chain Network Design Considering Marketing Strategies: A Hybrid Metaheuristic Algorithm

    Directory of Open Access Journals (Sweden)

    Ali Akbar Hasani

    2016-11-01

    Full Text Available In this paper, a comprehensive model is proposed to design a network for multi-period, multi-echelon, and multi-product inventory controlled the supply chain. Various marketing strategies and guerrilla marketing approaches are considered in the design process under the static competition condition. The goal of the proposed model is to efficiently respond to the customers’ demands in the presence of the pre-existing competitors and the price inelasticity of demands. The proposed optimization model considers multiple objectives that incorporate both market share and total profit of the considered supply chain network, simultaneously. To tackle the proposed multi-objective mixed-integer nonlinear programming model, an efficient hybrid meta-heuristic algorithm is developed that incorporates a Taguchi-based non-dominated sorting genetic algorithm-II and a particle swarm optimization. A variable neighborhood decomposition search is applied to enhance a local search process of the proposed hybrid solution algorithm. Computational results illustrate that the proposed model and solution algorithm are notably efficient in dealing with the competitive pressure by adopting the proper marketing strategies.

  20. Architectural Design Space Exploration of an FPGA-based Compressed Sampling Engine

    DEFF Research Database (Denmark)

    El-Sayed, Mohammad; Koch, Peter; Le Moullec, Yannick

    2015-01-01

    We present the architectural design space exploration of a compressed sampling engine for use in a wireless heart-rate monitoring system. We show how parallelism affects execution time at the register transfer level. Furthermore, two example solutions (modified semi-parallel and full...

  1. Predictive Sampling of Rare Conformational Events in Aqueous Solution: Designing a Generalized Orthogonal Space Tempering Method.

    Science.gov (United States)

    Lu, Chao; Li, Xubin; Wu, Dongsheng; Zheng, Lianqing; Yang, Wei

    2016-01-12

    In aqueous solution, solute conformational transitions are governed by intimate interplays of the fluctuations of solute-solute, solute-water, and water-water interactions. To promote molecular fluctuations to enhance sampling of essential conformational changes, a common strategy is to construct an expanded Hamiltonian through a series of Hamiltonian perturbations and thereby broaden the distribution of certain interactions of focus. Due to a lack of active sampling of configuration response to Hamiltonian transitions, it is challenging for common expanded Hamiltonian methods to robustly explore solvent mediated rare conformational events. The orthogonal space sampling (OSS) scheme, as exemplified by the orthogonal space random walk and orthogonal space tempering methods, provides a general framework for synchronous acceleration of slow configuration responses. To more effectively sample conformational transitions in aqueous solution, in this work, we devised a generalized orthogonal space tempering (gOST) algorithm. Specifically, in the Hamiltonian perturbation part, a solvent-accessible-surface-area-dependent term is introduced to implicitly perturb near-solute water-water fluctuations; more importantly in the orthogonal space response part, the generalized force order parameter is generalized as a two-dimension order parameter set, in which essential solute-solvent and solute-solute components are separately treated. The gOST algorithm is evaluated through a molecular dynamics simulation study on the explicitly solvated deca-alanine (Ala10) peptide. On the basis of a fully automated sampling protocol, the gOST simulation enabled repetitive folding and unfolding of the solvated peptide within a single continuous trajectory and allowed for detailed constructions of Ala10 folding/unfolding free energy surfaces. The gOST result reveals that solvent cooperative fluctuations play a pivotal role in Ala10 folding/unfolding transitions. In addition, our assessment

  2. Design Strategies for Balancing Exertion Games

    DEFF Research Database (Denmark)

    Jensen, Mads Møller; Grønbæk, Kaj

    2016-01-01

    In sports, if players' physical and technical abilities are mismatched, the competition is often uninteresting for them. With the emergence of exertion games, this could be changing. Player balancing, known from video games, allows players with different skill levels to compete, however, it is un......In sports, if players' physical and technical abilities are mismatched, the competition is often uninteresting for them. With the emergence of exertion games, this could be changing. Player balancing, known from video games, allows players with different skill levels to compete, however......, it is unclear how balancing mechanisms should be applied in exertion games, where physical and digital elements are fused. In this paper, we present an exertion game and three approaches for balancing it; a physical, an explicit-digital and an implicit-digital balancing approach. A user study that compares...... these three approaches is used to investigate the qualities and challenges within each approach and explore how the player experience is affected by them. Based on these findings, we suggest four design strategies for balancing exertion games, so that players will stay engaged in the game and contain...

  3. Rats track odour trails accurately using a multi-layered strategy with near-optimal sampling.

    Science.gov (United States)

    Khan, Adil Ghani; Sarangi, Manaswini; Bhalla, Upinder Singh

    2012-02-28

    Tracking odour trails is a crucial behaviour for many animals, often leading to food, mates or away from danger. It is an excellent example of active sampling, where the animal itself controls how to sense the environment. Here we show that rats can track odour trails accurately with near-optimal sampling. We trained rats to follow odour trails drawn on paper spooled through a treadmill. By recording local field potentials (LFPs) from the olfactory bulb, and sniffing rates, we find that sniffing but not LFPs differ between tracking and non-tracking conditions. Rats can track odours within ~1 cm, and this accuracy is degraded when one nostril is closed. Moreover, they show path prediction on encountering a fork, wide 'casting' sweeps on encountering a gap and detection of reappearance of the trail in 1-2 sniffs. We suggest that rats use a multi-layered strategy, and achieve efficient sampling and high accuracy in this complex task.

  4. Developing a weighting strategy to include mobile phone numbers into an ongoing population health survey using an overlapping dual-frame design with limited benchmark information.

    Science.gov (United States)

    Barr, Margo L; Ferguson, Raymond A; Hughes, Phil J; Steel, David G

    2014-09-04

    In 2012 mobile phone numbers were included into the ongoing New South Wales Population Health Survey (NSWPHS) using an overlapping dual-frame design. Previously in the NSWPHS the sample was selected using random digit dialing (RDD) of landline phone numbers. The survey was undertaken using computer assisted telephone interviewing (CATI). The weighting strategy needed to be significantly expanded to manage the differing probabilities of selection by frame, including that of children of mobile-only phone users, and to adjust for the increased chance of selection of dual-phone users. This paper describes the development of the final weighting strategy to properly combine the data from two overlapping sample frames accounting for the fact that population benchmarks for the different sampling frames were not available at the state or regional level. Estimates of the number of phone numbers for the landline and mobile phone frames used to calculate the differing probabilities of selection by frame, for New South Wales (NSW) and by stratum, were obtained by apportioning Australian estimates as none were available for NSW. The weighting strategy was then developed by calculating person selection probabilities, selection weights, applying a constant composite factor to the dual-phone users sample weights, and benchmarking to the latest NSW population by age group, sex and stratum. Data from the NSWPHS for the first quarter of 2012 was used to test the weighting strategy. This consisted of data on 3395 respondents with 2171 (64%) from the landline frame and 1224 (36%) from the mobile frame. However, in order to calculate the weights, data needed to be available for all core weighting variables and so 3378 respondents, 2933 adults and 445 children, had sufficient data to be included. Average person weights were 3.3 times higher for the mobile-only respondents, 1.3 times higher for the landline-only respondents and 1.7 times higher for dual-phone users in the mobile frame

  5. Usability Design Strategies for Children: Developing Children's Learning and Knowledge in Decreasing Their Dental Anxiety

    Science.gov (United States)

    Yahaya, Wan Ahmad Jaafar Wan; Salam, Sobihatun Nur Abdul

    2010-01-01

    This paper presents an example of how usability design strategies for children can be designed into educational material using CD-ROM based multimedia application for assisting parents and teachers to develop children's learning and knowledge in decreasing as well as motivate children aged 7-9 years old to reduce their anxious feelings towards…

  6. Designing Programme Implementation Strategies to Increase the Adoption and Use of Biosand Water Filters in Rural India

    OpenAIRE

    Tommy K.K. Ngai; Richard A. Fenner

    2014-01-01

    Low-cost household water treatment systems are innovations designed to improve the quality of drinking water at the point of use. This study investigates how an NGO can design appropriate programme strategies in order to increase the adoption and sustained use of household sand filters in rural India. A system dynamics computer model was developed and used to assess 18 potential programme strategies for their effectiveness in increasing filter use at two and ten years into the future, under s...

  7. The State of Environmentally Sustainable Interior Design Practice

    OpenAIRE

    Mihyun Kang; Denise A. Guerin

    2009-01-01

    Problem statement: Research that investigates how interior designers use environmentally sustainable interior design criteria in their design solutions has not been done. To provide a base to develop education strategies for sustainable interior design, this study examined the state of environmentally sustainable interior design practice. Approach: A national, Internet-based survey of interior design practitioners was conducted. To collect data, the random sample of US interior design practit...

  8. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    Science.gov (United States)

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Designing, implementing and monitoring social impact mitigation strategies: Lessons from Forest Industry Structural Adjustment Packages

    International Nuclear Information System (INIS)

    Loxton, Edwina A.; Schirmer, Jacki; Kanowski, Peter

    2013-01-01

    Social impact mitigation strategies are implemented by the proponents of policies and projects with the intent of reducing the negative, and increasing the positive social impacts of their activities, and facilitating the achievement of policy/project goals. Evaluation of mitigation strategies is critical to improving their future success and cost-effectiveness. This paper evaluates two Forest Industry Structural Adjustment Packages (FISAP) implemented in Australia in the 1990s to 2000s as part of broader policy changes that reduced access to timber from publicly owned native forests. It assesses the effectiveness of the structure, design, implementation and monitoring of the FISAPs, and highlights the interactions between these four elements and their influence on social impacts. The two FISAPs were found to be effective in terms of reducing negative impacts, encouraging positive impacts and contributing towards policy goals, although they did not mitigate negative impacts in all cases, and sometimes interacted with external factors and additional policy changes to contribute to significant short and long term negative impacts. -- Highlights: ► Mitigation strategies aim to reduce negative and enhance positive social impacts ► Mitigation strategy design, implementation, and monitoring are critical to success ► Effective mitigation enhanced the capacity of recipients to respond to change ► Mitigation strategies influenced multiple interacting positive and negative impacts ► Success required good communication, transparency, support, resources and timing

  10. Evaluation of peptide designing strategy against subunit reassociation in mucin 1: A steered molecular dynamics approach.

    Directory of Open Access Journals (Sweden)

    J Lesitha Jeeva Kumari

    Full Text Available Subunit reassociation in mucin 1, a breast cancer tumor marker, is reported as one of the critical factors for its cytoplasmic activation. Inhibition of its heterodimeric association would therefore result in loss of its function and alter disease progression. The present study aimed at evaluating peptide inhibitor designing strategies that may serve as antagonist against this receptor-ligand alliance. Several peptides and their derivatives were designed based on native residues, subunit interface, hydrogen bonding and secondary structure. Docking studies with the peptides were carried on the receptor subunit and their binding affinities were evaluated using steered molecular dynamics simulation and umbrella sampling. Our results showed that among all the different classes of peptides evaluated, the receptor based peptide showed the highest binding affinity. This result was concurrent with the experimental observation that the receptor-ligand alliance in mucin 1 is highly specific. Our results also show that peptide ligand against this subunit association is only stabilized through native residue inter-protein interaction irrespective of the peptide structure, peptide length and number of hydrogen bonds. Consistency in binding affinity, pull force and free energy barrier was observed with only the receptor derived peptides which resulted in favorable interprotein interactions at the interface. Several observations were made and discussed which will eventually lead to designing efficient peptide inhibitors against mucin 1 heterodimeric subunit reassociation.

  11. An Optimal Sample Data Usage Strategy to Minimize Overfitting and Underfitting Effects in Regression Tree Models Based on Remotely-Sensed Data

    Directory of Open Access Journals (Sweden)

    Yingxin Gu

    2016-11-01

    Full Text Available Regression tree models have been widely used for remote sensing-based ecosystem mapping. Improper use of the sample data (model training and testing data may cause overfitting and underfitting effects in the model. The goal of this study is to develop an optimal sampling data usage strategy for any dataset and identify an appropriate number of rules in the regression tree model that will improve its accuracy and robustness. Landsat 8 data and Moderate-Resolution Imaging Spectroradiometer-scaled Normalized Difference Vegetation Index (NDVI were used to develop regression tree models. A Python procedure was designed to generate random replications of model parameter options across a range of model development data sizes and rule number constraints. The mean absolute difference (MAD between the predicted and actual NDVI (scaled NDVI, value from 0–200 and its variability across the different randomized replications were calculated to assess the accuracy and stability of the models. In our case study, a six-rule regression tree model developed from 80% of the sample data had the lowest MAD (MADtraining = 2.5 and MADtesting = 2.4, which was suggested as the optimal model. This study demonstrates how the training data and rule number selections impact model accuracy and provides important guidance for future remote-sensing-based ecosystem modeling.

  12. Mechanism Design of Fashion Virtual Enterprise under Monitoring Strategy

    Directory of Open Access Journals (Sweden)

    Min Huang

    2014-01-01

    Full Text Available Designing a revenue sharing contract to prevent the moral hazard is one of the most important issues in virtual enterprise (VE. As the partners’ productive effort level cannot be observed by the owner and other partners, there is usually moral hazard problem in VE. To mitigate the moral hazard, the owner sets the monitoring effort with monitoring cost. Considering a risk-neutral owner and multiple downside risk-averse partners, the owner’s problem of determining the monitoring effort and incentive intensity to maximize his profit while the partners determine their productive effort to maximize their profit is addressed. The principal agent based model of this problem is proposed. By solving the model, the optimal strategy of owner and partner is derived. By comparing with the no monitoring scenario, we find that implementing suitable monitoring strategy can reduce the moral hazard effectively. Finally, by analyzing the partners’ risk attitude, the result reveals that the lower the risk level of the partner is, the more the owner wants. These results suggest that VE should not only focus on the risk attitude but also on monitoring.

  13. Strategies for monitoring of priority pollutant emission barriers

    DEFF Research Database (Denmark)

    Pettersson, Maria; De Keyser, Webbey; Birch, Heidi

    2010-01-01

    The objective of Task 7.5 was to develop tools for model-based planning of sampling campaigns in the design of monitoring strategies for priority pollutant emission barriers. Using integrated urban wastewater system (IUWS) models, measurement campaigns can be designed to improve the calibration...... to be implemented in the IUWS model, as well as the sampling and measuring devices that will be used. The simulation results are presented as a Substance Flow Analysis (SFA). These SFAs can be compared with empirical SFAs and can also be used to set up measurement campaigns aiming at gathering information...

  14. Assessment of off-design performance of a small-scale combined cooling and power system using an alternative operating strategy for gas turbine

    International Nuclear Information System (INIS)

    Han, Wei; Chen, Qiang; Lin, Ru-mou; Jin, Hong-guang

    2015-01-01

    Highlights: • We develop an off-design model for a CCP system driven by gas turbine. • An alternative operating strategy is proposed to improve the system performance. • Off-design performance of the combined cooling and power system (CCP) is enhanced. • Effects of both the different operating strategy are analyzed and compared. • Performance enhancement mechanism of the proposed operating strategy is presented. - Abstract: A small-scale combined cooling and power (CCP) system usually serves district air conditioning apart from power generation purposes. The typical system consists of a gas turbine and an exhaust gas-fired absorption refrigerator. The surplus heat of the gas turbine is recovered to generate cooling energy. In this way, the CCP system has a high overall efficiency at the design point. However, the CCP system usually runs under off-design conditions because the users’ demand varies frequently. The operating strategy of the gas turbine will affect the thermodynamic performance of itself and the entire CCP system. The operating strategies for gas turbines include the reducing turbine inlet temperature (TIT) and the compressor inlet air throttling (IAT). A CCP system, consisting of an OPRA gas turbine and a double effects absorption refrigerator, is investigated to identify the effects of different operating strategies. The CCP system is simulated based on the partial-load model of gas turbine and absorption refrigerator. The off-design performance of the CCP system is compared under different operating strategies. The results show that the IAT strategy is the better one. At 50% rated power output of the gas turbine, the IAT operating strategy can increase overall system efficiency by 10% compared with the TIT strategy. In general, the IAT operating strategy is suited for other gas turbines. However, the benefits of IAT should be investigated in the future, when different gas turbine is adopted. This study may provide a new operating

  15. Prescription drug samples--does this marketing strategy counteract policies for quality use of medicines?

    Science.gov (United States)

    Groves, K E M; Sketris, I; Tett, S E

    2003-08-01

    Prescription drug samples, as used by the pharmaceutical industry to market their products, are of current interest because of their influence on prescribing, and their potential impact on consumer safety. Very little research has been conducted into the use and misuse of prescription drug samples, and the influence of samples on health policies designed to improve the rational use of medicines. This is a topical issue in the prescription drug debate, with increasing costs and increasing concerns about optimizing use of medicines. This manuscript critically evaluates the research that has been conducted to date about prescription drug samples, discusses the issues raised in the context of traditional marketing theory, and suggests possible alternatives for the future.

  16. A Cost-Constrained Sampling Strategy in Support of LAI Product Validation in Mountainous Areas

    Directory of Open Access Journals (Sweden)

    Gaofei Yin

    2016-08-01

    Full Text Available Increasing attention is being paid on leaf area index (LAI retrieval in mountainous areas. Mountainous areas present extreme topographic variability, and are characterized by more spatial heterogeneity and inaccessibility compared with flat terrain. It is difficult to collect representative ground-truth measurements, and the validation of LAI in mountainous areas is still problematic. A cost-constrained sampling strategy (CSS in support of LAI validation was presented in this study. To account for the influence of rugged terrain on implementation cost, a cost-objective function was incorporated to traditional conditioned Latin hypercube (CLH sampling strategy. A case study in Hailuogou, Sichuan province, China was used to assess the efficiency of CSS. Normalized difference vegetation index (NDVI, land cover type, and slope were selected as auxiliary variables to present the variability of LAI in the study area. Results show that CSS can satisfactorily capture the variability across the site extent, while minimizing field efforts. One appealing feature of CSS is that the compromise between representativeness and implementation cost can be regulated according to actual surface heterogeneity and budget constraints, and this makes CSS flexible. Although the proposed method was only validated for the auxiliary variables rather than the LAI measurements, it serves as a starting point for establishing the locations of field plots and facilitates the preparation of field campaigns in mountainous areas.

  17. NATURAL VENTILATION: A PASSIVE DESIGN STRATEGY IN DESIGNING HOTEL LOBBIES – CASES FROM TROPICAL MALAYSIA

    Directory of Open Access Journals (Sweden)

    Abdul Malik Abdul Rahman

    2009-07-01

    Full Text Available When the Malaysian government increased electricity tariff by up to 12% in early 2006 and also another increase in early July 2008, most commercial buildings were affected by the move. The hardest hit would be the hotel industry as they are among the economic forefronts of the nation. Already burdened with the rigorous efforts of filling their rooms with guests, they now have to re-strategize to sustain business. Energy bills to pay for cooling have always been the biggest burden. Cooling the air is an intangible and a never-ending wasteful activity. Cold room for food is on for 24 hours for obvious reasons. To overcome this, one strategy was considered to be part and parcel of the overall building design so as to contribute to the reduction of the high dependency of energy consumption for cooling. The challenge here is to reduce electricity consumption without compromising the comfort of the guests and also reduce the overhead costs to give a more competitive edge in hotel room rates. Among other passive design elements this paper considers two natural ventilation occurrences and locations that can be relied upon for Malaysian hotel designs.

  18. A novel directional asymmetric sampling search algorithm for fast block-matching motion estimation

    Science.gov (United States)

    Li, Yue-e.; Wang, Qiang

    2011-11-01

    This paper proposes a novel directional asymmetric sampling search (DASS) algorithm for video compression. Making full use of the error information (block distortions) of the search patterns, eight different direction search patterns are designed for various situations. The strategy of local sampling search is employed for the search of big-motion vector. In order to further speed up the search, early termination strategy is adopted in procedure of DASS. Compared to conventional fast algorithms, the proposed method has the most satisfactory PSNR values for all test sequences.

  19. Using formative research to design a context-specific behaviour change strategy to improve infant and young child feeding practices and nutrition in Nepal.

    Science.gov (United States)

    Locks, Lindsey M; Pandey, Pooja R; Osei, Akoto K; Spiro, David S; Adhikari, Debendra P; Haselow, Nancy J; Quinn, Victoria J; Nielsen, Jennifer N

    2015-10-01

    Global recommendations on strategies to improve infant feeding, care and nutrition are clear; however, there is limited literature that explains methods for tailoring these recommendations to the local context where programmes are implemented. This paper aims to: (1) highlight the individual, cultural and environmental factors revealed by formative research to affect infant and young child feeding and care practices in Baitadi district of Far Western Nepal; and (2) outline how both quantitative and qualitative research methods were used to design a context-specific behaviour change strategy to improve child nutrition. Quantitative data on 750 children aged 12-23 months and their families were collected via surveys administered to mothers. The participants were selected using a multistage cluster sampling technique. The survey asked about knowledge, attitude and behaviours relating to infant and young child feeding. Qualitative data on breastfeeding and complementary feeding beliefs and practices were also collected from a separate sample via focus group discussions with mothers, and key informant interviews with mothers-in-law and husbands. Key findings revealed gaps in knowledge among many informants resulting in suboptimal infant and young child feeding practices - particularly with relation to duration of exclusive breastfeeding and dietary diversity of complementary foods. The findings from this research were then incorporated into a context-specific nutrition behaviour change communication strategy. © 2013 Helen Keller International © 2013 John Wiley & Sons, Ltd.

  20. IDENTIFYING PRODUCT AND PRICE STRATEGIES FOR DESIGNING TRANSACTIONAL BANKING PACKAGES ADDRESSED TO SMES (CONSIDERATIONS

    Directory of Open Access Journals (Sweden)

    Giuca Simona-Mihaela

    2014-07-01

    Full Text Available The current paper has the aim to provide guidelines for designing efficient product and price strategies, through proposed business cases which could be used especially for banking products addressed to SMEs. While identifying the optimal product and price strategy and designing the product catalogue structure, the marketing specialist should definitely consider existing portfolio behaviour and estimate the growing potential (if possible, overall portfolio, with focus on accurately defining the additional impact of the newly proposed product/ products. A business case contains estimations for results to be generated by products to be launched or optimized. This paper presents complex schemes for business case scenarios for migration of existing portfolio to the new products, but also considers new clients acquisition based on important features of the products. The pricing strategy is not a simple task to manage. Especially when speaking about transactional packages (for which the price is lower than separate services included, some segments or clusters may generate loss to the bank if they already used the services at a higher price than the one of the package. Therefore, the decision of setting up specific prices needs to be based on an accurate and complex analysis, as presented in current paper. The assumptions used in a business case need to be relevant for the entire process of designing and launching a product, therefore they can always be adjusted for better calculation of the impact. No matter if the assumptions and prices remain as in the initial proposal or not, the steps to be followed are the same. Segmentation also plays an important role in designing the product strategy, since the target for a product or product catalogue can be represented by a segment, a sub segment or a cluster of many segments. Not always the initial segmentation represents the clustering for the product strategy. Sometimes, behaviour of existing clients

  1. Design and Promotion Strategy of Marketing Platform of Aquatic Auction based on Internet

    Science.gov (United States)

    Peng, Jianliang

    For the online trade and promotion of aquatic products and related materials through the network between supply and demand, the design content and effective promotional strategies of aquatic auctions online marketing platform is proposed in this paper. Design elements involve the location of customer service, the basic function of the platform including the purchase of general orders, online auctions, information dissemination, and recommendation of fine products, human services, and payment preferences. Based on network and mobile e-commerce transaction support, the auction platform makes the transaction of aquatic products well in advance. The results are important practical value for the design and application of online marketing platform of aquatic auction.

  2. Design and Demonstration of a Material-Plasma Exposure Target Station for Neutron Irradiated Samples

    International Nuclear Information System (INIS)

    Rapp, Juergen; Aaron, A. M.; Bell, Gary L.; Burgess, Thomas W.; Ellis, Ronald James; Giuliano, D.; Howard, R.; Kiggans, James O.; Lessard, Timothy L.; Ohriner, Evan Keith; Perkins, Dale E.; Varma, Venugopal Koikal

    2015-01-01

    Fusion energy is the most promising energy source for the future, and one of the most important problems to be solved progressing to a commercial fusion reactor is the identification of plasma-facing materials compatible with the extreme conditions in the fusion reactor environment. The development of plasma-material interaction (PMI) science and the technology of plasma-facing components are key elements in the development of the next step fusion device in the United States, the so-called Fusion Nuclear Science Facility (FNSF). All of these PMI issues and the uncertain impact of the 14-MeV neutron irradiation have been identified in numerous expert panel reports to the fusion community. The 2007 Greenwald report classifies reactor plasma-facing materials (PFCs) and materials as the only Tier 1 issues, requiring a ''. . . major extrapolation from the current state of knowledge, need for qualitative improvements and substantial development for both the short and long term.'' The Greenwald report goes on to list 19 gaps in understanding and performance related to the plasma-material interface for the technology facilities needed for DEMO-oriented R&D and DEMO itself. Of the 15 major gaps, six (G7, G9, G10, G12, G13) can possibly be addressed with ORNL's proposal of an advanced Material Plasma Exposure eXperiment. Establishing this mid-scale plasma materials test facility at ORNL is a key element in ORNL's strategy to secure a leadership role for decades of fusion R&D. That is to say, our end goal is to bring the ''signature facility'' FNSF home to ORNL. This project is related to the pre-conceptual design of an innovative target station for a future Material-Plasma Exposure eXperiment (MPEX). The target station will be designed to expose candidate fusion reactor plasma-facing materials and components (PFMs and PFCs) to conditions anticipated in fusion reactors, where PFCs will be exposed to dense high-temperature hydrogen plasmas providing steady-state heat fluxes of

  3. Proposed strategies for designing sustainable high-rise apartment buildings in Ho Chi Minh City responding to critical urban issues

    Science.gov (United States)

    Truong, Nguyen Hoang Long; Huan Giang, Ngoc; Binh Duong, Trong

    2018-03-01

    This paper aims at finding practical strategies for designing sustainable high-rise apartment buildings in Ho Chi Minh City responding to varied municipal issues. Two steps are made. Step-1 identifies the critical issues of Ho Chi Minh City which are associated with high-rise apartment building projects. Step-2 finds potential and applicable strategies which are solutions for the critical issues in Step-1 with reference of seven selected assessment methods. The study finds the set of 58 strategies applicable to designing sustainable high-rise apartment buildings in Ho Chi Minh City.

  4. The Personal Health Technology Design Space

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Frost, Mads

    2016-01-01

    . To enable designers to make informed and well-articulated design decision, the authors propose a design space for personal health technologies. This space consists of 10 dimensions related to the design of data sampling strategies, visualization and feedback approaches, treatment models, and regulatory......Interest is increasing in personal health technologies that utilize mobile platforms for improved health and well-being. However, although a wide variety of these systems exist, each is designed quite differently and materializes many different and more or less explicit design assumptions...

  5. Interactive Control System, Intended Strategy, Implemented Strategy dan Emergent Strategy

    OpenAIRE

    Tubagus Ismail; Darjat Sudrajat

    2012-01-01

    The purpose of this study was to examine the relationship between management control system (MCS) and strategy formation processes, namely: intended strategy, emergent strategy and impelemented strategy. The focus of MCS in this study was interactive control system. The study was based on Structural Equation Modeling (SEM) as its multivariate analyses instrument. The samples were upper middle managers of manufacturing company in Banten Province, DKI Jakarta Province and West Java Province. AM...

  6. Evaluation of 5-FU pharmacokinetics in cancer patients with DPD deficiency using a Bayesian limited sampling strategy

    NARCIS (Netherlands)

    Van Kuilenburg, A.; Hausler, P.; Schalhorn, A.; Tanck, M.; Proost, J.H.; Terborg, C.; Behnke, D.; Schwabe, W.; Jabschinsky, K.; Maring, J.G.

    Aims: Dihydropyrimidine dehydrogenase (DPD) is the initial enzyme in the catabolism of 5-fluorouracil (5FU) and DPD deficiency is an important pharmacogenetic syndrome. The main purpose of this study was to develop a limited sampling strategy to evaluate the pharmacokinetics of 5FU and to detect

  7. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Science.gov (United States)

    Deitchler, Megan; Deconinck, Hedwig; Bergeron, Gilles

    2008-01-01

    The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster) and a 67 × 3 cluster design (67 clusters, 3 observations per cluster). Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals) than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data collection in emergency

  8. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Directory of Open Access Journals (Sweden)

    Deconinck Hedwig

    2008-05-01

    Full Text Available Abstract The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster and a 67 × 3 cluster design (67 clusters, 3 observations per cluster. Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data

  9. Parametric Design Strategies for Collaborative Urban Design

    DEFF Research Database (Denmark)

    Steinø, Nicolai; Yıldırım, Miray Baş; Özkar, Mine

    2013-01-01

    to the collaboration between professionals, participation by different non-professional stakeholders, such as residents, local authorities, non-governmental organizations and investors, is another important component of collaborative urban design processes. The involvement of community in decision making process...... implications of planning and design decisions, unless they are presented with relatively detailed architectural models, whether physical or virtual. This however, typically presents steep demands in terms of time and resources. As a foundation for our work with parametric urban design lies the hypothesis...... to solve different scripting challenges. The paper is organized into an introduction, three main sections and closing section with conclusions and perspectives. The first section of the paper gives a theoretical discussion of the notion of collaborative design and the challenges of collaborative urban...

  10. A Review of Different Behavior Modification Strategies Designed to Reduce Sedentary Screen Behaviors in Children

    Directory of Open Access Journals (Sweden)

    Jeremy A. Steeves

    2012-01-01

    Full Text Available Previous research suggests that reducing sedentary screen behaviors may be a strategy for preventing and treating obesity in children. This systematic review describes strategies used in interventions designed to either solely target sedentary screen behaviors or multiple health behaviors, including sedentary screen behaviors. Eighteen studies were included in this paper; eight targeting sedentary screen behaviors only, and ten targeting multiple health behaviors. All studies used behavior modification strategies for reducing sedentary screen behaviors in children (aged 1–12 years. Nine studies only used behavior modification strategies, and nine studies supplemented behavior modification strategies with an electronic device to enhance sedentary screen behaviors reductions. Many interventions (50% significantly reduced sedentary screen behaviors; however the magnitude of the significant reductions varied greatly (−0.44 to −3.1 h/day and may have been influenced by the primary focus of the intervention, number of behavior modification strategies used, and other tools used to limit sedentary screen behaviors.

  11. A review of different behavior modification strategies designed to reduce sedentary screen behaviors in children.

    Science.gov (United States)

    Steeves, Jeremy A; Thompson, Dixie L; Bassett, David R; Fitzhugh, Eugene C; Raynor, Hollie A

    2012-01-01

    Previous research suggests that reducing sedentary screen behaviors may be a strategy for preventing and treating obesity in children. This systematic review describes strategies used in interventions designed to either solely target sedentary screen behaviors or multiple health behaviors, including sedentary screen behaviors. Eighteen studies were included in this paper; eight targeting sedentary screen behaviors only, and ten targeting multiple health behaviors. All studies used behavior modification strategies for reducing sedentary screen behaviors in children (aged 1-12 years). Nine studies only used behavior modification strategies, and nine studies supplemented behavior modification strategies with an electronic device to enhance sedentary screen behaviors reductions. Many interventions (50%) significantly reduced sedentary screen behaviors; however the magnitude of the significant reductions varied greatly (-0.44 to -3.1 h/day) and may have been influenced by the primary focus of the intervention, number of behavior modification strategies used, and other tools used to limit sedentary screen behaviors.

  12. Collaborative Design Strategy: Knowledge exchange for roof design

    NARCIS (Netherlands)

    Zeiler, W.; Quanjel, E.M.C.J.; Roozenburg, N.; Chen, L-L; Stappers, P.J.

    2011-01-01

    In the (Dutch) Building Industry sub optimal use of knowledge by participants during the design phase causes damage and failure costs, as well as it hinders innovative sustainable solutions. Therefore a design tool was developed to support design knowledge exchange between different design team

  13. Reading Skills and Strategies: Assessing Primary School Students’ Awareness in L1 and EFL Strategy Use

    Directory of Open Access Journals (Sweden)

    Evdokimos Aivazoglou

    2014-09-01

    Full Text Available The present study was designed and conducted with the purpose to assess primary school students’ awareness in GL1 (Greek as first language and EFL (English as a foreign language strategy use and investigate the relations between the reported reading strategies use in first (L1 and foreign language (FL.  The sample (455 students attending the fifth and sixth grades of primary schools in Northern Greece was first categorized into skilled and less skilled L1 and EFL readers through screening reading comprehension tests, one in L1 and one in FL, before filling in the reading strategy questionnaires. The findings revealed participants’ preference for “problem solving” strategies, while “global strategies” coming next. Girls were proved to be more aware of their reading strategies use with the boys reporting a more frequent use in both languages. Also, skilled readers were found to use reading strategies more effectively, and appeared to be more flexible in transferring strategies from L1 to FL compared to less-skilled readers.

  14. Evolutionary optimization and game strategies for advanced multi-disciplinary design applications to aeronautics and UAV design

    CERN Document Server

    Periaux, Jacques; Lee, Dong Seop Chris

    2015-01-01

    Many complex aeronautical design problems can be formulated with efficient multi-objective evolutionary optimization methods and game strategies. This book describes the role of advanced innovative evolution tools in the solution, or the set of solutions of single or multi disciplinary optimization. These tools use the concept of multi-population, asynchronous parallelization and hierarchical topology which allows different models including precise, intermediate and approximate models with each node belonging to the different hierarchical layer handled by a different Evolutionary Algorithm. The efficiency of evolutionary algorithms for both single and multi-objective optimization problems are significantly improved by the coupling of EAs with games and in particular by a new dynamic methodology named “Hybridized Nash-Pareto games”. Multi objective Optimization techniques and robust design problems taking into account uncertainties are introduced and explained in detail. Several applications dealing with c...

  15. Decision Tree and Survey Development for Support in Agricultural Sampling Strategies during Nuclear and Radiological Emergencies

    International Nuclear Information System (INIS)

    Yi, Amelia Lee Zhi; Dercon, Gerd

    2017-01-01

    In the event of a severe nuclear or radiological accident, the release of radionuclides results in contamination of land surfaces affecting agricultural and food resources. Speedy accumulation of information and guidance on decision making is essential in enhancing the ability of stakeholders to strategize for immediate countermeasure strategies. Support tools such as decision trees and sampling protocols allow for swift response by governmental bodies and assist in proper management of the situation. While such tools exist, they focus mainly on protecting public well-being and not food safety management strategies. Consideration of the latter is necessary as it has long-term implications especially to agriculturally dependent Member States. However, it is a research gap that remains to be filled.

  16. Limited sampling strategies drawn within 3 hours postdose poorly predict mycophenolic acid area-under-the-curve after enteric-coated mycophenolate sodium.

    NARCIS (Netherlands)

    Winter, B.C. de; Gelder, T. van; Mathôt, R.A.A.; Glander, P.; Tedesco-Silva, H.; Hilbrands, L.B.; Budde, K.; Hest, R.M. van

    2009-01-01

    Previous studies predicted that limited sampling strategies (LSS) for estimation of mycophenolic acid (MPA) area-under-the-curve (AUC(0-12)) after ingestion of enteric-coated mycophenolate sodium (EC-MPS) using a clinically feasible sampling scheme may have poor predictive performance. Failure of

  17. A novel sampling design to explore gene-longevity associations

    DEFF Research Database (Denmark)

    De Rango, Francesco; Dato, Serena; Bellizzi, Dina

    2008-01-01

    To investigate the genetic contribution to familial similarity in longevity, we set up a novel experimental design where cousin-pairs born from siblings who were concordant or discordant for the longevity trait were analyzed. To check this design, two chromosomal regions already known to encompass...... from concordant and discordant siblings. In addition, we analyzed haplotype transmission from centenarians to offspring, and a statistically significant Transmission Ratio Distortion (TRD) was observed for both chromosomal regions in the discordant families (P=0.007 for 6p21.3 and P=0.015 for 11p15.......5). In concordant families, a marginally significant TRD was observed at 6p21.3 only (P=0.06). Although no significant difference emerged between the two groups of cousin-pairs, our study gave new insights on the hindrances to recruiting a suitable sample to obtain significant IBD data on longevity...

  18. Reproducibility of R-fMRI metrics on the impact of different strategies for multiple comparison correction and sample sizes.

    Science.gov (United States)

    Chen, Xiao; Lu, Bin; Yan, Chao-Gan

    2018-01-01

    Concerns regarding reproducibility of resting-state functional magnetic resonance imaging (R-fMRI) findings have been raised. Little is known about how to operationally define R-fMRI reproducibility and to what extent it is affected by multiple comparison correction strategies and sample size. We comprehensively assessed two aspects of reproducibility, test-retest reliability and replicability, on widely used R-fMRI metrics in both between-subject contrasts of sex differences and within-subject comparisons of eyes-open and eyes-closed (EOEC) conditions. We noted permutation test with Threshold-Free Cluster Enhancement (TFCE), a strict multiple comparison correction strategy, reached the best balance between family-wise error rate (under 5%) and test-retest reliability/replicability (e.g., 0.68 for test-retest reliability and 0.25 for replicability of amplitude of low-frequency fluctuations (ALFF) for between-subject sex differences, 0.49 for replicability of ALFF for within-subject EOEC differences). Although R-fMRI indices attained moderate reliabilities, they replicated poorly in distinct datasets (replicability < 0.3 for between-subject sex differences, < 0.5 for within-subject EOEC differences). By randomly drawing different sample sizes from a single site, we found reliability, sensitivity and positive predictive value (PPV) rose as sample size increased. Small sample sizes (e.g., < 80 [40 per group]) not only minimized power (sensitivity < 2%), but also decreased the likelihood that significant results reflect "true" effects (PPV < 0.26) in sex differences. Our findings have implications for how to select multiple comparison correction strategies and highlight the importance of sufficiently large sample sizes in R-fMRI studies to enhance reproducibility. Hum Brain Mapp 39:300-318, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  19. Simulation based design strategy for EMC compliance of components in hybrid vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Maass, Uwe; Ndip, Ivan; Hoene, Eckard; Guttowski, Stephan [Fraunhofer-Institut fuer Zuverlaessigkeit und Mikrointegration (IZM), Berlin (Germany); Tschoban, Christian; Lang, Klaus-Dieter [Technische Univ. Berlin (Germany)

    2012-11-01

    The design of components for the power train of hybrid vehicles needs to take into account EMC compliance standards related to hazardous electromagnetic fields. Using a simulation based design strategy allows for virtual EMC tests in parallel to the mechanical / electrical power design and thus reduces (re-)design time and costs. Taking as an example a high-voltage battery for a hybrid vehicle the emitted magnetic fields outside the battery are examined. The simulation stategy is based on 3D EM simulations using a full-wave and an eddy current solver. The simulation models are based on the actual CAD data from the mechanical construction resulting in and a high geometrical aspect ratio. The impact of simulation specific aspects such as boundary conditions and excitation is given. It was found that using field simulations it is possible to identify noise sources and coupling paths as well as aid the construction of the battery. (orig.)

  20. Evaluating sampling strategy for DNA barcoding study of coastal and inland halo-tolerant Poaceae and Chenopodiaceae: A case study for increased sample size.

    Science.gov (United States)

    Yao, Peng-Cheng; Gao, Hai-Yan; Wei, Ya-Nan; Zhang, Jian-Hang; Chen, Xiao-Yong; Li, Hong-Qing

    2017-01-01

    Environmental conditions in coastal salt marsh habitats have led to the development of specialist genetic adaptations. We evaluated six DNA barcode loci of the 53 species of Poaceae and 15 species of Chenopodiaceae from China's coastal salt marsh area and inland area. Our results indicate that the optimum DNA barcode was ITS for coastal salt-tolerant Poaceae and matK for the Chenopodiaceae. Sampling strategies for ten common species of Poaceae and Chenopodiaceae were analyzed according to optimum barcode. We found that by increasing the number of samples collected from the coastal salt marsh area on the basis of inland samples, the number of haplotypes of Arundinella hirta, Digitaria ciliaris, Eleusine indica, Imperata cylindrica, Setaria viridis, and Chenopodium glaucum increased, with a principal coordinate plot clearly showing increased distribution points. The results of a Mann-Whitney test showed that for Digitaria ciliaris, Eleusine indica, Imperata cylindrica, and Setaria viridis, the distribution of intraspecific genetic distances was significantly different when samples from the coastal salt marsh area were included (P Imperata cylindrica and Chenopodium album, average intraspecific distance tended to reach stability. These results indicate that the sample size for DNA barcode of globally distributed species should be increased to 11-15.

  1. Assessment of long-term gas sampling design at two commercial manure-belt layer barns.

    Science.gov (United States)

    Chai, Li-Long; Ni, Ji-Qin; Chen, Yan; Diehl, Claude A; Heber, Albert J; Lim, Teng T

    2010-06-01

    Understanding temporal and spatial variations of aerial pollutant concentrations is important for designing air quality monitoring systems. In long-term and continuous air quality monitoring in large livestock and poultry barns, these systems usually use location-shared analyzers and sensors and can only sample air at limited number of locations. To assess the validity of the gas sampling design at a commercial layer farm, a new methodology was developed to map pollutant gas concentrations using portable sensors under steady-state or quasi-steady-state barn conditions. Three assessment tests were conducted from December 2008 to February 2009 in two manure-belt layer barns. Each barn was 140.2 m long and 19.5 m wide and had 250,000 birds. Each test included four measurements of ammonia and carbon dioxide concentrations at 20 locations that covered all operating fans, including six of the fans used in the long-term sampling that represented three zones along the lengths of the barns, to generate data for complete-barn monitoring. To simulate the long-term monitoring, gas concentrations from the six long-term sampling locations were extracted from the 20 assessment locations. Statistical analyses were performed to test the variances (F-test) and sample means (t test) between the 6- and 20-sample data. The study clearly demonstrated ammonia and carbon dioxide concentration gradients that were characterized by increasing concentrations from the west to east ends of the barns following the under-cage manure-belt travel direction. Mean concentrations increased from 7.1 to 47.7 parts per million (ppm) for ammonia and from 2303 to 3454 ppm for carbon dioxide from the west to east of the barns. Variations of mean gas concentrations were much less apparent between the south and north sides of the barns, because they were 21.2 and 20.9 ppm for ammonia and 2979 and 2951 ppm for carbon dioxide, respectively. The null hypotheses that the variances and means between the 6- and 20

  2. Sampling design considerations for demographic studies: a case of colonial seabirds

    Science.gov (United States)

    Kendall, William L.; Converse, Sarah J.; Doherty, Paul F.; Naughton, Maura B.; Anders, Angela; Hines, James E.; Flint, Elizabeth

    2009-01-01

    For the purposes of making many informed conservation decisions, the main goal for data collection is to assess population status and allow prediction of the consequences of candidate management actions. Reducing the bias and variance of estimates of population parameters reduces uncertainty in population status and projections, thereby reducing the overall uncertainty under which a population manager must make a decision. In capture-recapture studies, imperfect detection of individuals, unobservable life-history states, local movement outside study areas, and tag loss can cause bias or precision problems with estimates of population parameters. Furthermore, excessive disturbance to individuals during capture?recapture sampling may be of concern because disturbance may have demographic consequences. We address these problems using as an example a monitoring program for Black-footed Albatross (Phoebastria nigripes) and Laysan Albatross (Phoebastria immutabilis) nesting populations in the northwestern Hawaiian Islands. To mitigate these estimation problems, we describe a synergistic combination of sampling design and modeling approaches. Solutions include multiple capture periods per season and multistate, robust design statistical models, dead recoveries and incidental observations, telemetry and data loggers, buffer areas around study plots to neutralize the effect of local movements outside study plots, and double banding and statistical models that account for band loss. We also present a variation on the robust capture?recapture design and a corresponding statistical model that minimizes disturbance to individuals. For the albatross case study, this less invasive robust design was more time efficient and, when used in combination with a traditional robust design, reduced the standard error of detection probability by 14% with only two hours of additional effort in the field. These field techniques and associated modeling approaches are applicable to studies of

  3. Standard methods for sampling and sample preparation for gamma spectroscopy

    International Nuclear Information System (INIS)

    Taskaeva, M.; Taskaev, E.; Nikolov, P.

    1993-01-01

    The strategy for sampling and sample preparation is outlined: necessary number of samples; analysis and treatment of the results received; quantity of the analysed material according to the radionuclide concentrations and analytical methods; the minimal quantity and kind of the data needed for making final conclusions and decisions on the base of the results received. This strategy was tested in gamma spectroscopic analysis of radionuclide contamination of the region of Eleshnitsa Uranium Mines. The water samples was taken and stored according to the ASTM D 3370-82. The general sampling procedures were in conformity with the recommendations of ISO 5667. The radionuclides was concentrated by coprecipitation with iron hydroxide and ion exchange. The sampling of soil samples complied with the rules of ASTM C 998, and their sample preparation - with ASTM C 999. After preparation the samples were sealed hermetically and measured. (author)

  4. The design and analysis of a teaching and learning strategy in Biophysics Course

    Directory of Open Access Journals (Sweden)

    Beatriz Aiziczon

    2010-01-01

    Full Text Available This work presents the design and analysis of a teaching and learning strategy of Biophysics in the Medical career, in the mark of the Ausubelian Significant Learning Model, to overtake the Model of Transmission-Reception of knowledge. It is an integrative Module constructed from our previous theoretical Model and based on the authors' previous works (AIZICZON; CUDMANI, 2004, 2005, 2007. We analyze applications of conceptual maps strategy and the previous organizing in Medical Education (AUSUBEL, 1981; MOREIRA, 1983, 1999 promoting the integration of concepts allowing the progressive differentiation and the integrative reorganization as well as the formative evaluation. In this work we analyze the experience with teachers.

  5. Classifier-guided sampling for discrete variable, discontinuous design space exploration: Convergence and computational performance

    Energy Technology Data Exchange (ETDEWEB)

    Backlund, Peter B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shahan, David W. [HRL Labs., LLC, Malibu, CA (United States); Seepersad, Carolyn Conner [Univ. of Texas, Austin, TX (United States)

    2014-04-22

    A classifier-guided sampling (CGS) method is introduced for solving engineering design optimization problems with discrete and/or continuous variables and continuous and/or discontinuous responses. The method merges concepts from metamodel-guided sampling and population-based optimization algorithms. The CGS method uses a Bayesian network classifier for predicting the performance of new designs based on a set of known observations or training points. Unlike most metamodeling techniques, however, the classifier assigns a categorical class label to a new design, rather than predicting the resulting response in continuous space, and thereby accommodates nondifferentiable and discontinuous functions of discrete or categorical variables. The CGS method uses these classifiers to guide a population-based sampling process towards combinations of discrete and/or continuous variable values with a high probability of yielding preferred performance. Accordingly, the CGS method is appropriate for discrete/discontinuous design problems that are ill-suited for conventional metamodeling techniques and too computationally expensive to be solved by population-based algorithms alone. In addition, the rates of convergence and computational properties of the CGS method are investigated when applied to a set of discrete variable optimization problems. Results show that the CGS method significantly improves the rate of convergence towards known global optima, on average, when compared to genetic algorithms.

  6. Designing a two-rank acceptance sampling plan for quality inspection of geospatial data products

    Science.gov (United States)

    Tong, Xiaohua; Wang, Zhenhua; Xie, Huan; Liang, Dan; Jiang, Zuoqin; Li, Jinchao; Li, Jun

    2011-10-01

    To address the disadvantages of classical sampling plans designed for traditional industrial products, we originally propose a two-rank acceptance sampling plan (TRASP) for the inspection of geospatial data outputs based on the acceptance quality level (AQL). The first rank sampling plan is to inspect the lot consisting of map sheets, and the second is to inspect the lot consisting of features in an individual map sheet. The TRASP design is formulated as an optimization problem with respect to sample size and acceptance number, which covers two lot size cases. The first case is for a small lot size with nonconformities being modeled by a hypergeometric distribution function, and the second is for a larger lot size with nonconformities being modeled by a Poisson distribution function. The proposed TRASP is illustrated through two empirical case studies. Our analysis demonstrates that: (1) the proposed TRASP provides a general approach for quality inspection of geospatial data outputs consisting of non-uniform items and (2) the proposed acceptance sampling plan based on TRASP performs better than other classical sampling plans. It overcomes the drawbacks of percent sampling, i.e., "strictness for large lot size, toleration for small lot size," and those of a national standard used specifically for industrial outputs, i.e., "lots with different sizes corresponding to the same sampling plan."

  7. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    Science.gov (United States)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the

  8. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  9. Optimal sampling designs for large-scale fishery sample surveys in Greece

    Directory of Open Access Journals (Sweden)

    G. BAZIGOS

    2007-12-01

    The paper deals with the optimization of the following three large scale sample surveys: biological sample survey of commercial landings (BSCL, experimental fishing sample survey (EFSS, and commercial landings and effort sample survey (CLES.

  10. motivational strategies and possible influence on secondary school ...

    African Journals Online (AJOL)

    UDUAK

    This study investigated the influence of motivational strategies on teachers' teaching performance in public secondary schools in Uyo – Urban, Akwa Ibom State. One hypothesis was formed to guide the study and Expo Facto design was adopted for the study. A sample of three hundred and sixty (360) teachers were ...

  11. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    OpenAIRE

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-01-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve ten-day to five-month sea-ice forecasts. As target regions for the forecasts we select the Chukchi Sea, a...

  12. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    KAUST Repository

    Beck, Joakim

    2018-02-19

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized for a specified error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a single-loop Monte Carlo method that uses the Laplace approximation of the return value of the inner loop. The first demonstration example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  13. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    Science.gov (United States)

    Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl

    2018-06-01

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  14. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    Science.gov (United States)

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  15. Single-subject withdrawal designs in delayed matching-to-sample procedures

    OpenAIRE

    Eilifsen, Christoffer; Arntzen, Erik

    2011-01-01

    In most studies of delayed matching-to-sample (DMTS) and stimulus equivalence, the delay has remained fixed throughout a single experimental condition. We wanted to expand on the DMTS and stimulus equivalence literature by examining the effects of using titrating delays with different starting points during the establishment of conditional discriminations prerequisite for stimulus equivalence. In Experiment 1, a variation of a single-subject withdrawal design was used. Ten adults were exposed...

  16. An Alternative View of Some FIA Sample Design and Analysis Issues

    Science.gov (United States)

    Paul C. Van Deusen

    2005-01-01

    Sample design and analysis decisions are the result of compromises and inputs from many sources. The end result would likely change if different individuals or groups were involved in the planning process. Discussed here are some alternatives to the procedures that are currently being used for the annual inventory. The purpose is to indicate that alternatives exist and...

  17. Lessons Learned in Evaluating a Multisite, Comprehensive Teen Dating Violence Prevention Strategy: Design and Challenges of the Evaluation of Dating Matters: Strategies to Promote Healthy Teen Relationships.

    Science.gov (United States)

    Niolon, Phyllis Holditch; Taylor, Bruce G; Latzman, Natasha E; Vivolo-Kantor, Alana M; Valle, Linda Anne; Tharp, Andra T

    2016-03-01

    This paper describes the multisite, longitudinal cluster randomized controlled trial (RCT) design of the evaluation of the Dating Matters: Strategies to Promote Healthy Relationships initiative, and discusses challenges faced in conducting this evaluation. Health departments in 4 communities are partnering with middle schools in high-risk, urban communities to implement 2 models of teen dating violence (TDV) prevention over 4 years. Schools were randomized to receive either the Dating Matters comprehensive strategy or the "standard of care" strategy (an existing, evidence-based TDV prevention curriculum). Our design permits comparison of the relative effectiveness of the comprehensive and standard of care strategies. Multiple cohorts of students from 46 middle schools are surveyed in middle school and high school, and parents and educators from participating schools are also surveyed. Challenges discussed in conducting a multisite RCT include site variability, separation of implementation and evaluation responsibilities, school retention, parent engagement in research activities, and working within the context of high-risk urban schools and communities. We discuss the strengths and weaknesses of our approaches to these challenges in the hopes of informing future research. Despite multiple challenges, the design of the Dating Matters evaluation remains strong. We hope this paper provides researchers who are conducting complex evaluations of behavioral interventions with thoughtful discussion of the challenges we have faced and potential solutions to such challenges.

  18. Optimization of sample preparation variables for wedelolactone from Eclipta alba using Box-Behnken experimental design followed by HPLC identification.

    Science.gov (United States)

    Patil, A A; Sachin, B S; Shinde, D B; Wakte, P S

    2013-07-01

    Coumestan wedelolactone is an important phytocomponent from Eclipta alba (L.) Hassk. It possesses diverse pharmacological activities, which have prompted the development of various extraction techniques and strategies for its better utilization. The aim of the present study is to develop and optimize supercritical carbon dioxide assisted sample preparation and HPLC identification of wedelolactone from E. alba (L.) Hassk. The response surface methodology was employed to study the optimization of sample preparation using supercritical carbon dioxide for wedelolactone from E. alba (L.) Hassk. The optimized sample preparation involves the investigation of quantitative effects of sample preparation parameters viz. operating pressure, temperature, modifier concentration and time on yield of wedelolactone using Box-Behnken design. The wedelolactone content was determined using validated HPLC methodology. The experimental data were fitted to second-order polynomial equation using multiple regression analysis and analyzed using the appropriate statistical method. By solving the regression equation and analyzing 3D plots, the optimum extraction conditions were found to be: extraction pressure, 25 MPa; temperature, 56 °C; modifier concentration, 9.44% and extraction time, 60 min. Optimum extraction conditions demonstrated wedelolactone yield of 15.37 ± 0.63 mg/100 g E. alba (L.) Hassk, which was in good agreement with the predicted values. Temperature and modifier concentration showed significant effect on the wedelolactone yield. The supercritical carbon dioxide extraction showed higher selectivity than the conventional Soxhlet assisted extraction method. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  19. Standard operating procedures for collection of soil and sediment samples for the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study

    Science.gov (United States)

    Fisher, Shawn C.; Reilly, Timothy J.; Jones, Daniel K.; Benzel, William M.; Griffin, Dale W.; Loftin, Keith A.; Iwanowicz, Luke R.; Cohl, Jonathan A.

    2015-12-17

    An understanding of the effects on human and ecological health brought by major coastal storms or flooding events is typically limited because of a lack of regionally consistent baseline and trends data in locations proximal to potential contaminant sources and mitigation activities, sensitive ecosystems, and recreational facilities where exposures are probable. In an attempt to close this gap, the U.S. Geological Survey (USGS) has implemented the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study to collect regional sediment-quality data prior to and in response to future coastal storms. The standard operating procedure (SOP) detailed in this document serves as the sample-collection protocol for the SCoRR strategy by providing step-by-step instructions for site preparation, sample collection and processing, and shipping of soil and surficial sediment (for example, bed sediment, marsh sediment, or beach material). The objectives of the SCoRR strategy pilot study are (1) to create a baseline of soil-, sand-, marsh sediment-, and bed-sediment-quality data from sites located in the coastal counties from Maine to Virginia based on their potential risk of being contaminated in the event of a major coastal storm or flooding (defined as Resiliency mode); and (2) respond to major coastal storms and flooding by reoccupying select baseline sites and sampling within days of the event (defined as Response mode). For both modes, samples are collected in a consistent manner to minimize bias and maximize quality control by ensuring that all sampling personnel across the region collect, document, and process soil and sediment samples following the procedures outlined in this SOP. Samples are analyzed using four USGS-developed screening methods—inorganic geochemistry, organic geochemistry, pathogens, and biological assays—which are also outlined in this SOP. Because the SCoRR strategy employs a multi-metric approach for sample analyses, this

  20. A nested-PCR strategy for molecular diagnosis of mollicutes in uncultured biological samples from cows with vulvovaginitis.

    Science.gov (United States)

    Voltarelli, Daniele Cristina; de Alcântara, Brígida Kussumoto; Lunardi, Michele; Alfieri, Alice Fernandes; de Arruda Leme, Raquel; Alfieri, Amauri Alcindo

    2018-01-01

    Bacteria classified in Mycoplasma (M. bovis and M. bovigenitalium) and Ureaplasma (U. diversum) genera are associated with granular vulvovaginitis that affect heifers and cows at reproductive age. The traditional means for detection and speciation of mollicutes from clinical samples have been culture and serology. However, challenges experienced with these laboratory methods have hampered assessment of their impact in pathogenesis and epidemiology in cattle worldwide. The aim of this study was to develop a PCR strategy to detect and primarily discriminate between the main species of mollicutes associated with reproductive disorders of cattle in uncultured clinical samples. In order to amplify the 16S-23S rRNA internal transcribed spacer region of the genome, a consensual and species-specific nested-PCR assay was developed to identify and discriminate between main species of mollicutes. In addition, 31 vaginal swab samples from dairy and beef affected cows were investigated. This nested-PCR strategy was successfully employed in the diagnosis of single and mixed mollicute infections of diseased cows from cattle herds from Brazil. The developed system enabled the rapid and unambiguous identification of the main mollicute species known to be associated with this cattle reproductive disorder through differential amplification of partial fragments of the ITS region of mollicute genomes. The development of rapid and sensitive tools for mollicute detection and discrimination without the need for previous cultures or sequencing of PCR products is a high priority for accurate diagnosis in animal health. Therefore, the PCR strategy described herein may be helpful for diagnosis of this class of bacteria in genital swabs submitted to veterinary diagnostic laboratories, not demanding expertise in mycoplasma culture and identification. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Relative effectiveness of context-based teaching strategy on senior ...

    African Journals Online (AJOL)

    This study adopted the quasi experimental research design to examine the relative effectiveness of context-based teaching strategy on senior secondary school students' achievements in inorganic chemistry. The sample consists of 451 SSII chemistry students (224 males and 227 females) drawn from four out of 46 ...

  2. STRATEGI PEMBELAJARAN, KEMAMPUAN AKADEMIK, KEMAMPUAN PEMECAHAN MASALAH, DAN HASIL BELAJAR BIOLOGI

    OpenAIRE

    I Wayan Karmana

    2012-01-01

    Abstract: The Learning Strategy, Academic Capability, Problem Solving Skills, and Cognitive Achievement In Biology. This study investigates the effects of learning strategy, academic capability, and their interaction on problem solving skills, critical thinking, metacognitive awareness, and cognitive achievement in biology. This is a quasi experimental study using pretest-posttest non-equivalent control group design. The sample includes 60 tenth grade students of Senior High School 4 Mataram....

  3. Design strategy for improving the energy efficiency in series hydraulic/electric synergy system

    International Nuclear Information System (INIS)

    Ramakrishnan, R.; Hiremath, Somashekhar S.; Singaperumal, M.

    2014-01-01

    Battery is a vital subsystem in an electric vehicle with regenerative braking system. The energy efficiency of an electric vehicle is improved by storing the regenerated energy in an electric battery, during braking, and reusing it during subsequent acceleration. Battery possesses a relatively poor power density and slow charging of regenerated energy, when compared to hydro-pneumatic accumulators. A series hydraulic/electric synergy system – an energy efficient mechatronics system is proposed to overcome the drawbacks in the conventional electric vehicle with regenerative braking. Even though, electric battery provides higher energy density than the accumulator system, optimal sizing of the hydro-pneumatic accumulator and other process parameters in the system to provide better energy density and efficiency. However, a trade-off prevails between the system energy delivered and energy consumed. This gives rise to a multiple objective problem. The proposed multi-objective design optimization procedure based on an evolutionary strategy algorithm maximizes the energy efficiency of the system. The system simulation results after optimization show that, the optimal system parameters increase the energy efficiency by 3% and hydraulic regeneration efficiency by 17.3%. The suggested design methodology provides a basis for the design of a series hydraulic/electric synergy system as energy efficient and zero emission system. - Highlights: • Dynamic analysis of SHESS to investigate energy efficiency. • Optimization of system parameters based on multi-objective design strategy. • Evaluation of improvements in system energy efficiency and hydraulic regeneration energy. • Identification of conditions at which hydraulic regenerative efficiency is maximized for minimum energy consumption. • Results confirm advantages of using SHESS

  4. Design and Demonstration of a Material-Plasma Exposure Target Station for Neutron Irradiated Samples

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, Juergen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Aaron, A. M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bell, Gary L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Burgess, Thomas W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ellis, Ronald James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Giuliano, D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Howard, R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kiggans, James O. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lessard, Timothy L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ohriner, Evan Keith [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Perkins, Dale E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Varma, Venugopal Koikal [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-10-20

    Fusion energy is the most promising energy source for the future, and one of the most important problems to be solved progressing to a commercial fusion reactor is the identification of plasma-facing materials compatible with the extreme conditions in the fusion reactor environment. The development of plasma–material interaction (PMI) science and the technology of plasma-facing components are key elements in the development of the next step fusion device in the United States, the so-called Fusion Nuclear Science Facility (FNSF). All of these PMI issues and the uncertain impact of the 14-MeV neutron irradiation have been identified in numerous expert panel reports to the fusion community. The 2007 Greenwald report classifies reactor plasma-facing materials (PFCs) and materials as the only Tier 1 issues, requiring a “. . . major extrapolation from the current state of knowledge, need for qualitative improvements and substantial development for both the short and long term.” The Greenwald report goes on to list 19 gaps in understanding and performance related to the plasma–material interface for the technology facilities needed for DEMO-oriented R&D and DEMO itself. Of the 15 major gaps, six (G7, G9, G10, G12, G13) can possibly be addressed with ORNL’s proposal of an advanced Material Plasma Exposure eXperiment. Establishing this mid-scale plasma materials test facility at ORNL is a key element in ORNL’s strategy to secure a leadership role for decades of fusion R&D. That is to say, our end goal is to bring the “signature facility” FNSF home to ORNL. This project is related to the pre-conceptual design of an innovative target station for a future Material–Plasma Exposure eXperiment (MPEX). The target station will be designed to expose candidate fusion reactor plasma-facing materials and components (PFMs and PFCs) to conditions anticipated in fusion reactors, where PFCs will be exposed to dense high-temperature hydrogen plasmas providing steady

  5. A UAV-Based Fog Collector Design for Fine-Scale Aerobiological Sampling

    Science.gov (United States)

    Gentry, Diana; Guarro, Marcello; Demachkie, Isabella Siham; Stumfall, Isabel; Dahlgren, Robert P.

    2017-01-01

    Airborne microbes are found throughout the troposphere and into the stratosphere. Knowing how the activity of airborne microorganisms can alter water, carbon, and other geochemical cycles is vital to a full understanding of local and global ecosystems. Just as on the land or in the ocean, atmospheric regions vary in habitability; the underlying geochemical, climatic, and ecological dynamics must be characterized at different scales to be effectively modeled. Most aerobiological studies have focused on a high level: 'How high are airborne microbes found?' and 'How far can they travel?' Most fog and cloud water studies collect from stationary ground stations (point) or along flight transects (1D). To complement and provide context for this data, we have designed a UAV-based modified fog and cloud water collector to retrieve 4D-resolved samples for biological and chemical analysis.Our design uses a passive impacting collector hanging from a rigid rod suspended between two multi-rotor UAVs. The suspension design reduces the effect of turbulence and potential for contamination from the UAV downwash. The UAVs are currently modeled in a leader-follower configuration, taking advantage of recent advances in modular UAVs, UAV swarming, and flight planning.The collector itself is a hydrophobic mesh. Materials including Tyvek, PTFE, nylon, and polypropylene monofilament fabricated via laser cutting, CNC knife, or 3D printing were characterized for droplet collection efficiency using a benchtop atomizer and particle counter. Because the meshes can be easily and inexpensively fabricated, a set can be pre-sterilized and brought to the field for 'hot swapping' to decrease cross-contamination between flight sessions or use as negative controls.An onboard sensor and logging system records the time and location of each sample; when combined with flight tracking data, the samples can be resolved into a 4D volumetric map of the fog bank. Collected samples can be returned to the lab for

  6. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  7. Frontiers in biomaterials the design, synthetic strategies and biocompatibility of polymer scaffolds for biomedical application

    CERN Document Server

    Cao, Shunsheng

    2014-01-01

    Frontiers in Biomaterials: The Design, Synthetic Strategies and Biocompatibility of Polymer Scaffolds for Biomedical Application, Volume 1" highlights the importance of biomaterials and their interaction with biological system. The need for the development of biomaterials as scaffold for tissue regeneration is driven by the increasing demands for materials that mimic functions of extracellular matrices of body tissues.This ebook covers the latest challenges on the biocompatibility of scaffold overtime after implantation and discusses the requirement of innovative technologies and strategies f

  8. A two-stage Bayesian design with sample size reestimation and subgroup analysis for phase II binary response trials.

    Science.gov (United States)

    Zhong, Wei; Koopmeiners, Joseph S; Carlin, Bradley P

    2013-11-01

    Frequentist sample size determination for binary outcome data in a two-arm clinical trial requires initial guesses of the event probabilities for the two treatments. Misspecification of these event rates may lead to a poor estimate of the necessary sample size. In contrast, the Bayesian approach that considers the treatment effect to be random variable having some distribution may offer a better, more flexible approach. The Bayesian sample size proposed by (Whitehead et al., 2008) for exploratory studies on efficacy justifies the acceptable minimum sample size by a "conclusiveness" condition. In this work, we introduce a new two-stage Bayesian design with sample size reestimation at the interim stage. Our design inherits the properties of good interpretation and easy implementation from Whitehead et al. (2008), generalizes their method to a two-sample setting, and uses a fully Bayesian predictive approach to reduce an overly large initial sample size when necessary. Moreover, our design can be extended to allow patient level covariates via logistic regression, now adjusting sample size within each subgroup based on interim analyses. We illustrate the benefits of our approach with a design in non-Hodgkin lymphoma with a simple binary covariate (patient gender), offering an initial step toward within-trial personalized medicine. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Sampling design and procedures for fixed surface-water sites in the Georgia-Florida coastal plain study unit, 1993

    Science.gov (United States)

    Hatzell, H.H.; Oaksford, E.T.; Asbury, C.E.

    1995-01-01

    The implementation of design guidelines for the National Water-Quality Assessment (NAWQA) Program has resulted in the development of new sampling procedures and the modification of existing procedures commonly used in the Water Resources Division of the U.S. Geological Survey. The Georgia-Florida Coastal Plain (GAFL) study unit began the intensive data collection phase of the program in October 1992. This report documents the implementation of the NAWQA guidelines by describing the sampling design and procedures for collecting surface-water samples in the GAFL study unit in 1993. This documentation is provided for agencies that use water-quality data and for future study units that will be entering the intensive phase of data collection. The sampling design is intended to account for large- and small-scale spatial variations, and temporal variations in water quality for the study area. Nine fixed sites were selected in drainage basins of different sizes and different land-use characteristics located in different land-resource provinces. Each of the nine fixed sites was sampled regularly for a combination of six constituent groups composed of physical and chemical constituents: field measurements, major ions and metals, nutrients, organic carbon, pesticides, and suspended sediments. Some sites were also sampled during high-flow conditions and storm events. Discussion of the sampling procedure is divided into three phases: sample collection, sample splitting, and sample processing. A cone splitter was used to split water samples for the analysis of the sampling constituent groups except organic carbon from approximately nine liters of stream water collected at four fixed sites that were sampled intensively. An example of the sample splitting schemes designed to provide the sample volumes required for each sample constituent group is described in detail. Information about onsite sample processing has been organized into a flowchart that describes a pathway for each of

  10. Optimizing trial design in pharmacogenetics research: comparing a fixed parallel group, group sequential, and adaptive selection design on sample size requirements.

    Science.gov (United States)

    Boessen, Ruud; van der Baan, Frederieke; Groenwold, Rolf; Egberts, Antoine; Klungel, Olaf; Grobbee, Diederick; Knol, Mirjam; Roes, Kit

    2013-01-01

    Two-stage clinical trial designs may be efficient in pharmacogenetics research when there is some but inconclusive evidence of effect modification by a genomic marker. Two-stage designs allow to stop early for efficacy or futility and can offer the additional opportunity to enrich the study population to a specific patient subgroup after an interim analysis. This study compared sample size requirements for fixed parallel group, group sequential, and adaptive selection designs with equal overall power and control of the family-wise type I error rate. The designs were evaluated across scenarios that defined the effect sizes in the marker positive and marker negative subgroups and the prevalence of marker positive patients in the overall study population. Effect sizes were chosen to reflect realistic planning scenarios, where at least some effect is present in the marker negative subgroup. In addition, scenarios were considered in which the assumed 'true' subgroup effects (i.e., the postulated effects) differed from those hypothesized at the planning stage. As expected, both two-stage designs generally required fewer patients than a fixed parallel group design, and the advantage increased as the difference between subgroups increased. The adaptive selection design added little further reduction in sample size, as compared with the group sequential design, when the postulated effect sizes were equal to those hypothesized at the planning stage. However, when the postulated effects deviated strongly in favor of enrichment, the comparative advantage of the adaptive selection design increased, which precisely reflects the adaptive nature of the design. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Solid-Phase Extraction Strategies to Surmount Body Fluid Sample Complexity in High-Throughput Mass Spectrometry-Based Proteomics

    Science.gov (United States)

    Bladergroen, Marco R.; van der Burgt, Yuri E. M.

    2015-01-01

    For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071

  12. Evaluating sampling strategy for DNA barcoding study of coastal and inland halo-tolerant Poaceae and Chenopodiaceae: A case study for increased sample size.

    Directory of Open Access Journals (Sweden)

    Peng-Cheng Yao

    Full Text Available Environmental conditions in coastal salt marsh habitats have led to the development of specialist genetic adaptations. We evaluated six DNA barcode loci of the 53 species of Poaceae and 15 species of Chenopodiaceae from China's coastal salt marsh area and inland area. Our results indicate that the optimum DNA barcode was ITS for coastal salt-tolerant Poaceae and matK for the Chenopodiaceae. Sampling strategies for ten common species of Poaceae and Chenopodiaceae were analyzed according to optimum barcode. We found that by increasing the number of samples collected from the coastal salt marsh area on the basis of inland samples, the number of haplotypes of Arundinella hirta, Digitaria ciliaris, Eleusine indica, Imperata cylindrica, Setaria viridis, and Chenopodium glaucum increased, with a principal coordinate plot clearly showing increased distribution points. The results of a Mann-Whitney test showed that for Digitaria ciliaris, Eleusine indica, Imperata cylindrica, and Setaria viridis, the distribution of intraspecific genetic distances was significantly different when samples from the coastal salt marsh area were included (P < 0.01. These results suggest that increasing the sample size in specialist habitats can improve measurements of intraspecific genetic diversity, and will have a positive effect on the application of the DNA barcodes in widely distributed species. The results of random sampling showed that when sample size reached 11 for Chloris virgata, Chenopodium glaucum, and Dysphania ambrosioides, 13 for Setaria viridis, and 15 for Eleusine indica, Imperata cylindrica and Chenopodium album, average intraspecific distance tended to reach stability. These results indicate that the sample size for DNA barcode of globally distributed species should be increased to 11-15.

  13. Parameter-Invariant Hierarchical Exclusive Alphabet Design for 2-WRC with HDF Strategy

    Directory of Open Access Journals (Sweden)

    T. Uřičář

    2010-01-01

    Full Text Available Hierarchical eXclusive Code (HXC for the Hierarchical Decode and Forward (HDF strategy in the Wireless 2-Way Relay Channel (2-WRC has the achievable rate region extended beyond the classical MAC region. Although direct HXC design is in general highly complex, a layered approach to HXC design is a feasible solution. While the outer layer code of the layered HXC can be any state-of-the-art capacity approaching code, the inner layer must be designed in such a way that the exclusive property of hierarchical symbols (received at the relay will be provided. The simplest case of the inner HXC layer is a simple signal space channel symbol memoryless mapper called Hierarchical eXclusive Alphabet (HXA. The proper design of HXA is important, especially in the case of parametric channels, where channel parametrization (e.g. phase rotation can violate the exclusive property of hierarchical symbols (as seen by the relay, resulting in significant capacity degradation. In this paper we introduce an example of a geometrical approach to Parameter-Invariant HXA design, and we show that the corresponding hierarchical MAC capacity region extends beyond the classical MAC region, irrespective of the channel pametrization.

  14. Sustained Attention Across the Life Span in a Sample of 10,000: Dissociating Ability and Strategy.

    Science.gov (United States)

    Fortenbaugh, Francesca C; DeGutis, Joseph; Germine, Laura; Wilmer, Jeremy B; Grosso, Mallory; Russo, Kathryn; Esterman, Michael

    2015-09-01

    Normal and abnormal differences in sustained visual attention have long been of interest to scientists, educators, and clinicians. Still lacking, however, is a clear understanding of how sustained visual attention varies across the broad sweep of the human life span. In the present study, we filled this gap in two ways. First, using an unprecedentedly large 10,430-person sample, we modeled age-related differences with substantially greater precision than have prior efforts. Second, using the recently developed gradual-onset continuous performance test (gradCPT), we parsed sustained-attention performance over the life span into its ability and strategy components. We found that after the age of 15 years, the strategy and ability trajectories saliently diverge. Strategy becomes monotonically more conservative with age, whereas ability peaks in the early 40s and is followed by a gradual decline in older adults. These observed life-span trajectories for sustained attention are distinct from results of other life-span studies focusing on fluid and crystallized intelligence. © The Author(s) 2015.

  15. Design-based estimators for snowball sampling

    OpenAIRE

    Shafie, Termeh

    2010-01-01

    Snowball sampling, where existing study subjects recruit further subjects from amongtheir acquaintances, is a popular approach when sampling from hidden populations.Since people with many in-links are more likely to be selected, there will be a selectionbias in the samples obtained. In order to eliminate this bias, the sample data must beweighted. However, the exact selection probabilities are unknown for snowball samplesand need to be approximated in an appropriate way. This paper proposes d...

  16. Service platforms management strategy: case study of an interior design firm

    Directory of Open Access Journals (Sweden)

    Leonel Del Rey de Melo Filho

    2015-03-01

    Full Text Available Platform management is a strategic tool for firms of various sizes, although it demands studies in the service sector. The aim of this paper is to investigate a use of platform management, designed to reach flexibility and operational dynamics in service projects. The studied platform is evaluated as a strategic resource in a particular case. The contributions of the service platform were explored from Resource-Based View (RBV and Service Marketing (SM perspectives, to study their effects on firms’ performance. The research strategy used was an exploratory case study in an interior design firm. The data collection techniques included a participant observation, document analysis and a focus group with firm managers. The research demonstrated that platform management is a strategic resource that assists with the planning of internal capabilities, market positioning, and provides better customer service.

  17. Sampling and energy evaluation challenges in ligand binding protein design.

    Science.gov (United States)

    Dou, Jiayi; Doyle, Lindsey; Jr Greisen, Per; Schena, Alberto; Park, Hahnbeom; Johnsson, Kai; Stoddard, Barry L; Baker, David

    2017-12-01

    The steroid hormone 17α-hydroxylprogesterone (17-OHP) is a biomarker for congenital adrenal hyperplasia and hence there is considerable interest in development of sensors for this compound. We used computational protein design to generate protein models with binding sites for 17-OHP containing an extended, nonpolar, shape-complementary binding pocket for the four-ring core of the compound, and hydrogen bonding residues at the base of the pocket to interact with carbonyl and hydroxyl groups at the more polar end of the ligand. Eight of 16 designed proteins experimentally tested bind 17-OHP with micromolar affinity. A co-crystal structure of one of the designs revealed that 17-OHP is rotated 180° around a pseudo-two-fold axis in the compound and displays multiple binding modes within the pocket, while still interacting with all of the designed residues in the engineered site. Subsequent rounds of mutagenesis and binding selection improved the ligand affinity to nanomolar range, while appearing to constrain the ligand to a single bound conformation that maintains the same "flipped" orientation relative to the original design. We trace the discrepancy in the design calculations to two sources: first, a failure to model subtle backbone changes which alter the distribution of sidechain rotameric states and second, an underestimation of the energetic cost of desolvating the carbonyl and hydroxyl groups of the ligand. The difference between design model and crystal structure thus arises from both sampling limitations and energy function inaccuracies that are exacerbated by the near two-fold symmetry of the molecule. © 2017 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  18. Teaching Strategies and Gender in Higher Education Instrumental Studios

    Science.gov (United States)

    Zhukov, Katie

    2012-01-01

    This study investigates instrumental music teaching strategies in higher education settings, in order to identify those employed and their frequency and context of use. An instrument- and gender-balanced sample of 24 lessons from five institutions was analysed using a researcher-designed observational instrument. The results reveal the…

  19. Effects of Direct and Indirect Instructional Strategies on Students ...

    African Journals Online (AJOL)

    This is a quasi experimental research designed to determine the effects of Direct and Indirect instructional strategies on Mathematics achievement among junior secondary school students. The population consisted of students in a Public Secondary School in Owerri, Imo State. A sample of 102 students from two (2) intact ...

  20. PENGARUH STRATEGI HEURISTIK VEE TERHADAP KEMAMPUAN PEMAHAMAN KONSEP MATEMATIK

    Directory of Open Access Journals (Sweden)

    Otong Suhyanto

    2016-12-01

    only design. Subyek penelitian ini adalah 55 mahasiswa yang terdiri dari 28 mahasiswa kelompok eksperimen dan 27 mahasiswa kelompok kontrol yang diperoleh dengan teknik cluster random sampling. Kemampuan pemahaman konsep matematik mahasiswa dikumpulkan dengan menggunakan tes essay. Hasil penelitian mengungkapkan bahwa kemampuan pemahaman konsep matematik mahasiswa yang diajar dengan strategi pembelajaran heuristik vee lebih tinggi dari pada kemampuan pemahaman konsep matematika mahasiswa yang diajar dengan strategi pembelajaran konvensional. Hal ini dapat dilihat dari nilai rata-rata hasil tes kemampuan pemahaman konsep matematik mahasiswa yang diajar dengan strategi pembelajaran heuristik vee sebesar 83,96 dan nilai rata-rata hasil tes kemampuan pemahaman konsep matematik mahasiswa yang diajar dengan strategi pembelajaran konvensional sebesar 78,3.

  1. iPhone application development strategies for efficient mobile design and delivery

    CERN Document Server

    Hahn, Jim

    2011-01-01

    iPhone application development is explained here in an accessible treatment for the generalist Library and Information Science (LIS) practitioner. Future information-seeking practices by users will take place across a diverse array of ubiquitous computing devices. iPhone applications represent one of the most compelling new platforms for which to remediate and re-engineer library service. Strategies of efficient mobile design and delivery include adapting computing best practices of data independence and adhering to web standards as articulated by the W3C. These best practices apply across the

  2. Learning algebra through MCREST strategy in junior high school students

    Science.gov (United States)

    Siregar, Nurfadilah; Kusumah, Yaya S.; Sabandar, J.; Dahlan, J. A.

    2017-09-01

    The aims of this paper are to describe the use of MCREST strategy in learning algebra and to obtain empirical evidence on the effect of MCREST strategy es specially on reasoning ability. Students in eight grade in one of schools at Cimahi City are chosen as the sample of this study. Using pre-test and post-test control group design, the data then analyzed in descriptive and inferential statistics. The results of this study show the students who got MCREST strategy in their class have better result in test of reasoning ability than students who got direct learning. It means that MCREST strategy gives good impact in learning algebra.

  3. Adaptive clinical trial designs with pre-specified rules for modifying the sample size: understanding efficient types of adaptation.

    Science.gov (United States)

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2013-04-15

    Adaptive clinical trial design has been proposed as a promising new approach that may improve the drug discovery process. Proponents of adaptive sample size re-estimation promote its ability to avoid 'up-front' commitment of resources, better address the complicated decisions faced by data monitoring committees, and minimize accrual to studies having delayed ascertainment of outcomes. We investigate aspects of adaptation rules, such as timing of the adaptation analysis and magnitude of sample size adjustment, that lead to greater or lesser statistical efficiency. Owing in part to the recent Food and Drug Administration guidance that promotes the use of pre-specified sampling plans, we evaluate alternative approaches in the context of well-defined, pre-specified adaptation. We quantify the relative costs and benefits of fixed sample, group sequential, and pre-specified adaptive designs with respect to standard operating characteristics such as type I error, maximal sample size, power, and expected sample size under a range of alternatives. Our results build on others' prior research by demonstrating in realistic settings that simple and easily implemented pre-specified adaptive designs provide only very small efficiency gains over group sequential designs with the same number of analyses. In addition, we describe optimal rules for modifying the sample size, providing efficient adaptation boundaries on a variety of scales for the interim test statistic for adaptation analyses occurring at several different stages of the trial. We thus provide insight into what are good and bad choices of adaptive sampling plans when the added flexibility of adaptive designs is desired. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Design strategy for control of inherently safe reactors

    International Nuclear Information System (INIS)

    Chisholm, G.H.

    1984-01-01

    Reactor power plant safety is assured through a combination of engineered barriers to radiation release (e.g., reactor containment) in combination with active reactor safety systems to shut the reactor down and remove decay heat. While not specifically identified as safety systems, the control systems responsible for continuous operation of plant subsystems are the first line of defense for mitigating radiation releases and for plant protection. Inherently safe reactors take advantage of passive system features for decay-heat removal and reactor shutdown functions normally ascribed to active reactor safety systems. The advent of these reactors may permit restructuring of the present control system design strategy. This restructuring is based on the fact that authority for protection against unlikely accidents is, as much as practical, placed upon the passive features of the system instead of the traditional placement upon the PPS. Consequently, reactor control may be simplified, allowing the reliability of control systems to be improved and more easily defended

  5. Development the conceptual design of Knowledge Based System for Integrated Maintenance Strategy and Operation

    Science.gov (United States)

    Milana; Khan, M. K.; Munive, J. E.

    2014-07-01

    The importance of maintenance has escalated significantly by the increasing of automation in manufacturing process. This condition switches traditional maintenance perspective of inevitable cost into the business competitive driver. Consequently, maintenance strategy and operation decision needs to be synchronized to business and manufacturing concerns. This paper shows the development of conceptual design of Knowledge Based System for Integrated Maintenance Strategy and Operation (KBIMSO). The framework of KBIMSO is elaborated to show the process of how the KBIMSO works to reach the maintenance decision. By considering the multi-criteria of maintenance decision making, the KB system embedded with GAP and AHP to support integrated maintenance strategy and operation which is novel in this area. The KBIMSO is useful to review the existing maintenance system and give reasonable recommendation of maintenance decisions in respect to business and manufacturing perspective.

  6. Development the conceptual design of Knowledge Based System for Integrated Maintenance Strategy and Operation

    International Nuclear Information System (INIS)

    Milana; Khan, M K; Munive, J E

    2014-01-01

    The importance of maintenance has escalated significantly by the increasing of automation in manufacturing process. This condition switches traditional maintenance perspective of inevitable cost into the business competitive driver. Consequently, maintenance strategy and operation decision needs to be synchronized to business and manufacturing concerns. This paper shows the development of conceptual design of Knowledge Based System for Integrated Maintenance Strategy and Operation (KBIMSO). The framework of KBIMSO is elaborated to show the process of how the KBIMSO works to reach the maintenance decision. By considering the multi-criteria of maintenance decision making, the KB system embedded with GAP and AHP to support integrated maintenance strategy and operation which is novel in this area. The KBIMSO is useful to review the existing maintenance system and give reasonable recommendation of maintenance decisions in respect to business and manufacturing perspective

  7. Strategies for designing novel functional meat products.

    Science.gov (United States)

    Arihara, Keizo

    2006-09-01

    In recent years, much attention has been paid to physiological functions of foods due to increasing concerns for health. Although there has been limited information of physiological functions of meat until recently, several attractive meat-based bioactive compounds, such as carnosine, anserine, l-carnitine, conjugated linoleic acid, have been studied. Emphasizing these activities is one possible approach for improving the health image of meat and developing functional meat products. This article provides potential benefits of representative meat-based bioactive compounds on human health and an overview of meat-based functional products. Strategies for designing novel functional meat products utilizing bioactive peptides and/or probiotic bacteria, is also discussed. This article focuses particularly on the possibility of meat protein-derived bioactive peptides, such as antihypertensive peptides. There are still some hurdles in developing and marketing novel functional meat products since such products are unconventional and consumers in many countries recognize meat and meat products to be bad for health. Along with accumulation of scientific data, there is an urgent need to inform consumers of the exact functional value of meat and meat products including novel functional foods.

  8. Radial line-scans as representative sampling strategy in dried-droplet laser ablation of liquid samples deposited on pre-cut filter paper disks

    Energy Technology Data Exchange (ETDEWEB)

    Nischkauer, Winfried [Institute of Chemical Technologies and Analytics, Vienna University of Technology, Vienna (Austria); Department of Analytical Chemistry, Ghent University, Ghent (Belgium); Vanhaecke, Frank [Department of Analytical Chemistry, Ghent University, Ghent (Belgium); Bernacchi, Sébastien; Herwig, Christoph [Institute of Chemical Engineering, Vienna University of Technology, Vienna (Austria); Limbeck, Andreas, E-mail: Andreas.Limbeck@tuwien.ac.at [Institute of Chemical Technologies and Analytics, Vienna University of Technology, Vienna (Austria)

    2014-11-01

    Nebulising liquid samples and using the aerosol thus obtained for further analysis is the standard method in many current analytical techniques, also with inductively coupled plasma (ICP)-based devices. With such a set-up, quantification via external calibration is usually straightforward for samples with aqueous or close-to-aqueous matrix composition. However, there is a variety of more complex samples. Such samples can be found in medical, biological, technological and industrial contexts and can range from body fluids, like blood or urine, to fuel additives or fermentation broths. Specialized nebulizer systems or careful digestion and dilution are required to tackle such demanding sample matrices. One alternative approach is to convert the liquid into a dried solid and to use laser ablation for sample introduction. Up to now, this approach required the application of internal standards or matrix-adjusted calibration due to matrix effects. In this contribution, we show a way to circumvent these matrix effects while using simple external calibration for quantification. The principle of representative sampling that we propose uses radial line-scans across the dried residue. This compensates for centro-symmetric inhomogeneities typically observed in dried spots. The effectiveness of the proposed sampling strategy is exemplified via the determination of phosphorus in biochemical fermentation media. However, the universal viability of the presented measurement protocol is postulated. Detection limits using laser ablation-ICP-optical emission spectrometry were in the order of 40 μg mL{sup −1} with a reproducibility of 10 % relative standard deviation (n = 4, concentration = 10 times the quantification limit). The reported sensitivity is fit-for-purpose in the biochemical context described here, but could be improved using ICP-mass spectrometry, if future analytical tasks would require it. Trueness of the proposed method was investigated by cross-validation with

  9. Within-otolith variability in chemical fingerprints: implications for sampling designs and possible environmental interpretation.

    Directory of Open Access Journals (Sweden)

    Antonio Di Franco

    Full Text Available Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS. LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1 whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2 the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast. We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within

  10. Within-otolith variability in chemical fingerprints: implications for sampling designs and possible environmental interpretation.

    Science.gov (United States)

    Di Franco, Antonio; Bulleri, Fabio; Pennetta, Antonio; De Benedetto, Giuseppe; Clarke, K Robert; Guidetti, Paolo

    2014-01-01

    Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1) whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2) the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast). We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within-otolith replication in the

  11. OSIRIS-REx Touch-and-Go (TAG) Mission Design for Asteroid Sample Collection

    Science.gov (United States)

    May, Alexander; Sutter, Brian; Linn, Timothy; Bierhaus, Beau; Berry, Kevin; Mink, Ron

    2014-01-01

    The Origins Spectral Interpretation Resource Identification Security Regolith Explorer (OSIRIS-REx) mission is a NASA New Frontiers mission launching in September 2016 to rendezvous with the near-Earth asteroid Bennu in October 2018. After several months of proximity operations to characterize the asteroid, OSIRIS-REx flies a Touch-And-Go (TAG) trajectory to the asteroid's surface to collect at least 60 g of pristine regolith sample for Earth return. This paper provides mission and flight system overviews, with more details on the TAG mission design and key events that occur to safely and successfully collect the sample. An overview of the navigation performed relative to a chosen sample site, along with the maneuvers to reach the desired site is described. Safety monitoring during descent is performed with onboard sensors providing an option to abort, troubleshoot, and try again if necessary. Sample collection occurs using a collection device at the end of an articulating robotic arm during a brief five second contact period, while a constant force spring mechanism in the arm assists to rebound the spacecraft away from the surface. Finally, the sample is measured quantitatively utilizing the law of conservation of angular momentum, along with qualitative data from imagery of the sampling device. Upon sample mass verification, the arm places the sample into the Stardust-heritage Sample Return Capsule (SRC) for return to Earth in September 2023.

  12. Strategies to address participant misrepresentation for eligibility in Web-based research.

    Science.gov (United States)

    Kramer, Jessica; Rubin, Amy; Coster, Wendy; Helmuth, Eric; Hermos, John; Rosenbloom, David; Moed, Rich; Dooley, Meghan; Kao, Ying-Chia; Liljenquist, Kendra; Brief, Deborah; Enggasser, Justin; Keane, Terence; Roy, Monica; Lachowicz, Mark

    2014-03-01

    Emerging methodological research suggests that the World Wide Web ("Web") is an appropriate venue for survey data collection, and a promising area for delivering behavioral intervention. However, the use of the Web for research raises concerns regarding sample validity, particularly when the Web is used for recruitment and enrollment. The purpose of this paper is to describe the challenges experienced in two different Web-based studies in which participant misrepresentation threatened sample validity: a survey study and an online intervention study. The lessons learned from these experiences generated three types of strategies researchers can use to reduce the likelihood of participant misrepresentation for eligibility in Web-based research. Examples of procedural/design strategies, technical/software strategies and data analytic strategies are provided along with the methodological strengths and limitations of specific strategies. The discussion includes a series of considerations to guide researchers in the selection of strategies that may be most appropriate given the aims, resources and target population of their studies. Copyright © 2014 John Wiley & Sons, Ltd.

  13. Robotic Irradiated Sample Handling Concept Design in Reactor TRIGA PUSPATI using Simulation Software

    International Nuclear Information System (INIS)

    Mohd Khairulezwan Abdul Manan; Mohd Sabri Minhat; Ridzuan Abdul Mutalib; Zareen Khan Abdul Jalil Khan; Nurfarhana Ayuni Joha

    2015-01-01

    This paper introduces the concept design of an Robotic Irradiated Sample Handling Machine using graphical software application, designed as a general, flexible and open platform to work on robotics. Webots has proven to be a useful tool in many fields of robotics, such as manipulator programming, mobile robots control (wheeled, sub-aquatic and walking robots), distance computation, sensor simulation, collision detection, motion planning and so on. Webots is used as the common interface for all the applications. Some practical cases and application for this concept design are illustrated on the paper to present the possibilities of this simulation software. (author)

  14. Actual distribution of Cronobacter spp. in industrial batches of powdered infant formula and consequences for performance of sampling strategies.

    Science.gov (United States)

    Jongenburger, I; Reij, M W; Boer, E P J; Gorris, L G M; Zwietering, M H

    2011-11-15

    The actual spatial distribution of microorganisms within a batch of food influences the results of sampling for microbiological testing when this distribution is non-homogeneous. In the case of pathogens being non-homogeneously distributed, it markedly influences public health risk. This study investigated the spatial distribution of Cronobacter spp. in powdered infant formula (PIF) on industrial batch-scale for both a recalled batch as well a reference batch. Additionally, local spatial occurrence of clusters of Cronobacter cells was assessed, as well as the performance of typical sampling strategies to determine the presence of the microorganisms. The concentration of Cronobacter spp. was assessed in the course of the filling time of each batch, by taking samples of 333 g using the most probable number (MPN) enrichment technique. The occurrence of clusters of Cronobacter spp. cells was investigated by plate counting. From the recalled batch, 415 MPN samples were drawn. The expected heterogeneous distribution of Cronobacter spp. could be quantified from these samples, which showed no detectable level (detection limit of -2.52 log CFU/g) in 58% of samples, whilst in the remainder concentrations were found to be between -2.52 and 2.75 log CFU/g. The estimated average concentration in the recalled batch was -2.78 log CFU/g and a standard deviation of 1.10 log CFU/g. The estimated average concentration in the reference batch was -4.41 log CFU/g, with 99% of the 93 samples being below the detection limit. In the recalled batch, clusters of cells occurred sporadically in 8 out of 2290 samples of 1g taken. The two largest clusters contained 123 (2.09 log CFU/g) and 560 (2.75 log CFU/g) cells. Various sampling strategies were evaluated for the recalled batch. Taking more and smaller samples and keeping the total sampling weight constant, considerably improved the performance of the sampling plans to detect such a type of contaminated batch. Compared to random sampling

  15. Designing Programme Implementation Strategies to Increase the Adoption and Use of Biosand Water Filters in Rural India

    Directory of Open Access Journals (Sweden)

    Tommy K.K. Ngai

    2014-06-01

    Full Text Available Low-cost household water treatment systems are innovations designed to improve the quality of drinking water at the point of use. This study investigates how an NGO can design appropriate programme strategies in order to increase the adoption and sustained use of household sand filters in rural India. A system dynamics computer model was developed and used to assess 18 potential programme strategies for their effectiveness in increasing filter use at two and ten years into the future, under seven scenarios of how the external context may plausibly evolve. The results showed that the optimal choice of strategy is influenced by the macroeconomic situation, donor funding, presence of alternative options, and the evaluation time frame. The analysis also revealed some key programme management challenges, including the trade-off between optimising short- or long-term gains, and counter-intuitive results, such as higher subsidy fund allocation leading to fewer filter distribution, and technology advances leading to fewer sales. This study outlines how an NGO can choose effective strategies in consideration of complex system interactions. This study demonstrated that small NGOs can dramatically increase their programme outcomes without necessarily increasing operational budget.

  16. Development strategy and process models for phased automation of design and digital manufacturing electronics

    Science.gov (United States)

    Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.

    2018-03-01

    The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.

  17. More than friendship is required : an empirical test of cooperative firm strategies

    OpenAIRE

    Pesämaa, Ossi; Hair Jr, Joseph F

    2006-01-01

    Purpose - The purpose of this paper is to examine a proposed six-construct theoretical model of factors influencing successful cooperative relationships and strategy development. Design/methodology/approach - A theoretical model of strategy development and cooperative relationships was tested. Qualitative research among key experts identified 15 successful regional tourism networks. Two successful cooperative networks were selected based on annual revenues. A sample of 254 small and mediu...

  18. PAT Design Strategy for Energy Recovery in Water Distribution Networks by Electrical Regulation

    Directory of Open Access Journals (Sweden)

    Helena M. Ramos

    2013-01-01

    Full Text Available In the management of water distribution networks, large energy savings can be yielded by exploiting the head drop due to the network pressure control strategy, i.e., for leak reductions. Hydropower in small streams is already exploited, but technical solutions combining efficiency and economic convenience are still required. In water distribution networks, an additional design problem comes out from the necessity of ensuring a required head drop under variable operating conditions, i.e., head and discharge variations. Both a hydraulic regulation (HR—via a series-parallel hydraulic circuit- and an electrical regulation (ER—via inverter- are feasible solutions. A design procedure for the selection of a production device in a series-parallel hydraulic circuit has been recently proposed. The procedure, named VOS (Variable Operating Strategy, is based on the overall plant efficiency criteria and is applied to a water distribution network where a PAT (pump as a turbine is used in order to produce energy. In the present paper the VOS design procedure has been extended to the electrical regulation and a comparison between HR and ER efficiency and flexibility within a water distribution network is shown: HR was found more flexible than ER and more efficient. Finally a preliminary economic study has been carried out in order to show the viability of both systems, and a shorter payback period of the electromechanical equipment was found for HR mode.

  19. Evaluation of single and two-stage adaptive sampling designs for estimation of density and abundance of freshwater mussels in a large river

    Science.gov (United States)

    Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.

    2011-01-01

    Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.

  20. Metal Organic Frameworks: Explorations and Design Strategies for MOF Synthesis

    KAUST Repository

    AbdulHalim, Rasha

    2016-11-27

    Metal-Organic Frameworks (MOFs) represent an emerging new class of functional crystalline solid-state materials. In the early discovery of this now rapidly growing class of materials significant challenges were often encountered. However, MOFs today, with its vast structural modularity, reflected by the huge library of the available chemical building blocks, and exceptional controlled porosity, stand as the most promising candidate to address many of the overbearing societal challenges pertaining to energy and environmental sustainability. A variety of design strategies have been enumerated in the literature which rely on the use of predesigned building blocks paving the way towards potentially more predictable structures. The two major design strategies presented in this work are the molecular building block (MBB) and supermolecular building block (SBB) -based approaches for the rationale assembly of functional MOF materials with the desired structural features. In this context, we targeted two highly connected MOF platforms, namely rht-MOF and shp-MOF. These two MOF platforms are classified based on their topology, defined as the underlying connectivity of their respective net, as edge transitive binodal nets; shp being (4,12)-connected net and rht being (3,24)-connected net. These highly connected nets were deliberately targeted due to the limited number of possible nets for connecting their associated basic building units. Two highly porous materials were designed and successfully constructed; namely Y-shp-MOF-5 and rht-MOF-10. The Y-shp-MOF-5 features a phenomenal water stability with an exquisite behavior when exposed to water, positioning this microporous material as the best adsorbent for moisture control applications. The shp-MOF platform proved to be modular to ligand functionalization and thus imparting significant behavioral changes when hydrophilic and hydrophobic functionalized ligands were introduced on the resultant MOF. On the other hand, rht

  1. Establishing Design Strategies and an Assessment Tool of Home Appliances to Promote Sustainable Behavior for the New Poor

    Directory of Open Access Journals (Sweden)

    Jui-Che Tu

    2018-05-01

    Full Text Available Environmental benefits related to home appliance life cycles depend on how these products are used. Designing home appliances that promote sustainable behavior is an effective way to reduce environmental impacts. This study aimed to increase relevant opportunities for promoting sustainable behavior practices on the new poor through home appliances, which is rarely discussed in the fields of design for sustainable behavior (DfSB and product design. In particular, relevant assessment tools or indicators are lacking in DfSB, and people’s use of home appliances is generally unsustainable. Therefore, repertory grid technology was used to understand the perceptions of the new poor, develop an assessment tool, and construct design strategies for home appliances that promote sustainable behavior. Data were collected from the new poor and from designers. Through cluster and principal component analyses, three strategy types were proposed that corresponded to different product features, suggestions, and guidance. In addition, the effectiveness and potential of an assessment tool were demonstrated using the Wilcoxon rank test. The findings could be used by designers, retailers, and green marketers to propose effective product design programs that promote sustainable behavior of the new poor during product use.

  2. Design and characterization of poly(dimethylsiloxane)-based valves for interfacing continuous-flow sampling to microchip electrophoresis.

    Science.gov (United States)

    Li, Michelle W; Huynh, Bryan H; Hulvey, Matthew K; Lunte, Susan M; Martin, R Scott

    2006-02-15

    This work describes the fabrication and evaluation of a poly(dimethyl)siloxane (PDMS)-based device that enables the discrete injection of a sample plug from a continuous-flow stream into a microchannel for subsequent analysis by electrophoresis. Devices were fabricated by aligning valving and flow channel layers followed by plasma sealing the combined layers onto a glass plate that contained fittings for the introduction of liquid sample and nitrogen gas. The design incorporates a reduced-volume pneumatic valve that actuates (on the order of hundreds of milliseconds) to allow analyte from a continuously flowing sampling channel to be injected into a separation channel for electrophoresis. The injector design was optimized to include a pushback channel to flush away stagnant sample associated with the injector dead volume. The effect of the valve actuation time, the pushback voltage, and the sampling stream flow rate on the performance of the device was characterized. Using the optimized design and an injection frequency of 0.64 Hz showed that the injection process is reproducible (RSD of 1.77%, n = 15). Concentration change experiments using fluorescein as the analyte showed that the device could achieve a lag time as small as 14 s. Finally, to demonstrate the potential uses of this device, the microchip was coupled to a microdialysis probe to monitor a concentration change and sample a fluorescein dye mixture.

  3. Sampling design for the Study of Cardiovascular Risks in Adolescents (ERICA

    Directory of Open Access Journals (Sweden)

    Mauricio Teixeira Leite de Vasconcellos

    2015-05-01

    Full Text Available The Study of Cardiovascular Risk in Adolescents (ERICA aims to estimate the prevalence of cardiovascular risk factors and metabolic syndrome in adolescents (12-17 years enrolled in public and private schools of the 273 municipalities with over 100,000 inhabitants in Brazil. The study population was stratified into 32 geographical strata (27 capitals and five sets with other municipalities in each macro-region of the country and a sample of 1,251 schools was selected with probability proportional to size. In each school three combinations of shift (morning and afternoon and grade were selected, and within each of these combinations, one class was selected. All eligible students in the selected classes were included in the study. The design sampling weights were calculated by the product of the reciprocals of the inclusion probabilities in each sampling stage, and were later calibrated considering the projections of the numbers of adolescents enrolled in schools located in the geographical strata by sex and age.

  4. Designing an implementation strategy to improve interprofessional shared decision making in sciatica: study protocol of the DISC study

    Directory of Open Access Journals (Sweden)

    Hofstede Stefanie N

    2012-06-01

    Full Text Available Abstract Background Sciatica is a common condition worldwide that is characterized by radiating leg pain and regularly caused by a herniated disc with nerve root compression. Sciatica patients with persisting leg pain after six to eight weeks were found to have similar clinical outcomes and associated costs after prolonged conservative treatment or surgery at one year follow-up. Guidelines recommend that the team of professionals involved in sciatica care and patients jointly decide about treatment options, so-called interprofessional shared decision making (SDM. However, there are strong indications that SDM for sciatica patients is not integrated in daily practice. We designed a study aiming to explore the barriers and facilitators associated with the everyday embedding of SDM for sciatica patients. All related relevant professionals and patients are involved to develop a tailored strategy to implement SDM for sciatica patients. Methods The study consists of two phases: identification of barriers and facilitators and development of an implementation strategy. First, barriers and facilitators are explored using semi-structured interviews among eight professionals of each (paramedical discipline involved in sciatica care (general practitioners, physical therapists, neurologists, neurosurgeons, and orthopedic surgeons. In addition, three focus groups will be conducted among patients. Second, the identified barriers and facilitators will be ranked using a questionnaire among a representative Dutch sample of 200 GPs, 200 physical therapists, 200 neurologists, all 124 neurosurgeons, 200 orthopedic surgeons, and 100 patients. A tailored team-based implementation strategy will be developed based on the results of the first phase using the principles of intervention mapping and an expert panel. Discussion Little is known about effective strategies to increase the uptake of SDM. Most implementation strategies only target a single discipline, whereas

  5. Designing an implementation strategy to improve interprofessional shared decision making in sciatica: study protocol of the DISC study.

    Science.gov (United States)

    Hofstede, Stefanie N; Marang-van de Mheen, Perla J; Assendelft, Willem J J; Vleggeert-Lankamp, Carmen L A; Stiggelbout, Anne M; Vroomen, Patrick C A J; van den Hout, Wilbert B; Vliet Vlieland, Thea P M; van Bodegom-Vos, Leti

    2012-06-15

    Sciatica is a common condition worldwide that is characterized by radiating leg pain and regularly caused by a herniated disc with nerve root compression. Sciatica patients with persisting leg pain after six to eight weeks were found to have similar clinical outcomes and associated costs after prolonged conservative treatment or surgery at one year follow-up. Guidelines recommend that the team of professionals involved in sciatica care and patients jointly decide about treatment options, so-called interprofessional shared decision making (SDM). However, there are strong indications that SDM for sciatica patients is not integrated in daily practice. We designed a study aiming to explore the barriers and facilitators associated with the everyday embedding of SDM for sciatica patients. All related relevant professionals and patients are involved to develop a tailored strategy to implement SDM for sciatica patients. The study consists of two phases: identification of barriers and facilitators and development of an implementation strategy. First, barriers and facilitators are explored using semi-structured interviews among eight professionals of each (para)medical discipline involved in sciatica care (general practitioners, physical therapists, neurologists, neurosurgeons, and orthopedic surgeons). In addition, three focus groups will be conducted among patients. Second, the identified barriers and facilitators will be ranked using a questionnaire among a representative Dutch sample of 200 GPs, 200 physical therapists, 200 neurologists, all 124 neurosurgeons, 200 orthopedic surgeons, and 100 patients. A tailored team-based implementation strategy will be developed based on the results of the first phase using the principles of intervention mapping and an expert panel. Little is known about effective strategies to increase the uptake of SDM. Most implementation strategies only target a single discipline, whereas multiple disciplines are involved in SDM among sciatica

  6. Radial line-scans as representative sampling strategy in dried-droplet laser ablation of liquid samples deposited on pre-cut filter paper disks.

    Science.gov (United States)

    Nischkauer, Winfried; Vanhaecke, Frank; Bernacchi, Sébastien; Herwig, Christoph; Limbeck, Andreas

    2014-11-01

    Nebulising liquid samples and using the aerosol thus obtained for further analysis is the standard method in many current analytical techniques, also with inductively coupled plasma (ICP)-based devices. With such a set-up, quantification via external calibration is usually straightforward for samples with aqueous or close-to-aqueous matrix composition. However, there is a variety of more complex samples. Such samples can be found in medical, biological, technological and industrial contexts and can range from body fluids, like blood or urine, to fuel additives or fermentation broths. Specialized nebulizer systems or careful digestion and dilution are required to tackle such demanding sample matrices. One alternative approach is to convert the liquid into a dried solid and to use laser ablation for sample introduction. Up to now, this approach required the application of internal standards or matrix-adjusted calibration due to matrix effects. In this contribution, we show a way to circumvent these matrix effects while using simple external calibration for quantification. The principle of representative sampling that we propose uses radial line-scans across the dried residue. This compensates for centro-symmetric inhomogeneities typically observed in dried spots. The effectiveness of the proposed sampling strategy is exemplified via the determination of phosphorus in biochemical fermentation media. However, the universal viability of the presented measurement protocol is postulated. Detection limits using laser ablation-ICP-optical emission spectrometry were in the order of 40 μg mL - 1 with a reproducibility of 10 % relative standard deviation (n = 4, concentration = 10 times the quantification limit). The reported sensitivity is fit-for-purpose in the biochemical context described here, but could be improved using ICP-mass spectrometry, if future analytical tasks would require it. Trueness of the proposed method was investigated by cross-validation with

  7. Radial line-scans as representative sampling strategy in dried-droplet laser ablation of liquid samples deposited on pre-cut filter paper disks

    Science.gov (United States)

    Nischkauer, Winfried; Vanhaecke, Frank; Bernacchi, Sébastien; Herwig, Christoph; Limbeck, Andreas

    2014-11-01

    Nebulising liquid samples and using the aerosol thus obtained for further analysis is the standard method in many current analytical techniques, also with inductively coupled plasma (ICP)-based devices. With such a set-up, quantification via external calibration is usually straightforward for samples with aqueous or close-to-aqueous matrix composition. However, there is a variety of more complex samples. Such samples can be found in medical, biological, technological and industrial contexts and can range from body fluids, like blood or urine, to fuel additives or fermentation broths. Specialized nebulizer systems or careful digestion and dilution are required to tackle such demanding sample matrices. One alternative approach is to convert the liquid into a dried solid and to use laser ablation for sample introduction. Up to now, this approach required the application of internal standards or matrix-adjusted calibration due to matrix effects. In this contribution, we show a way to circumvent these matrix effects while using simple external calibration for quantification. The principle of representative sampling that we propose uses radial line-scans across the dried residue. This compensates for centro-symmetric inhomogeneities typically observed in dried spots. The effectiveness of the proposed sampling strategy is exemplified via the determination of phosphorus in biochemical fermentation media. However, the universal viability of the presented measurement protocol is postulated. Detection limits using laser ablation-ICP-optical emission spectrometry were in the order of 40 μg mL- 1 with a reproducibility of 10 % relative standard deviation (n = 4, concentration = 10 times the quantification limit). The reported sensitivity is fit-for-purpose in the biochemical context described here, but could be improved using ICP-mass spectrometry, if future analytical tasks would require it. Trueness of the proposed method was investigated by cross-validation with

  8. A design-based approximation to the Bayes Information Criterion in finite population sampling

    Directory of Open Access Journals (Sweden)

    Enrico Fabrizi

    2014-05-01

    Full Text Available In this article, various issues related to the implementation of the usual Bayesian Information Criterion (BIC are critically examined in the context of modelling a finite population. A suitable design-based approximation to the BIC is proposed in order to avoid the derivation of the exact likelihood of the sample which is often very complex in a finite population sampling. The approximation is justified using a theoretical argument and a Monte Carlo simulation study.

  9. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  10. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  11. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    Science.gov (United States)

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.

  12. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  13. Design, analysis, and interpretation of field quality-control data for water-sampling projects

    Science.gov (United States)

    Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.

    2015-01-01

    The process of obtaining and analyzing water samples from the environment includes a number of steps that can affect the reported result. The equipment used to collect and filter samples, the bottles used for specific subsamples, any added preservatives, sample storage in the field, and shipment to the laboratory have the potential to affect how accurately samples represent the environment from which they were collected. During the early 1990s, the U.S. Geological Survey implemented policies to include the routine collection of quality-control samples in order to evaluate these effects and to ensure that water-quality data were adequately representing environmental conditions. Since that time, the U.S. Geological Survey Office of Water Quality has provided training in how to design effective field quality-control sampling programs and how to evaluate the resultant quality-control data. This report documents that training material and provides a reference for methods used to analyze quality-control data.

  14. Improved sample size determination for attributes and variables sampling

    International Nuclear Information System (INIS)

    Stirpe, D.; Picard, R.R.

    1985-01-01

    Earlier INMM papers have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, we have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed; and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments. 1 ref., 3 figs

  15. Design and building of a homemade sample changer for automation of the irradiation in neutron activation analysis technique

    International Nuclear Information System (INIS)

    Gago, Javier; Hernandez, Yuri; Baltuano, Oscar; Bedregal, Patricia; Lopez, Yon; Urquizo, Rafael

    2014-01-01

    Because the RP-10 research reactor operates during weekends, it was necessary to design and build a sample changer for irradiation as part of the automation process of neutron activation analysis technique. The device is formed by an aluminum turntable disk which can accommodate 19 polyethylene capsules, containing samples to be sent using the pneumatic transfer system from the laboratory to the irradiation position. The system is operate by a control switchboard to send and return capsules in a variable preset time and by two different ways, allowing the determination of short, medium and long lived radionuclides. Also another mechanism is designed called 'exchange valve' for changing travel paths (pipelines) allowing the irradiated samples to be stored for a longer time in the reactor hall. The system design has allowed complete automation of this technique, enabling the irradiation of samples without the presence of an analyst. The design, construction and operation of the device is described and presented in this article. (authors).

  16. Sampling strategies and materials for investigating large reactive particle complaints from Valley Village homeowners near a coal-fired power plant

    International Nuclear Information System (INIS)

    Chang, A.; Davis, H.; Frazar, B.; Haines, B.

    1997-01-01

    This paper will present Phase 3's sampling strategies, techniques, methods and substrates for assisting the District to resolve the complaints involving yellowish-brown staining and spotting of homes, cars, etc. These spots could not be easily washed off and some were permanent. The sampling strategies for the three phases were based on Phase 1 -- the identification of the reactive particles conducted in October, 1989 by APCD and IITRI, Phase 2 -- a study of the size distribution and concentration as a function of distance and direction of reactive particle deposition conducted by Radian and LG and E, and Phase 3 -- the determination of the frequency of soiling events over a full year's duration conducted in 1995 by APCD and IITRI. The sampling methods included two primary substrates -- ACE sheets and painted steel, and four secondary substrates -- mailbox, aluminum siding, painted wood panels and roof tiles. The secondary substrates were the main objects from the Valley Village complaints. The sampling technique included five Valley Village (VV) soiling/staining assessment sites and one southwest of the power plant as background/upwind site. The five VV sites northeast of the power plant covered 50 degrees span sector and 3/4 miles distance from the stacks. Hourly meteorological data for wind speeds and wind directions were collected. Based on this sampling technique, there were fifteen staining episodes detected. Nine of them were in summer, 1995

  17. Design of a sample acquistion system for the Mars exobiological penetrator

    Science.gov (United States)

    Thomson, Ron; Gwynne, Owen

    1988-01-01

    The Mars Exobiological Penetrator will be imbedded into several locations on the Martian surface. It contains various scientific instruments, such as an Alpha-Particle Instrument (API), Differential Scanning Calorimeter (DSC), Evolved Gas Analyzer (EGA) and accelerometers. A sample is required for analysis in the API and DSC. To avoid impact contaminated material, this sample must be taken from soil greater than 2 cm away from the penetrator shell. This study examines the design of a dedicated sampling system including deployment, suspension, fore/after body coupling, sample gathering and placement. To prevent subsurface material from entering the penetrator sampling compartment during impact, a plug is placed in the exit hole of the wall. A U-lever device is used to hold this plug in the penetrator wall. The U-lever rotates upon initial motion of the core-grinder mechanism (CGM), releasing the plug. Research points to a combination of coring and grinding as a plausible solution to the problem of dry drilling. The CGM, driven by two compressed springs, will be deployed along a tracking system. A slowly varying load i.e., springs, is favored over a fixed displacement motion because of its adaptability to different material hardness. However, to accommodate sampling in a low density soil, two dash pots set a maximum transverse velocity. In addition, minimal power use is achieved by unidirectional motion of the CGM. The sample will be transported to the scientific instruments by means of a sample placement tray that is driven by a compressed spring to avoid unnecessary power usage. This paper also explores possible modifications for size, weight, and time as well as possible future studies.

  18. Optimization of sampling pattern and the design of Fourier ptychographic illuminator.

    Science.gov (United States)

    Guo, Kaikai; Dong, Siyuan; Nanda, Pariksheet; Zheng, Guoan

    2015-03-09

    Fourier ptychography (FP) is a recently developed imaging approach that facilitates high-resolution imaging beyond the cutoff frequency of the employed optics. In the original FP approach, a periodic LED array is used for sample illumination, and therefore, the scanning pattern is a uniform grid in the Fourier space. Such a uniform sampling scheme leads to 3 major problems for FP, namely: 1) it requires a large number of raw images, 2) it introduces the raster grid artefacts in the reconstruction process, and 3) it requires a high-dynamic-range detector. Here, we investigate scanning sequences and sampling patterns to optimize the FP approach. For most biological samples, signal energy is concentrated at low-frequency region, and as such, we can perform non-uniform Fourier sampling in FP by considering the signal structure. In contrast, conventional ptychography perform uniform sampling over the entire real space. To implement the non-uniform Fourier sampling scheme in FP, we have designed and built an illuminator using LEDs mounted on a 3D-printed plastic case. The advantages of this illuminator are threefold in that: 1) it reduces the number of image acquisitions by at least 50% (68 raw images versus 137 in the original FP setup), 2) it departs from the translational symmetry of sampling to solve the raster grid artifact problem, and 3) it reduces the dynamic range of the captured images 6 fold. The results reported in this paper significantly shortened acquisition time and improved quality of FP reconstructions. It may provide new insights for developing Fourier ptychographic imaging platforms and find important applications in digital pathology.

  19. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  20. Baseline Design Compliance Matrix for the Type 4 In Situ Vapor Samplers and Supernate and Sludge and Soft Saltcake Grab Sampling

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    The DOE has identified a need to sample vapor space, exhaust ducts, supernate, sludge, and soft saltcake in waste tanks that store radioactive waste. This document provides the Design Compliance Matrix (DCM) for the Type 4 In-Situ Vapor Sampling (ISVS) system and the Grab Sampling System that are used for completing this type of sampling function. The DCM identifies the design requirements and the source of the requirements for the Type 4 ISVS system and the Grab Sampling system. The DCM is a single-source compilation design requirements for sampling and sampling support equipment and supports the configuration management of these systems