WorldWideScience

Sample records for sampling strategy based

  1. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  2. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    Science.gov (United States)

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. A cache-friendly sampling strategy for texture-based volume rendering on GPU

    Directory of Open Access Journals (Sweden)

    Junpeng Wang

    2017-06-01

    Full Text Available The texture-based volume rendering is a memory-intensive algorithm. Its performance relies heavily on the performance of the texture cache. However, most existing texture-based volume rendering methods blindly map computational resources to texture memory and result in incoherent memory access patterns, causing low cache hit rates in certain cases. The distance between samples taken by threads of an atomic scheduling unit (e.g. a warp of 32 threads in CUDA of the GPU is a crucial factor that affects the texture cache performance. Based on this fact, we present a new sampling strategy, called Warp Marching, for the ray-casting algorithm of texture-based volume rendering. The effects of different sample organizations and different thread-pixel mappings in the ray-casting algorithm are thoroughly analyzed. Also, a pipeline manner color blending approach is introduced and the power of warp-level GPU operations is leveraged to improve the efficiency of parallel executions on the GPU. In addition, the rendering performance of the Warp Marching is view-independent, and it outperforms existing empty space skipping techniques in scenarios that need to render large dynamic volumes in a low resolution image. Through a series of micro-benchmarking and real-life data experiments, we rigorously analyze our sampling strategies and demonstrate significant performance enhancements over existing sampling methods.

  4. Evaluation of sampling strategies to estimate crown biomass

    Directory of Open Access Journals (Sweden)

    Krishna P Poudel

    2015-01-01

    Full Text Available Background Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire modeling. However, crown biomass is difficult to predict because of the variability within and among species and sites. Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies. In this study, we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass. Methods Using data collected from 20 destructively sampled trees, we evaluated 11 different sampling strategies using six evaluation statistics: bias, relative bias, root mean square error (RMSE, relative RMSE, amount of biomass sampled, and relative biomass sampled. We also evaluated the performance of the selected sampling strategies when different numbers of branches (3, 6, 9, and 12 are selected from each tree. Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass. Results Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled. However, the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled. Under the stratified sampling strategy, selecting unequal number of branches per stratum produced approximately similar results to simple random sampling, but it further decreased RMSE when information on branch diameter is used in the design and estimation phases. Conclusions Use of

  5. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    Science.gov (United States)

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  6. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    Science.gov (United States)

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  7. Validated sampling strategy for assessing contaminants in soil stockpiles

    International Nuclear Information System (INIS)

    Lame, Frank; Honders, Ton; Derksen, Giljam; Gadella, Michiel

    2005-01-01

    Dutch legislation on the reuse of soil requires a sampling strategy to determine the degree of contamination. This sampling strategy was developed in three stages. Its main aim is to obtain a single analytical result, representative of the true mean concentration of the soil stockpile. The development process started with an investigation into how sample pre-treatment could be used to obtain representative results from composite samples of heterogeneous soil stockpiles. Combining a large number of random increments allows stockpile heterogeneity to be fully represented in the sample. The resulting pre-treatment method was then combined with a theoretical approach to determine the necessary number of increments per composite sample. At the second stage, the sampling strategy was evaluated using computerised models of contaminant heterogeneity in soil stockpiles. The now theoretically based sampling strategy was implemented by the Netherlands Centre for Soil Treatment in 1995. It was applied to all types of soil stockpiles, ranging from clean to heavily contaminated, over a period of four years. This resulted in a database containing the analytical results of 2570 soil stockpiles. At the final stage these results were used for a thorough validation of the sampling strategy. It was concluded that the model approach has indeed resulted in a sampling strategy that achieves analytical results representative of the mean concentration of soil stockpiles. - A sampling strategy that ensures analytical results representative of the mean concentration in soil stockpiles is presented and validated

  8. Population Pharmacokinetics and Optimal Sampling Strategy for Model-Based Precision Dosing of Melphalan in Patients Undergoing Hematopoietic Stem Cell Transplantation.

    Science.gov (United States)

    Mizuno, Kana; Dong, Min; Fukuda, Tsuyoshi; Chandra, Sharat; Mehta, Parinda A; McConnell, Scott; Anaissie, Elias J; Vinks, Alexander A

    2018-05-01

    High-dose melphalan is an important component of conditioning regimens for patients undergoing hematopoietic stem cell transplantation. The current dosing strategy based on body surface area results in a high incidence of oral mucositis and gastrointestinal and liver toxicity. Pharmacokinetically guided dosing will individualize exposure and help minimize overexposure-related toxicity. The purpose of this study was to develop a population pharmacokinetic model and optimal sampling strategy. A population pharmacokinetic model was developed with NONMEM using 98 observations collected from 15 adult patients given the standard dose of 140 or 200 mg/m 2 by intravenous infusion. The determinant-optimal sampling strategy was explored with PopED software. Individual area under the curve estimates were generated by Bayesian estimation using full and the proposed sparse sampling data. The predictive performance of the optimal sampling strategy was evaluated based on bias and precision estimates. The feasibility of the optimal sampling strategy was tested using pharmacokinetic data from five pediatric patients. A two-compartment model best described the data. The final model included body weight and creatinine clearance as predictors of clearance. The determinant-optimal sampling strategies (and windows) were identified at 0.08 (0.08-0.19), 0.61 (0.33-0.90), 2.0 (1.3-2.7), and 4.0 (3.6-4.0) h post-infusion. An excellent correlation was observed between area under the curve estimates obtained with the full and the proposed four-sample strategy (R 2  = 0.98; p strategy promises to achieve the target area under the curve as part of precision dosing.

  9. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    International Nuclear Information System (INIS)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ 1 -minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy

  10. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    Science.gov (United States)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ1-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.

  11. A Bayesian sampling strategy for hazardous waste site characterization

    International Nuclear Information System (INIS)

    Skalski, J.R.

    1987-12-01

    Prior knowledge based on historical records or physical evidence often suggests the existence of a hazardous waste site. Initial surveys may provide additional or even conflicting evidence of site contamination. This article presents a Bayes sampling strategy that allocates sampling at a site using this prior knowledge. This sampling strategy minimizes the environmental risks of missing chemical or radionuclide hot spots at a waste site. The environmental risk is shown to be proportional to the size of the undetected hot spot or inversely proportional to the probability of hot spot detection. 12 refs., 2 figs

  12. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Directory of Open Access Journals (Sweden)

    Jake M Ferguson

    2014-06-01

    Full Text Available The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  13. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Science.gov (United States)

    Ferguson, Jake M; Langebrake, Jessica B; Cannataro, Vincent L; Garcia, Andres J; Hamman, Elizabeth A; Martcheva, Maia; Osenberg, Craig W

    2014-06-01

    The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  14. Soil sampling strategies: Evaluation of different approaches

    Energy Technology Data Exchange (ETDEWEB)

    De Zorzi, Paolo [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, 100-00128 Roma (Italy)], E-mail: paolo.dezorzi@apat.it; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, 100-00128 Roma (Italy); Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia [Agenzia Regionale per la Prevenzione e Protezione dell' Ambiente del Veneto, ARPA Veneto, U.O. Centro Qualita Dati, Via Spalato, 14-36045 Vicenza (Italy)

    2008-11-15

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2{sigma}, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  15. Soil sampling strategies: Evaluation of different approaches

    International Nuclear Information System (INIS)

    De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-01-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2σ, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies

  16. Soil sampling strategies: evaluation of different approaches.

    Science.gov (United States)

    de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-11-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2sigma, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  17. A comparison of temporal and location-based sampling strategies for global positioning system-triggered electronic diaries.

    Science.gov (United States)

    Törnros, Tobias; Dorn, Helen; Reichert, Markus; Ebner-Priemer, Ulrich; Salize, Hans-Joachim; Tost, Heike; Meyer-Lindenberg, Andreas; Zipf, Alexander

    2016-11-21

    Self-reporting is a well-established approach within the medical and psychological sciences. In order to avoid recall bias, i.e. past events being remembered inaccurately, the reports can be filled out on a smartphone in real-time and in the natural environment. This is often referred to as ambulatory assessment and the reports are usually triggered at regular time intervals. With this sampling scheme, however, rare events (e.g. a visit to a park or recreation area) are likely to be missed. When addressing the correlation between mood and the environment, it may therefore be beneficial to include participant locations within the ambulatory assessment sampling scheme. Based on the geographical coordinates, the database query system then decides if a self-report should be triggered or not. We simulated four different ambulatory assessment sampling schemes based on movement data (coordinates by minute) from 143 voluntary participants tracked for seven consecutive days. Two location-based sampling schemes incorporating the environmental characteristics (land use and population density) at each participant's location were introduced and compared to a time-based sampling scheme triggering a report on the hour as well as to a sampling scheme incorporating physical activity. We show that location-based sampling schemes trigger a report less often, but we obtain more unique trigger positions and a greater spatial spread in comparison to sampling strategies based on time and distance. Additionally, the location-based methods trigger significantly more often at rarely visited types of land use and less often outside the study region where no underlying environmental data are available.

  18. Comparison of sampling strategies for object-based classification of urban vegetation from Very High Resolution satellite images

    Science.gov (United States)

    Rougier, Simon; Puissant, Anne; Stumpf, André; Lachiche, Nicolas

    2016-09-01

    Vegetation monitoring is becoming a major issue in the urban environment due to the services they procure and necessitates an accurate and up to date mapping. Very High Resolution satellite images enable a detailed mapping of the urban tree and herbaceous vegetation. Several supervised classifications with statistical learning techniques have provided good results for the detection of urban vegetation but necessitate a large amount of training data. In this context, this study proposes to investigate the performances of different sampling strategies in order to reduce the number of examples needed. Two windows based active learning algorithms from state-of-art are compared to a classical stratified random sampling and a third combining active learning and stratified strategies is proposed. The efficiency of these strategies is evaluated on two medium size French cities, Strasbourg and Rennes, associated to different datasets. Results demonstrate that classical stratified random sampling can in some cases be just as effective as active learning methods and that it should be used more frequently to evaluate new active learning methods. Moreover, the active learning strategies proposed in this work enables to reduce the computational runtime by selecting multiple windows at each iteration without increasing the number of windows needed.

  19. Compressed sensing of roller bearing fault based on multiple down-sampling strategy

    Science.gov (United States)

    Wang, Huaqing; Ke, Yanliang; Luo, Ganggang; Tang, Gang

    2016-02-01

    Roller bearings are essential components of rotating machinery and are often exposed to complex operating conditions, which can easily lead to their failures. Thus, to ensure normal production and the safety of machine operators, it is essential to detect the failures as soon as possible. However, it is a major challenge to maintain a balance between detection efficiency and big data acquisition given the limitations of sampling theory. To overcome these limitations, we try to preserve the information pertaining to roller bearing failures using a sampling rate far below the Nyquist sampling rate, which can ease the pressure generated by the large-scale data. The big data of a faulty roller bearing’s vibration signals is firstly reduced by a down-sample strategy while preserving the fault features by selecting peaks to represent the data segments in time domain. However, a problem arises in that the fault features may be weaker than before, since the noise may be mistaken for the peaks when the noise is stronger than the vibration signals, which makes the fault features unable to be extracted by commonly-used envelope analysis. Here we employ compressive sensing theory to overcome this problem, which can make a signal enhancement and reduce the sample sizes further. Moreover, it is capable of detecting fault features from a small number of samples based on orthogonal matching pursuit approach, which can overcome the shortcomings of the multiple down-sample algorithm. Experimental results validate the effectiveness of the proposed technique in detecting roller bearing faults.

  20. Compressed sensing of roller bearing fault based on multiple down-sampling strategy

    International Nuclear Information System (INIS)

    Wang, Huaqing; Ke, Yanliang; Luo, Ganggang; Tang, Gang

    2016-01-01

    Roller bearings are essential components of rotating machinery and are often exposed to complex operating conditions, which can easily lead to their failures. Thus, to ensure normal production and the safety of machine operators, it is essential to detect the failures as soon as possible. However, it is a major challenge to maintain a balance between detection efficiency and big data acquisition given the limitations of sampling theory. To overcome these limitations, we try to preserve the information pertaining to roller bearing failures using a sampling rate far below the Nyquist sampling rate, which can ease the pressure generated by the large-scale data. The big data of a faulty roller bearing’s vibration signals is firstly reduced by a down-sample strategy while preserving the fault features by selecting peaks to represent the data segments in time domain. However, a problem arises in that the fault features may be weaker than before, since the noise may be mistaken for the peaks when the noise is stronger than the vibration signals, which makes the fault features unable to be extracted by commonly-used envelope analysis. Here we employ compressive sensing theory to overcome this problem, which can make a signal enhancement and reduce the sample sizes further. Moreover, it is capable of detecting fault features from a small number of samples based on orthogonal matching pursuit approach, which can overcome the shortcomings of the multiple down-sample algorithm. Experimental results validate the effectiveness of the proposed technique in detecting roller bearing faults. (paper)

  1. Spent nuclear fuel sampling strategy

    International Nuclear Information System (INIS)

    Bergmann, D.W.

    1995-01-01

    This report proposes a strategy for sampling the spent nuclear fuel (SNF) stored in the 105-K Basins (105-K East and 105-K West). This strategy will support decisions concerning the path forward SNF disposition efforts in the following areas: (1) SNF isolation activities such as repackaging/overpacking to a newly constructed staging facility; (2) conditioning processes for fuel stabilization; and (3) interim storage options. This strategy was developed without following the Data Quality Objective (DQO) methodology. It is, however, intended to augment the SNF project DQOS. The SNF sampling is derived by evaluating the current storage condition of the SNF and the factors that effected SNF corrosion/degradation

  2. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  3. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    Science.gov (United States)

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.

  4. User-driven sampling strategies in image exploitation

    Science.gov (United States)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  5. A Fast and Robust Feature-Based Scan-Matching Method in 3D SLAM and the Effect of Sampling Strategies

    Directory of Open Access Journals (Sweden)

    Cihan Ulas

    2013-11-01

    Full Text Available Simultaneous localization and mapping (SLAM plays an important role in fully autonomous systems when a GNSS (global navigation satellite system is not available. Studies in both 2D indoor and 3D outdoor SLAM are based on the appearance of environments and utilize scan-matching methods to find rigid body transformation parameters between two consecutive scans. In this study, a fast and robust scan-matching method based on feature extraction is introduced. Since the method is based on the matching of certain geometric structures, like plane segments, the outliers and noise in the point cloud are considerably eliminated. Therefore, the proposed scan-matching algorithm is more robust than conventional methods. Besides, the registration time and the number of iterations are significantly reduced, since the number of matching points is efficiently decreased. As a scan-matching framework, an improved version of the normal distribution transform (NDT is used. The probability density functions (PDFs of the reference scan are generated as in the traditional NDT, and the feature extraction - based on stochastic plane detection - is applied to the only input scan. By using experimental dataset belongs to an outdoor environment like a university campus, we obtained satisfactory performance results. Moreover, the feature extraction part of the algorithm is considered as a special sampling strategy for scan-matching and compared to other sampling strategies, such as random sampling and grid-based sampling, the latter of which is first used in the NDT. Thus, this study also shows the effect of the subsampling on the performance of the NDT.

  6. Adaptive sampling strategies with high-throughput molecular dynamics

    Science.gov (United States)

    Clementi, Cecilia

    Despite recent significant hardware and software developments, the complete thermodynamic and kinetic characterization of large macromolecular complexes by molecular simulations still presents significant challenges. The high dimensionality of these systems and the complexity of the associated potential energy surfaces (creating multiple metastable regions connected by high free energy barriers) does not usually allow to adequately sample the relevant regions of their configurational space by means of a single, long Molecular Dynamics (MD) trajectory. Several different approaches have been proposed to tackle this sampling problem. We focus on the development of ensemble simulation strategies, where data from a large number of weakly coupled simulations are integrated to explore the configurational landscape of a complex system more efficiently. Ensemble methods are of increasing interest as the hardware roadmap is now mostly based on increasing core counts, rather than clock speeds. The main challenge in the development of an ensemble approach for efficient sampling is in the design of strategies to adaptively distribute the trajectories over the relevant regions of the systems' configurational space, without using any a priori information on the system global properties. We will discuss the definition of smart adaptive sampling approaches that can redirect computational resources towards unexplored yet relevant regions. Our approaches are based on new developments in dimensionality reduction for high dimensional dynamical systems, and optimal redistribution of resources. NSF CHE-1152344, NSF CHE-1265929, Welch Foundation C-1570.

  7. Chapter 2: Sampling strategies in forest hydrology and biogeochemistry

    Science.gov (United States)

    Roger C. Bales; Martha H. Conklin; Branko Kerkez; Steven Glaser; Jan W. Hopmans; Carolyn T. Hunsaker; Matt Meadows; Peter C. Hartsough

    2011-01-01

    Many aspects of forest hydrology have been based on accurate but not necessarily spatially representative measurements, reflecting the measurement capabilities that were traditionally available. Two developments are bringing about fundamental changes in sampling strategies in forest hydrology and biogeochemistry: (a) technical advances in measurement capability, as is...

  8. Solid-Phase Extraction Strategies to Surmount Body Fluid Sample Complexity in High-Throughput Mass Spectrometry-Based Proteomics

    Science.gov (United States)

    Bladergroen, Marco R.; van der Burgt, Yuri E. M.

    2015-01-01

    For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071

  9. Recruitment of hard-to-reach population subgroups via adaptations of the snowball sampling strategy.

    Science.gov (United States)

    Sadler, Georgia Robins; Lee, Hau-Chen; Lim, Rod Seung-Hwan; Fullerton, Judith

    2010-09-01

    Nurse researchers and educators often engage in outreach to narrowly defined populations. This article offers examples of how variations on the snowball sampling recruitment strategy can be applied in the creation of culturally appropriate, community-based information dissemination efforts related to recruitment to health education programs and research studies. Examples from the primary author's program of research are provided to demonstrate how adaptations of snowball sampling can be used effectively in the recruitment of members of traditionally underserved or vulnerable populations. The adaptation of snowball sampling techniques, as described in this article, helped the authors to gain access to each of the more-vulnerable population groups of interest. The use of culturally sensitive recruitment strategies is both appropriate and effective in enlisting the involvement of members of vulnerable populations. Adaptations of snowball sampling strategies should be considered when recruiting participants for education programs or for research studies when the recruitment of a population-based sample is not essential.

  10. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  11. Sampling strategies for estimating brook trout effective population size

    Science.gov (United States)

    Andrew R. Whiteley; Jason A. Coombs; Mark Hudy; Zachary Robinson; Keith H. Nislow; Benjamin H. Letcher

    2012-01-01

    The influence of sampling strategy on estimates of effective population size (Ne) from single-sample genetic methods has not been rigorously examined, though these methods are increasingly used. For headwater salmonids, spatially close kin association among age-0 individuals suggests that sampling strategy (number of individuals and location from...

  12. Measurement of radioactivity in the environment - Soil - Part 2: Guidance for the selection of the sampling strategy, sampling and pre-treatment of samples

    International Nuclear Information System (INIS)

    2007-01-01

    This part of ISO 18589 specifies the general requirements, based on ISO 11074 and ISO/IEC 17025, for all steps in the planning (desk study and area reconnaissance) of the sampling and the preparation of samples for testing. It includes the selection of the sampling strategy, the outline of the sampling plan, the presentation of general sampling methods and equipment, as well as the methodology of the pre-treatment of samples adapted to the measurements of the activity of radionuclides in soil. This part of ISO 18589 is addressed to the people responsible for determining the radioactivity present in soil for the purpose of radiation protection. It is applicable to soil from gardens, farmland, urban or industrial sites, as well as soil not affected by human activities. This part of ISO 18589 is applicable to all laboratories regardless of the number of personnel or the range of the testing performed. When a laboratory does not undertake one or more of the activities covered by this part of ISO 18589, such as planning, sampling or testing, the corresponding requirements do not apply. Information is provided on scope, normative references, terms and definitions and symbols, principle, sampling strategy, sampling plan, sampling process, pre-treatment of samples and recorded information. Five annexes inform about selection of the sampling strategy according to the objectives and the radiological characterization of the site and sampling areas, diagram of the evolution of the sample characteristics from the sampling site to the laboratory, example of sampling plan for a site divided in three sampling areas, example of a sampling record for a single/composite sample and example for a sample record for a soil profile with soil description. A bibliography is provided

  13. Recruiting hard-to-reach United States population sub-groups via adaptations of snowball sampling strategy

    Science.gov (United States)

    Sadler, Georgia Robins; Lee, Hau-Chen; Seung-Hwan Lim, Rod; Fullerton, Judith

    2011-01-01

    Nurse researchers and educators often engage in outreach to narrowly defined populations. This article offers examples of how variations on the snowball sampling recruitment strategy can be applied in the creation of culturally appropriate, community-based information dissemination efforts related to recruitment to health education programs and research studies. Examples from the primary author’s program of research are provided to demonstrate how adaptations of snowball sampling can be effectively used in the recruitment of members of traditionally underserved or vulnerable populations. The adaptation of snowball sampling techniques, as described in this article, helped the authors to gain access to each of the more vulnerable population groups of interest. The use of culturally sensitive recruitment strategies is both appropriate and effective in enlisting the involvement of members of vulnerable populations. Adaptations of snowball sampling strategies should be considered when recruiting participants for education programs or subjects for research studies when recruitment of a population based sample is not essential. PMID:20727089

  14. New Approach Based on Compressive Sampling for Sample Rate Enhancement in DASs for Low-Cost Sensing Nodes

    Directory of Open Access Journals (Sweden)

    Francesco Bonavolontà

    2014-10-01

    Full Text Available The paper deals with the problem of improving the maximum sample rate of analog-to-digital converters (ADCs included in low cost wireless sensing nodes. To this aim, the authors propose an efficient acquisition strategy based on the combined use of high-resolution time-basis and compressive sampling. In particular, the high-resolution time-basis is adopted to provide a proper sequence of random sampling instants, and a suitable software procedure, based on compressive sampling approach, is exploited to reconstruct the signal of interest from the acquired samples. Thanks to the proposed strategy, the effective sample rate of the reconstructed signal can be as high as the frequency of the considered time-basis, thus significantly improving the inherent ADC sample rate. Several tests are carried out in simulated and real conditions to assess the performance of the proposed acquisition strategy in terms of reconstruction error. In particular, the results obtained in experimental tests with ADC included in actual 8- and 32-bits microcontrollers highlight the possibility of achieving effective sample rate up to 50 times higher than that of the original ADC sample rate.

  15. Mendelian breeding units versus standard sampling strategies: mitochondrial DNA variation in southwest Sardinia

    Directory of Open Access Journals (Sweden)

    Daria Sanna

    2011-01-01

    Full Text Available We report a sampling strategy based on Mendelian Breeding Units (MBUs, representing an interbreeding group of individuals sharing a common gene pool. The identification of MBUs is crucial for case-control experimental design in association studies. The aim of this work was to evaluate the possible existence of bias in terms of genetic variability and haplogroup frequencies in the MBU sample, due to severe sample selection. In order to reach this goal, the MBU sampling strategy was compared to a standard selection of individuals according to their surname and place of birth. We analysed mitochondrial DNA variation (first hypervariable segment and coding region in unrelated healthy subjects from two different areas of Sardinia: the area around the town of Cabras and the western Campidano area. No statistically significant differences were observed when the two sampling methods were compared, indicating that the stringent sample selection needed to establish a MBU does not alter original genetic variability and haplogroup distribution. Therefore, the MBU sampling strategy can be considered a useful tool in association studies of complex traits.

  16. Metabolomic analysis of urine samples by UHPLC-QTOF-MS: Impact of normalization strategies.

    Science.gov (United States)

    Gagnebin, Yoric; Tonoli, David; Lescuyer, Pierre; Ponte, Belen; de Seigneux, Sophie; Martin, Pierre-Yves; Schappler, Julie; Boccard, Julien; Rudaz, Serge

    2017-02-22

    Among the various biological matrices used in metabolomics, urine is a biofluid of major interest because of its non-invasive collection and its availability in large quantities. However, significant sources of variability in urine metabolomics based on UHPLC-MS are related to the analytical drift and variation of the sample concentration, thus requiring normalization. A sequential normalization strategy was developed to remove these detrimental effects, including: (i) pre-acquisition sample normalization by individual dilution factors to narrow the concentration range and to standardize the analytical conditions, (ii) post-acquisition data normalization by quality control-based robust LOESS signal correction (QC-RLSC) to correct for potential analytical drift, and (iii) post-acquisition data normalization by MS total useful signal (MSTUS) or probabilistic quotient normalization (PQN) to prevent the impact of concentration variability. This generic strategy was performed with urine samples from healthy individuals and was further implemented in the context of a clinical study to detect alterations in urine metabolomic profiles due to kidney failure. In the case of kidney failure, the relation between creatinine/osmolality and the sample concentration is modified, and relying only on these measurements for normalization could be highly detrimental. The sequential normalization strategy was demonstrated to significantly improve patient stratification by decreasing the unwanted variability and thus enhancing data quality. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  18. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    Science.gov (United States)

    Debasish Saha; Armen R. Kemanian; Benjamin M. Rau; Paul R. Adler; Felipe Montes

    2017-01-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (...

  19. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    Science.gov (United States)

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  20. Recommended Immunological Strategies to Screen for Botulinum Neurotoxin-Containing Samples

    Directory of Open Access Journals (Sweden)

    Stéphanie Simon

    2015-11-01

    Full Text Available Botulinum neurotoxins (BoNTs cause the life-threatening neurological illness botulism in humans and animals and are divided into seven serotypes (BoNT/A–G, of which serotypes A, B, E, and F cause the disease in humans. BoNTs are classified as “category A” bioterrorism threat agents and are relevant in the context of the Biological Weapons Convention. An international proficiency test (PT was conducted to evaluate detection, quantification and discrimination capabilities of 23 expert laboratories from the health, food and security areas. Here we describe three immunological strategies that proved to be successful for the detection and quantification of BoNT/A, B, and E considering the restricted sample volume (1 mL distributed. To analyze the samples qualitatively and quantitatively, the first strategy was based on sensitive immunoenzymatic and immunochromatographic assays for fast qualitative and quantitative analyses. In the second approach, a bead-based suspension array was used for screening followed by conventional ELISA for quantification. In the third approach, an ELISA plate format assay was used for serotype specific immunodetection of BoNT-cleaved substrates, detecting the activity of the light chain, rather than the toxin protein. The results provide guidance for further steps in quality assurance and highlight problems to address in the future.

  1. Recommended Immunological Strategies to Screen for Botulinum Neurotoxin-Containing Samples.

    Science.gov (United States)

    Simon, Stéphanie; Fiebig, Uwe; Liu, Yvonne; Tierney, Rob; Dano, Julie; Worbs, Sylvia; Endermann, Tanja; Nevers, Marie-Claire; Volland, Hervé; Sesardic, Dorothea; Dorner, Martin B

    2015-11-26

    Botulinum neurotoxins (BoNTs) cause the life-threatening neurological illness botulism in humans and animals and are divided into seven serotypes (BoNT/A-G), of which serotypes A, B, E, and F cause the disease in humans. BoNTs are classified as "category A" bioterrorism threat agents and are relevant in the context of the Biological Weapons Convention. An international proficiency test (PT) was conducted to evaluate detection, quantification and discrimination capabilities of 23 expert laboratories from the health, food and security areas. Here we describe three immunological strategies that proved to be successful for the detection and quantification of BoNT/A, B, and E considering the restricted sample volume (1 mL) distributed. To analyze the samples qualitatively and quantitatively, the first strategy was based on sensitive immunoenzymatic and immunochromatographic assays for fast qualitative and quantitative analyses. In the second approach, a bead-based suspension array was used for screening followed by conventional ELISA for quantification. In the third approach, an ELISA plate format assay was used for serotype specific immunodetection of BoNT-cleaved substrates, detecting the activity of the light chain, rather than the toxin protein. The results provide guidance for further steps in quality assurance and highlight problems to address in the future.

  2. Sampling strategies in antimicrobial resistance monitoring: evaluating how precision and sensitivity vary with the number of animals sampled per farm.

    Directory of Open Access Journals (Sweden)

    Takehisa Yamamoto

    Full Text Available Because antimicrobial resistance in food-producing animals is a major public health concern, many countries have implemented antimicrobial monitoring systems at a national level. When designing a sampling scheme for antimicrobial resistance monitoring, it is necessary to consider both cost effectiveness and statistical plausibility. In this study, we examined how sampling scheme precision and sensitivity can vary with the number of animals sampled from each farm, while keeping the overall sample size constant to avoid additional sampling costs. Five sampling strategies were investigated. These employed 1, 2, 3, 4 or 6 animal samples per farm, with a total of 12 animals sampled in each strategy. A total of 1,500 Escherichia coli isolates from 300 fattening pigs on 30 farms were tested for resistance against 12 antimicrobials. The performance of each sampling strategy was evaluated by bootstrap resampling from the observational data. In the bootstrapping procedure, farms, animals, and isolates were selected randomly with replacement, and a total of 10,000 replications were conducted. For each antimicrobial, we observed that the standard deviation and 2.5-97.5 percentile interval of resistance prevalence were smallest in the sampling strategy that employed 1 animal per farm. The proportion of bootstrap samples that included at least 1 isolate with resistance was also evaluated as an indicator of the sensitivity of the sampling strategy to previously unidentified antimicrobial resistance. The proportion was greatest with 1 sample per farm and decreased with larger samples per farm. We concluded that when the total number of samples is pre-specified, the most precise and sensitive sampling strategy involves collecting 1 sample per farm.

  3. A proposal of optimal sampling design using a modularity strategy

    Science.gov (United States)

    Simone, A.; Giustolisi, O.; Laucelli, D. B.

    2016-08-01

    In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.

  4. Methodology Series Module 5: Sampling Strategies

    OpenAIRE

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ? Sampling Method?. There are essentially two types of sampling methods: 1) probability sampling ? based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling ? based on researcher's choice, population that accessible & available. Some of the non-probabilit...

  5. Effective sampling strategy to detect food and feed contamination

    NARCIS (Netherlands)

    Bouzembrak, Yamine; Fels, van der Ine

    2018-01-01

    Sampling plans for food safety hazards are aimed to be used to determine whether a lot of food is contaminated (with microbiological or chemical hazards) or not. One of the components of sampling plans is the sampling strategy. The aim of this study was to compare the performance of three

  6. Females' sampling strategy to comparatively evaluate prospective mates in the peacock blenny Salaria pavo

    Science.gov (United States)

    Locatello, Lisa; Rasotto, Maria B.

    2017-08-01

    Emerging evidence suggests the occurrence of comparative decision-making processes in mate choice, questioning the traditional idea of female choice based on rules of absolute preference. In such a scenario, females are expected to use a typical best-of- n sampling strategy, being able to recall previous sampled males based on memory of their quality and location. Accordingly, the quality of preferred mate is expected to be unrelated to both the number and the sequence of female visits. We found support for these predictions in the peacock blenny, Salaria pavo, a fish where females have the opportunity to evaluate the attractiveness of many males in a short time period and in a restricted spatial range. Indeed, even considering the variability in preference among females, most of them returned to previous sampled males for further evaluations; thus, the preferred male did not represent the last one in the sequence of visited males. Moreover, there was no relationship between the attractiveness of the preferred male and the number of further visits assigned to the other males. Our results suggest the occurrence of a best-of- n mate sampling strategy in the peacock blenny.

  7. Novel strategies for sample preparation in forensic toxicology.

    Science.gov (United States)

    Samanidou, Victoria; Kovatsi, Leda; Fragou, Domniki; Rentifis, Konstantinos

    2011-09-01

    This paper provides a review of novel strategies for sample preparation in forensic toxicology. The review initially outlines the principle of each technique, followed by sections addressing each class of abused drugs separately. The novel strategies currently reviewed focus on the preparation of various biological samples for the subsequent determination of opiates, benzodiazepines, amphetamines, cocaine, hallucinogens, tricyclic antidepressants, antipsychotics and cannabinoids. According to our experience, these analytes are the most frequently responsible for intoxications in Greece. The applications of techniques such as disposable pipette extraction, microextraction by packed sorbent, matrix solid-phase dispersion, solid-phase microextraction, polymer monolith microextraction, stir bar sorptive extraction and others, which are rapidly gaining acceptance in the field of toxicology, are currently reviewed.

  8. Limited-sampling strategies for anti-infective agents: systematic review.

    Science.gov (United States)

    Sprague, Denise A; Ensom, Mary H H

    2009-09-01

    Area under the concentration-time curve (AUC) is a pharmacokinetic parameter that represents overall exposure to a drug. For selected anti-infective agents, pharmacokinetic-pharmacodynamic parameters, such as AUC/MIC (where MIC is the minimal inhibitory concentration), have been correlated with outcome in a few studies. A limited-sampling strategy may be used to estimate pharmacokinetic parameters such as AUC, without the frequent, costly, and inconvenient blood sampling that would be required to directly calculate the AUC. To discuss, by means of a systematic review, the strengths, limitations, and clinical implications of published studies involving a limited-sampling strategy for anti-infective agents and to propose improvements in methodology for future studies. The PubMed and EMBASE databases were searched using the terms "anti-infective agents", "limited sampling", "optimal sampling", "sparse sampling", "AUC monitoring", "abbreviated AUC", "abbreviated sampling", and "Bayesian". The reference lists of retrieved articles were searched manually. Included studies were classified according to modified criteria from the US Preventive Services Task Force. Twenty studies met the inclusion criteria. Six of the studies (involving didanosine, zidovudine, nevirapine, ciprofloxacin, efavirenz, and nelfinavir) were classified as providing level I evidence, 4 studies (involving vancomycin, didanosine, lamivudine, and lopinavir-ritonavir) provided level II-1 evidence, 2 studies (involving saquinavir and ceftazidime) provided level II-2 evidence, and 8 studies (involving ciprofloxacin, nelfinavir, vancomycin, ceftazidime, ganciclovir, pyrazinamide, meropenem, and alpha interferon) provided level III evidence. All of the studies providing level I evidence used prospectively collected data and proper validation procedures with separate, randomly selected index and validation groups. However, most of the included studies did not provide an adequate description of the methods or

  9. Sample design and gamma-ray counting strategy of neutron activation system for triton burnup measurements in KSTAR

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Jungmin [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of); Cheon, Mun Seong [ITER Korea, National Fusion Research Institute, Daejeon (Korea, Republic of); Chung, Kyoung-Jae, E-mail: jkjlsh1@snu.ac.kr [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of); Hwang, Y.S. [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of)

    2016-11-01

    Highlights: • Sample design for triton burnup ratio measurement is carried out. • Samples for 14.1 MeV neutron measurements are selected for KSTAR. • Si and Cu are the most suitable materials for d-t neutron measurements. • Appropriate γ-ray counting strategies for each selected sample are established. - Abstract: On the purpose of triton burnup measurements in Korea Superconducting Tokamak Advanced Research (KSTAR) deuterium plasmas, appropriate neutron activation system (NAS) samples for 14.1 MeV d-t neutron measurements have been designed and gamma-ray counting strategy is established. Neutronics calculations are performed with the MCNP5 neutron transport code for the KSTAR neutral beam heated deuterium plasma discharges. Based on those calculations and the assumed d-t neutron yield, the activities induced by d-t neutrons are estimated with the inventory code FISPACT-2007 for candidate sample materials: Si, Cu, Al, Fe, Nb, Co, Ti, and Ni. It is found that Si, Cu, Al, and Fe are suitable for the KSATR NAS in terms of the minimum detectable activity (MDA) calculated based on the standard deviation of blank measurements. Considering background gamma-rays radiated from surrounding structures activated by thermalized fusion neutrons, appropriate gamma-ray counting strategy for each selected sample is established.

  10. Potential-Decomposition Strategy in Markov Chain Monte Carlo Sampling Algorithms

    International Nuclear Information System (INIS)

    Shangguan Danhua; Bao Jingdong

    2010-01-01

    We introduce the potential-decomposition strategy (PDS), which can he used in Markov chain Monte Carlo sampling algorithms. PDS can be designed to make particles move in a modified potential that favors diffusion in phase space, then, by rejecting some trial samples, the target distributions can be sampled in an unbiased manner. Furthermore, if the accepted trial samples are insufficient, they can be recycled as initial states to form more unbiased samples. This strategy can greatly improve efficiency when the original potential has multiple metastable states separated by large barriers. We apply PDS to the 2d Ising model and a double-well potential model with a large barrier, demonstrating in these two representative examples that convergence is accelerated by orders of magnitude.

  11. Limited sampling strategy for determining metformin area under the plasma concentration-time curve

    DEFF Research Database (Denmark)

    Santoro, Ana Beatriz; Stage, Tore Bjerregaard; Struchiner, Claudio José

    2016-01-01

    AIM: The aim was to develop and validate limited sampling strategy (LSS) models to predict the area under the plasma concentration-time curve (AUC) for metformin. METHODS: Metformin plasma concentrations (n = 627) at 0-24 h after a single 500 mg dose were used for LSS development, based on all su...

  12. Evaluation of Multiple Linear Regression-Based Limited Sampling Strategies for Enteric-Coated Mycophenolate Sodium in Adult Kidney Transplant Recipients.

    Science.gov (United States)

    Brooks, Emily K; Tett, Susan E; Isbel, Nicole M; McWhinney, Brett; Staatz, Christine E

    2018-04-01

    Although multiple linear regression-based limited sampling strategies (LSSs) have been published for enteric-coated mycophenolate sodium, none have been evaluated for the prediction of subsequent mycophenolic acid (MPA) exposure. This study aimed to examine the predictive performance of the published LSS for the estimation of future MPA area under the concentration-time curve from 0 to 12 hours (AUC0-12) in renal transplant recipients. Total MPA plasma concentrations were measured in 20 adult renal transplant patients on 2 occasions a week apart. All subjects received concomitant tacrolimus and were approximately 1 month after transplant. Samples were taken at 0, 0.33, 0.5, 1, 1.5, 2, 2.5, 3, 3.5, 4, 6, and 8 hours and 0, 0.25, 0.5, 0.75, 1, 1.25, 1.5, 2, 3, 4, 6, 9, and 12 hours after dose on the first and second sampling occasion, respectively. Predicted MPA AUC0-12 was calculated using 19 published LSSs and data from the first or second sampling occasion for each patient and compared with the second occasion full MPA AUC0-12 calculated using the linear trapezoidal rule. Bias (median percentage prediction error) and imprecision (median absolute prediction error) were determined. Median percentage prediction error and median absolute prediction error for the prediction of full MPA AUC0-12 were multiple linear regression-based LSS was not possible without concentrations up to at least 8 hours after the dose.

  13. Sample preparation composite and replicate strategy for assay of solid oral drug products.

    Science.gov (United States)

    Harrington, Brent; Nickerson, Beverly; Guo, Michele Xuemei; Barber, Marc; Giamalva, David; Lee, Carlos; Scrivens, Garry

    2014-12-16

    In pharmaceutical analysis, the results of drug product assay testing are used to make decisions regarding the quality, efficacy, and stability of the drug product. In order to make sound risk-based decisions concerning drug product potency, an understanding of the uncertainty of the reportable assay value is required. Utilizing the most restrictive criteria in current regulatory documentation, a maximum variability attributed to method repeatability is defined for a drug product potency assay. A sampling strategy that reduces the repeatability component of the assay variability below this predefined maximum is demonstrated. The sampling strategy consists of determining the number of dosage units (k) to be prepared in a composite sample of which there may be a number of equivalent replicate (r) sample preparations. The variability, as measured by the standard error (SE), of a potency assay consists of several sources such as sample preparation and dosage unit variability. A sampling scheme that increases the number of sample preparations (r) and/or number of dosage units (k) per sample preparation will reduce the assay variability and thus decrease the uncertainty around decisions made concerning the potency of the drug product. A maximum allowable repeatability component of the standard error (SE) for the potency assay is derived using material in current regulatory documents. A table of solutions for the number of dosage units per sample preparation (r) and number of replicate sample preparations (k) is presented for any ratio of sample preparation and dosage unit variability.

  14. Strategies to address participant misrepresentation for eligibility in Web-based research.

    Science.gov (United States)

    Kramer, Jessica; Rubin, Amy; Coster, Wendy; Helmuth, Eric; Hermos, John; Rosenbloom, David; Moed, Rich; Dooley, Meghan; Kao, Ying-Chia; Liljenquist, Kendra; Brief, Deborah; Enggasser, Justin; Keane, Terence; Roy, Monica; Lachowicz, Mark

    2014-03-01

    Emerging methodological research suggests that the World Wide Web ("Web") is an appropriate venue for survey data collection, and a promising area for delivering behavioral intervention. However, the use of the Web for research raises concerns regarding sample validity, particularly when the Web is used for recruitment and enrollment. The purpose of this paper is to describe the challenges experienced in two different Web-based studies in which participant misrepresentation threatened sample validity: a survey study and an online intervention study. The lessons learned from these experiences generated three types of strategies researchers can use to reduce the likelihood of participant misrepresentation for eligibility in Web-based research. Examples of procedural/design strategies, technical/software strategies and data analytic strategies are provided along with the methodological strengths and limitations of specific strategies. The discussion includes a series of considerations to guide researchers in the selection of strategies that may be most appropriate given the aims, resources and target population of their studies. Copyright © 2014 John Wiley & Sons, Ltd.

  15. [Strategies for biobank networks. Classification of different approaches for locating samples and an outlook on the future within the BBMRI-ERIC].

    Science.gov (United States)

    Lablans, Martin; Kadioglu, Dennis; Mate, Sebastian; Leb, Ines; Prokosch, Hans-Ulrich; Ückert, Frank

    2016-03-01

    Medical research projects often require more biological material than can be supplied by a single biobank. For this reason, a multitude of strategies support locating potential research partners with matching material without requiring centralization of sample storage. Classification of different strategies for biobank networks, in particular for locating suitable samples. Description of an IT infrastructure combining these strategies. Existing strategies can be classified according to three criteria: (a) granularity of sample data: coarse bank-level data (catalogue) vs. fine-granular sample-level data, (b) location of sample data: central (central search service) vs. decentral storage (federated search services), and (c) level of automation: automatic (query-based, federated search service) vs. semi-automatic (inquiry-based, decentral search). All mentioned search services require data integration. Metadata help to overcome semantic heterogeneity. The "Common Service IT" in BBMRI-ERIC (Biobanking and BioMolecular Resources Research Infrastructure) unites a catalogue, the decentral search and metadata in an integrated platform. As a result, researchers receive versatile tools to search suitable biomaterial, while biobanks retain a high degree of data sovereignty. Despite their differences, the presented strategies for biobank networks do not rule each other out but can complement and even benefit from each other.

  16. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  17. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability samplingbased on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability samplingbased on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  18. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability samplingbased on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability samplingbased on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  19. Comparison of sampling strategies for tobacco retailer inspections to maximize coverage in vulnerable areas and minimize cost.

    Science.gov (United States)

    Lee, Joseph G L; Shook-Sa, Bonnie E; Bowling, J Michael; Ribisl, Kurt M

    2017-06-23

    In the United States, tens of thousands of inspections of tobacco retailers are conducted each year. Various sampling choices can reduce travel costs, emphasize enforcement in areas with greater non-compliance, and allow for comparability between states and over time. We sought to develop a model sampling strategy for state tobacco retailer inspections. Using a 2014 list of 10,161 North Carolina tobacco retailers, we compared results from simple random sampling; stratified, clustered at the ZIP code sampling; and, stratified, clustered at the census tract sampling. We conducted a simulation of repeated sampling and compared approaches for their comparative level of precision, coverage, and retailer dispersion. While maintaining an adequate design effect and statistical precision appropriate for a public health enforcement program, both stratified, clustered ZIP- and tract-based approaches were feasible. Both ZIP and tract strategies yielded improvements over simple random sampling, with relative improvements, respectively, of average distance between retailers (reduced 5.0% and 1.9%), percent Black residents in sampled neighborhoods (increased 17.2% and 32.6%), percent Hispanic residents in sampled neighborhoods (reduced 2.2% and increased 18.3%), percentage of sampled retailers located near schools (increased 61.3% and 37.5%), and poverty rate in sampled neighborhoods (increased 14.0% and 38.2%). States can make retailer inspections more efficient and targeted with stratified, clustered sampling. Use of statistically appropriate sampling strategies like these should be considered by states, researchers, and the Food and Drug Administration to improve program impact and allow for comparisons over time and across states. The authors present a model tobacco retailer sampling strategy for promoting compliance and reducing costs that could be used by U.S. states and the Food and Drug Administration (FDA). The design is feasible to implement in North Carolina. Use of

  20. Mars Sample Return - Launch and Detection Strategies for Orbital Rendezvous

    Science.gov (United States)

    Woolley, Ryan C.; Mattingly, Richard L.; Riedel, Joseph E.; Sturm, Erick J.

    2011-01-01

    This study sets forth conceptual mission design strategies for the ascent and rendezvous phase of the proposed NASA/ESA joint Mars Sample Return Campaign. The current notional mission architecture calls for the launch of an acquisition/cache rover in 2018, an orbiter with an Earth return vehicle in 2022, and a fetch rover and ascent vehicle in 2024. Strategies are presented to launch the sample into a coplanar orbit with the Orbiter which facilitate robust optical detection, orbit determination, and rendezvous. Repeating ground track orbits exist at 457 and 572 km which provide multiple launch opportunities with similar geometries for detection and rendezvous.

  1. Mars Sample Return: Launch and Detection Strategies for Orbital Rendezvous

    Science.gov (United States)

    Woolley, Ryan C.; Mattingly, Richard L.; Riedel, Joseph E.; Sturm, Erick J.

    2011-01-01

    This study sets forth conceptual mission design strategies for the ascent and rendezvous phase of the proposed NASA/ESA joint Mars Sample Return Campaign. The current notional mission architecture calls for the launch of an acquisition/ caching rover in 2018, an Earth return orbiter in 2022, and a fetch rover with ascent vehicle in 2024. Strategies are presented to launch the sample into a nearly coplanar orbit with the Orbiter which would facilitate robust optical detection, orbit determination, and rendezvous. Repeating ground track orbits existat 457 and 572 km which would provide multiple launch opportunities with similar geometries for detection and rendezvous.

  2. Do Culture-based Segments Predict Selection of Market Strategy?

    Directory of Open Access Journals (Sweden)

    Veronika Jadczaková

    2015-01-01

    Full Text Available Academists and practitioners have already acknowledged the importance of unobservable segmentation bases (such as psychographics yet still focusing on how well these bases are capable of describing relevant segments (the identifiability criterion rather than on how precisely these segments can predict (the predictability criterion. Therefore, this paper intends to add a debate to this topic by exploring whether culture-based segments do account for a selection of market strategy. To do so, a set of market strategy variables over a sample of 251 manufacturing firms was first regressed on a set of 19 cultural variables using canonical correlation analysis. Having found significant relationship in the first canonical function, it was further examined by means of correspondence analysis which cultural segments – if any – are linked to which market strategies. However, as correspondence analysis failed to find a significant relationship, it may be concluded that business culture might relate to the adoption of market strategy but not to the cultural groupings presented in the paper.

  3. An Optimal Sample Data Usage Strategy to Minimize Overfitting and Underfitting Effects in Regression Tree Models Based on Remotely-Sensed Data

    Directory of Open Access Journals (Sweden)

    Yingxin Gu

    2016-11-01

    Full Text Available Regression tree models have been widely used for remote sensing-based ecosystem mapping. Improper use of the sample data (model training and testing data may cause overfitting and underfitting effects in the model. The goal of this study is to develop an optimal sampling data usage strategy for any dataset and identify an appropriate number of rules in the regression tree model that will improve its accuracy and robustness. Landsat 8 data and Moderate-Resolution Imaging Spectroradiometer-scaled Normalized Difference Vegetation Index (NDVI were used to develop regression tree models. A Python procedure was designed to generate random replications of model parameter options across a range of model development data sizes and rule number constraints. The mean absolute difference (MAD between the predicted and actual NDVI (scaled NDVI, value from 0–200 and its variability across the different randomized replications were calculated to assess the accuracy and stability of the models. In our case study, a six-rule regression tree model developed from 80% of the sample data had the lowest MAD (MADtraining = 2.5 and MADtesting = 2.4, which was suggested as the optimal model. This study demonstrates how the training data and rule number selections impact model accuracy and provides important guidance for future remote-sensing-based ecosystem modeling.

  4. Development of a sampling strategy and sample size calculation to estimate the distribution of mammographic breast density in Korean women.

    Science.gov (United States)

    Jun, Jae Kwan; Kim, Mi Jin; Choi, Kui Son; Suh, Mina; Jung, Kyu-Won

    2012-01-01

    Mammographic breast density is a known risk factor for breast cancer. To conduct a survey to estimate the distribution of mammographic breast density in Korean women, appropriate sampling strategies for representative and efficient sampling design were evaluated through simulation. Using the target population from the National Cancer Screening Programme (NCSP) for breast cancer in 2009, we verified the distribution estimate by repeating the simulation 1,000 times using stratified random sampling to investigate the distribution of breast density of 1,340,362 women. According to the simulation results, using a sampling design stratifying the nation into three groups (metropolitan, urban, and rural), with a total sample size of 4,000, we estimated the distribution of breast density in Korean women at a level of 0.01% tolerance. Based on the results of our study, a nationwide survey for estimating the distribution of mammographic breast density among Korean women can be conducted efficiently.

  5. Relative effectiveness of context-based teaching strategy on senior ...

    African Journals Online (AJOL)

    This study adopted the quasi experimental research design to examine the relative effectiveness of context-based teaching strategy on senior secondary school students' achievements in inorganic chemistry. The sample consists of 451 SSII chemistry students (224 males and 227 females) drawn from four out of 46 ...

  6. Sample preparation composite and replicate strategy case studies for assay of solid oral drug products.

    Science.gov (United States)

    Nickerson, Beverly; Harrington, Brent; Li, Fasheng; Guo, Michele Xuemei

    2017-11-30

    Drug product assay is one of several tests required for new drug products to ensure the quality of the product at release and throughout the life cycle of the product. Drug product assay testing is typically performed by preparing a composite sample of multiple dosage units to obtain an assay value representative of the batch. In some cases replicate composite samples may be prepared and the reportable assay value is the average value of all the replicates. In previously published work by Harrington et al. (2014) [5], a sample preparation composite and replicate strategy for assay was developed to provide a systematic approach which accounts for variability due to the analytical method and dosage form with a standard error of the potency assay criteria based on compendia and regulatory requirements. In this work, this sample preparation composite and replicate strategy for assay is applied to several case studies to demonstrate the utility of this approach and its application at various stages of pharmaceutical drug product development. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Assessment of sampling strategies for estimation of site mean concentrations of stormwater pollutants.

    Science.gov (United States)

    McCarthy, David T; Zhang, Kefeng; Westerlund, Camilla; Viklander, Maria; Bertrand-Krajewski, Jean-Luc; Fletcher, Tim D; Deletic, Ana

    2018-02-01

    The estimation of stormwater pollutant concentrations is a primary requirement of integrated urban water management. In order to determine effective sampling strategies for estimating pollutant concentrations, data from extensive field measurements at seven different catchments was used. At all sites, 1-min resolution continuous flow measurements, as well as flow-weighted samples, were taken and analysed for total suspend solids (TSS), total nitrogen (TN) and Escherichia coli (E. coli). For each of these parameters, the data was used to calculate the Event Mean Concentrations (EMCs) for each event. The measured Site Mean Concentrations (SMCs) were taken as the volume-weighted average of these EMCs for each parameter, at each site. 17 different sampling strategies, including random and fixed strategies were tested to estimate SMCs, which were compared with the measured SMCs. The ratios of estimated/measured SMCs were further analysed to determine the most effective sampling strategies. Results indicate that the random sampling strategies were the most promising method in reproducing SMCs for TSS and TN, while some fixed sampling strategies were better for estimating the SMC of E. coli. The differences in taking one, two or three random samples were small (up to 20% for TSS, and 10% for TN and E. coli), indicating that there is little benefit in investing in collection of more than one sample per event if attempting to estimate the SMC through monitoring of multiple events. It was estimated that an average of 27 events across the studied catchments are needed for characterising SMCs of TSS with a 90% confidence interval (CI) width of 1.0, followed by E.coli (average 12 events) and TN (average 11 events). The coefficient of variation of pollutant concentrations was linearly and significantly correlated to the 90% confidence interval ratio of the estimated/measured SMCs (R 2  = 0.49; P sampling frequency needed to accurately estimate SMCs of pollutants. Crown

  8. Structure-based feeding strategies: A key component of child nutrition.

    Science.gov (United States)

    Taylor, Maija B; Emley, Elizabeth; Pratt, Mercedes; Musher-Eizenman, Dara R

    2017-07-01

    This study examined the relationship between structure, autonomy promotion, and control feeding strategies and parent-reported child diet. Participants (N = 497) were parents of children ages 2.5 to 7.5 recruited from Amazon Mechanical Turk. This sample was a Caucasian (79%), educated sample (61% college graduates) with most reports from mothers (76%). Online survey including measures of parent feeding strategies and child dietary intake. Use of structure-based feeding strategies explained 21% of the variance in child consumption of added sugar, 12% of the variance in child intake of added sugar from sugar-sweetened beverages, and 16% of the variance in child consumption of fruits and vegetables. Higher unhealthy food availability and permissive feeding uniquely predicted higher child added sugar intake and child consumption of added sugar from sugar-sweetened beverages. Greater healthy food availability uniquely predicted higher child fruit and vegetable intake. and Future Directions: In Caucasian educated families, structure-based feeding strategies appear to be a relatively stronger correlate of parent-reported child intake of added sugar and fruits and vegetables as compared to autonomy promotion and control feeding strategies. Longitudinal research may be needed in order to reveal the relationships between autonomy promotion and control feeding strategies with child diet. If future studies have similar findings to this study's results, researchers may want to focus more heavily on investigating the impact of teaching parents stimulus-control techniques and feeding-related assertiveness skills on child dietary intake. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Limited sampling strategy models for estimating the AUC of gliclazide in Chinese healthy volunteers.

    Science.gov (United States)

    Huang, Ji-Han; Wang, Kun; Huang, Xiao-Hui; He, Ying-Chun; Li, Lu-Jin; Sheng, Yu-Cheng; Yang, Juan; Zheng, Qing-Shan

    2013-06-01

    The aim of this work is to reduce the cost of required sampling for the estimation of the area under the gliclazide plasma concentration versus time curve within 60 h (AUC0-60t ). The limited sampling strategy (LSS) models were established and validated by the multiple regression model within 4 or fewer gliclazide concentration values. Absolute prediction error (APE), root of mean square error (RMSE) and visual prediction check were used as criterion. The results of Jack-Knife validation showed that 10 (25.0 %) of the 40 LSS based on the regression analysis were not within an APE of 15 % using one concentration-time point. 90.2, 91.5 and 92.4 % of the 40 LSS models were capable of prediction using 2, 3 and 4 points, respectively. Limited sampling strategies were developed and validated for estimating AUC0-60t of gliclazide. This study indicates that the implementation of an 80 mg dosage regimen enabled accurate predictions of AUC0-60t by the LSS model. This study shows that 12, 6, 4, 2 h after administration are the key sampling times. The combination of (12, 2 h), (12, 8, 2 h) or (12, 8, 4, 2 h) can be chosen as sampling hours for predicting AUC0-60t in practical application according to requirement.

  10. Towards an optimal sampling strategy for assessing genetic variation within and among white clover (Trifolium repens L. cultivars using AFLP

    Directory of Open Access Journals (Sweden)

    Khosro Mehdi Khanlou

    2011-01-01

    Full Text Available Cost reduction in plant breeding and conservation programs depends largely on correctly defining the minimal sample size required for the trustworthy assessment of intra- and inter-cultivar genetic variation. White clover, an important pasture legume, was chosen for studying this aspect. In clonal plants, such as the aforementioned, an appropriate sampling scheme eliminates the redundant analysis of identical genotypes. The aim was to define an optimal sampling strategy, i.e., the minimum sample size and appropriate sampling scheme for white clover cultivars, by using AFLP data (283 loci from three popular types. A grid-based sampling scheme, with an interplant distance of at least 40 cm, was sufficient to avoid any excess in replicates. Simulations revealed that the number of samples substantially influenced genetic diversity parameters. When using less than 15 per cultivar, the expected heterozygosity (He and Shannon diversity index (I were greatly underestimated, whereas with 20, more than 95% of total intra-cultivar genetic variation was covered. Based on AMOVA, a 20-cultivar sample was apparently sufficient to accurately quantify individual genetic structuring. The recommended sampling strategy facilitates the efficient characterization of diversity in white clover, for both conservation and exploitation.

  11. Sampling strategies for indoor radon investigations

    International Nuclear Information System (INIS)

    Prichard, H.M.

    1983-01-01

    Recent investigations prompted by concern about the environmental effects of residential energy conservation have produced many accounts of indoor radon concentrations far above background levels. In many instances time-normalized annual exposures exceeded the 4 WLM per year standard currently used for uranium mining. Further investigations of indoor radon exposures are necessary to judge the extent of the problem and to estimate the practicality of health effects studies. A number of trends can be discerned as more indoor surveys are reported. It is becoming increasingly clear that local geological factors play a major, if not dominant role in determining the distribution of indoor radon concentrations in a given area. Within a giving locale, indoor radon concentrations tend to be log-normally distributed, and sample means differ markedly from one region to another. The appreciation of geological factors and the general log-normality of radon distributions will improve the accuracy of population dose estimates and facilitate the design of preliminary health effects studies. The relative merits of grab samples, short and long term integrated samples, and more complicated dose assessment strategies are discussed in the context of several types of epidemiological investigations. A new passive radon sampler with a 24 hour integration time is described and evaluated as a tool for pilot investigations

  12. Contingency inferences driven by base rates: Valid by sampling

    Directory of Open Access Journals (Sweden)

    Florian Kutzner

    2011-04-01

    Full Text Available Fiedler et al. (2009, reviewed evidence for the utilization of a contingency inference strategy termed pseudocontingencies (PCs. In PCs, the more frequent levels (and, by implication, the less frequent levels are assumed to be associated. PCs have been obtained using a wide range of task settings and dependent measures. Yet, the readiness with which decision makers rely on PCs is poorly understood. A computer simulation explored two potential sources of subjective validity of PCs. First, PCs are shown to perform above chance level when the task is to infer the sign of moderate to strong population contingencies from a sample of observations. Second, contingency inferences based on PCs and inferences based on cell frequencies are shown to partially agree across samples. Intriguingly, this criterion and convergent validity are by-products of random sampling error, highlighting the inductive nature of contingency inferences.

  13. Sampling strategies for tropical forest nutrient cycling studies: a case study in São Paulo, Brazil

    Directory of Open Access Journals (Sweden)

    G. Sparovek

    1997-12-01

    Full Text Available The precise sampling of soil, biological or micro climatic attributes in tropical forests, which are characterized by a high diversity of species and complex spatial variability, is a difficult task. We found few basic studies to guide sampling procedures. The objective of this study was to define a sampling strategy and data analysis for some parameters frequently used in nutrient cycling studies, i. e., litter amount, total nutrient amounts in litter and its composition (Ca, Mg, Κ, Ν and P, and soil attributes at three depths (organic matter, Ρ content, cation exchange capacity and base saturation. A natural remnant forest in the West of São Paulo State (Brazil was selected as study area and samples were collected in July, 1989. The total amount of litter and its total nutrient amounts had a high spatial independent variance. Conversely, the variance of litter composition was lower and the spatial dependency was peculiar to each nutrient. The sampling strategy for the estimation of litter amounts and the amount of nutrient in litter should be different than the sampling strategy for nutrient composition. For the estimation of litter amounts and the amount of nutrients in litter (related to quantity a large number of randomly distributed determinations are needed. Otherwise, for the estimation of litter nutrient composition (related to quality a smaller amount of spatially located samples should be analyzed. The determination of sampling for soil attributes differed according to the depth. Overall, surface samples (0-5 cm showed high short distance spatial dependent variance, whereas, subsurface samples exhibited spatial dependency in longer distances. Short transects with sampling interval of 5-10 m are recommended for surface sampling. Subsurface samples must also be spatially located, but with transects or grids with longer distances between sampling points over the entire area. Composite soil samples would not provide a complete

  14. Influence of Pre-question and genre-based instructional strategies on reading

    Directory of Open Access Journals (Sweden)

    Titi J. Fola-Adebayo

    2014-12-01

    Full Text Available This study investigated the influence of Pre-question and genre-based instructional strategies on science undergraduates’ achievement in, and attitude to, reading. Using purposive sampling,two specialised universities in Nigeria were selected and stratified sampling was employed in assigning students to research groups based on gender and performance in a verbal ability test. Two hundred and eighty-five students participated in the study. Pre-post randomised block experimental design was used with three experimental groups and one control group. The experimental procedure involving Pre-question, genre-based instruction and a combination of Pre-question and genre-based instructional strategies were used for the experimental groups for four weeks whilst the control group received normal teacher input. Data were collected through a Reading Comprehension Achievement Test and Students’ Attitude Questionnaire. Qualitative data, obtained from videotapes of classroom interactions, were subjected to conversation and interaction analyses and quantitative data were analysed with Analysis of Covariance (ANCOVA. The results indicate that although there was no significant main effect of instructional strategy on students’ achievement in reading comprehension, there was significant main effect of instructional strategy on students’ attitude to reading (F(3,231 = 30.9;p <.05. Findings from the qualitative enquiry revealed that female students were more voluble and assertive in their responses probably because of the need to resist male domination whilst male students used discourse strategies to affirm their authority. The study indicated that the combination of pre-question and genre-based approach was the most effective in enhancing the students’ attitude to reading. Reading is one of the most useful of the Language Arts skills which learners need for academic reasons and for lifelong learning. The globalised world demands that the second language

  15. A belief-based evolutionarily stable strategy.

    Science.gov (United States)

    Deng, Xinyang; Wang, Zhen; Liu, Qi; Deng, Yong; Mahadevan, Sankaran

    2014-11-21

    As an equilibrium refinement of the Nash equilibrium, evolutionarily stable strategy (ESS) is a key concept in evolutionary game theory and has attracted growing interest. An ESS can be either a pure strategy or a mixed strategy. Even though the randomness is allowed in mixed strategy, the selection probability of pure strategy in a mixed strategy may fluctuate due to the impact of many factors. The fluctuation can lead to more uncertainty. In this paper, such uncertainty involved in mixed strategy has been further taken into consideration: a belief strategy is proposed in terms of Dempster-Shafer evidence theory. Furthermore, based on the proposed belief strategy, a belief-based ESS has been developed. The belief strategy and belief-based ESS can reduce to the mixed strategy and mixed ESS, which provide more realistic and powerful tools to describe interactions among agents. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Sampling strategy to develop a primary core collection of apple ...

    African Journals Online (AJOL)

    PRECIOUS

    2010-01-11

    Jan 11, 2010 ... Physiology and Molecular Biology for Fruit, Tree, Beijing 100193, China. ... analyzed on genetic diversity to ensure their represen- .... strategy, cluster and random sampling. .... on isozyme data―A simulation study, Theor.

  17. Observer-Based Stabilization of Spacecraft Rendezvous with Variable Sampling and Sensor Nonlinearity

    Directory of Open Access Journals (Sweden)

    Zhuoshi Li

    2013-01-01

    Full Text Available This paper addresses the observer-based control problem of spacecraft rendezvous with nonuniform sampling period. The relative dynamic model is based on the classical Clohessy-Wiltshire equation, and sensor nonlinearity and sampling are considered together in a unified framework. The purpose of this paper is to perform an observer-based controller synthesis by using sampled and saturated output measurements, such that the resulting closed-loop system is exponentially stable. A time-dependent Lyapunov functional is developed which depends on time and the upper bound of the sampling period and also does not grow along the input update times. The controller design problem is solved in terms of the linear matrix inequality method, and the obtained results are less conservative than using the traditional Lyapunov functionals. Finally, a numerical simulation example is built to show the validity of the developed sampled-data control strategy.

  18. Effect of different bleaching strategies on microhardness of a silorane-based composite resin.

    Science.gov (United States)

    Bahari, Mahmoud; Savadi Oskoee, Siavash; Mohammadi, Narmin; Ebrahimi Chaharom, Mohammad Esmaeel; Godrati, Mostafa; Savadi Oskoee, Ayda

    2016-01-01

    Background. Dentists' awareness of the effects of bleaching agents on the surface and mechanical properties of restorative materials is of utmost importance. Therefore, this in vitro study was undertaken to investigate the effects of different bleaching strategies on the microhardness of a silorane-based composite resin. Methods. Eighty samples of a silorane-based composite resin (measuring 4 mm in diameter and 2 mm in thickness) were prepared within acrylic molds. The samples were polished and randomly assigned to 4 groups (n=20). Group 1 (controls) were stored in distilled water for 2 weeks. The samples in group 2 underwent a bleaching procedure with 15% carbamide peroxide for two weeks two hours daily. The samples in group 3 were bleached with 35% hydrogen peroxide twice 5 days apart for 30 minutes each time. The samples in group 4 underwent a bleaching procedure with light-activated 35% hydrogen peroxide under LED light once for 40 minutes. Then the microhardness of the samples was determined using Vickers method. Data were analyzed with one-way ANOVA and post hoc Tukey tests (P bleaching agents significantly decreased microhardness compared to the control group (P 0.05). Conclusion. Bleaching agents decreased microhardness of silorane-based composite resin restorations, the magnitude of which depending on the bleaching strategy used.

  19. Optimal sampling strategy for data mining

    International Nuclear Information System (INIS)

    Ghaffar, A.; Shahbaz, M.; Mahmood, W.

    2013-01-01

    Latest technology like Internet, corporate intranets, data warehouses, ERP's, satellites, digital sensors, embedded systems, mobiles networks all are generating such a massive amount of data that it is getting very difficult to analyze and understand all these data, even using data mining tools. Huge datasets are becoming a difficult challenge for classification algorithms. With increasing amounts of data, data mining algorithms are getting slower and analysis is getting less interactive. Sampling can be a solution. Using a fraction of computing resources, Sampling can often provide same level of accuracy. The process of sampling requires much care because there are many factors involved in the determination of correct sample size. The approach proposed in this paper tries to find a solution to this problem. Based on a statistical formula, after setting some parameters, it returns a sample size called s ufficient sample size , which is then selected through probability sampling. Results indicate the usefulness of this technique in coping with the problem of huge datasets. (author)

  20. Population pharmacokinetic analysis of clopidogrel in healthy Jordanian subjects with emphasis optimal sampling strategy.

    Science.gov (United States)

    Yousef, A M; Melhem, M; Xue, B; Arafat, T; Reynolds, D K; Van Wart, S A

    2013-05-01

    Clopidogrel is metabolized primarily into an inactive carboxyl metabolite (clopidogrel-IM) or to a lesser extent an active thiol metabolite. A population pharmacokinetic (PK) model was developed using NONMEM(®) to describe the time course of clopidogrel-IM in plasma and to design a sparse-sampling strategy to predict clopidogrel-IM exposures for use in characterizing anti-platelet activity. Serial blood samples from 76 healthy Jordanian subjects administered a single 75 mg oral dose of clopidogrel were collected and assayed for clopidogrel-IM using reverse phase high performance liquid chromatography. A two-compartment (2-CMT) PK model with first-order absorption and elimination plus an absorption lag-time was evaluated, as well as a variation of this model designed to mimic enterohepatic recycling (EHC). Optimal PK sampling strategies (OSS) were determined using WinPOPT based upon collection of 3-12 post-dose samples. A two-compartment model with EHC provided the best fit and reduced bias in C(max) (median prediction error (PE%) of 9.58% versus 12.2%) relative to the basic two-compartment model, AUC(0-24) was similar for both models (median PE% = 1.39%). The OSS for fitting the two-compartment model with EHC required the collection of seven samples (0.25, 1, 2, 4, 5, 6 and 12 h). Reasonably unbiased and precise exposures were obtained when re-fitting this model to a reduced dataset considering only these sampling times. A two-compartment model considering EHC best characterized the time course of clopidogrel-IM in plasma. Use of the suggested OSS will allow for the collection of fewer PK samples when assessing clopidogrel-IM exposures. Copyright © 2013 John Wiley & Sons, Ltd.

  1. A comparison of sample preparation strategies for biological tissues and subsequent trace element analysis using LA-ICP-MS.

    Science.gov (United States)

    Bonta, Maximilian; Török, Szilvia; Hegedus, Balazs; Döme, Balazs; Limbeck, Andreas

    2017-03-01

    Laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) is one of the most commonly applied methods for lateral trace element distribution analysis in medical studies. Many improvements of the technique regarding quantification and achievable lateral resolution have been achieved in the last years. Nevertheless, sample preparation is also of major importance and the optimal sample preparation strategy still has not been defined. While conventional histology knows a number of sample pre-treatment strategies, little is known about the effect of these approaches on the lateral distributions of elements and/or their quantities in tissues. The technique of formalin fixation and paraffin embedding (FFPE) has emerged as the gold standard in tissue preparation. However, the potential use for elemental distribution studies is questionable due to a large number of sample preparation steps. In this work, LA-ICP-MS was used to examine the applicability of the FFPE sample preparation approach for elemental distribution studies. Qualitative elemental distributions as well as quantitative concentrations in cryo-cut tissues as well as FFPE samples were compared. Results showed that some metals (especially Na and K) are severely affected by the FFPE process, whereas others (e.g., Mn, Ni) are less influenced. Based on these results, a general recommendation can be given: FFPE samples are completely unsuitable for the analysis of alkaline metals. When analyzing transition metals, FFPE samples can give comparable results to snap-frozen tissues. Graphical abstract Sample preparation strategies for biological tissues are compared with regard to the elemental distributions and average trace element concentrations.

  2. Economics of place-based monitoring under the safe drinking water act, part II: design and development of place-based monitoring strategies.

    Science.gov (United States)

    Brands, Edwin; Rajagopal, R

    2008-08-01

    The goals of environmental legislation and associated regulations are to protect public health, natural resources, and ecosystems. In this context, monitoring programs should provide timely and relevant information so that the regulatory community can implement legislation in a cost-effective and efficient manner. The Safe Drinking Water Act (SDWA) of 1974 attempts to ensure that public water systems (PWSs) supply safe water to its consumers. As is the case with many other federal environmental statutes, SDWA monitoring has been implemented in relatively uniform fashion across the United States. In this three part series, spatial and temporal patterns in water quality data are utilized to develop, compare, and evaluate the economic performance of alternative place-based monitoring approaches to current monitoring practice. Part II: Several factors affect the performance of monitoring strategies, including: measurable objectives, required precision in estimates, acceptable confidence levels of such estimates, available budget for sampling. In this paper, we develop place-based monitoring strategies based on extensive analysis of available historical water quality data (1960-1994) of 19 Iowa community water systems. These systems supply potable water to over 350,000 people. In the context of drinking water, the objective is to protect public health by utilizing monitoring resources to characterize contaminants that are detectable, and are close to exceeding health standards. A place-based monitoring strategy was developed in which contaminants were selected based on their historical occurrence, rather than their appearance on the SDWA contaminant list. In a subset of the water systems, the temporal frequency of monitoring for one ubiquitous contaminant, nitrate, was tailored to patterns in its historical occurrence and concentration. Three sampling allocation models (linear, quadratic, and cubic) based on historic patterns in peak occurrence were developed and

  3. A DC Microgrid Coordinated Control Strategy Based on Integrator Current-Sharing

    Directory of Open Access Journals (Sweden)

    Liyuan Gao

    2017-08-01

    Full Text Available The DC microgrid has become a new trend for microgrid study with the advantages of high reliability, simple control and low losses. With regard to the drawbacks of the traditional droop control strategies, an improved DC droop control strategy based on integrator current-sharing is introduced. In the strategy, the principle of eliminating deviation through an integrator is used, constructing the current-sharing term in order to make the power-sharing between different distributed generation (DG units uniform and reasonable, which can reduce the circulating current between DG units. Furthermore, at the system coordinated control level, a hierarchical/droop control strategy based on the DC bus voltage is proposed. In the strategy, the operation modes of the AC main network and micro-sources are determined through detecting the DC voltage variation, which can ensure the power balance of the DC microgrid under different operating conditions. Meanwhile, communication is not needed between different DG units, while each DG unit needs to sample the DC bus voltage, which retains the plug-and-play feature of the DC microgrid. The proposed control strategy is validated by simulation on a DC microgrid with permanent magnet synchronous generator-based wind turbines, solar arrays and energy storage batteries, which can be applied to small commercial or residential buildings.

  4. Sampling strategies for the analysis of reactive low-molecular weight compounds in air

    NARCIS (Netherlands)

    Henneken, H.

    2006-01-01

    Within this thesis, new sampling and analysis strategies for the determination of airborne workplace contaminants have been developed. Special focus has been directed towards the development of air sampling methods that involve diffusive sampling. In an introductory overview, the current

  5. Selecting Sample Preparation Workflows for Mass Spectrometry-Based Proteomic and Phosphoproteomic Analysis of Patient Samples with Acute Myeloid Leukemia.

    Science.gov (United States)

    Hernandez-Valladares, Maria; Aasebø, Elise; Selheim, Frode; Berven, Frode S; Bruserud, Øystein

    2016-08-22

    Global mass spectrometry (MS)-based proteomic and phosphoproteomic studies of acute myeloid leukemia (AML) biomarkers represent a powerful strategy to identify and confirm proteins and their phosphorylated modifications that could be applied in diagnosis and prognosis, as a support for individual treatment regimens and selection of patients for bone marrow transplant. MS-based studies require optimal and reproducible workflows that allow a satisfactory coverage of the proteome and its modifications. Preparation of samples for global MS analysis is a crucial step and it usually requires method testing, tuning and optimization. Different proteomic workflows that have been used to prepare AML patient samples for global MS analysis usually include a standard protein in-solution digestion procedure with a urea-based lysis buffer. The enrichment of phosphopeptides from AML patient samples has previously been carried out either with immobilized metal affinity chromatography (IMAC) or metal oxide affinity chromatography (MOAC). We have recently tested several methods of sample preparation for MS analysis of the AML proteome and phosphoproteome and introduced filter-aided sample preparation (FASP) as a superior methodology for the sensitive and reproducible generation of peptides from patient samples. FASP-prepared peptides can be further fractionated or IMAC-enriched for proteome or phosphoproteome analyses. Herein, we will review both in-solution and FASP-based sample preparation workflows and encourage the use of the latter for the highest protein and phosphorylation coverage and reproducibility.

  6. A Geostatistical Approach to Indoor Surface Sampling Strategies

    DEFF Research Database (Denmark)

    Schneider, Thomas; Petersen, Ole Holm; Nielsen, Allan Aasbjerg

    1990-01-01

    Particulate surface contamination is of concern in production industries such as food processing, aerospace, electronics and semiconductor manufacturing. There is also an increased awareness that surface contamination should be monitored in industrial hygiene surveys. A conceptual and theoretical...... framework for designing sampling strategies is thus developed. The distribution and spatial correlation of surface contamination can be characterized using concepts from geostatistical science, where spatial applications of statistics is most developed. The theory is summarized and particulate surface...... contamination, sampled from small areas on a table, have been used to illustrate the method. First, the spatial correlation is modelled and the parameters estimated from the data. Next, it is shown how the contamination at positions not measured can be estimated with kriging, a minimum mean square error method...

  7. Interactive Control System, Intended Strategy, Implemented Strategy dan Emergent Strategy

    OpenAIRE

    Tubagus Ismail; Darjat Sudrajat

    2012-01-01

    The purpose of this study was to examine the relationship between management control system (MCS) and strategy formation processes, namely: intended strategy, emergent strategy and impelemented strategy. The focus of MCS in this study was interactive control system. The study was based on Structural Equation Modeling (SEM) as its multivariate analyses instrument. The samples were upper middle managers of manufacturing company in Banten Province, DKI Jakarta Province and West Java Province. AM...

  8. Using Linked Survey Paradata to Improve Sampling Strategies in the Medical Expenditure Panel Survey

    Directory of Open Access Journals (Sweden)

    Mirel Lisa B.

    2017-06-01

    Full Text Available Using paradata from a prior survey that is linked to a new survey can help a survey organization develop more effective sampling strategies. One example of this type of linkage or subsampling is between the National Health Interview Survey (NHIS and the Medical Expenditure Panel Survey (MEPS. MEPS is a nationally representative sample of the U.S. civilian, noninstitutionalized population based on a complex multi-stage sample design. Each year a new sample is drawn as a subsample of households from the prior year’s NHIS. The main objective of this article is to examine how paradata from a prior survey can be used in developing a sampling scheme in a subsequent survey. A framework for optimal allocation of the sample in substrata formed for this purpose is presented and evaluated for the relative effectiveness of alternative substratification schemes. The framework is applied, using real MEPS data, to illustrate how utilizing paradata from the linked survey offers the possibility of making improvements to the sampling scheme for the subsequent survey. The improvements aim to reduce the data collection costs while maintaining or increasing effective responding sample sizes and response rates for a harder to reach population.

  9. Impact of sampling strategy on stream load estimates in till landscape of the Midwest

    Science.gov (United States)

    Vidon, P.; Hubbard, L.E.; Soyeux, E.

    2009-01-01

    Accurately estimating various solute loads in streams during storms is critical to accurately determine maximum daily loads for regulatory purposes. This study investigates the impact of sampling strategy on solute load estimates in streams in the US Midwest. Three different solute types (nitrate, magnesium, and dissolved organic carbon (DOC)) and three sampling strategies are assessed. Regardless of the method, the average error on nitrate loads is higher than for magnesium or DOC loads, and all three methods generally underestimate DOC loads and overestimate magnesium loads. Increasing sampling frequency only slightly improves the accuracy of solute load estimates but generally improves the precision of load calculations. This type of investigation is critical for water management and environmental assessment so error on solute load calculations can be taken into account by landscape managers, and sampling strategies optimized as a function of monitoring objectives. ?? 2008 Springer Science+Business Media B.V.

  10. A Sample-Based Forest Monitoring Strategy Using Landsat, AVHRR and MODIS Data to Estimate Gross Forest Cover Loss in Malaysia between 1990 and 2005

    Directory of Open Access Journals (Sweden)

    Peter Potapov

    2013-04-01

    Full Text Available Insular Southeast Asia is a hotspot of humid tropical forest cover loss. A sample-based monitoring approach quantifying forest cover loss from Landsat imagery was implemented to estimate gross forest cover loss for two eras, 1990–2000 and 2000–2005. For each time interval, a probability sample of 18.5 km × 18.5 km blocks was selected, and pairs of Landsat images acquired per sample block were interpreted to quantify forest cover area and gross forest cover loss. Stratified random sampling was implemented for 2000–2005 with MODIS-derived forest cover loss used to define the strata. A probability proportional to x (πpx design was implemented for 1990–2000 with AVHRR-derived forest cover loss used as the x variable to increase the likelihood of including forest loss area in the sample. The estimated annual gross forest cover loss for Malaysia was 0.43 Mha/yr (SE = 0.04 during 1990–2000 and 0.64 Mha/yr (SE = 0.055 during 2000–2005. Our use of the πpx sampling design represents a first practical trial of this design for sampling satellite imagery. Although the design performed adequately in this study, a thorough comparative investigation of the πpx design relative to other sampling strategies is needed before general design recommendations can be put forth.

  11. Perspectives on land snails - sampling strategies for isotopic analyses

    Science.gov (United States)

    Kwiecien, Ola; Kalinowski, Annika; Kamp, Jessica; Pellmann, Anna

    2017-04-01

    Since the seminal works of Goodfriend (1992), several substantial studies confirmed a relation between the isotopic composition of land snail shells (d18O, d13C) and environmental parameters like precipitation amount, moisture source, temperature and vegetation type. This relation, however, is not straightforward and site dependent. The choice of sampling strategy (discrete or bulk sampling) and cleaning procedure (several methods can be used, but comparison of their effects in an individual shell has yet not been achieved) further complicate the shell analysis. The advantage of using snail shells as environmental archive lies in the snails' limited mobility, and therefore an intrinsic aptitude of recording local and site-specific conditions. Also, snail shells are often found at dated archaeological sites. An obvious drawback is that shell assemblages rarely make up a continuous record, and a single shell is only a snapshot of the environmental setting at a given time. Shells from archaeological sites might represent a dietary component and cooking would presumably alter the isotopic signature of aragonite material. Consequently, a proper sampling strategy is of great importance and should be adjusted to the scientific question. Here, we compare and contrast different sampling approaches using modern shells collected in Morocco, Spain and Germany. The bulk shell approach (fine-ground material) yields information on mean environmental parameters within the life span of analyzed individuals. However, despite homogenization, replicate measurements of bulk shell material returned results with a variability greater than analytical precision (up to 2‰ for d18O, and up to 1‰ for d13C), calling for caution analyzing only single individuals. Horizontal high-resolution sampling (single drill holes along growth lines) provides insights into the amplitude of seasonal variability, while vertical high-resolution sampling (multiple drill holes along the same growth line

  12. The Prediction of Coping Strategies Based on Personality Traits in Irritants Affiliates

    Directory of Open Access Journals (Sweden)

    Ali Masoud Rostami

    2013-07-01

    Full Text Available Objective: This study aimed to predict coping strategies based on the characteristics of the stimulants-dependent people. Method: The research method was correlational. The population consisted of all stimulant-dependent individuals (n=402 that consecutively admitted to addiction centers of Tehran over the last year and had a drug dependence diagnosis. By systematic sampling, 201 subjects were selected of this population. The NEO personality inventory (NEO-FFI short form and Lazarus-Folkman’s coping strategies (WCQ-short form questionnaires administered among selected sample. Results: The results showed that there is a positive correlation between the personality dimension of neuroticism and coping strategies of avoidance, disengagement, and a negative one with continence. There is a positive correlation between the personality dimension of extroversion and avoidance, seeking social support and a negative one with (assuming responsibility. Also there is a positive correlation between the personality dimension of compatibility with restraint coping strategies and seeking social support. Conscientiousness personality dimension did not predict any coping strategies in stimulus-dependent patients. Conclusion: Personality traits can predict coping strategies in stimulus-dependent individuals. For the treatment of addicted patients paying attention to evaluation of the patients’ characteristics is suggested.

  13. Sample preparation strategies for food and biological samples prior to nanoparticle detection and imaging

    DEFF Research Database (Denmark)

    Larsen, Erik Huusfeldt; Löschner, Katrin

    2014-01-01

    microscopy (TEM) proved to be necessary for trouble shooting of results obtained from AFFF-LS-ICP-MS. Aqueous and enzymatic extraction strategies were tested for thorough sample preparation aiming at degrading the sample matrix and to liberate the AgNPs from chicken meat into liquid suspension. The resulting...... AFFF-ICP-MS fractograms, which corresponded to the enzymatic digests, showed a major nano-peak (about 80 % recovery of AgNPs spiked to the meat) plus new smaller peaks that eluted close to the void volume of the fractograms. Small, but significant shifts in retention time of AFFF peaks were observed...... for the meat sample extracts and the corresponding neat AgNP suspension, and rendered sizing by way of calibration with AgNPs as sizing standards inaccurate. In order to gain further insight into the sizes of the separated AgNPs, or their possible dissolved state, fractions of the AFFF eluate were collected...

  14. Sampling strategy for estimating human exposure pathways to consumer chemicals

    NARCIS (Netherlands)

    Papadopoulou, Eleni; Padilla-Sanchez, Juan A.; Collins, Chris D.; Cousins, Ian T.; Covaci, Adrian; de Wit, Cynthia A.; Leonards, Pim E.G.; Voorspoels, Stefan; Thomsen, Cathrine; Harrad, Stuart; Haug, Line S.

    2016-01-01

    Human exposure to consumer chemicals has become a worldwide concern. In this work, a comprehensive sampling strategy is presented, to our knowledge being the first to study all relevant exposure pathways in a single cohort using multiple methods for assessment of exposure from each exposure pathway.

  15. Dried blood spot measurement: application in tacrolimus monitoring using limited sampling strategy and abbreviated AUC estimation.

    Science.gov (United States)

    Cheung, Chi Yuen; van der Heijden, Jaques; Hoogtanders, Karin; Christiaans, Maarten; Liu, Yan Lun; Chan, Yiu Han; Choi, Koon Shing; van de Plas, Afke; Shek, Chi Chung; Chau, Ka Foon; Li, Chun Sang; van Hooff, Johannes; Stolk, Leo

    2008-02-01

    Dried blood spot (DBS) sampling and high-performance liquid chromatography tandem-mass spectrometry have been developed in monitoring tacrolimus levels. Our center favors the use of limited sampling strategy and abbreviated formula to estimate the area under concentration-time curve (AUC(0-12)). However, it is inconvenient for patients because they have to wait in the center for blood sampling. We investigated the application of DBS method in tacrolimus level monitoring using limited sampling strategy and abbreviated AUC estimation approach. Duplicate venous samples were obtained at each time point (C(0), C(2), and C(4)). To determine the stability of blood samples, one venous sample was sent to our laboratory immediately. The other duplicate venous samples, together with simultaneous fingerprick blood samples, were sent to the University of Maastricht in the Netherlands. Thirty six patients were recruited and 108 sets of blood samples were collected. There was a highly significant relationship between AUC(0-12), estimated from venous blood samples, and fingerprick blood samples (r(2) = 0.96, P AUC(0-12) strategy as drug monitoring.

  16. Stable isotope labeling strategy based on coding theory

    Energy Technology Data Exchange (ETDEWEB)

    Kasai, Takuma; Koshiba, Seizo; Yokoyama, Jun; Kigawa, Takanori, E-mail: kigawa@riken.jp [RIKEN Quantitative Biology Center (QBiC), Laboratory for Biomolecular Structure and Dynamics (Japan)

    2015-10-15

    We describe a strategy for stable isotope-aided protein nuclear magnetic resonance (NMR) analysis, called stable isotope encoding. The basic idea of this strategy is that amino-acid selective labeling can be considered as “encoding and decoding” processes, in which the information of amino acid type is encoded by the stable isotope labeling ratio of the corresponding residue and it is decoded by analyzing NMR spectra. According to the idea, the strategy can diminish the required number of labelled samples by increasing information content per sample, enabling discrimination of 19 kinds of non-proline amino acids with only three labeled samples. The idea also enables this strategy to combine with information technologies, such as error detection by check digit, to improve the robustness of analyses with low quality data. Stable isotope encoding will facilitate NMR analyses of proteins under non-ideal conditions, such as those in large complex systems, with low-solubility, and in living cells.

  17. Stable isotope labeling strategy based on coding theory

    International Nuclear Information System (INIS)

    Kasai, Takuma; Koshiba, Seizo; Yokoyama, Jun; Kigawa, Takanori

    2015-01-01

    We describe a strategy for stable isotope-aided protein nuclear magnetic resonance (NMR) analysis, called stable isotope encoding. The basic idea of this strategy is that amino-acid selective labeling can be considered as “encoding and decoding” processes, in which the information of amino acid type is encoded by the stable isotope labeling ratio of the corresponding residue and it is decoded by analyzing NMR spectra. According to the idea, the strategy can diminish the required number of labelled samples by increasing information content per sample, enabling discrimination of 19 kinds of non-proline amino acids with only three labeled samples. The idea also enables this strategy to combine with information technologies, such as error detection by check digit, to improve the robustness of analyses with low quality data. Stable isotope encoding will facilitate NMR analyses of proteins under non-ideal conditions, such as those in large complex systems, with low-solubility, and in living cells

  18. Selecting Sample Preparation Workflows for Mass Spectrometry-Based Proteomic and Phosphoproteomic Analysis of Patient Samples with Acute Myeloid Leukemia

    Directory of Open Access Journals (Sweden)

    Maria Hernandez-Valladares

    2016-08-01

    Full Text Available Global mass spectrometry (MS-based proteomic and phosphoproteomic studies of acute myeloid leukemia (AML biomarkers represent a powerful strategy to identify and confirm proteins and their phosphorylated modifications that could be applied in diagnosis and prognosis, as a support for individual treatment regimens and selection of patients for bone marrow transplant. MS-based studies require optimal and reproducible workflows that allow a satisfactory coverage of the proteome and its modifications. Preparation of samples for global MS analysis is a crucial step and it usually requires method testing, tuning and optimization. Different proteomic workflows that have been used to prepare AML patient samples for global MS analysis usually include a standard protein in-solution digestion procedure with a urea-based lysis buffer. The enrichment of phosphopeptides from AML patient samples has previously been carried out either with immobilized metal affinity chromatography (IMAC or metal oxide affinity chromatography (MOAC. We have recently tested several methods of sample preparation for MS analysis of the AML proteome and phosphoproteome and introduced filter-aided sample preparation (FASP as a superior methodology for the sensitive and reproducible generation of peptides from patient samples. FASP-prepared peptides can be further fractionated or IMAC-enriched for proteome or phosphoproteome analyses. Herein, we will review both in-solution and FASP-based sample preparation workflows and encourage the use of the latter for the highest protein and phosphorylation coverage and reproducibility.

  19. Statistical sampling strategies

    International Nuclear Information System (INIS)

    Andres, T.H.

    1987-01-01

    Systems assessment codes use mathematical models to simulate natural and engineered systems. Probabilistic systems assessment codes carry out multiple simulations to reveal the uncertainty in values of output variables due to uncertainty in the values of the model parameters. In this paper, methods are described for sampling sets of parameter values to be used in a probabilistic systems assessment code. Three Monte Carlo parameter selection methods are discussed: simple random sampling, Latin hypercube sampling, and sampling using two-level orthogonal arrays. Three post-selection transformations are also described: truncation, importance transformation, and discretization. Advantages and disadvantages of each method are summarized

  20. Efficiency enhancement of optimized Latin hypercube sampling strategies: Application to Monte Carlo uncertainty analysis and meta-modeling

    Science.gov (United States)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Janssen, Hans

    2015-02-01

    The majority of literature regarding optimized Latin hypercube sampling (OLHS) is devoted to increasing the efficiency of these sampling strategies through the development of new algorithms based on the combination of innovative space-filling criteria and specialized optimization schemes. However, little attention has been given to the impact of the initial design that is fed into the optimization algorithm, on the efficiency of OLHS strategies. Previous studies, as well as codes developed for OLHS, have relied on one of the following two approaches for the selection of the initial design in OLHS: (1) the use of random points in the hypercube intervals (random LHS), and (2) the use of midpoints in the hypercube intervals (midpoint LHS). Both approaches have been extensively used, but no attempt has been previously made to compare the efficiency and robustness of their resulting sample designs. In this study we compare the two approaches and show that the space-filling characteristics of OLHS designs are sensitive to the initial design that is fed into the optimization algorithm. It is also illustrated that the space-filling characteristics of OLHS designs based on midpoint LHS are significantly better those based on random LHS. The two approaches are compared by incorporating their resulting sample designs in Monte Carlo simulation (MCS) for uncertainty propagation analysis, and then, by employing the sample designs in the selection of the training set for constructing non-intrusive polynomial chaos expansion (NIPCE) meta-models which subsequently replace the original full model in MCSs. The analysis is based on two case studies involving numerical simulation of density dependent flow and solute transport in porous media within the context of seawater intrusion in coastal aquifers. We show that the use of midpoint LHS as the initial design increases the efficiency and robustness of the resulting MCSs and NIPCE meta-models. The study also illustrates that this

  1. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    Science.gov (United States)

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  2. Measuring strategies for learning regulation in medical education: scale reliability and dimensionality in a Swedish sample.

    Science.gov (United States)

    Edelbring, Samuel

    2012-08-15

    The degree of learners' self-regulated learning and dependence on external regulation influence learning processes in higher education. These regulation strategies are commonly measured by questionnaires developed in other settings than in which they are being used, thereby requiring renewed validation. The aim of this study was to psychometrically evaluate the learning regulation strategy scales from the Inventory of Learning Styles with Swedish medical students (N = 206). The regulation scales were evaluated regarding their reliability, scale dimensionality and interrelations. The primary evaluation focused on dimensionality and was performed with Mokken scale analysis. To assist future scale refinement, additional item analysis, such as item-to-scale correlations, was performed. Scale scores in the Swedish sample displayed good reliability in relation to published results: Cronbach's alpha: 0.82, 0.72, and 0.65 for self-regulation, external regulation and lack of regulation scales respectively. The dimensionalities in scales were adequate for self-regulation and its subscales, whereas external regulation and lack of regulation displayed less unidimensionality. The established theoretical scales were largely replicated in the exploratory analysis. The item analysis identified two items that contributed little to their respective scales. The results indicate that these scales have an adequate capacity for detecting the three theoretically proposed learning regulation strategies in the medical education sample. Further construct validity should be sought by interpreting scale scores in relation to specific learning activities. Using established scales for measuring students' regulation strategies enables a broad empirical base for increasing knowledge on regulation strategies in relation to different disciplinary settings and contributes to theoretical development.

  3. A sampling strategy to establish existing plant configuration baselines

    International Nuclear Information System (INIS)

    Buchanan, L.P.

    1995-01-01

    The Department of Energy's Gaseous Diffusion Plants (DOEGDP) are undergoing a Safety Analysis Update Program. As part of this program, critical existing structures are being reevaluated for Natural Phenomena Hazards (NPH) based on the recommendations of UCRL-15910. The Department of Energy has specified that current plant configurations be used in the performance of these reevaluations. This paper presents the process and results of a walkdown program implemented at DOEGDP to establish the current configuration baseline for these existing critical structures for use in subsequent NPH evaluations. These structures are classified as moderate hazard facilities and were constructed in the early 1950's. The process involved a statistical sampling strategy to determine the validity of critical design information as represented on the original design drawings such as member sizes, orientation, connection details and anchorage. A floor load inventory of the dead load of the equipment, both permanently attached and spare, was also performed as well as a walkthrough inspection of the overall structure to identify any other significant anomalies

  4. Sampling and analyte enrichment strategies for ambient mass spectrometry.

    Science.gov (United States)

    Li, Xianjiang; Ma, Wen; Li, Hongmei; Ai, Wanpeng; Bai, Yu; Liu, Huwei

    2018-01-01

    Ambient mass spectrometry provides great convenience for fast screening, and has showed promising potential in analytical chemistry. However, its relatively low sensitivity seriously restricts its practical utility in trace compound analysis. In this review, we summarize the sampling and analyte enrichment strategies coupled with nine modes of representative ambient mass spectrometry (desorption electrospray ionization, paper vhspray ionization, wooden-tip spray ionization, probe electrospray ionization, coated blade spray ionization, direct analysis in real time, desorption corona beam ionization, dielectric barrier discharge ionization, and atmospheric-pressure solids analysis probe) that have dramatically increased the detection sensitivity. We believe that these advances will promote routine use of ambient mass spectrometry. Graphical abstract Scheme of sampling stretagies for ambient mass spectrometry.

  5. Observing System Simulation Experiments for the assessment of temperature sampling strategies in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    F. Raicich

    2003-01-01

    Full Text Available For the first time in the Mediterranean Sea various temperature sampling strategies are studied and compared to each other by means of the Observing System Simulation Experiment technique. Their usefulness in the framework of the Mediterranean Forecasting System (MFS is assessed by quantifying their impact in a Mediterranean General Circulation Model in numerical twin experiments via univariate data assimilation of temperature profiles in summer and winter conditions. Data assimilation is performed by means of the optimal interpolation algorithm implemented in the SOFA (System for Ocean Forecasting and Analysis code. The sampling strategies studied here include various combinations of eXpendable BathyThermograph (XBT profiles collected along Volunteer Observing Ship (VOS tracks, Airborne XBTs (AXBTs and sea surface temperatures. The actual sampling strategy adopted in the MFS Pilot Project during the Targeted Operational Period (TOP, winter-spring 2000 is also studied. The data impact is quantified by the error reduction relative to the free run. The most effective sampling strategies determine 25–40% error reduction, depending on the season, the geographic area and the depth range. A qualitative relationship can be recognized in terms of the spread of information from the data positions, between basin circulation features and spatial patterns of the error reduction fields, as a function of different spatial and seasonal characteristics of the dynamics. The largest error reductions are observed when samplings are characterized by extensive spatial coverages, as in the cases of AXBTs and the combination of XBTs and surface temperatures. The sampling strategy adopted during the TOP is characterized by little impact, as a consequence of a sampling frequency that is too low. Key words. Oceanography: general (marginal and semi-enclosed seas; numerical modelling

  6. Observing System Simulation Experiments for the assessment of temperature sampling strategies in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    F. Raicich

    Full Text Available For the first time in the Mediterranean Sea various temperature sampling strategies are studied and compared to each other by means of the Observing System Simulation Experiment technique. Their usefulness in the framework of the Mediterranean Forecasting System (MFS is assessed by quantifying their impact in a Mediterranean General Circulation Model in numerical twin experiments via univariate data assimilation of temperature profiles in summer and winter conditions. Data assimilation is performed by means of the optimal interpolation algorithm implemented in the SOFA (System for Ocean Forecasting and Analysis code. The sampling strategies studied here include various combinations of eXpendable BathyThermograph (XBT profiles collected along Volunteer Observing Ship (VOS tracks, Airborne XBTs (AXBTs and sea surface temperatures. The actual sampling strategy adopted in the MFS Pilot Project during the Targeted Operational Period (TOP, winter-spring 2000 is also studied.

    The data impact is quantified by the error reduction relative to the free run. The most effective sampling strategies determine 25–40% error reduction, depending on the season, the geographic area and the depth range. A qualitative relationship can be recognized in terms of the spread of information from the data positions, between basin circulation features and spatial patterns of the error reduction fields, as a function of different spatial and seasonal characteristics of the dynamics. The largest error reductions are observed when samplings are characterized by extensive spatial coverages, as in the cases of AXBTs and the combination of XBTs and surface temperatures. The sampling strategy adopted during the TOP is characterized by little impact, as a consequence of a sampling frequency that is too low.

    Key words. Oceanography: general (marginal and semi-enclosed seas; numerical modelling

  7. A novel multi-scale adaptive sampling-based approach for energy saving in leak detection for WSN-based water pipelines

    Science.gov (United States)

    Saqib, Najam us; Faizan Mysorewala, Muhammad; Cheded, Lahouari

    2017-12-01

    In this paper, we propose a novel monitoring strategy for a wireless sensor networks (WSNs)-based water pipeline network. Our strategy uses a multi-pronged approach to reduce energy consumption based on the use of two types of vibration sensors and pressure sensors, all having different energy levels, and a hierarchical adaptive sampling mechanism to determine the sampling frequency. The sampling rate of the sensors is adjusted according to the bandwidth of the vibration signal being monitored by using a wavelet-based adaptive thresholding scheme that calculates the new sampling frequency for the following cycle. In this multimodal sensing scheme, the duty-cycling approach is used for all sensors to reduce the sampling instances, such that the high-energy, high-precision (HE-HP) vibration sensors have low duty cycles, and the low-energy, low-precision (LE-LP) vibration sensors have high duty cycles. The low duty-cycling (HE-HP) vibration sensor adjusts the sampling frequency of the high duty-cycling (LE-LP) vibration sensor. The simulated test bed considered here consists of a water pipeline network which uses pressure and vibration sensors, with the latter having different energy consumptions and precision levels, at various locations in the network. This is all the more useful for energy conservation for extended monitoring. It is shown that by using the novel features of our proposed scheme, a significant reduction in energy consumption is achieved and the leak is effectively detected by the sensor node that is closest to it. Finally, both the total energy consumed by monitoring as well as the time to detect the leak by a WSN node are computed, and show the superiority of our proposed hierarchical adaptive sampling algorithm over a non-adaptive sampling approach.

  8. A sampling strategy for estimating plot average annual fluxes of chemical elements from forest soils

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.; Vries, de W.

    2010-01-01

    A sampling strategy for estimating spatially averaged annual element leaching fluxes from forest soils is presented and tested in three Dutch forest monitoring plots. In this method sampling locations and times (days) are selected by probability sampling. Sampling locations were selected by

  9. [Identification of Systemic Contaminations with Legionella Spec. in Drinking Water Plumbing Systems: Sampling Strategies and Corresponding Parameters].

    Science.gov (United States)

    Völker, S; Schreiber, C; Müller, H; Zacharias, N; Kistemann, T

    2017-05-01

    After the amendment of the Drinking Water Ordinance in 2011, the requirements for the hygienic-microbiological monitoring of drinking water installations have increased significantly. In the BMBF-funded project "Biofilm Management" (2010-2014), we examined the extent to which established sampling strategies in practice can uncover drinking water plumbing systems systemically colonized with Legionella. Moreover, we investigated additional parameters that might be suitable for detecting systemic contaminations. We subjected the drinking water plumbing systems of 8 buildings with known microbial contamination (Legionella) to an intensive hygienic-microbiological sampling with high spatial and temporal resolution. A total of 626 drinking hot water samples were analyzed with classical culture-based methods. In addition, comprehensive hygienic observations were conducted in each building and qualitative interviews with operators and users were applied. Collected tap-specific parameters were quantitatively analyzed by means of sensitivity and accuracy calculations. The systemic presence of Legionella in drinking water plumbing systems has a high spatial and temporal variability. Established sampling strategies were only partially suitable to detect long-term Legionella contaminations in practice. In particular, the sampling of hot water at the calorifier and circulation re-entrance showed little significance in terms of contamination events. To detect the systemic presence of Legionella,the parameters stagnation (qualitatively assessed) and temperature (compliance with the 5K-rule) showed better results. © Georg Thieme Verlag KG Stuttgart · New York.

  10. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software

    OpenAIRE

    Dols, W. Stuart; Persily, Andrew K.; Morrow, Jayne B.; Matzke, Brett D.; Sego, Landon H.; Nuffer, Lisa L.; Pulsipher, Brent A.

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by vir...

  11. Measuring strategies for learning regulation in medical education: Scale reliability and dimensionality in a Swedish sample

    Directory of Open Access Journals (Sweden)

    Edelbring Samuel

    2012-08-01

    Full Text Available Abstract Background The degree of learners’ self-regulated learning and dependence on external regulation influence learning processes in higher education. These regulation strategies are commonly measured by questionnaires developed in other settings than in which they are being used, thereby requiring renewed validation. The aim of this study was to psychometrically evaluate the learning regulation strategy scales from the Inventory of Learning Styles with Swedish medical students (N = 206. Methods The regulation scales were evaluated regarding their reliability, scale dimensionality and interrelations. The primary evaluation focused on dimensionality and was performed with Mokken scale analysis. To assist future scale refinement, additional item analysis, such as item-to-scale correlations, was performed. Results Scale scores in the Swedish sample displayed good reliability in relation to published results: Cronbach’s alpha: 0.82, 0.72, and 0.65 for self-regulation, external regulation and lack of regulation scales respectively. The dimensionalities in scales were adequate for self-regulation and its subscales, whereas external regulation and lack of regulation displayed less unidimensionality. The established theoretical scales were largely replicated in the exploratory analysis. The item analysis identified two items that contributed little to their respective scales. Discussion The results indicate that these scales have an adequate capacity for detecting the three theoretically proposed learning regulation strategies in the medical education sample. Further construct validity should be sought by interpreting scale scores in relation to specific learning activities. Using established scales for measuring students’ regulation strategies enables a broad empirical base for increasing knowledge on regulation strategies in relation to different disciplinary settings and contributes to theoretical development.

  12. Memory and communication support in dementia: research-based strategies for caregivers.

    Science.gov (United States)

    Smith, Erin R; Broughton, Megan; Baker, Rosemary; Pachana, Nancy A; Angwin, Anthony J; Humphreys, Michael S; Mitchell, Leander; Byrne, Gerard J; Copland, David A; Gallois, Cindy; Hegney, Desley; Chenery, Helen J

    2011-03-01

    Difficulties with memory and communication are prominent and distressing features of dementia which impact on the person with dementia and contribute to caregiver stress and burden. There is a need to provide caregivers with strategies to support and maximize memory and communication abilities in people with dementia. In this project, a team of clinicians, researchers and educators in neuropsychology, psychogeriatrics, nursing and speech pathology translated research-based knowledge from these fields into a program of practical strategies for everyday use by family and professional caregivers. From the available research evidence, the project team identified compensatory or facilitative strategies to assist with common areas of difficulty, and structured these under the mnemonics RECAPS (for memory) and MESSAGE (for communication). This information was adapted for presentation in a DVD-based education program in accordance with known characteristics of effective caregiver education. The resultant DVD comprises (1) information on the nature and importance of memory and communication in everyday life; (2) explanations of common patterns of difficulty and preserved ability in memory and communication across the stages of dementia; (3) acted vignettes demonstrating the strategies, based on authentic samples of speech in dementia; and (4) scenarios to prompt the viewer to consider the benefits of using the strategies. Using a knowledge-translation framework, information and strategies can be provided to family and professional caregivers to help them optimize residual memory and communication in people with dementia. Future development of the materials, incorporating consumer feedback, will focus on methods for enabling wider dissemination.

  13. Learning Efficiency of Two ICT-Based Instructional Strategies in Greek Sheep Farmers

    Science.gov (United States)

    Bellos, Georgios; Mikropoulos, Tassos A.; Deligeorgis, Stylianos; Kominakis, Antonis

    2016-01-01

    Purpose: The objective of the present study was to compare the learning efficiency of two information and communications technology (ICT)-based instructional strategies (multimedia presentation (MP) and concept mapping) in a sample (n = 187) of Greek sheep farmers operating mainly in Western Greece. Design/methodology/approach: In total, 15…

  14. Sampling strategies and materials for investigating large reactive particle complaints from Valley Village homeowners near a coal-fired power plant

    International Nuclear Information System (INIS)

    Chang, A.; Davis, H.; Frazar, B.; Haines, B.

    1997-01-01

    This paper will present Phase 3's sampling strategies, techniques, methods and substrates for assisting the District to resolve the complaints involving yellowish-brown staining and spotting of homes, cars, etc. These spots could not be easily washed off and some were permanent. The sampling strategies for the three phases were based on Phase 1 -- the identification of the reactive particles conducted in October, 1989 by APCD and IITRI, Phase 2 -- a study of the size distribution and concentration as a function of distance and direction of reactive particle deposition conducted by Radian and LG and E, and Phase 3 -- the determination of the frequency of soiling events over a full year's duration conducted in 1995 by APCD and IITRI. The sampling methods included two primary substrates -- ACE sheets and painted steel, and four secondary substrates -- mailbox, aluminum siding, painted wood panels and roof tiles. The secondary substrates were the main objects from the Valley Village complaints. The sampling technique included five Valley Village (VV) soiling/staining assessment sites and one southwest of the power plant as background/upwind site. The five VV sites northeast of the power plant covered 50 degrees span sector and 3/4 miles distance from the stacks. Hourly meteorological data for wind speeds and wind directions were collected. Based on this sampling technique, there were fifteen staining episodes detected. Nine of them were in summer, 1995

  15. Standard methods for sampling and sample preparation for gamma spectroscopy

    International Nuclear Information System (INIS)

    Taskaeva, M.; Taskaev, E.; Nikolov, P.

    1993-01-01

    The strategy for sampling and sample preparation is outlined: necessary number of samples; analysis and treatment of the results received; quantity of the analysed material according to the radionuclide concentrations and analytical methods; the minimal quantity and kind of the data needed for making final conclusions and decisions on the base of the results received. This strategy was tested in gamma spectroscopic analysis of radionuclide contamination of the region of Eleshnitsa Uranium Mines. The water samples was taken and stored according to the ASTM D 3370-82. The general sampling procedures were in conformity with the recommendations of ISO 5667. The radionuclides was concentrated by coprecipitation with iron hydroxide and ion exchange. The sampling of soil samples complied with the rules of ASTM C 998, and their sample preparation - with ASTM C 999. After preparation the samples were sealed hermetically and measured. (author)

  16. Modelling of in-stream nitrogen and phosphorus concentrations using different sampling strategies for calibration data

    Science.gov (United States)

    Jomaa, Seifeddine; Jiang, Sanyuan; Yang, Xiaoqiang; Rode, Michael

    2016-04-01

    It is known that a good evaluation and prediction of surface water pollution is mainly limited by the monitoring strategy and the capability of the hydrological water quality model to reproduce the internal processes. To this end, a compromise sampling frequency, which can reflect the dynamical behaviour of leached nutrient fluxes responding to changes in land use, agriculture practices and point sources, and appropriate process-based water quality model are required. The objective of this study was to test the identification of hydrological water quality model parameters (nitrogen and phosphorus) under two different monitoring strategies: (1) regular grab-sampling approach and (2) regular grab-sampling with additional monitoring during the hydrological events using automatic samplers. First, the semi-distributed hydrological water quality HYPE (Hydrological Predictions for the Environment) model was successfully calibrated (1994-1998) for discharge (NSE = 0.86), nitrate-N (lowest NSE for nitrate-N load = 0.69), particulate phosphorus and soluble phosphorus in the Selke catchment (463 km2, central Germany) for the period 1994-1998 using regular grab-sampling approach (biweekly to monthly for nitrogen and phosphorus concentrations). Second, the model was successfully validated during the period 1999-2010 for discharge, nitrate-N, particulate-phosphorus and soluble-phosphorus (lowest NSE for soluble phosphorus load = 0.54). Results, showed that when additional sampling during the events with random grab-sampling approach was used (period 2011-2013), the hydrological model could reproduce only the nitrate-N and soluble phosphorus concentrations reasonably well. However, when additional sampling during the hydrological events was considered, the HYPE model could not represent the measured particulate phosphorus. This reflects the importance of suspended sediment during the hydrological events increasing the concentrations of particulate phosphorus. The HYPE model could

  17. Sampling and analysis strategies to support waste form qualification

    International Nuclear Information System (INIS)

    Westsik, J.H. Jr.; Pulsipher, B.A.; Eggett, D.L.; Kuhn, W.L.

    1989-04-01

    As part of the waste acceptance process, waste form producers will be required to (1) demonstrate that their glass waste form will meet minimum specifications, (2) show that the process can be controlled to consistently produce an acceptable waste form, and (3) provide documentation that the waste form produced meets specifications. Key to the success of these endeavors is adequate sampling and chemical and radiochemical analyses of the waste streams from the waste tanks through the process to the final glass product. This paper suggests sampling and analysis strategies for meeting specific statistical objectives of (1) detection of compositions outside specification limits, (2) prediction of final glass product composition, and (3) estimation of composition in process vessels for both reporting and guiding succeeding process steps. 2 refs., 1 fig., 3 tabs

  18. Strategy for thermo-gravimetric analysis of K East fuel samples

    International Nuclear Information System (INIS)

    Lawrence, L.A.

    1997-01-01

    A strategy was developed for the Thermo-Gravimetric Analysis (TGA) testing of K East fuel samples for oxidation rate determinations. Tests will first establish if there are any differences for dry air oxidation between the K West and K East fuel. These tests will be followed by moist inert gas oxidation rate measurements. The final series of tests will consider pure water vapor i.e., steam

  19. Getting men in the room: perceptions of effective strategies to initiate men's involvement in gender-based violence prevention in a global sample.

    Science.gov (United States)

    Casey, Erin A; Leek, Cliff; Tolman, Richard M; Allen, Christopher T; Carlson, Juliana M

    2017-09-01

    As engaging men in gender-based violence prevention efforts becomes an increasingly institutionalised component of gender equity work globally, clarity is needed about the strategies that best initiate male-identified individuals' involvement in these efforts. The purpose of this study was to examine the perceived relevance and effectiveness of men's engagement strategies from the perspective of men around the world who have organised or attended gender-based violence prevention events. Participants responded to an online survey (available in English, French and Spanish) and rated the effectiveness of 15 discrete engagement strategies derived from earlier qualitative work. Participants also provided suggestions regarding strategies in open-ended comments. Listed strategies cut across the social ecological spectrum and represented both venues in which to reach men, and the content of violence prevention messaging. Results suggest that all strategies, on average, were perceived as effective across regions of the world, with strategies that tailor messaging to topics of particular concern to men (such as fatherhood and healthy relationships) rated most highly. Open-ended comments also surfaced tensions, particularly related to the role of a gender analysis in initial men's engagement efforts. Findings suggest the promise of cross-regional adaptation and information sharing regarding successful approaches to initiating men's anti-violence involvement.

  20. A census-weighted, spatially-stratified household sampling strategy for urban malaria epidemiology

    Directory of Open Access Journals (Sweden)

    Slutsker Laurence

    2008-02-01

    Full Text Available Abstract Background Urban malaria is likely to become increasingly important as a consequence of the growing proportion of Africans living in cities. A novel sampling strategy was developed for urban areas to generate a sample simultaneously representative of population and inhabited environments. Such a strategy should facilitate analysis of important epidemiological relationships in this ecological context. Methods Census maps and summary data for Kisumu, Kenya, were used to create a pseudo-sampling frame using the geographic coordinates of census-sampled structures. For every enumeration area (EA designated as urban by the census (n = 535, a sample of structures equal to one-tenth the number of households was selected. In EAs designated as rural (n = 32, a geographically random sample totalling one-tenth the number of households was selected from a grid of points at 100 m intervals. The selected samples were cross-referenced to a geographic information system, and coordinates transferred to handheld global positioning units. Interviewers found the closest eligible household to the sampling point and interviewed the caregiver of a child aged Results 4,336 interviews were completed in 473 of the 567 study area EAs from June 2002 through February 2003. EAs without completed interviews were randomly distributed, and non-response was approximately 2%. Mean distance from the assigned sampling point to the completed interview was 74.6 m, and was significantly less in urban than rural EAs, even when controlling for number of households. The selected sample had significantly more children and females of childbearing age than the general population, and fewer older individuals. Conclusion This method selected a sample that was simultaneously population-representative and inclusive of important environmental variation. The use of a pseudo-sampling frame and pre-programmed handheld GPS units is more efficient and may yield a more complete sample than

  1. Cell-Based Strategies for Meniscus Tissue Engineering

    Science.gov (United States)

    Niu, Wei; Guo, Weimin; Han, Shufeng; Zhu, Yun; Liu, Shuyun; Guo, Quanyi

    2016-01-01

    Meniscus injuries remain a significant challenge due to the poor healing potential of the inner avascular zone. Following a series of studies and clinical trials, tissue engineering is considered a promising prospect for meniscus repair and regeneration. As one of the key factors in tissue engineering, cells are believed to be highly beneficial in generating bionic meniscus structures to replace injured ones in patients. Therefore, cell-based strategies for meniscus tissue engineering play a fundamental role in meniscal regeneration. According to current studies, the main cell-based strategies for meniscus tissue engineering are single cell type strategies; cell coculture strategies also were applied to meniscus tissue engineering. Likewise, on the one side, the zonal recapitulation strategies based on mimicking meniscal differing cells and internal architectures have received wide attentions. On the other side, cell self-assembling strategies without any scaffolds may be a better way to build a bionic meniscus. In this review, we primarily discuss cell seeds for meniscus tissue engineering and their application strategies. We also discuss recent advances and achievements in meniscus repair experiments that further improve our understanding of meniscus tissue engineering. PMID:27274735

  2. PENGARUH STRATEGI PROJECT BASED LEARNING (PJBL TERHADAP KEMAMPUAN BERPIKIR KRITIS SISWA KELAS XI IPA PADA MATERI KOLOID

    Directory of Open Access Journals (Sweden)

    Nur Hikmah

    2016-11-01

    Full Text Available The 21st century education aims to develop the ability of intelligence of students in order to resolve the problems faced in real life. Project-Based Learning is one instructional strategies to develop the skills required in the 21st century. Through a given project, students are not only required to achieve the learning objectives that have been set, but students will be trained to face the world of work that requires their ability to access, mesintesis, communicating information, and work together to solve complex problems so as to improve the ability of students critical thinking. This research is a quasi-experimental research (quasy experiment with posttest only control group design. This research aims to determine the influence of Project Based Learning Strategy (PjBL to the critical thinking skills of students of class XI IPA at SMAN 1 Malua on colloidal material. The research population includes students of class XI IPA at SMAN 1 Malua, with random cluster sampling technique sampling. Data were analyzed using independent sample t-test in SPSS 20 for windows at the 0.05 level of significance. The result showed that the significance level of critical thinking skills of 0.001 which indicates that there are differences between students' critical thinking skills that learned using a strategy of Project Based Learning (PjBL with students that learned using conventional methods. Pendidikan abad 21 bertujuan untuk membangun kemampuan intelegensi siswa dalam pembelajaran agar mampu menyelesaikan permasalahan yang dihadapi. salah satu strategi pembelaj aran di abad 21 yang mengembangkan keterampilan siswa ialah Project Based Learning. Melalui proyek yang diberikan, siswa tidak hanya dituntut untuk mencapai tujuan pembelajaran yang telah ditetapkan, tetapi siswa akan lebih terlatih menghadapi dunia kerja yang membutuhkan kemampuan mereka dalam mengakses, mesintesis, mengomunikasikan infomasi, dan bekerja sama memecahkan masalah yang kompleks

  3. The Implementation of Knowledge Strategy-Based Entrepreneurial Capacity to Achieve Sustainable Competitive Advantage

    OpenAIRE

    Widodo

    2013-01-01

    This study aims to develop a model of knowledge strategy-based entrepreneurial capacity to achieve sustainable competitive advantage of rurol banking in Central Java province. The sampling method is ‘‘Purposive sampling’’ by considering the characteristics of the population items, namely: operational experience for at least 5 years and representatives of each area of rurol banking in Semarang, Surakarta and Purwokerto. Then, the sample size is 150 of 251 (59.7%) of top managers of rurol bank...

  4. Evaluating sampling strategies for larval cisco (Coregonus artedi)

    Science.gov (United States)

    Myers, J.T.; Stockwell, J.D.; Yule, D.L.; Black, J.A.

    2008-01-01

    To improve our ability to assess larval cisco (Coregonus artedi) populations in Lake Superior, we conducted a study to compare several sampling strategies. First, we compared density estimates of larval cisco concurrently captured in surface waters with a 2 x 1-m paired neuston net and a 0.5-m (diameter) conical net. Density estimates obtained from the two gear types were not significantly different, suggesting that the conical net is a reasonable alternative to the more cumbersome and costly neuston net. Next, we assessed the effect of tow pattern (sinusoidal versus straight tows) to examine if propeller wash affected larval density. We found no effect of propeller wash on the catchability of larval cisco. Given the availability of global positioning systems, we recommend sampling larval cisco using straight tows to simplify protocols and facilitate straightforward measurements of volume filtered. Finally, we investigated potential trends in larval cisco density estimates by sampling four time periods during the light period of a day at individual sites. Our results indicate no significant trends in larval density estimates during the day. We conclude estimates of larval cisco density across space are not confounded by time at a daily timescale. Well-designed, cost effective surveys of larval cisco abundance will help to further our understanding of this important Great Lakes forage species.

  5. Unsupervised Performance Evaluation Strategy for Bridge Superstructure Based on Fuzzy Clustering and Field Data

    Directory of Open Access Journals (Sweden)

    Yubo Jiao

    2013-01-01

    Full Text Available Performance evaluation of a bridge is critical for determining the optimal maintenance strategy. An unsupervised bridge superstructure state assessment method is proposed in this paper based on fuzzy clustering and bridge field measured data. Firstly, the evaluation index system of bridge is constructed. Secondly, a certain number of bridge health monitoring data are selected as clustering samples to obtain the fuzzy similarity matrix and fuzzy equivalent matrix. Finally, different thresholds are selected to form dynamic clustering maps and determine the best classification based on statistic analysis. The clustering result is regarded as a sample base, and the bridge state can be evaluated by calculating the fuzzy nearness between the unknown bridge state data and the sample base. Nanping Bridge in Jilin Province is selected as the engineering project to verify the effectiveness of the proposed method.

  6. Silicon based ultrafast optical waveform sampling

    DEFF Research Database (Denmark)

    Ji, Hua; Galili, Michael; Pu, Minhao

    2010-01-01

    A 300 nmx450 nmx5 mm silicon nanowire is designed and fabricated for a four wave mixing based non-linear optical gate. Based on this silicon nanowire, an ultra-fast optical sampling system is successfully demonstrated using a free-running fiber laser with a carbon nanotube-based mode-locker as th......A 300 nmx450 nmx5 mm silicon nanowire is designed and fabricated for a four wave mixing based non-linear optical gate. Based on this silicon nanowire, an ultra-fast optical sampling system is successfully demonstrated using a free-running fiber laser with a carbon nanotube-based mode......-locker as the sampling source. A clear eye-diagram of a 320 Gbit/s data signal is obtained. The temporal resolution of the sampling system is estimated to 360 fs....

  7. Comparison of active and passive sampling strategies for the monitoring of pesticide contamination in streams

    Science.gov (United States)

    Assoumani, Azziz; Margoum, Christelle; Guillemain, Céline; Coquery, Marina

    2014-05-01

    The monitoring of water bodies regarding organic contaminants, and the determination of reliable estimates of concentrations are challenging issues, in particular for the implementation of the Water Framework Directive. Several strategies can be applied to collect water samples for the determination of their contamination level. Grab sampling is fast, easy, and requires little logistical and analytical needs in case of low frequency sampling campaigns. However, this technique lacks of representativeness for streams with high variations of contaminant concentrations, such as pesticides in rivers located in small agricultural watersheds. Increasing the representativeness of this sampling strategy implies greater logistical needs and higher analytical costs. Average automated sampling is therefore a solution as it allows, in a single analysis, the determination of more accurate and more relevant estimates of concentrations. Two types of automatic samplings can be performed: time-related sampling allows the assessment of average concentrations, whereas flow-dependent sampling leads to average flux concentrations. However, the purchase and the maintenance of automatic samplers are quite expensive. Passive sampling has recently been developed as an alternative to grab or average automated sampling, to obtain at lower cost, more realistic estimates of the average concentrations of contaminants in streams. These devices allow the passive accumulation of contaminants from large volumes of water, resulting in ultratrace level detection and smoothed integrative sampling over periods ranging from days to weeks. They allow the determination of time-weighted average (TWA) concentrations of the dissolved fraction of target contaminants, but they need to be calibrated in controlled conditions prior to field applications. In other words, the kinetics of the uptake of the target contaminants into the sampler must be studied in order to determine the corresponding sampling rate

  8. Computer-based theory of strategies

    Energy Technology Data Exchange (ETDEWEB)

    Findler, N V

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. Three long-term projects which aim at automatically analyzing and synthesizing strategies are discussed. 27 references.

  9. Patch-based visual tracking with online representative sample selection

    Science.gov (United States)

    Ou, Weihua; Yuan, Di; Li, Donghao; Liu, Bin; Xia, Daoxun; Zeng, Wu

    2017-05-01

    Occlusion is one of the most challenging problems in visual object tracking. Recently, a lot of discriminative methods have been proposed to deal with this problem. For the discriminative methods, it is difficult to select the representative samples for the target template updating. In general, the holistic bounding boxes that contain tracked results are selected as the positive samples. However, when the objects are occluded, this simple strategy easily introduces the noises into the training data set and the target template and then leads the tracker to drift away from the target seriously. To address this problem, we propose a robust patch-based visual tracker with online representative sample selection. Different from previous works, we divide the object and the candidates into several patches uniformly and propose a score function to calculate the score of each patch independently. Then, the average score is adopted to determine the optimal candidate. Finally, we utilize the non-negative least square method to find the representative samples, which are used to update the target template. The experimental results on the object tracking benchmark 2013 and on the 13 challenging sequences show that the proposed method is robust to the occlusion and achieves promising results.

  10. An Energy Efficient Localization Strategy for Outdoor Objects based on Intelligent Light-Intensity Sampling

    OpenAIRE

    Sandnes, Frode Eika

    2010-01-01

    A simple and low cost strategy for implementing pervasive objects that identify and track their own geographical location is proposed. The strategy, which is not reliant on any GIS infrastructure such as GPS, is realized using an electronic artifact with a built in clock, a light sensor, or low-cost digital camera, persistent storage such as flash and sufficient computational circuitry to make elementary trigonometric computations. The object monitors the lighting conditions and thereby detec...

  11. Quantitative learning strategies based on word networks

    Science.gov (United States)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  12. Technical bases and guidance for the use of composite soil sampling for demonstrating compliance with radiological release criteria

    International Nuclear Information System (INIS)

    Vitkus, Timothy J.

    2012-01-01

    This guidance provides information on methodologies and the technical bases that licensees should consider for incorporating composite sampling strategies into final status survey (FSS) plans. In addition, this guidance also includes appropriate uses of composite sampling for generating the data for other decommissioning site investigations such as characterization or other preliminary site investigations

  13. Technical bases and guidance for the use of composite soil sampling for demonstrating compliance with radiological release criteria

    Energy Technology Data Exchange (ETDEWEB)

    Vitkus, Timothy J. [Oak Ridge Institute for Science and Education, Oak Ridge, TN (United States). Independent Environmental Assessment and Verification Program

    2012-04-24

    This guidance provides information on methodologies and the technical bases that licensees should consider for incorporating composite sampling strategies into final status survey (FSS) plans. In addition, this guidance also includes appropriate uses of composite sampling for generating the data for other decommissioning site investigations such as characterization or other preliminary site investigations.

  14. Adaptive Angular Sampling for SPECT Imaging

    OpenAIRE

    Li, Nan; Meng, Ling-Jian

    2011-01-01

    This paper presents an analytical approach for performing adaptive angular sampling in single photon emission computed tomography (SPECT) imaging. It allows for a rapid determination of the optimum sampling strategy that minimizes image variance in regions-of-interest (ROIs). The proposed method consists of three key components: (a) a set of close-form equations for evaluating image variance and resolution attainable with a given sampling strategy, (b) a gradient-based algor...

  15. Interactive Control System, Intended Strategy, Implemented Strategy dan Emergent Strategy

    Directory of Open Access Journals (Sweden)

    Tubagus Ismail

    2012-09-01

    Full Text Available The purpose of this study was to examine the relationship between management control system (MCS and strategy formation processes, namely: intended strategy, emergent strategy and impelemented strategy. The focus of MCS in this study was interactive control system. The study was based on Structural Equation Modeling (SEM as its multivariate analyses instrument. The samples were upper middle managers of manufacturing company in Banten Province, DKI Jakarta Province and West Java Province. AMOS Software 16 program is used as an additional instrument to resolve the problem in SEM modeling. The study found that interactive control system brought a positive and significant influence on Intended strategy; interactive control system brought a positive and significant influence on implemented strategy; interactive control system brought a positive and significant influence on emergent strategy. The limitation of this study is that our empirical model only used one way relationship between the process of strategy formation and interactive control system.

  16. Sampling strategies to measure the prevalence of common recurrent infections in longitudinal studies

    Directory of Open Access Journals (Sweden)

    Luby Stephen P

    2010-08-01

    Full Text Available Abstract Background Measuring recurrent infections such as diarrhoea or respiratory infections in epidemiological studies is a methodological challenge. Problems in measuring the incidence of recurrent infections include the episode definition, recall error, and the logistics of close follow up. Longitudinal prevalence (LP, the proportion-of-time-ill estimated by repeated prevalence measurements, is an alternative measure to incidence of recurrent infections. In contrast to incidence which usually requires continuous sampling, LP can be measured at intervals. This study explored how many more participants are needed for infrequent sampling to achieve the same study power as frequent sampling. Methods We developed a set of four empirical simulation models representing low and high risk settings with short or long episode durations. The model was used to evaluate different sampling strategies with different assumptions on recall period and recall error. Results The model identified three major factors that influence sampling strategies: (1 the clustering of episodes in individuals; (2 the duration of episodes; (3 the positive correlation between an individual's disease incidence and episode duration. Intermittent sampling (e.g. 12 times per year often requires only a slightly larger sample size compared to continuous sampling, especially in cluster-randomized trials. The collection of period prevalence data can lead to highly biased effect estimates if the exposure variable is associated with episode duration. To maximize study power, recall periods of 3 to 7 days may be preferable over shorter periods, even if this leads to inaccuracy in the prevalence estimates. Conclusion Choosing the optimal approach to measure recurrent infections in epidemiological studies depends on the setting, the study objectives, study design and budget constraints. Sampling at intervals can contribute to making epidemiological studies and trials more efficient, valid

  17. OUTPACE long duration stations: physical variability, context of biogeochemical sampling, and evaluation of sampling strategy

    Directory of Open Access Journals (Sweden)

    A. de Verneil

    2018-04-01

    Full Text Available Research cruises to quantify biogeochemical fluxes in the ocean require taking measurements at stations lasting at least several days. A popular experimental design is the quasi-Lagrangian drifter, often mounted with in situ incubations or sediment traps that follow the flow of water over time. After initial drifter deployment, the ship tracks the drifter for continuing measurements that are supposed to represent the same water environment. An outstanding question is how to best determine whether this is true. During the Oligotrophy to UlTra-oligotrophy PACific Experiment (OUTPACE cruise, from 18 February to 3 April 2015 in the western tropical South Pacific, three separate stations of long duration (five days over the upper 500 m were conducted in this quasi-Lagrangian sampling scheme. Here we present physical data to provide context for these three stations and to assess whether the sampling strategy worked, i.e., that a single body of water was sampled. After analyzing tracer variability and local water circulation at each station, we identify water layers and times where the drifter risks encountering another body of water. While almost no realization of this sampling scheme will be truly Lagrangian, due to the presence of vertical shear, the depth-resolved observations during the three stations show most layers sampled sufficiently homogeneous physical environments during OUTPACE. By directly addressing the concerns raised by these quasi-Lagrangian sampling platforms, a protocol of best practices can begin to be formulated so that future research campaigns include the complementary datasets and analyses presented here to verify the appropriate use of the drifter platform.

  18. Developing a fuzzy ANP model for performance appraisal based on firm strategy

    Directory of Open Access Journals (Sweden)

    Seid Mohammad Reza Mirahmadi

    2018-10-01

    Full Text Available The purpose of this study is to develop a fuzzy Analytic Network Process (ANP model that has the ability to evaluate employee performance in different strategies. A team of experts in the field of strategic human resource management and senior management of an organization engaged in steel production were involved in the study. The data collection tool was a questionnaire that was designed based on the criteria of organization's performance appraisal system. The results showed that in cost leadership strategy, compliance of work hierarchy, quantity of work and the ability to make important decisions constituted the highest coefficients, while in the focus strategy, participate in group work, power of supervision and administration and decision making ability had the highest importance. In differentiation strategy, innovation and creativity, quality and offering constructive suggestions received higher ratings than other criteria. Finally, the developed model was used to evaluate the performance of a sample employee

  19. Strategy-based listening and pragmatic comprehension

    Directory of Open Access Journals (Sweden)

    Corsetti, Cristiane Ruzicki

    2014-01-01

    Full Text Available This article addresses the role of strategy-based listening as an alternative methodological approach to develop pragmatic comprehension in L2 contexts. Pragmatic comprehension refers to the understanding of speech acts and conversational implicatures. Listening comprehension comprises both bottom-up and top-down processes. Strategy-based listening encompasses the activation of pragmatic knowledge through pre-listening activities and the development of specific listening micro-skills. An empirical project which included a classroom project carried out with a group of eight learners preparing for the IELTS examination in 2009 corroborated the following assumptions: in order to achieve listening proficiency, learners need practice in making inferences as semantic and pragmatic inferences are embedded in verbal communication; semantic and pragmatic aspects affecting the meaning of utterances can be highlighted via comprehension activities focusing on specific listening subskills. The results of the classroom project suggested that strategy-based listening is potentially capable of directly enhancing pragmatic comprehension but were inconclusive with regards to pragmatic production

  20. Neural Substrates of Similarity and Rule-based Strategies in Judgment

    Directory of Open Access Journals (Sweden)

    Bettina eVon Helversen

    2014-10-01

    Full Text Available Making accurate judgments is a core human competence and a prerequisite for success in many areas of life. Plenty of evidence exists that people can employ different judgment strategies to solve identical judgment problems. In categorization, it has been demonstrated that similarity-based and rule-based strategies are associated with activity in different brain regions. Building on this research, the present work tests whether solving two identical judgment problems recruits different neural substrates depending on people's judgment strategies. Combining cognitive modeling of judgment strategies at the behavioral level with functional magnetic resonance imaging (fMRI, we compare brain activity when using two archetypal judgment strategies: a similarity-based exemplar strategy and a rule-based heuristic strategy. Using an exemplar-based strategy should recruit areas involved in long-term memory processes to a larger extent than a heuristic strategy. In contrast, using a heuristic strategy should recruit areas involved in the application of rules to a larger extent than an exemplar-based strategy. Largely consistent with our hypotheses, we found that using an exemplar-based strategy led to relatively higher BOLD activity in the anterior prefrontal and inferior parietal cortex, presumably related to retrieval and selective attention processes. In contrast, using a heuristic strategy led to relatively higher activity in areas in the dorsolateral prefrontal and the temporal-parietal cortex associated with cognitive control and information integration. Thus, even when people solve identical judgment problems, different neural substrates can be recruited depending on the judgment strategy involved.

  1. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    Science.gov (United States)

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  2. Clinical usefulness of limited sampling strategies for estimating AUC of proton pump inhibitors.

    Science.gov (United States)

    Niioka, Takenori

    2011-03-01

    Cytochrome P450 (CYP) 2C19 (CYP2C19) genotype is regarded as a useful tool to predict area under the blood concentration-time curve (AUC) of proton pump inhibitors (PPIs). In our results, however, CYP2C19 genotypes had no influence on AUC of all PPIs during fluvoxamine treatment. These findings suggest that CYP2C19 genotyping is not always a good indicator for estimating AUC of PPIs. Limited sampling strategies (LSS) were developed to estimate AUC simply and accurately. It is important to minimize the number of blood samples because of patient's acceptance. This article reviewed the usefulness of LSS for estimating AUC of three PPIs (omeprazole: OPZ, lansoprazole: LPZ and rabeprazole: RPZ). The best prediction formulas in each PPI were AUC(OPZ)=9.24 x C(6h)+2638.03, AUC(LPZ)=12.32 x C(6h)+3276.09 and AUC(RPZ)=1.39 x C(3h)+7.17 x C(6h)+344.14, respectively. In order to optimize the sampling strategy of LPZ, we tried to establish LSS for LPZ using a time point within 3 hours through the property of pharmacokinetics of its enantiomers. The best prediction formula using the fewest sampling points (one point) was AUC(racemic LPZ)=6.5 x C(3h) of (R)-LPZ+13.7 x C(3h) of (S)-LPZ-9917.3 x G1-14387.2×G2+7103.6 (G1: homozygous extensive metabolizer is 1 and the other genotypes are 0; G2: heterozygous extensive metabolizer is 1 and the other genotypes are 0). Those strategies, plasma concentration monitoring at one or two time-points, might be more suitable for AUC estimation than reference to CYP2C19 genotypes, particularly in the case of coadministration of CYP mediators.

  3. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards

    OpenAIRE

    Bornstein, Marc H.; Jager, Justin; Putnick, Diane L.

    2013-01-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study’s t...

  4. Knowledge Governance Strategies in Project-based Organizations

    DEFF Research Database (Denmark)

    Pemsel, Sofia; Müller, Ralf; Söderlund, Jonas

    2016-01-01

    Knowledge governance (KG) aims at strategically influencing knowledge processes by implementing governance mechanisms. Little is known about whether, how, or why such strategies differ among firms. We utilize a large-scale empirical study of 20 organizations to develop a typology of KG strategies...... in project-based organizations; we then explore how these strategies emerge and affect organizational knowledge processes. Six strategies are identified: Protector, Deliverer, Polisher, Explorer, Supporter, and Analyzer. This paper posits a multi-level categorization model to facilitate comparisons among KG...... strategies. We uncover three main drivers of organizations' chosen knowledge governance strategies-namely, attitudes about humans, knowledge, and knowledge control....

  5. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Science.gov (United States)

    Mitra, Abhishek; Skrzypczak, Magdalena; Ginalski, Krzysztof; Rowicka, Maga

    2015-01-01

    Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding). Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants). Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol) that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively, we discuss how

  6. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Directory of Open Access Journals (Sweden)

    Abhishek Mitra

    Full Text Available Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding. Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants. Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively

  7. Value of recruitment strategies used in a primary care practice-based trial.

    Science.gov (United States)

    Ellis, Shellie D; Bertoni, Alain G; Bonds, Denise E; Clinch, C Randall; Balasubramanyam, Aarthi; Blackwell, Caroline; Chen, Haiying; Lischke, Michael; Goff, David C

    2007-05-01

    "Physicians-recruiting-physicians" is the preferred recruitment approach for practice-based research. However, yields are variable; and the approach can be costly and lead to biased, unrepresentative samples. We sought to explore the potential efficiency of alternative methods. We conducted a retrospective analysis of the yield and cost of 10 recruitment strategies used to recruit primary care practices to a randomized trial to improve cardiovascular disease risk factor management. We measured response and recruitment yields and the resources used to estimate the value of each strategy. Providers at recruited practices were surveyed about motivation for participation. Response to 6 opt-in marketing strategies was 0.40% (53/13290), ranging from 0% to 2.86% by strategy; 33.96% (18/53) of responders were recruited to the study. Of those recruited from opt-out strategies, 8.68% joined the study, ranging from 5.35% to 41.67% per strategy. A strategy that combined both opt-in and opt-out approaches resulted in a 51.14% (90/176) response and a 10.80% (19/90) recruitment rate. Cost of recruitment was $613 per recruited practice. Recruitment approaches based on in-person meetings (41.67%), previous relationships (33.33%), and borrowing an Area Health Education Center's established networks (10.80%), yielded the most recruited practices per effort and were most cost efficient. Individual providers who chose to participate were motivated by interest in improving their clinical practice (80.5%); contributing to CVD primary prevention (54.4%); and invigorating their practice with new ideas (42.1%). This analysis provides suggestions for future recruitment efforts and research. Translational studies with limited funds could consider multi-modal recruitment approaches including in-person presentations to practice groups and exploitation of previous relationships, which require the providers to opt-out, and interactive opt-in approaches which rely on borrowed networks. These

  8. Improvement of a sample preparation method assisted by sodium deoxycholate for mass-spectrometry-based shotgun membrane proteomics.

    Science.gov (United States)

    Lin, Yong; Lin, Haiyan; Liu, Zhonghua; Wang, Kunbo; Yan, Yujun

    2014-11-01

    In current shotgun-proteomics-based biological discovery, the identification of membrane proteins is a challenge. This is especially true for integral membrane proteins due to their highly hydrophobic nature and low abundance. Thus, much effort has been directed at sample preparation strategies such as use of detergents, chaotropes, and organic solvents. We previously described a sample preparation method for shotgun membrane proteomics, the sodium deoxycholate assisted method, which cleverly circumvents many of the challenges associated with traditional sample preparation methods. However, the method is associated with significant sample loss due to the slightly weaker extraction/solubilization ability of sodium deoxycholate when it is used at relatively low concentrations such as 1%. Hence, we present an enhanced sodium deoxycholate sample preparation strategy that first uses a high concentration of sodium deoxycholate (5%) to lyse membranes and extract/solubilize hydrophobic membrane proteins, and then dilutes the detergent to 1% for a more efficient digestion. We then applied the improved method to shotgun analysis of proteins from rat liver membrane enriched fraction. Compared with other representative sample preparation strategies including our previous sodium deoxycholate assisted method, the enhanced sodium deoxycholate method exhibited superior sensitivity, coverage, and reliability for the identification of membrane proteins particularly those with high hydrophobicity and/or multiple transmembrane domains. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. SPR based immunosensor for detection of Legionella pneumophila in water samples

    Science.gov (United States)

    Enrico, De Lorenzis; Manera, Maria G.; Montagna, Giovanni; Cimaglia, Fabio; Chiesa, Maurizio; Poltronieri, Palmiro; Santino, Angelo; Rella, Roberto

    2013-05-01

    Detection of legionellae by water sampling is an important factor in epidemiological investigations of Legionnaires' disease and its prevention. To avoid labor-intensive problems with conventional methods, an alternative, highly sensitive and simple method is proposed for detecting L. pneumophila in aqueous samples. A compact Surface Plasmon Resonance (SPR) instrumentation prototype, provided with proper microfluidics tools, is built. The developed immunosensor is capable of dynamically following the binding between antigens and the corresponding antibody molecules immobilized on the SPR sensor surface. A proper immobilization strategy is used in this work that makes use of an important efficient step aimed at the orientation of antibodies onto the sensor surface. The feasibility of the integration of SPR-based biosensing setups with microfluidic technologies, resulting in a low-cost and portable biosensor is demonstrated.

  10. Graphene-based sample supports for in situ high-resolution TEM electrical investigations

    International Nuclear Information System (INIS)

    Westenfelder, B; Scholz, F; Meyer, J C; Biskupek, J; Algara-Siller, G; Lechner, L G; Kaiser, U; Kusterer, J; Kohn, E; Krill, C E III

    2011-01-01

    Specially designed transmission electron microscopy (TEM) sample carriers have been developed to enable atomically resolved studies of the heat-induced evolution of adsorbates on graphene and their influence on electrical conductivity. Here, we present a strategy for graphene-based carrier realization, evaluating its design with respect to fabrication effort and applications potential. We demonstrate that electrical current can lead to very high temperatures in suspended graphene membranes, and we determine that current-induced cleaning of graphene results from Joule heating.

  11. Representativeness-based sampling network design for the State of Alaska

    Science.gov (United States)

    Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove

    2013-01-01

    Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...

  12. Gaussian process based intelligent sampling for measuring nano-structure surfaces

    Science.gov (United States)

    Sun, L. J.; Ren, M. J.; Yin, Y. H.

    2016-09-01

    Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.

  13. Parallel Representation of Value-Based and Finite State-Based Strategies in the Ventral and Dorsal Striatum.

    Directory of Open Access Journals (Sweden)

    Makoto Ito

    2015-11-01

    Full Text Available Previous theoretical studies of animal and human behavioral learning have focused on the dichotomy of the value-based strategy using action value functions to predict rewards and the model-based strategy using internal models to predict environmental states. However, animals and humans often take simple procedural behaviors, such as the "win-stay, lose-switch" strategy without explicit prediction of rewards or states. Here we consider another strategy, the finite state-based strategy, in which a subject selects an action depending on its discrete internal state and updates the state depending on the action chosen and the reward outcome. By analyzing choice behavior of rats in a free-choice task, we found that the finite state-based strategy fitted their behavioral choices more accurately than value-based and model-based strategies did. When fitted models were run autonomously with the same task, only the finite state-based strategy could reproduce the key feature of choice sequences. Analyses of neural activity recorded from the dorsolateral striatum (DLS, the dorsomedial striatum (DMS, and the ventral striatum (VS identified significant fractions of neurons in all three subareas for which activities were correlated with individual states of the finite state-based strategy. The signal of internal states at the time of choice was found in DMS, and for clusters of states was found in VS. In addition, action values and state values of the value-based strategy were encoded in DMS and VS, respectively. These results suggest that both the value-based strategy and the finite state-based strategy are implemented in the striatum.

  14. Parallel Representation of Value-Based and Finite State-Based Strategies in the Ventral and Dorsal Striatum.

    Science.gov (United States)

    Ito, Makoto; Doya, Kenji

    2015-11-01

    Previous theoretical studies of animal and human behavioral learning have focused on the dichotomy of the value-based strategy using action value functions to predict rewards and the model-based strategy using internal models to predict environmental states. However, animals and humans often take simple procedural behaviors, such as the "win-stay, lose-switch" strategy without explicit prediction of rewards or states. Here we consider another strategy, the finite state-based strategy, in which a subject selects an action depending on its discrete internal state and updates the state depending on the action chosen and the reward outcome. By analyzing choice behavior of rats in a free-choice task, we found that the finite state-based strategy fitted their behavioral choices more accurately than value-based and model-based strategies did. When fitted models were run autonomously with the same task, only the finite state-based strategy could reproduce the key feature of choice sequences. Analyses of neural activity recorded from the dorsolateral striatum (DLS), the dorsomedial striatum (DMS), and the ventral striatum (VS) identified significant fractions of neurons in all three subareas for which activities were correlated with individual states of the finite state-based strategy. The signal of internal states at the time of choice was found in DMS, and for clusters of states was found in VS. In addition, action values and state values of the value-based strategy were encoded in DMS and VS, respectively. These results suggest that both the value-based strategy and the finite state-based strategy are implemented in the striatum.

  15. Utilizing the ultrasensitive Schistosoma up-converting phosphor lateral flow circulating anodic antigen (UCP-LF CAA) assay for sample pooling-strategies.

    Science.gov (United States)

    Corstjens, Paul L A M; Hoekstra, Pytsje T; de Dood, Claudia J; van Dam, Govert J

    2017-11-01

    Methodological applications of the high sensitivity genus-specific Schistosoma CAA strip test, allowing detection of single worm active infections (ultimate sensitivity), are discussed for efficient utilization in sample pooling strategies. Besides relevant cost reduction, pooling of samples rather than individual testing can provide valuable data for large scale mapping, surveillance, and monitoring. The laboratory-based CAA strip test utilizes luminescent quantitative up-converting phosphor (UCP) reporter particles and a rapid user-friendly lateral flow (LF) assay format. The test includes a sample preparation step that permits virtually unlimited sample concentration with urine, reaching ultimate sensitivity (single worm detection) at 100% specificity. This facilitates testing large urine pools from many individuals with minimal loss of sensitivity and specificity. The test determines the average CAA level of the individuals in the pool thus indicating overall worm burden and prevalence. When requiring test results at the individual level, smaller pools need to be analysed with the pool-size based on expected prevalence or when unknown, on the average CAA level of a larger group; CAA negative pools do not require individual test results and thus reduce the number of tests. Straightforward pooling strategies indicate that at sub-population level the CAA strip test is an efficient assay for general mapping, identification of hotspots, determination of stratified infection levels, and accurate monitoring of mass drug administrations (MDA). At the individual level, the number of tests can be reduced i.e. in low endemic settings as the pool size can be increased as opposed to prevalence decrease. At the sub-population level, average CAA concentrations determined in urine pools can be an appropriate measure indicating worm burden. Pooling strategies allowing this type of large scale testing are feasible with the various CAA strip test formats and do not affect

  16. The use of a genetic algorithm-based search strategy in geostatistics: application to a set of anisotropic piezometric head data

    Science.gov (United States)

    Abedini, M. J.; Nasseri, M.; Burn, D. H.

    2012-04-01

    In any geostatistical study, an important consideration is the choice of an appropriate, repeatable, and objective search strategy that controls the nearby samples to be included in the location-specific estimation procedure. Almost all geostatistical software available in the market puts the onus on the user to supply search strategy parameters in a heuristic manner. These parameters are solely controlled by geographical coordinates that are defined for the entire area under study, and the user has no guidance as to how to choose these parameters. The main thesis of the current study is that the selection of search strategy parameters has to be driven by data—both the spatial coordinates and the sample values—and cannot be chosen beforehand. For this purpose, a genetic-algorithm-based ordinary kriging with moving neighborhood technique is proposed. The search capability of a genetic algorithm is exploited to search the feature space for appropriate, either local or global, search strategy parameters. Radius of circle/sphere and/or radii of standard or rotated ellipse/ellipsoid are considered as the decision variables to be optimized by GA. The superiority of GA-based ordinary kriging is demonstrated through application to the Wolfcamp Aquifer piezometric head data. Assessment of numerical results showed that definition of search strategy parameters based on both geographical coordinates and sample values improves cross-validation statistics when compared with that based on geographical coordinates alone. In the case of a variable search neighborhood for each estimation point, optimization of local search strategy parameters for an elliptical support domain—the orientation of which is dictated by anisotropic axes—via GA was able to capture the dynamics of piezometric head in west Texas/New Mexico in an efficient way.

  17. Individual-based and group-based occupational exposure assessment: some equations to evaluate different strategies.

    NARCIS (Netherlands)

    Tielemans, E.; Kupper, L.L.; Kromhout, H.; Heederik, D.; Houba, R.

    1998-01-01

    Basically, two strategies can be considered for the analysis of hazardous pollutants in the work environment: group-based and individual-based strategies. This paper provides existing and recently derived equations for both strategies describing the influence of several factors on attenuation and on

  18. Strategy selection in cue-based decision making.

    Science.gov (United States)

    Bryant, David J

    2014-06-01

    People can make use of a range of heuristic and rational, compensatory strategies to perform a multiple-cue judgment task. It has been proposed that people are sensitive to the amount of cognitive effort required to employ decision strategies. Experiment 1 employed a dual-task methodology to investigate whether participants' preference for heuristic versus compensatory decision strategies can be altered by increasing the cognitive demands of the task. As indicated by participants' decision times, a secondary task interfered more with the performance of a heuristic than compensatory decision strategy but did not affect the proportions of participants using either type of strategy. A stimulus set effect suggested that the conjunction of cue salience and cue validity might play a determining role in strategy selection. The results of Experiment 2 indicated that when a perceptually salient cue was also the most valid, the majority of participants preferred a single-cue heuristic strategy. Overall, the results contradict the view that heuristics are more likely to be adopted when a task is made more cognitively demanding. It is argued that people employ 2 learning processes during training, one an associative learning process in which cue-outcome associations are developed by sampling multiple cues, and another that involves the sequential examination of single cues to serve as a basis for a single-cue heuristic.

  19. A preliminary evaluation of comminution and sampling strategies for radioactive cemented waste

    Energy Technology Data Exchange (ETDEWEB)

    Bilodeau, M.; Lastra, R.; Bouzoubaa, N. [Natural Resources Canada, Ottawa, ON (Canada); Chapman, M. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2011-07-01

    Lixiviation of Hg, U and Cs contaminants and micro-encapsulation of cemented radioactive waste (CRW) are the two main components of a CRW stabilization research project carried out at Natural Resources Canada in collaboration with Atomic Energy of Canada Limited. Unmolding CRW from the storage pail, its fragmentation into a size range suitable for both processes and the collection of a representative sample are three essential steps for providing optimal material conditions for the two studies. Separation of wires, metals and plastic incorporated into CRW samples is also required. A comminution and sampling strategy was developed to address all those needs. Dust emissions and other health and safety concerns were given full consideration. Surrogate cemented waste (SCW) was initially used for this comminution study where Cu was used as a substitute for U and Hg. SCW was characterized as a friable material through the measurement of the Bond work index of 7.7 kWh/t. A mineralogical investigation and the calibration of material heterogeneity parameters of the sampling error model showed that Cu, Hg and Cs are finely disseminated in the cement matrix. A sampling strategy was built from the model and successfully validated with radioactive waste. A larger than expected sampling error was observed with U due to the formation of large U solid phases, which were not observed with the Cu tracer. SCW samples were crushed and ground under different rock fragmentation mechanisms: compression (jaw and cone crushers, rod mill), impact (ball mill), attrition, high voltage disintegration and high pressure water (and liquid nitrogen) jetting. Cryogenic grinding was also tested with the attrition mill. Crushing and grinding technologies were assessed against criteria that were gathered from literature surveys, experiential know-how and discussion with the client and field experts. Water jetting and its liquid nitrogen variant were retained for pail cutting and waste unmolding while

  20. A preliminary evaluation of comminution and sampling strategies for radioactive cemented waste

    International Nuclear Information System (INIS)

    Bilodeau, M.; Lastra, R.; Bouzoubaa, N.; Chapman, M.

    2011-01-01

    Lixiviation of Hg, U and Cs contaminants and micro-encapsulation of cemented radioactive waste (CRW) are the two main components of a CRW stabilization research project carried out at Natural Resources Canada in collaboration with Atomic Energy of Canada Limited. Unmolding CRW from the storage pail, its fragmentation into a size range suitable for both processes and the collection of a representative sample are three essential steps for providing optimal material conditions for the two studies. Separation of wires, metals and plastic incorporated into CRW samples is also required. A comminution and sampling strategy was developed to address all those needs. Dust emissions and other health and safety concerns were given full consideration. Surrogate cemented waste (SCW) was initially used for this comminution study where Cu was used as a substitute for U and Hg. SCW was characterized as a friable material through the measurement of the Bond work index of 7.7 kWh/t. A mineralogical investigation and the calibration of material heterogeneity parameters of the sampling error model showed that Cu, Hg and Cs are finely disseminated in the cement matrix. A sampling strategy was built from the model and successfully validated with radioactive waste. A larger than expected sampling error was observed with U due to the formation of large U solid phases, which were not observed with the Cu tracer. SCW samples were crushed and ground under different rock fragmentation mechanisms: compression (jaw and cone crushers, rod mill), impact (ball mill), attrition, high voltage disintegration and high pressure water (and liquid nitrogen) jetting. Cryogenic grinding was also tested with the attrition mill. Crushing and grinding technologies were assessed against criteria that were gathered from literature surveys, experiential know-how and discussion with the client and field experts. Water jetting and its liquid nitrogen variant were retained for pail cutting and waste unmolding while

  1. A radial sampling strategy for uniform k-space coverage with retrospective respiratory gating in 3D ultrashort-echo-time lung imaging.

    Science.gov (United States)

    Park, Jinil; Shin, Taehoon; Yoon, Soon Ho; Goo, Jin Mo; Park, Jang-Yeon

    2016-05-01

    The purpose of this work was to develop a 3D radial-sampling strategy which maintains uniform k-space sample density after retrospective respiratory gating, and demonstrate its feasibility in free-breathing ultrashort-echo-time lung MRI. A multi-shot, interleaved 3D radial sampling function was designed by segmenting a single-shot trajectory of projection views such that each interleaf samples k-space in an incoherent fashion. An optimal segmentation factor for the interleaved acquisition was derived based on an approximate model of respiratory patterns such that radial interleaves are evenly accepted during the retrospective gating. The optimality of the proposed sampling scheme was tested by numerical simulations and phantom experiments using human respiratory waveforms. Retrospectively, respiratory-gated, free-breathing lung MRI with the proposed sampling strategy was performed in healthy subjects. The simulation yielded the most uniform k-space sample density with the optimal segmentation factor, as evidenced by the smallest standard deviation of the number of neighboring samples as well as minimal side-lobe energy in the point spread function. The optimality of the proposed scheme was also confirmed by minimal image artifacts in phantom images. Human lung images showed that the proposed sampling scheme significantly reduced streak and ring artifacts compared with the conventional retrospective respiratory gating while suppressing motion-related blurring compared with full sampling without respiratory gating. In conclusion, the proposed 3D radial-sampling scheme can effectively suppress the image artifacts due to non-uniform k-space sample density in retrospectively respiratory-gated lung MRI by uniformly distributing gated radial views across the k-space. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Innovative recruitment using online networks: lessons learned from an online study of alcohol and other drug use utilizing a web-based, respondent-driven sampling (webRDS) strategy.

    Science.gov (United States)

    Bauermeister, José A; Zimmerman, Marc A; Johns, Michelle M; Glowacki, Pietreck; Stoddard, Sarah; Volz, Erik

    2012-09-01

    We used a web version of Respondent-Driven Sampling (webRDS) to recruit a sample of young adults (ages 18-24) and examined whether this strategy would result in alcohol and other drug (AOD) prevalence estimates comparable to national estimates (National Survey on Drug Use and Health [NSDUH]). We recruited 22 initial participants (seeds) via Facebook to complete a web survey examining AOD risk correlates. Sequential, incentivized recruitment continued until our desired sample size was achieved. After correcting for webRDS clustering effects, we contrasted our AOD prevalence estimates (past 30 days) to NSDUH estimates by comparing the 95% confidence intervals of prevalence estimates. We found comparable AOD prevalence estimates between our sample and NSDUH for the past 30 days for alcohol, marijuana, cocaine, Ecstasy (3,4-methylenedioxymethamphetamine, or MDMA), and hallucinogens. Cigarette use was lower than NSDUH estimates. WebRDS may be a suitable strategy to recruit young adults online. We discuss the unique strengths and challenges that may be encountered by public health researchers using webRDS methods.

  3. Standard operating procedures for collection of soil and sediment samples for the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study

    Science.gov (United States)

    Fisher, Shawn C.; Reilly, Timothy J.; Jones, Daniel K.; Benzel, William M.; Griffin, Dale W.; Loftin, Keith A.; Iwanowicz, Luke R.; Cohl, Jonathan A.

    2015-12-17

    An understanding of the effects on human and ecological health brought by major coastal storms or flooding events is typically limited because of a lack of regionally consistent baseline and trends data in locations proximal to potential contaminant sources and mitigation activities, sensitive ecosystems, and recreational facilities where exposures are probable. In an attempt to close this gap, the U.S. Geological Survey (USGS) has implemented the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study to collect regional sediment-quality data prior to and in response to future coastal storms. The standard operating procedure (SOP) detailed in this document serves as the sample-collection protocol for the SCoRR strategy by providing step-by-step instructions for site preparation, sample collection and processing, and shipping of soil and surficial sediment (for example, bed sediment, marsh sediment, or beach material). The objectives of the SCoRR strategy pilot study are (1) to create a baseline of soil-, sand-, marsh sediment-, and bed-sediment-quality data from sites located in the coastal counties from Maine to Virginia based on their potential risk of being contaminated in the event of a major coastal storm or flooding (defined as Resiliency mode); and (2) respond to major coastal storms and flooding by reoccupying select baseline sites and sampling within days of the event (defined as Response mode). For both modes, samples are collected in a consistent manner to minimize bias and maximize quality control by ensuring that all sampling personnel across the region collect, document, and process soil and sediment samples following the procedures outlined in this SOP. Samples are analyzed using four USGS-developed screening methods—inorganic geochemistry, organic geochemistry, pathogens, and biological assays—which are also outlined in this SOP. Because the SCoRR strategy employs a multi-metric approach for sample analyses, this

  4. Problems with sampling desert tortoises: A simulation analysis based on field data

    Science.gov (United States)

    Freilich, J.E.; Camp, R.J.; Duda, J.J.; Karl, A.E.

    2005-01-01

    The desert tortoise (Gopherus agassizii) was listed as a U.S. threatened species in 1990 based largely on population declines inferred from mark-recapture surveys of 2.59-km2 (1-mi2) plots. Since then, several census methods have been proposed and tested, but all methods still pose logistical or statistical difficulties. We conducted computer simulations using actual tortoise location data from 2 1-mi2 plot surveys in southern California, USA, to identify strengths and weaknesses of current sampling strategies. We considered tortoise population estimates based on these plots as "truth" and then tested various sampling methods based on sampling smaller plots or transect lines passing through the mile squares. Data were analyzed using Schnabel's mark-recapture estimate and program CAPTURE. Experimental subsampling with replacement of the 1-mi2 data using 1-km2 and 0.25-km2 plot boundaries produced data sets of smaller plot sizes, which we compared to estimates from the 1-mi 2 plots. We also tested distance sampling by saturating a 1-mi 2 site with computer simulated transect lines, once again evaluating bias in density estimates. Subsampling estimates from 1-km2 plots did not differ significantly from the estimates derived at 1-mi2. The 0.25-km2 subsamples significantly overestimated population sizes, chiefly because too few recaptures were made. Distance sampling simulations were biased 80% of the time and had high coefficient of variation to density ratios. Furthermore, a prospective power analysis suggested limited ability to detect population declines as high as 50%. We concluded that poor performance and bias of both sampling procedures was driven by insufficient sample size, suggesting that all efforts must be directed to increasing numbers found in order to produce reliable results. Our results suggest that present methods may not be capable of accurately estimating desert tortoise populations.

  5. Activity-Based Information Integrating the operations strategy

    Directory of Open Access Journals (Sweden)

    José Augusto da Rocha de Araujo

    2005-12-01

    Full Text Available In the globalized world, companies seek for new operations strategies to ensure world corporate success. This article analyzes how the cost management models – both traditional and activity-based, aid the planning and management of corporate globalized operations. The efficacy of the models application depends on their alignment with the competitive strategy. Companies must evaluate the nature of the competition and its competitive priorities; they should then define the necessary and sufficient dependence level on costs information. In this article, three dependence levels are presented: operational, decision support and strategic control. The result of the research shows the importance of alignment between the cost management model and the competitive strategy for corporate success, and confirms the adequacy of the activity-based costing model as a supporting tool for decision taking in a global strategy. Case studies in world class companies in Brazil are presented.

  6. Efficient community-based control strategies in adaptive networks

    International Nuclear Information System (INIS)

    Yang Hui; Tang Ming; Zhang Haifeng

    2012-01-01

    Most studies on adaptive networks concentrate on the properties of steady state, but neglect transient dynamics. In this study, we pay attention to the emergence of community structure in the transient process and the effects of community-based control strategies on epidemic spreading. First, by normalizing the modularity, we investigate the evolution of community structure during the transient process, and find that a strong community structure is induced by the rewiring mechanism in the early stage of epidemic dynamics, which, remarkably, delays the outbreak of disease. We then study the effects of control strategies started at different stages on the prevalence. Both immunization and quarantine strategies indicate that it is not ‘the earlier, the better’ for the implementation of control measures. And the optimal control effect is obtained if control measures can be efficiently implemented in the period of a strong community structure. For the immunization strategy, immunizing the susceptible nodes on susceptible–infected links and immunizing susceptible nodes randomly have similar control effects. However, for the quarantine strategy, quarantining the infected nodes on susceptible–infected links can yield a far better result than quarantining infected nodes randomly. More significantly, the community-based quarantine strategy performs better than the community-based immunization strategy. This study may shed new light on the forecast and the prevention of epidemics among humans. (paper)

  7. Analytical strategies for uranium determination in natural water and industrial effluents samples

    International Nuclear Information System (INIS)

    Santos, Juracir Silva

    2011-01-01

    The work was developed under the project 993/2007 - 'Development of analytical strategies for uranium determination in environmental and industrial samples - Environmental monitoring in the Caetite city, Bahia, Brazil' and made possible through a partnership established between Universidade Federal da Bahia and the Comissao Nacional de Energia Nuclear. Strategies were developed to uranium determination in natural water and effluents of uranium mine. The first one was a critical evaluation of the determination of uranium by inductively coupled plasma optical emission spectrometry (ICP OES) performed using factorial and Doehlert designs involving the factors: acid concentration, radio frequency power and nebuliser gas flow rate. Five emission lines were simultaneously studied (namely: 367.007, 385.464, 385.957, 386.592 and 409.013 nm), in the presence of HN0 3 , H 3 C 2 00H or HCI. The determinations in HN0 3 medium were the most sensitive. Among the factors studied, the gas flow rate was the most significant for the five emission lines. Calcium caused interference in the emission intensity for some lines and iron did not interfere (at least up to 10 mg L -1 ) in the five lines studied. The presence of 13 other elements did not affect the emission intensity of uranium for the lines chosen. The optimized method, using the line at 385.957 nm, allows the determination of uranium with limit of quantification of 30 μg L -1 and precision expressed as RSD lower than 2.2% for uranium concentrations of either 500 and 1000 μg L -1 . In second one, a highly sensitive flow-based procedure for uranium determination in natural waters is described. A 100-cm optical path flow cell based on a liquid-core waveguide (LCW) was exploited to increase sensitivity of the arsenazo 111 method, aiming to achieve the limits established by environmental regulations. The flow system was designed with solenoid micro-pumps in order to improve mixing and minimize reagent consumption, as well as

  8. Comparative effectiveness and acceptability of home-based and clinic-based sampling methods for sexually transmissible infections screening in females aged 14-50 years: a systematic review and meta-analysis.

    Science.gov (United States)

    Odesanmi, Tolulope Y; Wasti, Sharada P; Odesanmi, Omolola S; Adegbola, Omololu; Oguntuase, Olubukola O; Mahmood, Sajid

    2013-12-01

    Home-based sampling is a strategy to enhance uptake of sexually transmissible infection (STI) screening. This review aimed to compare the screening uptake levels of home-based self-sampling and clinic-based specimen collection for STIs (chlamydia (Chlamydia trachomatis), gonorrhoea (Neisseria gonorrhoeae) and trichomoniasis) in females aged 14-50 years. Acceptability and effect on specimen quality were determined. Sixteen electronic databases were searched from inception to September 2012. Randomised controlled trials (RCTs) comparing the uptake levels of home-based self-sampling and clinic-based sampling for chlamydia, gonorrhoea and trichomoniasis in females aged 14-50 years were eligible for inclusion. The risk of bias in the trials was assessed. Risk ratios (RRs) for dichotomous outcomes were meta-analysed. Of 3065 papers, six studies with seven RCTs contributed to the final review. Compared with clinic-based methods, home-based screening increased uptake significantly (P=0.001-0.05) in five trials and was substantiated in a meta-analysis (RR: 1.55; 95% confidence interval: 1.30-1.85; P=0.00001) of two trials. In three trials, a significant preference for home-based testing (P=0.001-0.05) was expressed. No significant difference was observed in specimen quality. Sampling was rated as easy by a significantly higher number of women (P=0.01) in the clinic group in one trial. The review provides evidence that home-based testing results in greater uptake of STI screening in females (14-50 years) than clinic-based testing without compromising quality in the developed world. Home collection strategies should be added to clinic-based screening programs to enhance uptake.

  9. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards.

    Science.gov (United States)

    Bornstein, Marc H; Jager, Justin; Putnick, Diane L

    2013-12-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study's target population, whether they yield representative and generalizable estimates of subsamples within a study's target population, the recruitment efforts and costs they entail, whether they yield sufficient power to detect subsample differences, and whether they introduce "noise" related to variation in subsamples and whether that "noise" can be accounted for statistically. We use sample composition of gender, ethnicity, and socioeconomic status to illustrate and assess the four sampling strategies. Finally, we tally the use of the four sampling strategies in five prominent developmental science journals and make recommendations about best practices for sample selection and reporting.

  10. Vanguard/rearguard strategy for the evaluation of the degradation of yoghurt samples based on the direct analysis of the volatiles profile through headspace-gas chromatography-mass spectrometry.

    Science.gov (United States)

    Carrillo-Carrión, C; Cárdenas, S; Valcárcel, M

    2007-02-02

    A vanguard/rearguard analytical strategy for the monitoring of the degradation of yoghurt samples is proposed. The method is based on the headspace-gas chromatography-mass spectrometry (HS-GC-MS) instrumental coupling. In this combination, the chromatographic column is firstly used as an interface between the HS and the MS (vanguard mode) avoiding separation of the volatile components by maintaining the chromatographic oven at high, constant temperature. By changing the thermal conditions of the oven, the aldehydes can be properly separated for individual identification/quantification (rearguard mode). In the vanguard method, the quantification of the volatile aldehydes was calculated through partial least square and given as a total index. The rearguard method permits the detection of the aldehydes at concentrations between 12 and 35 ng/g. Both methods were applied to the study of the environmental factors favouring the presence of the volatile aldehydes (C(5)-C(9)) in the yoghurt samples. Principal component analysis of the total concentration of aldehydes with the time (from 0 to 30 days) demonstrates the capability of the HS-MS coupling for the estimation of the quality losses of the samples. The results were corroborated by the HS-GC-MS which also indicates that pentanal was present in the yoghurt from the beginning of the study and the combination of light/oxygen was the most negative influence for sample conservation.

  11. Radial line-scans as representative sampling strategy in dried-droplet laser ablation of liquid samples deposited on pre-cut filter paper disks.

    Science.gov (United States)

    Nischkauer, Winfried; Vanhaecke, Frank; Bernacchi, Sébastien; Herwig, Christoph; Limbeck, Andreas

    2014-11-01

    Nebulising liquid samples and using the aerosol thus obtained for further analysis is the standard method in many current analytical techniques, also with inductively coupled plasma (ICP)-based devices. With such a set-up, quantification via external calibration is usually straightforward for samples with aqueous or close-to-aqueous matrix composition. However, there is a variety of more complex samples. Such samples can be found in medical, biological, technological and industrial contexts and can range from body fluids, like blood or urine, to fuel additives or fermentation broths. Specialized nebulizer systems or careful digestion and dilution are required to tackle such demanding sample matrices. One alternative approach is to convert the liquid into a dried solid and to use laser ablation for sample introduction. Up to now, this approach required the application of internal standards or matrix-adjusted calibration due to matrix effects. In this contribution, we show a way to circumvent these matrix effects while using simple external calibration for quantification. The principle of representative sampling that we propose uses radial line-scans across the dried residue. This compensates for centro-symmetric inhomogeneities typically observed in dried spots. The effectiveness of the proposed sampling strategy is exemplified via the determination of phosphorus in biochemical fermentation media. However, the universal viability of the presented measurement protocol is postulated. Detection limits using laser ablation-ICP-optical emission spectrometry were in the order of 40 μg mL - 1 with a reproducibility of 10 % relative standard deviation (n = 4, concentration = 10 times the quantification limit). The reported sensitivity is fit-for-purpose in the biochemical context described here, but could be improved using ICP-mass spectrometry, if future analytical tasks would require it. Trueness of the proposed method was investigated by cross-validation with

  12. Radial line-scans as representative sampling strategy in dried-droplet laser ablation of liquid samples deposited on pre-cut filter paper disks

    Science.gov (United States)

    Nischkauer, Winfried; Vanhaecke, Frank; Bernacchi, Sébastien; Herwig, Christoph; Limbeck, Andreas

    2014-11-01

    Nebulising liquid samples and using the aerosol thus obtained for further analysis is the standard method in many current analytical techniques, also with inductively coupled plasma (ICP)-based devices. With such a set-up, quantification via external calibration is usually straightforward for samples with aqueous or close-to-aqueous matrix composition. However, there is a variety of more complex samples. Such samples can be found in medical, biological, technological and industrial contexts and can range from body fluids, like blood or urine, to fuel additives or fermentation broths. Specialized nebulizer systems or careful digestion and dilution are required to tackle such demanding sample matrices. One alternative approach is to convert the liquid into a dried solid and to use laser ablation for sample introduction. Up to now, this approach required the application of internal standards or matrix-adjusted calibration due to matrix effects. In this contribution, we show a way to circumvent these matrix effects while using simple external calibration for quantification. The principle of representative sampling that we propose uses radial line-scans across the dried residue. This compensates for centro-symmetric inhomogeneities typically observed in dried spots. The effectiveness of the proposed sampling strategy is exemplified via the determination of phosphorus in biochemical fermentation media. However, the universal viability of the presented measurement protocol is postulated. Detection limits using laser ablation-ICP-optical emission spectrometry were in the order of 40 μg mL- 1 with a reproducibility of 10 % relative standard deviation (n = 4, concentration = 10 times the quantification limit). The reported sensitivity is fit-for-purpose in the biochemical context described here, but could be improved using ICP-mass spectrometry, if future analytical tasks would require it. Trueness of the proposed method was investigated by cross-validation with

  13. Instruction of Research-Based Comprehension Strategies in Basal Reading Programs

    Science.gov (United States)

    Pilonieta, Paola

    2010-01-01

    Research supports using research-based comprehension strategies; however, comprehension strategy instruction is not highly visible in basal reading programs or classroom instruction, resulting in many students who struggle with comprehension. A content analysis examined which research-based comprehension strategies were presented in five…

  14. Manufacturing strategies for time based competitive advantages

    OpenAIRE

    Lin, Yong; Ma, Shihua; Zhou, Li

    2012-01-01

    Purpose – The main purpose of this paper is to investigate the current manufacturing strategies and practices of bus manufacturers in China, and to propose a framework of manufacturing strategies for time-based competitive advantages.\\ud Design/methodology/approach – The conceptual research framework is devised from a review of the literature, and case studies are used to investigate the manufacturing strategies and practices in place in the case companies. Data is collected through semi-stru...

  15. Radial line-scans as representative sampling strategy in dried-droplet laser ablation of liquid samples deposited on pre-cut filter paper disks

    Energy Technology Data Exchange (ETDEWEB)

    Nischkauer, Winfried [Institute of Chemical Technologies and Analytics, Vienna University of Technology, Vienna (Austria); Department of Analytical Chemistry, Ghent University, Ghent (Belgium); Vanhaecke, Frank [Department of Analytical Chemistry, Ghent University, Ghent (Belgium); Bernacchi, Sébastien; Herwig, Christoph [Institute of Chemical Engineering, Vienna University of Technology, Vienna (Austria); Limbeck, Andreas, E-mail: Andreas.Limbeck@tuwien.ac.at [Institute of Chemical Technologies and Analytics, Vienna University of Technology, Vienna (Austria)

    2014-11-01

    Nebulising liquid samples and using the aerosol thus obtained for further analysis is the standard method in many current analytical techniques, also with inductively coupled plasma (ICP)-based devices. With such a set-up, quantification via external calibration is usually straightforward for samples with aqueous or close-to-aqueous matrix composition. However, there is a variety of more complex samples. Such samples can be found in medical, biological, technological and industrial contexts and can range from body fluids, like blood or urine, to fuel additives or fermentation broths. Specialized nebulizer systems or careful digestion and dilution are required to tackle such demanding sample matrices. One alternative approach is to convert the liquid into a dried solid and to use laser ablation for sample introduction. Up to now, this approach required the application of internal standards or matrix-adjusted calibration due to matrix effects. In this contribution, we show a way to circumvent these matrix effects while using simple external calibration for quantification. The principle of representative sampling that we propose uses radial line-scans across the dried residue. This compensates for centro-symmetric inhomogeneities typically observed in dried spots. The effectiveness of the proposed sampling strategy is exemplified via the determination of phosphorus in biochemical fermentation media. However, the universal viability of the presented measurement protocol is postulated. Detection limits using laser ablation-ICP-optical emission spectrometry were in the order of 40 μg mL{sup −1} with a reproducibility of 10 % relative standard deviation (n = 4, concentration = 10 times the quantification limit). The reported sensitivity is fit-for-purpose in the biochemical context described here, but could be improved using ICP-mass spectrometry, if future analytical tasks would require it. Trueness of the proposed method was investigated by cross-validation with

  16. Optimization of Region-of-Interest Sampling Strategies for Hepatic MRI Proton Density Fat Fraction Quantification

    Science.gov (United States)

    Hong, Cheng William; Wolfson, Tanya; Sy, Ethan Z.; Schlein, Alexandra N.; Hooker, Jonathan C.; Dehkordy, Soudabeh Fazeli; Hamilton, Gavin; Reeder, Scott B.; Loomba, Rohit; Sirlin, Claude B.

    2017-01-01

    BACKGROUND Clinical trials utilizing proton density fat fraction (PDFF) as an imaging biomarker for hepatic steatosis have used a laborious region-of-interest (ROI) sampling strategy of placing an ROI in each hepatic segment. PURPOSE To identify a strategy with the fewest ROIs that consistently achieves close agreement with the nine-ROI strategy. STUDY TYPE Retrospective secondary analysis of prospectively acquired clinical research data. POPULATION A total of 391 adults (173 men, 218 women) with known or suspected NAFLD. FIELD STRENGTH/SEQUENCE Confounder-corrected chemical-shift-encoded 3T MRI using a 2D multiecho gradientrecalled echo technique. ASSESSMENT An ROI was placed in each hepatic segment. Mean nine-ROI PDFF and segmental PDFF standard deviation were computed. Segmental and lobar PDFF were compared. PDFF was estimated using every combinatorial subset of ROIs and compared to the nine-ROI average. STATISTICAL TESTING Mean nine-ROI PDFF and segmental PDFF standard deviation were summarized descriptively. Segmental PDFF was compared using a one-way analysis of variance, and lobar PDFF was compared using a paired t-test and a Bland–Altman analysis. The PDFF estimated by every subset of ROIs was informally compared to the nine-ROI average using median intraclass correlation coefficients (ICCs) and Bland–Altman analyses. RESULTS The study population’s mean whole-liver PDFF was 10.1±8.9% (range: 1.1–44.1%). Although there was no significant difference in average segmental (P=0.452) or lobar (P=0.154) PDFF, left and right lobe PDFF differed by at least 1.5 percentage points in 25.1% (98/391) of patients. Any strategy with ≥ 4 ROIs had ICC >0.995. 115 of 126 four-ROI strategies (91%) had limits of agreement (LOA) 0.995, and 2/36 (6%) of two-ROI strategies and 46/84 (55%) of three-ROI strategies had LOA <1.5%. DATA CONCLUSION Four-ROI sampling strategies with two ROIs in the left and right lobes achieve close agreement with nine-ROI PDFF. Level of

  17. Optimization of region-of-interest sampling strategies for hepatic MRI proton density fat fraction quantification.

    Science.gov (United States)

    Hong, Cheng William; Wolfson, Tanya; Sy, Ethan Z; Schlein, Alexandra N; Hooker, Jonathan C; Fazeli Dehkordy, Soudabeh; Hamilton, Gavin; Reeder, Scott B; Loomba, Rohit; Sirlin, Claude B

    2018-04-01

    Clinical trials utilizing proton density fat fraction (PDFF) as an imaging biomarker for hepatic steatosis have used a laborious region-of-interest (ROI) sampling strategy of placing an ROI in each hepatic segment. To identify a strategy with the fewest ROIs that consistently achieves close agreement with the nine-ROI strategy. Retrospective secondary analysis of prospectively acquired clinical research data. A total of 391 adults (173 men, 218 women) with known or suspected NAFLD. Confounder-corrected chemical-shift-encoded 3T MRI using a 2D multiecho gradient-recalled echo technique. An ROI was placed in each hepatic segment. Mean nine-ROI PDFF and segmental PDFF standard deviation were computed. Segmental and lobar PDFF were compared. PDFF was estimated using every combinatorial subset of ROIs and compared to the nine-ROI average. Mean nine-ROI PDFF and segmental PDFF standard deviation were summarized descriptively. Segmental PDFF was compared using a one-way analysis of variance, and lobar PDFF was compared using a paired t-test and a Bland-Altman analysis. The PDFF estimated by every subset of ROIs was informally compared to the nine-ROI average using median intraclass correlation coefficients (ICCs) and Bland-Altman analyses. The study population's mean whole-liver PDFF was 10.1 ± 8.9% (range: 1.1-44.1%). Although there was no significant difference in average segmental (P = 0.452) or lobar (P = 0.154) PDFF, left and right lobe PDFF differed by at least 1.5 percentage points in 25.1% (98/391) of patients. Any strategy with ≥4 ROIs had ICC >0.995. 115 of 126 four-ROI strategies (91%) had limits of agreement (LOA) 0.995, and 2/36 (6%) of two-ROI strategies and 46/84 (55%) of three-ROI strategies had LOA <1.5%. Four-ROI sampling strategies with two ROIs in the left and right lobes achieve close agreement with nine-ROI PDFF. 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:988-994. © 2017 International Society for Magnetic Resonance

  18. A typology for campus-based alcohol prevention: moving toward environmental management strategies.

    Science.gov (United States)

    DeJong, William; Langford, Linda M

    2002-03-01

    This article outlines a typology of programs and policies for preventing and treating campus-based alcohol-related problems, reviews recent case studies showing the promise of campus-based environmental management strategies and reports findings from a national survey of U.S. colleges and universities about available resources for pursuing environmentally focused prevention. The typology is grounded in a social ecological framework, which recognizes that health-related behaviors are affected through multiple levels of influence: intrapersonal (individual) factors, interpersonal (group) processes, institutional factors, community factors and public policy. The survey on prevention resources and activities was mailed to senior administrators responsible for their school's institutional response to substance use problems. The study sample was an equal probability sample of 365 2- and 4-year U.S. campuses. The response rate was 76.9%. Recent case studies suggest the value of environmentally focused alcohol prevention approaches on campus, but more rigorous research is needed to establish their effectiveness. The administrators' survey showed that most U.S. colleges have not yet installed the basic infrastructure required for developing, implementing and evaluating environmental management strategies. The typology of campus-based prevention options can be used to categorize current efforts and to inform strategic planning of multilevel interventions. Additional colleges and universities should establish a permanent campus task force that reports directly to the president, participate actively in a campus-community coalition that seeks to change the availability of alcohol in the local community and join a state-level association that speaks out on state and federal policy issues.

  19. A systematic examination of a random sampling strategy for source apportionment calculations.

    Science.gov (United States)

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Fine mapping quantitative trait loci under selective phenotyping strategies based on linkage and linkage disequilibrium criteria

    DEFF Research Database (Denmark)

    Ansari-Mahyari, S; Berg, P; Lund, M S

    2009-01-01

    disequilibrium-based sampling criteria (LDC) for selecting individuals to phenotype are compared to random phenotyping in a quantitative trait loci (QTL) verification experiment using stochastic simulation. Several strategies based on LAC and LDC for selecting the most informative 30%, 40% or 50% of individuals...... for phenotyping to extract maximum power and precision in a QTL fine mapping experiment were developed and assessed. Linkage analyses for the mapping was performed for individuals sampled on LAC within families and combined linkage disequilibrium and linkage analyses was performed for individuals sampled across...... the whole population based on LDC. The results showed that selecting individuals with similar haplotypes to the paternal haplotypes (minimum recombination criterion) using LAC compared to random phenotyping gave at least the same power to detect a QTL but decreased the accuracy of the QTL position. However...

  1. Analytical sample preparation strategies for the determination of antimalarial drugs in human whole blood, plasma and urine

    DEFF Research Database (Denmark)

    Casas, Monica Escolà; Hansen, Martin; Krogh, Kristine A

    2014-01-01

    the available sample preparation strategies combined with liquid chromatographic (LC) analysis to determine antimalarials in whole blood, plasma and urine published over the last decade. Sample preparation can be done by protein precipitation, solid-phase extraction, liquid-liquid extraction or dilution. After...

  2. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis.

  3. Evaluation of the Frequency for Gas Sampling for the High Burnup Confirmatory Data Project

    Energy Technology Data Exchange (ETDEWEB)

    Stockman, Christine T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Alsaed, Halim A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bryan, Charles R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marschman, Steven C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Scaglione, John M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-05-01

    This report provides a technically based gas sampling frequency strategy for the High Burnup (HBU) Confirmatory Data Project. The evaluation of: 1) the types and magnitudes of gases that could be present in the project cask and, 2) the degradation mechanisms that could change gas compositions culminates in an adaptive gas sampling frequency strategy. This adaptive strategy is compared against the sampling frequency that has been developed based on operational considerations.

  4. New sampling strategy using a Bayesian approach to assess iohexol clearance in kidney transplant recipients.

    Science.gov (United States)

    Benz-de Bretagne, I; Le Guellec, C; Halimi, J M; Gatault, P; Barbet, C; Alnajjar, A; Büchler, M; Lebranchu, Y; Andres, Christian Robert; Vourcʼh, P; Blasco, H

    2012-06-01

    Glomerular filtration rate (GFR) measurement is a major issue in kidney transplant recipients for clinicians. GFR can be determined by estimating the plasma clearance of iohexol, a nonradiolabeled compound. For practical and convenient application for patients and caregivers, it is important that a minimal number of samples are drawn. The aim of this study was to develop and validate a Bayesian model with fewer samples for reliable prediction of GFR in kidney transplant recipients. Iohexol plasma concentration-time curves from 95 patients were divided into an index (n = 63) and a validation set (n = 32). Samples (n = 4-6 per patient) were obtained during the elimination phase, that is, between 120 and 270 minutes. Individual reference values of iohexol clearance (CL(iohexol)) were calculated from k (elimination slope) and V (volume of distribution from intercept). Individual CL(iohexol) values were then introduced into the Bröchner-Mortensen equation to obtain the GFR (reference value). A population pharmacokinetic model was developed from the index set and validated using standard methods. For the validation set, we tested various combinations of 1, 2, or 3 sampling time to estimate CL(iohexol). According to the different combinations tested, a maximum a posteriori Bayesian estimation of CL(iohexol) was obtained from population parameters. Individual estimates of GFR were compared with individual reference values through analysis of bias and precision. A capability analysis allowed us to determine the best sampling strategy for Bayesian estimation. A 1-compartment model best described our data. Covariate analysis showed that uremia, serum creatinine, and age were significantly associated with k(e), and weight with V. The strategy, including samples drawn at 120 and 270 minutes, allowed accurate prediction of GFR (mean bias: -3.71%, mean imprecision: 7.77%). With this strategy, about 20% of individual predictions were outside the bounds of acceptance set at ± 10

  5. Enhancing adaptive sparse grid approximations and improving refinement strategies using adjoint-based a posteriori error estimates

    Science.gov (United States)

    Jakeman, J. D.; Wildey, T.

    2015-01-01

    In this paper we present an algorithm for adaptive sparse grid approximations of quantities of interest computed from discretized partial differential equations. We use adjoint-based a posteriori error estimates of the physical discretization error and the interpolation error in the sparse grid to enhance the sparse grid approximation and to drive adaptivity of the sparse grid. Utilizing these error estimates provides significantly more accurate functional values for random samples of the sparse grid approximation. We also demonstrate that alternative refinement strategies based upon a posteriori error estimates can lead to further increases in accuracy in the approximation over traditional hierarchical surplus based strategies. Throughout this paper we also provide and test a framework for balancing the physical discretization error with the stochastic interpolation error of the enhanced sparse grid approximation.

  6. Enhancing adaptive sparse grid approximations and improving refinement strategies using adjoint-based a posteriori error estimates

    International Nuclear Information System (INIS)

    Jakeman, J.D.; Wildey, T.

    2015-01-01

    In this paper we present an algorithm for adaptive sparse grid approximations of quantities of interest computed from discretized partial differential equations. We use adjoint-based a posteriori error estimates of the physical discretization error and the interpolation error in the sparse grid to enhance the sparse grid approximation and to drive adaptivity of the sparse grid. Utilizing these error estimates provides significantly more accurate functional values for random samples of the sparse grid approximation. We also demonstrate that alternative refinement strategies based upon a posteriori error estimates can lead to further increases in accuracy in the approximation over traditional hierarchical surplus based strategies. Throughout this paper we also provide and test a framework for balancing the physical discretization error with the stochastic interpolation error of the enhanced sparse grid approximation

  7. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards

    Science.gov (United States)

    Bornstein, Marc H.; Jager, Justin; Putnick, Diane L.

    2014-01-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study’s target population, whether they yield representative and generalizable estimates of subsamples within a study’s target population, the recruitment efforts and costs they entail, whether they yield sufficient power to detect subsample differences, and whether they introduce “noise” related to variation in subsamples and whether that “noise” can be accounted for statistically. We use sample composition of gender, ethnicity, and socioeconomic status to illustrate and assess the four sampling strategies. Finally, we tally the use of the four sampling strategies in five prominent developmental science journals and make recommendations about best practices for sample selection and reporting. PMID:25580049

  8. A simulation approach to assessing sampling strategies for insect pests: an example with the balsam gall midge.

    Directory of Open Access Journals (Sweden)

    R Drew Carleton

    Full Text Available Estimation of pest density is a basic requirement for integrated pest management in agriculture and forestry, and efficiency in density estimation is a common goal. Sequential sampling techniques promise efficient sampling, but their application can involve cumbersome mathematics and/or intensive warm-up sampling when pests have complex within- or between-site distributions. We provide tools for assessing the efficiency of sequential sampling and of alternative, simpler sampling plans, using computer simulation with "pre-sampling" data. We illustrate our approach using data for balsam gall midge (Paradiplosis tumifex attack in Christmas tree farms. Paradiplosis tumifex proved recalcitrant to sequential sampling techniques. Midge distributions could not be fit by a common negative binomial distribution across sites. Local parameterization, using warm-up samples to estimate the clumping parameter k for each site, performed poorly: k estimates were unreliable even for samples of n ∼ 100 trees. These methods were further confounded by significant within-site spatial autocorrelation. Much simpler sampling schemes, involving random or belt-transect sampling to preset sample sizes, were effective and efficient for P. tumifex. Sampling via belt transects (through the longest dimension of a stand was the most efficient, with sample means converging on true mean density for sample sizes of n ∼ 25-40 trees. Pre-sampling and simulation techniques provide a simple method for assessing sampling strategies for estimating insect infestation. We suspect that many pests will resemble P. tumifex in challenging the assumptions of sequential sampling methods. Our software will allow practitioners to optimize sampling strategies before they are brought to real-world applications, while potentially avoiding the need for the cumbersome calculations required for sequential sampling methods.

  9. A Region-Based Strategy for Collaborative Roadmap Construction

    KAUST Repository

    Denny, Jory; Sandströ m, Read; Julian, Nicole; Amato, Nancy M.

    2015-01-01

    © Springer International Publishing Switzerland 2015. Motion planning has seen much attention over the past two decades. A great deal of progress has been made in sampling-based planning, whereby a planner builds an approximate representation of the planning space. While these planners have demonstrated success inmany scenarios, there are still difficult problems where they lack robustness or efficiency, e.g., certain types of narrow spaces. Conversely, human intuition can often determine an approximate solution to these problems quite effectively, but humans lack the speed and precision necessary to perform the corresponding low-level tasks (such as collision checking) in a timely manner. In this work, we introduce a novel strategy called Region Steering in which the user and a PRM planner work cooperatively to map the space while maintaining the probabilistic completeness property of the PRMplanner. Region Steering utilizes two-way communication to integrate the strengths of both the user and the planner, thereby overcoming the weaknesses inherent to relying on either one alone. In one communication direction, a user can input regions, or bounding volumes in the workspace, to bias sampling towards or away from these areas. In the other direction, the planner displays its progress to the user and colors the regions based on their perceived usefulness.We demonstrate that Region Steering provides roadmap customizability, reduced mapping time, and smaller roadmap sizes compared with fully automated PRMs, e.g., Gaussian PRM.

  10. Investigating the Effectiveness of Teaching Methods Based on a Four-Step Constructivist Strategy

    Science.gov (United States)

    Çalik, Muammer; Ayas, Alipaşa; Coll, Richard K.

    2010-02-01

    This paper reports on an investigation of the effectiveness an intervention using several different methods for teaching solution chemistry. The teaching strategy comprised a four-step approach derived from a constructivist view of learning. A sample consisting of 44 students (18 boys and 26 girls) was selected purposively from two different Grade 9 classes in the city of Trabzon, Turkey. Data collection employed a purpose-designed `solution chemistry concept test', consisting of 17 items, with the quantitative data from the survey supported by qualitative interview data. The findings suggest that using different methods embedded within the four-step constructivist-based teaching strategy enables students to refute some alternative conceptions, but does not completely eliminate student alternative conceptions for solution chemistry.

  11. Hybrid Multiple Soft-Sensor Models of Grinding Granularity Based on Cuckoo Searching Algorithm and Hysteresis Switching Strategy

    Directory of Open Access Journals (Sweden)

    Jie-Sheng Wang

    2015-01-01

    Full Text Available According to the characteristics of grinding process and accuracy requirements of technical indicators, a hybrid multiple soft-sensor modeling method of grinding granularity is proposed based on cuckoo searching (CS algorithm and hysteresis switching (HS strategy. Firstly, a mechanism soft-sensor model of grinding granularity is deduced based on the technique characteristics and a lot of experimental data of grinding process. Meanwhile, the BP neural network soft-sensor model and wavelet neural network (WNN soft-sensor model are set up. Then, the hybrid multiple soft-sensor model based on the hysteresis switching strategy is realized. That is to say, the optimum model is selected as the current predictive model according to the switching performance index at each sampling instant. Finally the cuckoo searching algorithm is adopted to optimize the performance parameters of hysteresis switching strategy. Simulation results show that the proposed model has better generalization results and prediction precision, which can satisfy the real-time control requirements of grinding classification process.

  12. Implementation of Technology-based Patient Engagement Strategies within Practice-based Research Networks.

    Science.gov (United States)

    Careyva, Beth; Shaak, Kyle; Mills, Geoffrey; Johnson, Melanie; Goodrich, Samantha; Stello, Brian; Wallace, Lorraine S

    2016-01-01

    Technology-based patient engagement strategies (such as patient portals) are increasingly available, yet little is known about current use and barriers within practice-based research networks (PBRNs). PBRN directors have unique opportunities to inform the implementation of patient-facing technology and to translate these findings into practice. PBRN directors were queried regarding technology-based patient engagement strategies as part of the 2015 CAFM Educational Research Alliance (CERA) survey of PBRN directors. A total of 102 PBRN directors were identified via the Agency for Healthcare Research and Quality's registry; 54 of 96 eligible PBRN directors completed the survey, for a response rate of 56%. Use of technology-based patient engagement strategies within PBRNs was limited, with less than half of respondents reporting experience with the most frequently named tools (risk assessments/decision aids). Information technology (IT) support was the top barrier, followed by low rates of portal enrollment. For engaging participant practices, workload and practice leadership were cited as most important, with fewer respondents noting concerns about patient privacy. Given limited use of patient-facing technologies, PBRNs have an opportunity to clarify the optimal use of these strategies. Providing IT support and addressing clinician concerns regarding workload may facilitate the inclusion of innovative technologies in PBRNs. © Copyright 2016 by the American Board of Family Medicine.

  13. How to handle speciose clades? Mass taxon-sampling as a strategy towards illuminating the natural history of Campanula (Campanuloideae.

    Directory of Open Access Journals (Sweden)

    Guilhem Mansion

    Full Text Available BACKGROUND: Speciose clades usually harbor species with a broad spectrum of adaptive strategies and complex distribution patterns, and thus constitute ideal systems to disentangle biotic and abiotic causes underlying species diversification. The delimitation of such study systems to test evolutionary hypotheses is difficult because they often rely on artificial genus concepts as starting points. One of the most prominent examples is the bellflower genus Campanula with some 420 species, but up to 600 species when including all lineages to which Campanula is paraphyletic. We generated a large alignment of petD group II intron sequences to include more than 70% of described species as a reference. By comparison with partial data sets we could then assess the impact of selective taxon sampling strategies on phylogenetic reconstruction and subsequent evolutionary conclusions. METHODOLOGY/PRINCIPAL FINDINGS: Phylogenetic analyses based on maximum parsimony (PAUP, PRAP, Bayesian inference (MrBayes, and maximum likelihood (RAxML were first carried out on the large reference data set (D680. Parameters including tree topology, branch support, and age estimates, were then compared to those obtained from smaller data sets resulting from "classification-guided" (D088 and "phylogeny-guided sampling" (D101. Analyses of D088 failed to fully recover the phylogenetic diversity in Campanula, whereas D101 inferred significantly different branch support and age estimates. CONCLUSIONS/SIGNIFICANCE: A short genomic region with high phylogenetic utility allowed us to easily generate a comprehensive phylogenetic framework for the speciose Campanula clade. Our approach recovered 17 well-supported and circumscribed sub-lineages. Knowing these will be instrumental for developing more specific evolutionary hypotheses and guide future research, we highlight the predictive value of a mass taxon-sampling strategy as a first essential step towards illuminating the detailed

  14. Sampling strategy for estimating human exposure pathways to consumer chemicals

    Directory of Open Access Journals (Sweden)

    Eleni Papadopoulou

    2016-03-01

    Full Text Available Human exposure to consumer chemicals has become a worldwide concern. In this work, a comprehensive sampling strategy is presented, to our knowledge being the first to study all relevant exposure pathways in a single cohort using multiple methods for assessment of exposure from each exposure pathway. The selected groups of chemicals to be studied are consumer chemicals whose production and use are currently in a state of transition and are; per- and polyfluorinated alkyl substances (PFASs, traditional and “emerging” brominated flame retardants (BFRs and EBFRs, organophosphate esters (OPEs and phthalate esters (PEs. Information about human exposure to these contaminants is needed due to existing data gaps on human exposure intakes from multiple exposure pathways and relationships between internal and external exposure. Indoor environment, food and biological samples were collected from 61 participants and their households in the Oslo area (Norway on two consecutive days, during winter 2013-14. Air, dust, hand wipes, and duplicate diet (food and drink samples were collected as indicators of external exposure, and blood, urine, blood spots, hair, nails and saliva as indicators of internal exposure. A food diary, food frequency questionnaire (FFQ and indoor environment questionnaire were also implemented. Approximately 2000 samples were collected in total and participant views on their experiences of this campaign were collected via questionnaire. While 91% of our participants were positive about future participation in a similar project, some tasks were viewed as problematic. Completing the food diary and collection of duplicate food/drink portions were the tasks most frequent reported as “hard”/”very hard”. Nevertheless, a strong positive correlation between the reported total mass of food/drinks in the food record and the total weight of the food/drinks in the collection bottles was observed, being an indication of accurate performance

  15. A two-stage cluster sampling method using gridded population data, a GIS, and Google EarthTM imagery in a population-based mortality survey in Iraq

    Directory of Open Access Journals (Sweden)

    Galway LP

    2012-04-01

    Full Text Available Abstract Background Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. Results We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Conclusion Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings.

  16. Adding-point strategy for reduced-order hypersonic aerothermodynamics modeling based on fuzzy clustering

    Science.gov (United States)

    Chen, Xin; Liu, Li; Zhou, Sida; Yue, Zhenjiang

    2016-09-01

    Reduced order models(ROMs) based on the snapshots on the CFD high-fidelity simulations have been paid great attention recently due to their capability of capturing the features of the complex geometries and flow configurations. To improve the efficiency and precision of the ROMs, it is indispensable to add extra sampling points to the initial snapshots, since the number of sampling points to achieve an adequately accurate ROM is generally unknown in prior, but a large number of initial sampling points reduces the parsimony of the ROMs. A fuzzy-clustering-based adding-point strategy is proposed and the fuzzy clustering acts an indicator of the region in which the precision of ROMs is relatively low. The proposed method is applied to construct the ROMs for the benchmark mathematical examples and a numerical example of hypersonic aerothermodynamics prediction for a typical control surface. The proposed method can achieve a 34.5% improvement on the efficiency than the estimated mean squared error prediction algorithm and shows same-level prediction accuracy.

  17. D4 S4: A Four Dimensions Instructional Strategy for Web-based and Blended Learning

    Directory of Open Access Journals (Sweden)

    Hamdy A. ABDELAZIZ,

    2012-08-01

    Full Text Available Web-based education is facing a paradigm shift under the rapid development of information and communication technology. The new paradigm of learning requires special techniques of course design, special instructional models, and special methods of evaluation. This paper investigates the effectiveness of an adaptive instructional strategy for teaching and learning through the Web and blended learning environments. The central theme of this strategy is that instructional strategies give instructors and students a conceptual as well as a practical mode of delivery from which to teach and learn. Considering and applying new instructional strategy can help instructors to understand the uses of pedagogical content knowledge, as well as to reflect the role of technological content knowledge that can be adapted and/or adopted in teaching in all educational levels and environments. The main objective of this paper was to develop a holonomic instructional strategy for Web-based and blended learning. This strategy is guided by the non-linear and interactive features of learning environments. The strategy is consisted of four dimensions: designing, developing, delving and distributing. In this new instructional strategy, learning is holonomic and adaptive. Learning occurs in an open learning environment, in which instructors are designing a shared vision, developing a sharable e-learning task, delving students’ learning through scaffolding and salvaging students’ knowledge. The expected outcome of this instructional strategy is that each learner will develop a cognitive schema to be used to organize and construct knowledge and meaning in similar context of learning which may increase the generalizability, trustworthiness and transferability of learning. The results of applying this new strategy showed that this strategy is effective on developing both achievement and deep learning levels among a sample of graduate students.

  18. A RSSI-based parameter tracking strategy for constrained position localization

    Science.gov (United States)

    Du, Jinze; Diouris, Jean-François; Wang, Yide

    2017-12-01

    In this paper, a received signal strength indicator (RSSI)-based parameter tracking strategy for constrained position localization is proposed. To estimate channel model parameters, least mean squares method (LMS) is associated with the trilateration method. In the context of applications where the positions are constrained on a grid, a novel tracking strategy is proposed to determine the real position and obtain the actual parameters in the monitored region. Based on practical data acquired from a real localization system, an experimental channel model is constructed to provide RSSI values and verify the proposed tracking strategy. Quantitative criteria are given to guarantee the efficiency of the proposed tracking strategy by providing a trade-off between the grid resolution and parameter variation. The simulation results show a good behavior of the proposed tracking strategy in the presence of space-time variation of the propagation channel. Compared with the existing RSSI-based algorithms, the proposed tracking strategy exhibits better localization accuracy but consumes more calculation time. In addition, a tracking test is performed to validate the effectiveness of the proposed tracking strategy.

  19. Clustering of samples and elements based on multi-variable chemical data

    International Nuclear Information System (INIS)

    Op de Beeck, J.

    1984-01-01

    Clustering and classification are defined in the context of multivariable chemical analysis data. Classical multi-variate techniques, commonly used to interpret such data, are shown to be based on probabilistic and geometrical principles which are not justified for analytical data, since in that case one assumes or expects a system of more or less systematically related objects (samples) as defined by measurements on more or less systematically interdependent variables (elements). For the specific analytical problem of data set concerning a large number of trace elements determined in a large number of samples, a deterministic cluster analysis can be used to develop the underlying classification structure. Three main steps can be distinguished: diagnostic evaluation and preprocessing of the raw input data; computation of a symmetric matrix with pairwise standardized dissimilarity values between all possible pairs of samples and/or elements; and ultrametric clustering strategy to produce the final classification as a dendrogram. The software packages designed to perform these tasks are discussed and final results are given. Conclusions are formulated concerning the dangers of using multivariate, clustering and classification software packages as a black-box

  20. Twitter Strategies for Web-Based Surveying: Descriptive Analysis From the International Concussion Study.

    Science.gov (United States)

    Hendricks, Sharief; Düking, Peter; Mellalieu, Stephen D

    2016-09-01

    Social media provides researchers with an efficient means to reach and engage with a large and diverse audience. Twitter allows for the virtual social interaction among a network of users that enables researchers to recruit and administer surveys using snowball sampling. Although using Twitter to administer surveys for research is not new, strategies to improve response rates are yet to be reported. To compare the potential and actual reach of 2 Twitter accounts that administered a Web-based concussion survey to rugby players and trainers using 2 distinct Twitter-targeting strategies. Furthermore, the study sought to determine the likelihood of receiving a retweet based on the time of the day and day of the week of posting. A survey based on previous concussion research was exported to a Web-based survey website Survey Monkey. The survey comprised 2 questionnaires, one for players, and one for those involved in the game (eg, coaches and athletic trainers). The Web-based survey was administered using 2 existing Twitter accounts, with each account executing a distinct targeting strategy. A list of potential Twitter accounts to target was drawn up, together with a list of predesigned tweets. The list of accounts to target was divided into 'High-Profile' and 'Low-Profile', based on each accounts' position to attract publicity with a high social interaction potential. The potential reach (number of followers of the targeted account), and actual reach (number of retweets received by each post) between the 2 strategies were compared. The number of retweets received by each account was further analyzed to understand when the most likely time of day, and day of the week, a retweet would be received. The number of retweets received by a Twitter account decreased by 72% when using the 'high-profile strategy' compared with the 'low-profile strategy' (incidence rate ratio (IRR); 0.28, 95% confidence interval (CI) 0.21-0.37, P.001) and 6 PM to 11:59 PM (IRR 1.48, 95% CI 1

  1. Strategies of molecular imprinting-based fluorescence sensors for chemical and biological analysis.

    Science.gov (United States)

    Yang, Qian; Li, Jinhua; Wang, Xiaoyan; Peng, Hailong; Xiong, Hua; Chen, Lingxin

    2018-07-30

    One pressing concern today is to construct sensors that can withstand various disturbances for highly selective and sensitive detecting trace analytes in complicated samples. Molecularly imprinted polymers (MIPs) with tailor-made binding sites are preferred to be recognition elements in sensors for effective targets detection, and fluorescence measurement assists in highly sensitive detection and user-friendly control. Accordingly, molecular imprinting-based fluorescence sensors (MI-FL sensors) have attracted great research interest in many fields such as chemical and biological analysis. Herein, we comprehensively review the recent advances in MI-FL sensors construction and applications, giving insights on sensing principles and signal transduction mechanisms, focusing on general construction strategies for intrinsically fluorescent or nonfluorescent analytes and improvement strategies in sensing performance, particularly in sensitivity. Construction strategies are well overviewed, mainly including the traditional indirect methods of competitive binding against pre-bound fluorescent indicators, employment of fluorescent functional monomers and embedding of fluorescence substances, and novel rational designs of hierarchical architecture (core-shell/hollow and mesoporous structures), post-imprinting modification, and ratiometric fluorescence detection. Furthermore, MI-FL sensor based microdevices are discussed, involving micromotors, test strips and microfluidics, which are more portable for rapid point-of-care detection and in-field diagnosing. Finally, the current challenges and future perspectives of MI-FL sensors are proposed. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Effect Of Constructivist-Based Teaching Strategy On Academic ...

    African Journals Online (AJOL)

    Research reports indicate that this negative attitude was caused, majorly, by teachers' conventional (lecture) method of teaching integrated science. Research reports on the effectiveness of constructivist-based teaching strategy revealed that the strategy enhanced students' academic performance. In view of this, this study ...

  3. Alpha Matting with KL-Divergence Based Sparse Sampling.

    Science.gov (United States)

    Karacan, Levent; Erdem, Aykut; Erdem, Erkut

    2017-06-22

    In this paper, we present a new sampling-based alpha matting approach for the accurate estimation of foreground and background layers of an image. Previous sampling-based methods typically rely on certain heuristics in collecting representative samples from known regions, and thus their performance deteriorates if the underlying assumptions are not satisfied. To alleviate this, we take an entirely new approach and formulate sampling as a sparse subset selection problem where we propose to pick a small set of candidate samples that best explains the unknown pixels. Moreover, we describe a new dissimilarity measure for comparing two samples which is based on KLdivergence between the distributions of features extracted in the vicinity of the samples. The proposed framework is general and could be easily extended to video matting by additionally taking temporal information into account in the sampling process. Evaluation on standard benchmark datasets for image and video matting demonstrates that our approach provides more accurate results compared to the state-of-the-art methods.

  4. Planning schistosomiasis control: investigation of alternative sampling strategies for Schistosoma mansoni to target mass drug administration of praziquantel in East Africa.

    Science.gov (United States)

    Sturrock, Hugh J W; Gething, Pete W; Ashton, Ruth A; Kolaczinski, Jan H; Kabatereine, Narcis B; Brooker, Simon

    2011-09-01

    In schistosomiasis control, there is a need to geographically target treatment to populations at high risk of morbidity. This paper evaluates alternative sampling strategies for surveys of Schistosoma mansoni to target mass drug administration in Kenya and Ethiopia. Two main designs are considered: lot quality assurance sampling (LQAS) of children from all schools; and a geostatistical design that samples a subset of schools and uses semi-variogram analysis and spatial interpolation to predict prevalence in the remaining unsurveyed schools. Computerized simulations are used to investigate the performance of sampling strategies in correctly classifying schools according to treatment needs and their cost-effectiveness in identifying high prevalence schools. LQAS performs better than geostatistical sampling in correctly classifying schools, but at a cost with a higher cost per high prevalence school correctly classified. It is suggested that the optimal surveying strategy for S. mansoni needs to take into account the goals of the control programme and the financial and drug resources available.

  5. Strategy community development based on local resources

    Science.gov (United States)

    Meirinawati; Prabawati, I.; Pradana, G. W.

    2018-01-01

    The problem of progressing regions is not far from economic problems and is often caused by the inability of the regions in response to changes in economic conditions that occur, so the need for community development programs implemented to solve various problems. Improved community effort required with the real conditions and needs of each region. Community development based on local resources process is very important, because it is an increase in human resource capability in the optimal utilization of local resource potential. In this case a strategy is needed in community development based on local resources. The community development strategy are as follows:(1) “Eight Line Equalization Plus” which explains the urgency of rural industrialization, (2) the construction of the village will be more successful when combining strategies are tailored to regional conditions, (3) the escort are positioning themselves as the Planner, supervisor, information giver, motivator, facilitator, connecting at once evaluators.

  6. Control Strategies for the DAB Based PV Interface System.

    Directory of Open Access Journals (Sweden)

    Hadi M El-Helw

    Full Text Available This paper presents an interface system based on the Dual Active Bridge (DAB converter for Photovoltaic (PV arrays. Two control strategies are proposed for the DAB converter to harvest the maximum power from the PV array. The first strategy is based on a simple PI controller to regulate the terminal PV voltage through the phase shift angle of the DAB converter. The Perturb and Observe (P&O Maximum Power Point Tracking (MPPT technique is utilized to set the reference of the PV terminal voltage. The second strategy presented in this paper employs the Artificial Neural Network (ANN to directly set the phase shift angle of the DAB converter that results in harvesting maximum power. This feed-forward strategy overcomes the stability issues of the feedback strategy. The proposed PV interface systems are modeled and simulated using MATLAB/SIMULINK and the EMTDC/PSCAD software packages. The simulation results reveal accurate and fast response of the proposed systems. The dynamic performance of the proposed feed-forward strategy outdoes that of the feedback strategy in terms of accuracy and response time. Moreover, an experimental prototype is built to test and validate the proposed PV interface system.

  7. Model For Marketing Strategy Decision Based On Multicriteria Decicion Making: A Case Study In Batik Madura Industry

    Science.gov (United States)

    Anna, I. D.; Cahyadi, I.; Yakin, A.

    2018-01-01

    Selection of marketing strategy is a prominent competitive advantage for small and medium enterprises business development. The selection process is is a multiple criteria decision-making problem, which includes evaluation of various attributes or criteria in a process of strategy formulation. The objective of this paper is to develop a model for the selection of a marketing strategy in Batik Madura industry. The current study proposes an integrated approach based on analytic network process (ANP) and technique for order preference by similarity to ideal solution (TOPSIS) to determine the best strategy for Batik Madura marketing problems. Based on the results of group decision-making technique, this study selected fourteen criteria, including consistency, cost, trend following, customer loyalty, business volume, uniqueness manpower, customer numbers, promotion, branding, bussiness network, outlet location, credibility and the inovation as Batik Madura marketing strategy evaluation criteria. A survey questionnaire developed from literature review was distributed to a sample frame of Batik Madura SMEs in Pamekasan. In the decision procedure step, expert evaluators were asked to establish the decision matrix by comparing the marketing strategy alternatives under each of the individual criteria. Then, considerations obtained from ANP and TOPSIS methods were applied to build the specific criteria constraints and range of the launch strategy in the model. The model in this study demonstrates that, under current business situation, Straight-focus marketing strategy is the best marketing strategy for Batik Madura SMEs in Pamekasan.

  8. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  9. SDP Policy Iteration-Based Energy Management Strategy Using Traffic Information for Commuter Hybrid Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Xiaohong Jiao

    2014-07-01

    Full Text Available This paper demonstrates an energy management method using traffic information for commuter hybrid electric vehicles. A control strategy based on stochastic dynamic programming (SDP is developed, which minimizes on average the equivalent fuel consumption, while satisfying the battery charge-sustaining constraints and the overall vehicle power demand for drivability. First, according to the sample information of the traffic speed profiles, the regular route is divided into several segments and the statistic characteristics in the different segments are constructed from gathered data on the averaged vehicle speeds. And then, the energy management problem is formulated as a stochastic nonlinear and constrained optimal control problem and a modified policy iteration algorithm is utilized to generate a time-invariant state-dependent power split strategy. Finally, simulation results over some driving cycles are presented to demonstrate the effectiveness of the proposed energy management strategy.

  10. Establishing the ACORN National Practitioner Database: Strategies to Recruit Practitioners to a National Practice-Based Research Network.

    Science.gov (United States)

    Adams, Jon; Steel, Amie; Moore, Craig; Amorin-Woods, Lyndon; Sibbritt, David

    2016-10-01

    The purpose of this paper is to report on the recruitment and promotion strategies employed by the Australian Chiropractic Research Network (ACORN) project aimed at helping recruit a substantial national sample of participants and to describe the features of our practice-based research network (PBRN) design that may provide key insights to others looking to establish a similar network or draw on the ACORN project to conduct sub-studies. The ACORN project followed a multifaceted recruitment and promotion strategy drawing on distinct branding, a practitioner-focused promotion campaign, and a strategically designed questionnaire and distribution/recruitment approach to attract sufficient participation from the ranks of registered chiropractors across Australia. From the 4684 chiropractors registered at the time of recruitment, the project achieved a database response rate of 36% (n = 1680), resulting in a large, nationally representative sample across age, gender, and location. This sample constitutes the largest proportional coverage of participants from any voluntary national PBRN across any single health care profession. It does appear that a number of key promotional and recruitment features of the ACORN project may have helped establish the high response rate for the PBRN, which constitutes an important sustainable resource for future national and international efforts to grow the chiropractic evidence base and research capacity. Further rigorous enquiry is needed to help evaluate the direct contribution of specific promotional and recruitment strategies in attaining high response rates from practitioner populations who may be invited to participate in future PBRNs. Copyright © 2016. Published by Elsevier Inc.

  11. Catch, effort and sampling strategies in the highly variable sardine fisheries around East Java, Indonesia.

    NARCIS (Netherlands)

    Pet, J.S.; Densen, van W.L.T.; Machiels, M.A.M.; Sukkel, M.; Setyohady, D.; Tumuljadi, A.

    1997-01-01

    Temporal and spatial patterns in the fishery for Sardinella spp. around East Java, Indonesia, were studied in an attempt to develop an efficient catch and effort sampling strategy for this highly variable fishery. The inter-annual and monthly variation in catch, effort and catch per unit of effort

  12. Affect-Aware Adaptive Tutoring Based on Human-Automation Etiquette Strategies.

    Science.gov (United States)

    Yang, Euijung; Dorneich, Michael C

    2018-06-01

    We investigated adapting the interaction style of intelligent tutoring system (ITS) feedback based on human-automation etiquette strategies. Most ITSs adapt the content difficulty level, adapt the feedback timing, or provide extra content when they detect cognitive or affective decrements. Our previous work demonstrated that changing the interaction style via different feedback etiquette strategies has differential effects on students' motivation, confidence, satisfaction, and performance. The best etiquette strategy was also determined by user frustration. Based on these findings, a rule set was developed that systemically selected the proper etiquette strategy to address one of four learning factors (motivation, confidence, satisfaction, and performance) under two different levels of user frustration. We explored whether etiquette strategy selection based on this rule set (systematic) or random changes in etiquette strategy for a given level of frustration affected the four learning factors. Participants solved mathematics problems under different frustration conditions with feedback that adapted dynamic changes in etiquette strategies either systematically or randomly. The results demonstrated that feedback with etiquette strategies chosen systematically via the rule set could selectively target and improve motivation, confidence, satisfaction, and performance more than changing etiquette strategies randomly. The systematic adaptation was effective no matter the level of frustration for the participant. If computer tutors can vary the interaction style to effectively mitigate negative emotions, then ITS designers would have one more mechanism in which to design affect-aware adaptations that provide the proper responses in situations where human emotions affect the ability to learn.

  13. Sampling strategies for subsampled segmented EPI PRF thermometry in MR guided high intensity focused ultrasound

    Science.gov (United States)

    Odéen, Henrik; Todd, Nick; Diakite, Mahamadou; Minalga, Emilee; Payne, Allison; Parker, Dennis L.

    2014-01-01

    Purpose: To investigate k-space subsampling strategies to achieve fast, large field-of-view (FOV) temperature monitoring using segmented echo planar imaging (EPI) proton resonance frequency shift thermometry for MR guided high intensity focused ultrasound (MRgHIFU) applications. Methods: Five different k-space sampling approaches were investigated, varying sample spacing (equally vs nonequally spaced within the echo train), sampling density (variable sampling density in zero, one, and two dimensions), and utilizing sequential or centric sampling. Three of the schemes utilized sequential sampling with the sampling density varied in zero, one, and two dimensions, to investigate sampling the k-space center more frequently. Two of the schemes utilized centric sampling to acquire the k-space center with a longer echo time for improved phase measurements, and vary the sampling density in zero and two dimensions, respectively. Phantom experiments and a theoretical point spread function analysis were performed to investigate their performance. Variable density sampling in zero and two dimensions was also implemented in a non-EPI GRE pulse sequence for comparison. All subsampled data were reconstructed with a previously described temporally constrained reconstruction (TCR) algorithm. Results: The accuracy of each sampling strategy in measuring the temperature rise in the HIFU focal spot was measured in terms of the root-mean-square-error (RMSE) compared to fully sampled “truth.” For the schemes utilizing sequential sampling, the accuracy was found to improve with the dimensionality of the variable density sampling, giving values of 0.65 °C, 0.49 °C, and 0.35 °C for density variation in zero, one, and two dimensions, respectively. The schemes utilizing centric sampling were found to underestimate the temperature rise, with RMSE values of 1.05 °C and 1.31 °C, for variable density sampling in zero and two dimensions, respectively. Similar subsampling schemes

  14. IPTV program recommendation based on combination strategies

    Directory of Open Access Journals (Sweden)

    Li Hao

    2018-01-01

    Full Text Available As a new interactive service technology, IPTV has been extensively studying in the field of TV pro-gram recommendation, but the sparse of the user-program rating matrix and the cold-start problem is a bottleneck that the program recommended accurately. In this paper, a flexible combination of two recommendation strategies proposed, which explored the sparse and cold-start problem as well as the issue of user interest change over time. This paper achieved content-based filtering section and collaborative filtering section according to the two combination strategies, which effectively solved the cold-start program and over the sparse problem and the problem of users interest change over time. The experimental results showed that this combinational recommendation system in optimal parameters compared by using any one of two combination strategies or not using any combination strategy at all, and the reducing range of MAE is [2.7%,3%].The increasing range of precision and recall is [13.8%95.5%] and [0,97.8%], respectively. The experiment showed better results when using combinational recommendation system in optimal parameters than using each combination strategies individually or not using any combination strategy.

  15. Sample Based Unit Liter Dose Estimates

    International Nuclear Information System (INIS)

    JENSEN, L.

    2000-01-01

    The Tank Waste Characterization Program has taken many core samples, grab samples, and auger samples from the single-shell and double-shell tanks during the past 10 years. Consequently, the amount of sample data available has increased, both in terms of quantity of sample results and the number of tanks characterized. More and better data is available than when the current radiological and toxicological source terms used in the Basis for Interim Operation (BIO) (FDH 1999a) and the Final Safety Analysis Report (FSAR) (FDH 1999b) were developed. The Nuclear Safety and Licensing (NS and L) organization wants to use the new data to upgrade the radiological and toxicological source terms used in the BIO and FSAR. The NS and L organization requested assistance in producing a statistically based process for developing the source terms. This report describes the statistical techniques used and the assumptions made to support the development of a new radiological source term for liquid and solid wastes stored in single-shell and double-shell tanks. The results given in this report are a revision to similar results given in an earlier version of the document (Jensen and Wilmarth 1999). The main difference between the results in this document and the earlier version is that the dose conversion factors (DCF) for converting μCi/g or μCi/L to Sv/L (sieverts per liter) have changed. There are now two DCFs, one based on ICRP-68 and one based on ICW-71 (Brevick 2000)

  16. Control strategies for VSC-based HVDC transmission system

    DEFF Research Database (Denmark)

    Stan, Ana-Irina; Stroe, Daniel Ioan; Silva, Rodrigo Da

    2011-01-01

    Throughout this paper the modeling and control of the VSC-based HVDC systems are investigated and described. Two different control methods capable of controlling such systems are proposed. Both developed control strategies are implemented in the dq synchronous reference frame. In order to analyze...... the behavior of the developed VSC-based HVDC transmission system two study cases are carried out using MATLAB/Simulink. The results obtained from simulations show acceptable performances, of the proposed strategies, when changes in the reference parameters are considered. The active power flow between...

  17. Field screening sampling and analysis strategy and methodology for the 183-H Solar Evaporation Basins: Phase 2, Soils

    International Nuclear Information System (INIS)

    Antipas, A.; Hopkins, A.M.; Wasemiller, M.A.; McCain, R.G.

    1996-01-01

    This document provides a sampling/analytical strategy and methodology for Resource Conservation and Recovery Act (RCRA) closure of the 183-H Solar Evaporation Basins within the boundaries and requirements identified in the initial Phase II Sampling and Analysis Plan for RCRA Closure of the 183-H Solar Evaporation Basins

  18. Chemometric classification of casework arson samples based on gasoline content.

    Science.gov (United States)

    Sinkov, Nikolai A; Sandercock, P Mark L; Harynuk, James J

    2014-02-01

    Detection and identification of ignitable liquids (ILs) in arson debris is a critical part of arson investigations. The challenge of this task is due to the complex and unpredictable chemical nature of arson debris, which also contains pyrolysis products from the fire. ILs, most commonly gasoline, are complex chemical mixtures containing hundreds of compounds that will be consumed or otherwise weathered by the fire to varying extents depending on factors such as temperature, air flow, the surface on which IL was placed, etc. While methods such as ASTM E-1618 are effective, data interpretation can be a costly bottleneck in the analytical process for some laboratories. In this study, we address this issue through the application of chemometric tools. Prior to the application of chemometric tools such as PLS-DA and SIMCA, issues of chromatographic alignment and variable selection need to be addressed. Here we use an alignment strategy based on a ladder consisting of perdeuterated n-alkanes. Variable selection and model optimization was automated using a hybrid backward elimination (BE) and forward selection (FS) approach guided by the cluster resolution (CR) metric. In this work, we demonstrate the automated construction, optimization, and application of chemometric tools to casework arson data. The resulting PLS-DA and SIMCA classification models, trained with 165 training set samples, have provided classification of 55 validation set samples based on gasoline content with 100% specificity and sensitivity. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Effects of Direct Fuel Injection Strategies on Cycle-by-Cycle Variability in a Gasoline Homogeneous Charge Compression Ignition Engine: Sample Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Jacek Hunicz

    2015-01-01

    Full Text Available In this study we summarize and analyze experimental observations of cyclic variability in homogeneous charge compression ignition (HCCI combustion in a single-cylinder gasoline engine. The engine was configured with negative valve overlap (NVO to trap residual gases from prior cycles and thus enable auto-ignition in successive cycles. Correlations were developed between different fuel injection strategies and cycle average combustion and work output profiles. Hypothesized physical mechanisms based on these correlations were then compared with trends in cycle-by-cycle predictability as revealed by sample entropy. The results of these comparisons help to clarify how fuel injection strategy can interact with prior cycle effects to affect combustion stability and so contribute to design control methods for HCCI engines.

  20. Sampling strategies and stopping criteria for stochastic dual dynamic programming: a case study in long-term hydrothermal scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Homem-de-Mello, Tito [University of Illinois at Chicago, Department of Mechanical and Industrial Engineering, Chicago, IL (United States); Matos, Vitor L. de; Finardi, Erlon C. [Universidade Federal de Santa Catarina, LabPlan - Laboratorio de Planejamento de Sistemas de Energia Eletrica, Florianopolis (Brazil)

    2011-03-15

    The long-term hydrothermal scheduling is one of the most important problems to be solved in the power systems area. This problem aims to obtain an optimal policy, under water (energy) resources uncertainty, for hydro and thermal plants over a multi-annual planning horizon. It is natural to model the problem as a multi-stage stochastic program, a class of models for which algorithms have been developed. The original stochastic process is represented by a finite scenario tree and, because of the large number of stages, a sampling-based method such as the Stochastic Dual Dynamic Programming (SDDP) algorithm is required. The purpose of this paper is two-fold. Firstly, we study the application of two alternative sampling strategies to the standard Monte Carlo - namely, Latin hypercube sampling and randomized quasi-Monte Carlo - for the generation of scenario trees, as well as for the sampling of scenarios that is part of the SDDP algorithm. Secondly, we discuss the formulation of stopping criteria for the optimization algorithm in terms of statistical hypothesis tests, which allows us to propose an alternative criterion that is more robust than that originally proposed for the SDDP. We test these ideas on a problem associated with the whole Brazilian power system, with a three-year planning horizon. (orig.)

  1. School-Based and Community-Based Gun Safety Educational Strategies for Injury Prevention.

    Science.gov (United States)

    Holly, Cheryl; Porter, Sallie; Kamienski, Mary; Lim, Aubrianne

    2018-05-01

    Nearly 1,300 children in the United States die because of firearm-related injury each year and another 5,790 survive gunshot wounds, making the prevention of firearm-related unintentional injury to children of vital importance to families, health professionals, and policy makers. To systematically review the evidence on school-based and community-based gun safety programs for children aged 3 to 18 years. Systematic review. Twelve databases were searched from their earliest records to December 2016. Interventional and analytic studies were sought, including randomized controlled trials, quasi-experimental studies, as well as before-and-after studies or cohort studies with or without a control that involved an intervention. The low level of evidence, heterogeneity of studies, and lack of consistent outcome measures precluded a pooled estimate of results. A best evidence synthesis was performed. Results support the premise that programs using either knowledge-based or active learning strategies or a combination of these may be insufficient for teaching gun safety skills to children. Gun safety programs do not improve the likelihood that children will not handle firearms in an unsupervised situation. Stronger research designs with larger samples are needed to determine the most effective way to transfer the use of the gun safety skills outside the training session and enable stronger conclusions to be drawn.

  2. Soil sampling and analytical strategies for mapping fallout in nuclear emergencies based on the Fukushima Dai-ichi Nuclear Power Plant accident

    International Nuclear Information System (INIS)

    Onda, Yuichi; Kato, Hiroaki; Hoshi, Masaharu; Takahashi, Yoshio; Nguyen, Minh-Long

    2015-01-01

    The Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident resulted in extensive radioactive contamination of the environment via deposited radionuclides such as radiocesium and 131 I. Evaluating the extent and level of environmental contamination is critical to protecting citizens in affected areas and to planning decontamination efforts. However, a standardized soil sampling protocol is needed in such emergencies to facilitate the collection of large, tractable samples for measuring gamma-emitting radionuclides. In this study, we developed an emergency soil sampling protocol based on preliminary sampling from the FDNPP accident-affected area. We also present the results of a preliminary experiment aimed to evaluate the influence of various procedures (e.g., mixing, number of samples) on measured radioactivity. Results show that sample mixing strongly affects measured radioactivity in soil samples. Furthermore, for homogenization, shaking the plastic sample container at least 150 times or disaggregating soil by hand-rolling in a disposable plastic bag is required. Finally, we determined that five soil samples within a 3 m × 3-m area are the minimum number required for reducing measurement uncertainty in the emergency soil sampling protocol proposed here. - Highlights: • Emergency soil sampling protocol was proposed for nuclear hazards. • Various sampling procedures were tested and evaluated in Fukushima area. • Soil sample mixing procedure was of key importance for measured radioactivity. • Minimum number of sampling was determined for reducing measurement uncertainty

  3. Design Strategies for Aptamer-Based Biosensors

    Science.gov (United States)

    Han, Kun; Liang, Zhiqiang; Zhou, Nandi

    2010-01-01

    Aptamers have been widely used as recognition elements for biosensor construction, especially in the detection of proteins or small molecule targets, and regarded as promising alternatives for antibodies in bioassay areas. In this review, we present an overview of reported design strategies for the fabrication of biosensors and classify them into four basic modes: target-induced structure switching mode, sandwich or sandwich-like mode, target-induced dissociation/displacement mode and competitive replacement mode. In view of the unprecedented advantages brought about by aptamers and smart design strategies, aptamer-based biosensors are expected to be one of the most promising devices in bioassay related applications. PMID:22399891

  4. Strategy2D: Turn-based Strategy Video Game Engine for Mobile Devices

    OpenAIRE

    Calvo Villazón, Javier

    2014-01-01

    Multi-platform video game engine for the development of turn-based strategy games for mobile devices. Developed in C++ within the Cocos2d-x framework, It provides a scalable and configurable tool for the creation of this type of games.

  5. Autonomous spatially adaptive sampling in experiments based on curvature, statistical error and sample spacing with applications in LDA measurements

    Science.gov (United States)

    Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.

    2015-06-01

    Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.

  6. The association between implementation strategy use and the uptake of hepatitis C treatment in a national sample.

    Science.gov (United States)

    Rogal, Shari S; Yakovchenko, Vera; Waltz, Thomas J; Powell, Byron J; Kirchner, JoAnn E; Proctor, Enola K; Gonzalez, Rachel; Park, Angela; Ross, David; Morgan, Timothy R; Chartier, Maggie; Chinman, Matthew J

    2017-05-11

    Hepatitis C virus (HCV) is a common and highly morbid illness. New medications that have much higher cure rates have become the new evidence-based practice in the field. Understanding the implementation of these new medications nationally provides an opportunity to advance the understanding of the role of implementation strategies in clinical outcomes on a large scale. The Expert Recommendations for Implementing Change (ERIC) study defined discrete implementation strategies and clustered these strategies into groups. The present evaluation assessed the use of these strategies and clusters in the context of HCV treatment across the US Department of Veterans Affairs (VA), Veterans Health Administration, the largest provider of HCV care nationally. A 73-item survey was developed and sent to all VA sites treating HCV via electronic survey, to assess whether or not a site used each ERIC-defined implementation strategy related to employing the new HCV medication in 2014. VA national data regarding the number of Veterans starting on the new HCV medications at each site were collected. The associations between treatment starts and number and type of implementation strategies were assessed. A total of 80 (62%) sites responded. Respondents endorsed an average of 25 ± 14 strategies. The number of treatment starts was positively correlated with the total number of strategies endorsed (r = 0.43, p strategies endorsed (p strategies, compared to 15 strategies in the lowest quartile. There were significant differences in the types of strategies endorsed by sites in the highest and lowest quartiles of treatment starts. Four of the 10 top strategies for sites in the top quartile had significant correlations with treatment starts compared to only 1 of the 10 top strategies in the bottom quartile sites. Overall, only 3 of the top 15 most frequently used strategies were associated with treatment. These results suggest that sites that used a greater number of implementation

  7. Rule-based energy management strategies for hybrid vehicles

    NARCIS (Netherlands)

    Hofman, T.; Druten, van R.M.; Serrarens, A.F.A.; Steinbuch, M.

    2007-01-01

    Int. J. of Electric and Hybrid Vehicles (IJEHV), The highest control layer of a (hybrid) vehicular drive train is termed the Energy Management Strategy (EMS). In this paper an overview of different control methods is given and a new rule-based EMS is introduced based on the combination of Rule-Based

  8. A New Treatment Strategy for Inactivating Algae in Ballast Water Based on Multi-Trial Injections of Chlorine.

    Science.gov (United States)

    Sun, Jinyang; Wang, Junsheng; Pan, Xinxiang; Yuan, Haichao

    2015-06-09

    Ships' ballast water can carry aquatic organisms into foreign ecosystems. In our previous studies, a concept using ion exchange membrane electrolysis to treat ballast water has been proven. In addition to other substantial approaches, a new strategy for inactivating algae is proposed based on the developed ballast water treatment system. In the new strategy, the means of multi-trial injection with small doses of electrolytic products is applied for inactivating algae. To demonstrate the performance of the new strategy, contrast experiments between new strategies and routine processes were conducted. Four algae species including Chlorella vulgaris, Platymonas subcordiformis, Prorocentrum micans and Karenia mikimotoi were chosen as samples. The different experimental parameters are studied including the injection times and doses of electrolytic products. Compared with the conventional one trial injection method, mortality rate time (MRT) and available chlorine concentration can be saved up to about 84% and 40%, respectively, under the application of the new strategy. The proposed new approach has great potential in practical ballast water treatment. Furthermore, the strategy is also helpful for deep insight of mechanism of algal tolerance.

  9. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    Science.gov (United States)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity

  10. Practical experiences with an extended screening strategy for genetically modified organisms (GMOs) in real-life samples.

    Science.gov (United States)

    Scholtens, Ingrid; Laurensse, Emile; Molenaar, Bonnie; Zaaijer, Stephanie; Gaballo, Heidi; Boleij, Peter; Bak, Arno; Kok, Esther

    2013-09-25

    Nowadays most animal feed products imported into Europe have a GMO (genetically modified organism) label. This means that they contain European Union (EU)-authorized GMOs. For enforcement of these labeling requirements, it is necessary, with the rising number of EU-authorized GMOs, to perform an increasing number of analyses. In addition to this, it is necessary to test products for the potential presence of EU-unauthorized GMOs. Analysis for EU-authorized and -unauthorized GMOs in animal feed has thus become laborious and expensive. Initial screening steps may reduce the number of GMO identification methods that need to be applied, but with the increasing diversity also screening with GMO elements has become more complex. For the present study, the application of an informative detailed 24-element screening and subsequent identification strategy was applied in 50 animal feed samples. Almost all feed samples were labeled as containing GMO-derived materials. The main goal of the study was therefore to investigate if a detailed screening strategy would reduce the number of subsequent identification analyses. An additional goal was to test the samples in this way for the potential presence of EU-unauthorized GMOs. Finally, to test the robustness of the approach, eight of the samples were tested in a concise interlaboratory study. No significant differences were found between the results of the two laboratories.

  11. Digital Content Strategies

    OpenAIRE

    Halbheer, Daniel; Stahl, Florian; Koenigsberg, Oded; Lehmann, Donald R

    2013-01-01

    This paper studies content strategies for online publishers of digital information goods. It examines sampling strategies and compares their performance to paid content and free content strategies. A sampling strategy, where some of the content is offered for free and consumers are charged for access to the rest, is known as a "metered model" in the newspaper industry. We analyze optimal decisions concerning the size of the sample and the price of the paid content when sampling serves the dua...

  12. Spatiotemporally Representative and Cost-Efficient Sampling Design for Validation Activities in Wanglang Experimental Site

    Directory of Open Access Journals (Sweden)

    Gaofei Yin

    2017-11-01

    Full Text Available Spatiotemporally representative Elementary Sampling Units (ESUs are required for capturing the temporal variations in surface spatial heterogeneity through field measurements. Since inaccessibility often coexists with heterogeneity, a cost-efficient sampling design is mandatory. We proposed a sampling strategy to generate spatiotemporally representative and cost-efficient ESUs based on the conditioned Latin hypercube sampling scheme. The proposed strategy was constrained by multi-temporal Normalized Difference Vegetation Index (NDVI imagery, and the ESUs were limited within a sampling feasible region established based on accessibility criteria. A novel criterion based on the Overlapping Area (OA between the NDVI frequency distribution histogram from the sampled ESUs and that from the entire study area was used to assess the sampling efficiency. A case study in Wanglang National Nature Reserve in China showed that the proposed strategy improves the spatiotemporally representativeness of sampling (mean annual OA = 74.7% compared to the single-temporally constrained (OA = 68.7% and the random sampling (OA = 63.1% strategies. The introduction of the feasible region constraint significantly reduces in-situ labour-intensive characterization necessities at expenses of about 9% loss in the spatiotemporal representativeness of the sampling. Our study will support the validation activities in Wanglang experimental site providing a benchmark for locating the nodes of automatic observation systems (e.g., LAINet which need a spatially distributed and temporally fixed sampling design.

  13. Contamination of apple orchard soils and fruit trees with copper-based fungicides: sampling aspects.

    Science.gov (United States)

    Wang, Quanying; Liu, Jingshuang; Liu, Qiang

    2015-01-01

    Accumulations of copper in orchard soils and fruit trees due to the application of Cu-based fungicides have become research hotspots. However, information about the sampling strategies, which can affect the accuracy of the following research results, is lacking. This study aimed to determine some sampling considerations when Cu accumulations in the soils and fruit trees of apple orchards are studied. The study was conducted in three apple orchards from different sites. Each orchard included two different histories of Cu-based fungicides usage, varying from 3 to 28 years. Soil samples were collected from different locations varying with the distances from tree trunk to the canopy drip line. Fruits and leaves from the middle heights of tree canopy at two locations (outer canopy and inner canopy) were collected. The variation in total soil Cu concentrations between orchards was much greater than the variation within orchards. Total soil Cu concentrations had a tendency to increase with the increasing history of Cu-based fungicides usage. Moreover, total soil Cu concentrations had the lowest values at the canopy drip line, while the highest values were found at the half distances between the trunk and the canopy drip line. Additionally, Cu concentrations of leaves and fruits from the outer parts of the canopy were significantly higher than from the inner parts. Depending on the findings of this study, not only the between-orchard variation but also the within-orchard variation should be taken into consideration when conducting future soil and tree samplings in apple orchards.

  14. Application of the risk-based strategy to the Hanford tank waste organic-nitrate safety issue

    International Nuclear Information System (INIS)

    Hunter, V.L.; Colson, S.D.; Ferryman, T.; Gephart, R.E.; Heasler, P.; Scheele, R.D.

    1997-12-01

    This report describes the results from application of the Risk-Based Decision Management Approach for Justifying Characterization of Hanford Tank Waste to the organic-nitrate safety issue in Hanford single-shell tanks (SSTs). Existing chemical and physical models were used, taking advantage of the most current (mid-1997) sampling and analysis data. The purpose of this study is to make specific recommendations for planning characterization to help ensure the safety of each SST as it relates to the organic-nitrate safety issue. An additional objective is to demonstrate the viability of the Risk-Based Strategy for addressing Hanford tank waste safety issues

  15. Application of the risk-based strategy to the Hanford tank waste organic-nitrate safety issue

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, V.L.; Colson, S.D.; Ferryman, T.; Gephart, R.E.; Heasler, P.; Scheele, R.D.

    1997-12-01

    This report describes the results from application of the Risk-Based Decision Management Approach for Justifying Characterization of Hanford Tank Waste to the organic-nitrate safety issue in Hanford single-shell tanks (SSTs). Existing chemical and physical models were used, taking advantage of the most current (mid-1997) sampling and analysis data. The purpose of this study is to make specific recommendations for planning characterization to help ensure the safety of each SST as it relates to the organic-nitrate safety issue. An additional objective is to demonstrate the viability of the Risk-Based Strategy for addressing Hanford tank waste safety issues.

  16. Serum sample containing endogenous antibodies interfering with multiple hormone immunoassays. Laboratory strategies to detect interference

    Directory of Open Access Journals (Sweden)

    Elena García-González

    2016-04-01

    Full Text Available Objectives: Endogenous antibodies (EA may interfere with immunoassays, causing erroneous results for hormone analyses. As (in most cases this interference arises from the assay format and most immunoassays, even from different manufacturers, are constructed in a similar way, it is possible for a single type of EA to interfere with different immunoassays. Here we describe the case of a patient whose serum sample contains EA that interfere several hormones tests. We also discuss the strategies deployed to detect interference. Subjects and methods: Over a period of four years, a 30-year-old man was subjected to a plethora of laboratory and imaging diagnostic procedures as a consequence of elevated hormone results, mainly of pituitary origin, which did not correlate with the overall clinical picture. Results: Once analytical interference was suspected, the best laboratory approaches to investigate it were sample reanalysis on an alternative platform and sample incubation with antibody blocking tubes. Construction of an in-house ‘nonsense’ sandwich assay was also a valuable strategy to confirm interference. In contrast, serial sample dilutions were of no value in our case, while polyethylene glycol (PEG precipitation gave inconclusive results, probably due to the use of inappropriate PEG concentrations for several of the tests assayed. Conclusions: Clinicians and laboratorians must be aware of the drawbacks of immunometric assays, and alert to the possibility of EA interference when results do not fit the clinical pattern. Keywords: Endogenous antibodies, Immunoassay, Interference, Pituitary hormones, Case report

  17. A Web-Based Respondent Driven Sampling Pilot Targeting Young People at Risk for Chlamydia Trachomatis in Social and Sexual Networks with Testing : A Use Evaluation

    NARCIS (Netherlands)

    Theunissen, Kevin; Hoebe, Christian; Kok, Gerjo; Crutzen, Rik; Kara-Zaïtri, Chakib; de Vries, Nanne; van Bergen, Jan; Hamilton, Robert; van der Sande, Marianne; Dukers-Muijrers, Nicole

    BACKGROUND: With the aim of targeting high-risk hidden heterosexual young people for Chlamydia trachomatis (CT) testing, an innovative web-based screening strategy using Respondent Driven Sampling (RDS) and home-based CT testing, was developed, piloted and evaluated. METHODS: Two STI clinic nurses

  18. Collaboration During the NASA ABoVE Airborne SAR Campaign: Sampling Strategies Used by NGEE Arctic and Other Partners in Alaska and Western Canada

    Science.gov (United States)

    Wullschleger, S. D.; Charsley-Groffman, L.; Baltzer, J. L.; Berg, A. A.; Griffith, P. C.; Jafarov, E. E.; Marsh, P.; Miller, C. E.; Schaefer, K. M.; Siqueira, P.; Wilson, C. J.; Kasischke, E. S.

    2017-12-01

    There is considerable interest in using L- and P-band Synthetic Aperture Radar (SAR) data to monitor variations in aboveground woody biomass, soil moisture, and permafrost conditions in high-latitude ecosystems. Such information is useful for quantifying spatial heterogeneity in surface and subsurface properties, and for model development and evaluation. To conduct these studies, it is desirable that field studies share a common sampling strategy so that the data from multiple sites can be combined and used to analyze variations in conditions across different landscape geomorphologies and vegetation types. In 2015, NASA launched the decade-long Arctic-Boreal Vulnerability Experiment (ABoVE) to study the sensitivity and resilience of these ecosystems to disturbance and environmental change. NASA is able to leverage its remote sensing strengths to collect airborne and satellite observations to capture important ecosystem properties and dynamics across large spatial scales. A critical component of this effort includes collection of ground-based data that can be used to analyze, calibrate and validate remote sensing products. ABoVE researchers at a large number of sites located in important Arctic and boreal ecosystems in Alaska and western Canada are following common design protocols and strategies for measuring soil moisture, thaw depth, biomass, and wetland inundation. Here we elaborate on those sampling strategies as used in the 2017 summer SAR campaign and address the sampling design and measurement protocols for supporting the ABoVE aerial activities. Plot size, transect length, and distribution of replicates across the landscape systematically allowed investigators to optimally sample a site for soil moisture, thaw depth, and organic layer thickness. Specific examples and data sets are described for the Department of Energy's Next-Generation Ecosystem Experiments (NGEE Arctic) project field sites near Nome and Barrow, Alaska. Future airborne and satellite

  19. Sampling of temporal networks: Methods and biases

    Science.gov (United States)

    Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter

    2017-11-01

    Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.

  20. Image-Based Method for Determining Better Walking Strategies for Hexapods

    Directory of Open Access Journals (Sweden)

    Kazi Mostafa

    2015-05-01

    Full Text Available An intelligent walking strategy is vital for multi-legged robots possessing no a priori information of an environment when traversing across discontinuous terrain. Six-legged robots outperform other multi-legged robots in static and dynamic stability. However, hexapods require careful planning to traverse across discontinuous terrain. A hexapod walking strategy can be accomplished using a vision-based navigation system to identify the surrounding environment. This paper presents an image-based technique to achieve better walking strategies for a hexapod walking on a special terrain containing irregular, restricted regions. The properties of the restricted regions were acquired beforehand by using reliable surveillance means. Moreover, simplified forward gaits, better rotational gaits, and adaptive gait selection strategies for walking on discontinuous terrain were proposed. The hexapod can effectively switch the gait sequences and types according to the environment involved. The boundary of standing zones can be successfully labelled by applying the greyscale erosion comprising a structuring element similar in shape and size to the foot tip of the hexapod. The experimental results demonstrated that the proposed image-based technique significantly improved the walking strategies of hexapods traversing on discontinuous terrain.

  1. Toward a Common Understanding of Research-Based Instructional Strategies

    Science.gov (United States)

    Goodwin, Deborah; Webb, Mary Ann

    2014-01-01

    A review of available books, articles and on-line resources which deal with "Research-Based Instructional Strategies" will produce a plethora of materials which promote the effectiveness of these strategies on student achievement. Also, a perusal of classroom instruction and teacher evaluation instruments will reveal that many of the…

  2. Access and completion of a Web-based treatment in a population-based sample of tornado-affected adolescents.

    Science.gov (United States)

    Price, Matthew; Yuen, Erica K; Davidson, Tatiana M; Hubel, Grace; Ruggiero, Kenneth J

    2015-08-01

    Although Web-based treatments have significant potential to assess and treat difficult-to-reach populations, such as trauma-exposed adolescents, the extent that such treatments are accessed and used is unclear. The present study evaluated the proportion of adolescents who accessed and completed a Web-based treatment for postdisaster mental health symptoms. Correlates of access and completion were examined. A sample of 2,000 adolescents living in tornado-affected communities was assessed via structured telephone interview and invited to a Web-based treatment. The modular treatment addressed symptoms of posttraumatic stress disorder, depression, and alcohol and tobacco use. Participants were randomized to experimental or control conditions after accessing the site. Overall access for the intervention was 35.8%. Module completion for those who accessed ranged from 52.8% to 85.6%. Adolescents with parents who used the Internet to obtain health-related information were more likely to access the treatment. Adolescent males were less likely to access the treatment. Future work is needed to identify strategies to further increase the reach of Web-based treatments to provide clinical services in a postdisaster context. (c) 2015 APA, all rights reserved).

  3. Image Sampling with Quasicrystals

    Directory of Open Access Journals (Sweden)

    Mark Grundland

    2009-07-01

    Full Text Available We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  4. A Web-Based Respondent Driven Sampling Pilot Targeting Young People at Risk for Chlamydia Trachomatis in Social and Sexual Networks with Testing: A Use Evaluation

    NARCIS (Netherlands)

    Theunissen, Kevin; Hoebe, Christian; Kok, Gerjo; Crutzen, Rik; Kara-Zaïtri, Chakib; de Vries, Nanne; van Bergen, Jan; Hamilton, Robert; van der Sande, Marianne; Dukers-Muijrers, Nicole

    2015-01-01

    With the aim of targeting high-risk hidden heterosexual young people for Chlamydia trachomatis (CT) testing, an innovative web-based screening strategy using Respondent Driven Sampling (RDS) and home-based CT testing, was developed, piloted and evaluated. Two STI clinic nurses encouraged 37 CT

  5. Healthy Aging Promotion through Neuroscientific Information-Based Strategies.

    Science.gov (United States)

    Seinfeld, Sofia; Sanchez-Vives, Maria V

    2015-09-28

    To ensure the well-being of a rapidly growing elderly population, it is fundamental to find strategies to foster healthy brain aging. With this intention, we designed a program of scientific-based lectures aimed at dissemination by established neuroscientists about brain function, brain plasticity and how lifestyle influences the brain. We also carried out a pilot study on the impact of the lectures on attendees. The objective was to provide information to elderly people in order to encourage them to identify unhealthy and healthy daily habits, and more importantly, to promote behavioral changes towards healthy brain aging. Here we report on our experience. In order to determine the impact of the lectures in the daily routine of the attendees, we asked them to fill out questionnaires. Preliminary results indicate that neuroscientific information-based strategies can be a useful method to have a positive impact on the lives of elderly, increase their awareness on how to improve brain function and promote positive lifestyle modifications. Furthermore, based on self-reported data, we also found that through this strategy it is possible to promote behavioral changes related to nutrition, sleep, and realization of physical and cognitively stimulating activities. Finally, based on the results obtained, the importance of promoting self-efficacy and the empowerment of the older populations is highlighted.

  6. Healthy Aging Promotion through Neuroscientific Information-Based Strategies

    Directory of Open Access Journals (Sweden)

    Sofia Seinfeld

    2015-09-01

    Full Text Available To ensure the well-being of a rapidly growing elderly population, it is fundamental to find strategies to foster healthy brain aging. With this intention, we designed a program of scientific-based lectures aimed at dissemination by established neuroscientists about brain function, brain plasticity and how lifestyle influences the brain. We also carried out a pilot study on the impact of the lectures on attendees. The objective was to provide information to elderly people in order to encourage them to identify unhealthy and healthy daily habits, and more importantly, to promote behavioral changes towards healthy brain aging. Here we report on our experience. In order to determine the impact of the lectures in the daily routine of the attendees, we asked them to fill out questionnaires. Preliminary results indicate that neuroscientific information-based strategies can be a useful method to have a positive impact on the lives of elderly, increase their awareness on how to improve brain function and promote positive lifestyle modifications. Furthermore, based on self-reported data, we also found that through this strategy it is possible to promote behavioral changes related to nutrition, sleep, and realization of physical and cognitively stimulating activities. Finally, based on the results obtained, the importance of promoting self-efficacy and the empowerment of the older populations is highlighted.

  7. Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay

    International Nuclear Information System (INIS)

    Wang Na; Wu Zhi-Hai; Peng Li

    2014-01-01

    In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  8. Strategies for reward-based crowdfunding campaigns

    Directory of Open Access Journals (Sweden)

    Sascha Kraus

    2016-01-01

    Full Text Available Crowdfunding represents an alternative way of funding entrepreneurial ventures – and is attracting a high amount of interest in research as well as practice. Against this background, this paper analyzes reward-based crowdfunding campaign strategies and their communication tools. To do this, 446 crowdfunding projects were gathered and empirically analyzed. Three different paths of successful crowdfunding projects could be identified and are described in detail. Practical implications of crowdfunding strategies are derived, and are dependent on the required sales effort and the project added value. The terms communicator, networker and self-runner are created for this crowdfunding strategy and filled with practical examples. This paper contributes to the literature in different ways: first, it sheds more light on the developing concept of crowdfunding, with an overview of current academic discussions on crowdfunding. Furthermore, the analysis of success factors for crowdfunding initiatives adds to an emerging area of research and allows entrepreneurs to extract best practice examples for increasing the probability of successful crowdfunding projects under consideration of the key influencing factors of communication.

  9. Improvements to robotics-inspired conformational sampling in rosetta.

    Directory of Open Access Journals (Sweden)

    Amelie Stein

    Full Text Available To accurately predict protein conformations in atomic detail, a computational method must be capable of sampling models sufficiently close to the native structure. All-atom sampling is difficult because of the vast number of possible conformations and extremely rugged energy landscapes. Here, we test three sampling strategies to address these difficulties: conformational diversification, intensification of torsion and omega-angle sampling and parameter annealing. We evaluate these strategies in the context of the robotics-based kinematic closure (KIC method for local conformational sampling in Rosetta on an established benchmark set of 45 12-residue protein segments without regular secondary structure. We quantify performance as the fraction of sub-Angstrom models generated. While improvements with individual strategies are only modest, the combination of intensification and annealing strategies into a new "next-generation KIC" method yields a four-fold increase over standard KIC in the median percentage of sub-Angstrom models across the dataset. Such improvements enable progress on more difficult problems, as demonstrated on longer segments, several of which could not be accurately remodeled with previous methods. Given its improved sampling capability, next-generation KIC should allow advances in other applications such as local conformational remodeling of multiple segments simultaneously, flexible backbone sequence design, and development of more accurate energy functions.

  10. Regional MLEM reconstruction strategy for PET-based treatment verification in ion beam radiotherapy

    International Nuclear Information System (INIS)

    Gianoli, Chiara; Riboldi, Marco; Fattori, Giovanni; Baselli, Giuseppe; Baroni, Guido; Bauer, Julia; Debus, Jürgen; Parodi, Katia; De Bernardi, Elisabetta

    2014-01-01

    In ion beam radiotherapy, PET-based treatment verification provides a consistency check of the delivered treatment with respect to a simulation based on the treatment planning. In this work the region-based MLEM reconstruction algorithm is proposed as a new evaluation strategy in PET-based treatment verification. The comparative evaluation is based on reconstructed PET images in selected regions, which are automatically identified on the expected PET images according to homogeneity in activity values. The strategy was tested on numerical and physical phantoms, simulating mismatches between the planned and measured β + activity distributions. The region-based MLEM reconstruction was demonstrated to be robust against noise and the sensitivity of the strategy results were comparable to three voxel units, corresponding to 6 mm in numerical phantoms. The robustness of the region-based MLEM evaluation outperformed the voxel-based strategies. The potential of the proposed strategy was also retrospectively assessed on patient data and further clinical validation is envisioned. (paper)

  11. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  12. How to Handle Speciose Clades? Mass Taxon-Sampling as a Strategy towards Illuminating the Natural History of Campanula (Campanuloideae)

    Science.gov (United States)

    Mansion, Guilhem; Parolly, Gerald; Crowl, Andrew A.; Mavrodiev, Evgeny; Cellinese, Nico; Oganesian, Marine; Fraunhofer, Katharina; Kamari, Georgia; Phitos, Dimitrios; Haberle, Rosemarie; Akaydin, Galip; Ikinci, Nursel; Raus, Thomas; Borsch, Thomas

    2012-01-01

    Background Speciose clades usually harbor species with a broad spectrum of adaptive strategies and complex distribution patterns, and thus constitute ideal systems to disentangle biotic and abiotic causes underlying species diversification. The delimitation of such study systems to test evolutionary hypotheses is difficult because they often rely on artificial genus concepts as starting points. One of the most prominent examples is the bellflower genus Campanula with some 420 species, but up to 600 species when including all lineages to which Campanula is paraphyletic. We generated a large alignment of petD group II intron sequences to include more than 70% of described species as a reference. By comparison with partial data sets we could then assess the impact of selective taxon sampling strategies on phylogenetic reconstruction and subsequent evolutionary conclusions. Methodology/Principal Findings Phylogenetic analyses based on maximum parsimony (PAUP, PRAP), Bayesian inference (MrBayes), and maximum likelihood (RAxML) were first carried out on the large reference data set (D680). Parameters including tree topology, branch support, and age estimates, were then compared to those obtained from smaller data sets resulting from “classification-guided” (D088) and “phylogeny-guided sampling” (D101). Analyses of D088 failed to fully recover the phylogenetic diversity in Campanula, whereas D101 inferred significantly different branch support and age estimates. Conclusions/Significance A short genomic region with high phylogenetic utility allowed us to easily generate a comprehensive phylogenetic framework for the speciose Campanula clade. Our approach recovered 17 well-supported and circumscribed sub-lineages. Knowing these will be instrumental for developing more specific evolutionary hypotheses and guide future research, we highlight the predictive value of a mass taxon-sampling strategy as a first essential step towards illuminating the detailed evolutionary

  13. A survey on OFDM channel estimation techniques based on denoising strategies

    Directory of Open Access Journals (Sweden)

    Pallaviram Sure

    2017-04-01

    Full Text Available Channel estimation forms the heart of any orthogonal frequency division multiplexing (OFDM based wireless communication receiver. Frequency domain pilot aided channel estimation techniques are either least squares (LS based or minimum mean square error (MMSE based. LS based techniques are computationally less complex. Unlike MMSE ones, they do not require a priori knowledge of channel statistics (KCS. However, the mean square error (MSE performance of the channel estimator incorporating MMSE based techniques is better compared to that obtained with the incorporation of LS based techniques. To enhance the MSE performance using LS based techniques, a variety of denoising strategies have been developed in the literature, which are applied on the LS estimated channel impulse response (CIR. The advantage of denoising threshold based LS techniques is that, they do not require KCS but still render near optimal MMSE performance similar to MMSE based techniques. In this paper, a detailed survey on various existing denoising strategies, with a comparative discussion of these strategies is presented.

  14. Positive versus Negative Communication Strategies in Task-Based Learning

    Science.gov (United States)

    Rohani, Siti

    2013-01-01

    This study aimed at describing how the implementation of Task-Based Learning (TBL) would shape or change students' use of oral communication strategies. Students' problems and strategies to solve the problems during the implementation of TBL were also explored. The study was a mixed method, employing both quantitative and qualitative analysis…

  15. Reinforcement Learning–Based Energy Management Strategy for a Hybrid Electric Tracked Vehicle

    Directory of Open Access Journals (Sweden)

    Teng Liu

    2015-07-01

    Full Text Available This paper presents a reinforcement learning (RL–based energy management strategy for a hybrid electric tracked vehicle. A control-oriented model of the powertrain and vehicle dynamics is first established. According to the sample information of the experimental driving schedule, statistical characteristics at various velocities are determined by extracting the transition probability matrix of the power request. Two RL-based algorithms, namely Q-learning and Dyna algorithms, are applied to generate optimal control solutions. The two algorithms are simulated on the same driving schedule, and the simulation results are compared to clarify the merits and demerits of these algorithms. Although the Q-learning algorithm is faster (3 h than the Dyna algorithm (7 h, its fuel consumption is 1.7% higher than that of the Dyna algorithm. Furthermore, the Dyna algorithm registers approximately the same fuel consumption as the dynamic programming–based global optimal solution. The computational cost of the Dyna algorithm is substantially lower than that of the stochastic dynamic programming.

  16. Improving snow density estimation for mapping SWE with Lidar snow depth: assessment of uncertainty in modeled density and field sampling strategies in NASA SnowEx

    Science.gov (United States)

    Raleigh, M. S.; Smyth, E.; Small, E. E.

    2017-12-01

    The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.

  17. The Effect of Using a Proposed Teaching Strategy Based on the Selective Thinking on Students' Acquisition Concepts in Mathematics

    Science.gov (United States)

    Qudah, Ahmad Hassan

    2016-01-01

    This study aimed at identify the effect of using a proposed teaching strategy based on the selective thinking in acquire mathematical concepts by Classroom Teacher Students at Al- al- Bayt University, The sample of the study consisted of (74) students, equally distributed into a control group and an experimental group. The selective thinking…

  18. A Systematic Review of Promising Strategies of Faith-Based Cancer Education and Lifestyle Interventions Among Racial/Ethnic Minority Groups.

    Science.gov (United States)

    Hou, Su-I; Cao, Xian

    2017-09-13

    Church-based interventions have been used to reach racial/ethnic minorities. In order to develop effective programs, we conducted a comprehensive systematic review of faith-based cancer prevention studies (2005~2016) to examine characteristics and promising strategies. Combination terms "church or faith-based or religion," "intervention or program," and "cancer education or lifestyle" were used in searching the five major databases: CINAHL; ERIC; Health Technology Assessments; MEDLINE; and PsycInfo. A total of 20 studies met study criteria. CDC's Community Guide was used to analyze and review group interventions. Analyses were organized by two racial groups: African American (AA) and Latino/Hispanic American groups. Results showed most studies reviewed focused on breast cancer alone or in combination with other cancers. Studies of Latino/Hispanic groups targeted more on uninsured, Medicare, or Medicaid individuals, whereas AA studies generally did not include specific insurance criteria. The sample sizes of the AA studies were generally larger. The majority of these studies reviewed used pre-post, posttest only with control group, or quasi-experience designs. The Health Belief Model was the most commonly used theory in both groups. Community-based participatory research and empowerment/ecological frameworks were also used frequently in the Latino/Hispanic studies. Small media and group education were the top two most popular intervention strategies in both groups. Although one-on-one strategy was used in some Latino studies, neither group used reducing client out-of-pocket costs strategy. Client reminders could also be used more in both groups as well. Current review showed church-based cancer education programs were effective in changing knowledge, but not always screening utilization. Results show faith-based cancer educational interventions are promising. To maximize intervention impact, future studies might consider using stronger study designs, incorporating a

  19. [Progress in the spectral library based protein identification strategy].

    Science.gov (United States)

    Yu, Derui; Ma, Jie; Xie, Zengyan; Bai, Mingze; Zhu, Yunping; Shu, Kunxian

    2018-04-25

    Exponential growth of the mass spectrometry (MS) data is exhibited when the mass spectrometry-based proteomics has been developing rapidly. It is a great challenge to develop some quick, accurate and repeatable methods to identify peptides and proteins. Nowadays, the spectral library searching has become a mature strategy for tandem mass spectra based proteins identification in proteomics, which searches the experiment spectra against a collection of confidently identified MS/MS spectra that have been observed previously, and fully utilizes the abundance in the spectrum, peaks from non-canonical fragment ions, and other features. This review provides an overview of the implement of spectral library search strategy, and two key steps, spectral library construction and spectral library searching comprehensively, and discusses the progress and challenge of the library search strategy.

  20. What Successful Science Teachers Do: 75 Research-Based Strategies

    Science.gov (United States)

    Glasgow, Neal A.; Cheyne, Michele; Yerrick, Randy K.

    2010-01-01

    The experience and science expertise of these award-winning authors makes this easy-to-use guide a teacher's treasure trove. This latest edition to the popular What Successful Teachers Do series describes 75 research-based strategies and outlines best practices for inquiry-oriented science. Each strategy includes a brief description of the…

  1. A new modeling strategy for third-order fast high-performance liquid chromatographic data with fluorescence detection. Quantitation of fluoroquinolones in water samples.

    Science.gov (United States)

    Alcaráz, Mirta R; Bortolato, Santiago A; Goicoechea, Héctor C; Olivieri, Alejandro C

    2015-03-01

    Matrix augmentation is regularly employed in extended multivariate curve resolution-alternating least-squares (MCR-ALS), as applied to analytical calibration based on second- and third-order data. However, this highly useful concept has almost no correspondence in parallel factor analysis (PARAFAC) of third-order data. In the present work, we propose a strategy to process third-order chromatographic data with matrix fluorescence detection, based on an Augmented PARAFAC model. The latter involves decomposition of a three-way data array augmented along the elution time mode with data for the calibration samples and for each of the test samples. A set of excitation-emission fluorescence matrices, measured at different chromatographic elution times for drinking water samples, containing three fluoroquinolones and uncalibrated interferences, were evaluated using this approach. Augmented PARAFAC exploits the second-order advantage, even in the presence of significant changes in chromatographic profiles from run to run. The obtained relative errors of prediction were ca. 10 % for ofloxacin, ciprofloxacin, and danofloxacin, with a significant enhancement in analytical figures of merit in comparison with previous reports. The results are compared with those furnished by MCR-ALS.

  2. Integrating the Theory of Sampling into Underground Mine Grade Control Strategies

    Directory of Open Access Journals (Sweden)

    Simon C. Dominy

    2018-05-01

    Full Text Available Grade control in underground mines aims to deliver quality tonnes to the process plant via the accurate definition of ore and waste. It comprises a decision-making process including data collection and interpretation; local estimation; development and mining supervision; ore and waste destination tracking; and stockpile management. The foundation of any grade control programme is that of high-quality samples collected in a geological context. The requirement for quality samples has long been recognised, where they should be representative and fit-for-purpose. Once a sampling error is introduced, it propagates through all subsequent processes contributing to data uncertainty, which leads to poor decisions and financial loss. Proper application of the Theory of Sampling reduces errors during sample collection, preparation, and assaying. To achieve quality, sampling techniques must minimise delimitation, extraction, and preparation errors. Underground sampling methods include linear (chip and channel, grab (broken rock, and drill-based samples. Grade control staff should be well-trained and motivated, and operating staff should understand the critical need for grade control. Sampling must always be undertaken with a strong focus on safety and alternatives sought if the risk to humans is high. A quality control/quality assurance programme must be implemented, particularly when samples contribute to a reserve estimate. This paper assesses grade control sampling with emphasis on underground gold operations and presents recommendations for optimal practice through the application of the Theory of Sampling.

  3. Rats track odour trails accurately using a multi-layered strategy with near-optimal sampling.

    Science.gov (United States)

    Khan, Adil Ghani; Sarangi, Manaswini; Bhalla, Upinder Singh

    2012-02-28

    Tracking odour trails is a crucial behaviour for many animals, often leading to food, mates or away from danger. It is an excellent example of active sampling, where the animal itself controls how to sense the environment. Here we show that rats can track odour trails accurately with near-optimal sampling. We trained rats to follow odour trails drawn on paper spooled through a treadmill. By recording local field potentials (LFPs) from the olfactory bulb, and sniffing rates, we find that sniffing but not LFPs differ between tracking and non-tracking conditions. Rats can track odours within ~1 cm, and this accuracy is degraded when one nostril is closed. Moreover, they show path prediction on encountering a fork, wide 'casting' sweeps on encountering a gap and detection of reappearance of the trail in 1-2 sniffs. We suggest that rats use a multi-layered strategy, and achieve efficient sampling and high accuracy in this complex task.

  4. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    Science.gov (United States)

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Sampling strategies to improve passive optical remote sensing of river bathymetry

    Science.gov (United States)

    Legleiter, Carl; Overstreet, Brandon; Kinzel, Paul J.

    2018-01-01

    Passive optical remote sensing of river bathymetry involves establishing a relation between depth and reflectance that can be applied throughout an image to produce a depth map. Building upon the Optimal Band Ratio Analysis (OBRA) framework, we introduce sampling strategies for constructing calibration data sets that lead to strong relationships between an image-derived quantity and depth across a range of depths. Progressively excluding observations that exceed a series of cutoff depths from the calibration process improved the accuracy of depth estimates and allowed the maximum detectable depth ($d_{max}$) to be inferred directly from an image. Depth retrieval in two distinct rivers also was enhanced by a stratified version of OBRA that partitions field measurements into a series of depth bins to avoid biases associated with under-representation of shallow areas in typical field data sets. In the shallower, clearer of the two rivers, including the deepest field observations in the calibration data set did not compromise depth retrieval accuracy, suggesting that $d_{max}$ was not exceeded and the reach could be mapped without gaps. Conversely, in the deeper and more turbid stream, progressive truncation of input depths yielded a plausible estimate of $d_{max}$ consistent with theoretical calculations based on field measurements of light attenuation by the water column. This result implied that the entire channel, including pools, could not be mapped remotely. However, truncation improved the accuracy of depth estimates in areas shallower than $d_{max}$, which comprise the majority of the channel and are of primary interest for many habitat-oriented applications.

  6. A Story-Based Simulation for Teaching Sampling Distributions

    Science.gov (United States)

    Turner, Stephen; Dabney, Alan R.

    2015-01-01

    Statistical inference relies heavily on the concept of sampling distributions. However, sampling distributions are difficult to teach. We present a series of short animations that are story-based, with associated assessments. We hope that our contribution can be useful as a tool to teach sampling distributions in the introductory statistics…

  7. Limited-sampling strategy models for estimating the pharmacokinetic parameters of 4-methylaminoantipyrine, an active metabolite of dipyrone

    Directory of Open Access Journals (Sweden)

    Suarez-Kurtz G.

    2001-01-01

    Full Text Available Bioanalytical data from a bioequivalence study were used to develop limited-sampling strategy (LSS models for estimating the area under the plasma concentration versus time curve (AUC and the peak plasma concentration (Cmax of 4-methylaminoantipyrine (MAA, an active metabolite of dipyrone. Twelve healthy adult male volunteers received single 600 mg oral doses of dipyrone in two formulations at a 7-day interval in a randomized, crossover protocol. Plasma concentrations of MAA (N = 336, measured by HPLC, were used to develop LSS models. Linear regression analysis and a "jack-knife" validation procedure revealed that the AUC0-¥ and the Cmax of MAA can be accurately predicted (R²>0.95, bias 0.85 of the AUC0-¥ or Cmax for the other formulation. LSS models based on three sampling points (1.5, 4 and 24 h, but using different coefficients for AUC0-¥ and Cmax, predicted the individual values of both parameters for the enrolled volunteers (R²>0.88, bias = -0.65 and -0.37%, precision = 4.3 and 7.4% as well as for plasma concentration data sets generated by simulation (R²>0.88, bias = -1.9 and 8.5%, precision = 5.2 and 8.7%. Bioequivalence assessment of the dipyrone formulations based on the 90% confidence interval of log-transformed AUC0-¥ and Cmax provided similar results when either the best-estimated or the LSS-derived metrics were used.

  8. A Cost-Constrained Sampling Strategy in Support of LAI Product Validation in Mountainous Areas

    Directory of Open Access Journals (Sweden)

    Gaofei Yin

    2016-08-01

    Full Text Available Increasing attention is being paid on leaf area index (LAI retrieval in mountainous areas. Mountainous areas present extreme topographic variability, and are characterized by more spatial heterogeneity and inaccessibility compared with flat terrain. It is difficult to collect representative ground-truth measurements, and the validation of LAI in mountainous areas is still problematic. A cost-constrained sampling strategy (CSS in support of LAI validation was presented in this study. To account for the influence of rugged terrain on implementation cost, a cost-objective function was incorporated to traditional conditioned Latin hypercube (CLH sampling strategy. A case study in Hailuogou, Sichuan province, China was used to assess the efficiency of CSS. Normalized difference vegetation index (NDVI, land cover type, and slope were selected as auxiliary variables to present the variability of LAI in the study area. Results show that CSS can satisfactorily capture the variability across the site extent, while minimizing field efforts. One appealing feature of CSS is that the compromise between representativeness and implementation cost can be regulated according to actual surface heterogeneity and budget constraints, and this makes CSS flexible. Although the proposed method was only validated for the auxiliary variables rather than the LAI measurements, it serves as a starting point for establishing the locations of field plots and facilitates the preparation of field campaigns in mountainous areas.

  9. The generalization ability of online SVM classification based on Markov sampling.

    Science.gov (United States)

    Xu, Jie; Yan Tang, Yuan; Zou, Bin; Xu, Zongben; Li, Luoqing; Lu, Yang

    2015-03-01

    In this paper, we consider online support vector machine (SVM) classification learning algorithms with uniformly ergodic Markov chain (u.e.M.c.) samples. We establish the bound on the misclassification error of an online SVM classification algorithm with u.e.M.c. samples based on reproducing kernel Hilbert spaces and obtain a satisfactory convergence rate. We also introduce a novel online SVM classification algorithm based on Markov sampling, and present the numerical studies on the learning ability of online SVM classification based on Markov sampling for benchmark repository. The numerical studies show that the learning performance of the online SVM classification algorithm based on Markov sampling is better than that of classical online SVM classification based on random sampling as the size of training samples is larger.

  10. POSITIVE VERSUS NEGATIVE COMMUNICATION STRATEGIES IN TASK-BASED LEARNING

    Directory of Open Access Journals (Sweden)

    Siti Rohani

    2013-07-01

    Full Text Available This study aimed at describing how the implementation of Task Based Learning (TBL would shape or change students’ use of oral communication strategies. Students’ problems and strategies to solve the problems during the implementation of TBL were also explored. The study was a mixed method, employing both quantitative and qualitative analysis throughmulti-methods of questionnaire, interviews, focus group discussion, learning journals, and classroom observation. Participants were 26 second year students of the State Polytechnic of Malang. Data collection was conducted for one semester. Findingsshow linguistic and non-linguistic problems encountered by students during one-semester implementation of TBL. Students also performedincreased use of positive strategies but reduced use of negative strategies after the implementation of TBL.

  11. On-capillary sample cleanup method for the electrophoretic determination of carbohydrates in juice samples.

    Science.gov (United States)

    Morales-Cid, Gabriel; Simonet, Bartolomé M; Cárdenas, Soledad; Valcárcel, Miguel

    2007-05-01

    On many occasions, sample treatment is a critical step in electrophoretic analysis. As an alternative to batch procedures, in this work, a new strategy is presented with a view to develop an on-capillary sample cleanup method. This strategy is based on the partial filling of the capillary with carboxylated single-walled carbon nanotube (c-SWNT). The nanoparticles retain interferences from the matrix allowing the determination and quantification of carbohydrates (viz glucose, maltose and fructose). The precision of the method for the analysis of real samples ranged from 5.3 to 6.4%. The proposed method was compared with a method based on a batch filtration of the juice sample through diatomaceous earth and further electrophoretic determination. This method was also validated in this work. The RSD for this other method ranged from 5.1 to 6%. The results obtained by both methods were statistically comparable demonstrating the accuracy of the proposed methods and their effectiveness. Electrophoretic separation of carbohydrates was achieved using 200 mM borate solution as a buffer at pH 9.5 and applying 15 kV. During separation, the capillary temperature was kept constant at 40 degrees C. For the on-capillary cleanup method, a solution containing 50 mg/L of c-SWNTs prepared in 300 mM borate solution at pH 9.5 was introduced for 60 s into the capillary just before sample introduction. For the electrophoretic analysis of samples cleaned in batch with diatomaceous earth, it is also recommended to introduce into the capillary, just before the sample, a 300 mM borate solution as it enhances the sensitivity and electrophoretic resolution.

  12. Universal quantum dot-based sandwich-like immunoassay strategy for rapid and ultrasensitive detection of small molecules using portable and reusable optofluidic nano-biosensing platform

    International Nuclear Information System (INIS)

    Zhou, Liping; Zhu, Anna; Lou, Xuening; Song, Dan; Yang, Rong; Shi, Hanchang; Long, Feng

    2016-01-01

    A universal sandwich-like immunoassay strategy based on quantum-dots immunoprobe (QD-labeled anti-mouse IgG antibody) was developed for rapid and ultrasensitive detection of small molecules. A portable and reusable optofluidic nano-biosensing platform was applied to investigate the sandwich-like immunoassay mechanism and format of small molecules, as well as the binding kinetics between QD immunoprobe and anti-small molecule antibody. A two-step immunoassay method that involves pre-incubation mixture of different concentration of small molecule and anti-small molecule antibody, and subsequent introduction of QD immunoprobe into the optofluidic cell was conducted for small molecule determination. Compared with the one-step immunoassay method, the two-step immunoassay method can obtain higher fluorescence signal and higher sensitivity index, thus improving the nano-biosensing performance. Based on the proposed strategy, two mode targets, namely, microcystin-LR (MC-LR) and Bisphenol A (BPA) were tested with high sensitivity, rapidity, and ease of use. A higher concentration of small molecules in the sample led to less anti-small molecule antibody bound with antigen-carrier protein conjugate immobilized onto the sensor surface, and less QD immunoprobes bound with anti-small molecule antibody. This phenomenon lowered the fluorescence signal detected by nano-biosensing platform. Under optimal operating conditions, MC-LR and BPA exhibited a limit of detection of 0.003 and 0.04 μg/L, respectively. The LODs were better than those of the indirect competitive immunoassay method for small molecules via Cy5.5-labeled anti-small molecule antibody. The proposed QD-based sandwich-like immunoassay strategy was evaluated in spiked water samples, and showed good recovery, precision and accuracy without complicated sample pretreatments. All these results demonstrate that the new detection strategy could be readily applied to the other trace small molecules in real water samples

  13. Universal quantum dot-based sandwich-like immunoassay strategy for rapid and ultrasensitive detection of small molecules using portable and reusable optofluidic nano-biosensing platform

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Liping; Zhu, Anna; Lou, Xuening; Song, Dan; Yang, Rong [School of Environment and Natural Resources, Renmin University of China, Beijing (China); Shi, Hanchang [School of Environment, Tsinghua University, Beijing (China); Long, Feng, E-mail: longf04@ruc.edu.cn [School of Environment and Natural Resources, Renmin University of China, Beijing (China)

    2016-01-28

    A universal sandwich-like immunoassay strategy based on quantum-dots immunoprobe (QD-labeled anti-mouse IgG antibody) was developed for rapid and ultrasensitive detection of small molecules. A portable and reusable optofluidic nano-biosensing platform was applied to investigate the sandwich-like immunoassay mechanism and format of small molecules, as well as the binding kinetics between QD immunoprobe and anti-small molecule antibody. A two-step immunoassay method that involves pre-incubation mixture of different concentration of small molecule and anti-small molecule antibody, and subsequent introduction of QD immunoprobe into the optofluidic cell was conducted for small molecule determination. Compared with the one-step immunoassay method, the two-step immunoassay method can obtain higher fluorescence signal and higher sensitivity index, thus improving the nano-biosensing performance. Based on the proposed strategy, two mode targets, namely, microcystin-LR (MC-LR) and Bisphenol A (BPA) were tested with high sensitivity, rapidity, and ease of use. A higher concentration of small molecules in the sample led to less anti-small molecule antibody bound with antigen-carrier protein conjugate immobilized onto the sensor surface, and less QD immunoprobes bound with anti-small molecule antibody. This phenomenon lowered the fluorescence signal detected by nano-biosensing platform. Under optimal operating conditions, MC-LR and BPA exhibited a limit of detection of 0.003 and 0.04 μg/L, respectively. The LODs were better than those of the indirect competitive immunoassay method for small molecules via Cy5.5-labeled anti-small molecule antibody. The proposed QD-based sandwich-like immunoassay strategy was evaluated in spiked water samples, and showed good recovery, precision and accuracy without complicated sample pretreatments. All these results demonstrate that the new detection strategy could be readily applied to the other trace small molecules in real water samples

  14. Control Strategy of Active Power Filter Based on Modular Multilevel Converter

    Science.gov (United States)

    Xie, Xifeng

    2018-03-01

    To improve the capacity, pressure resistance and the equivalent switching frequency of active power filter (APF), a control strategy of APF based on Modular Multilevel Converter (MMC) is presented. In this Control Strategy, the indirect current control method is used to achieve active current and reactive current decoupling control; Voltage Balance Control Strategy is to stabilize sub-module capacitor voltage, the predictive current control method is used to Track and control of harmonic currents. As a result, the harmonic current is restrained, and power quality is improved. Finally, the simulation model of active power filter controller based on MMC is established in Matlab/Simulink, the simulation proves that the proposed strategy is feasible and correct.

  15. Lifetime Prevalence of Suicide Attempts Among Sexual Minority Adults by Study Sampling Strategies: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Hottes, Travis Salway; Bogaert, Laura; Rhodes, Anne E; Brennan, David J; Gesink, Dionne

    2016-05-01

    Previous reviews have demonstrated a higher risk of suicide attempts for lesbian, gay, and bisexual (LGB) persons (sexual minorities), compared with heterosexual groups, but these were restricted to general population studies, thereby excluding individuals sampled through LGB community venues. Each sampling strategy, however, has particular methodological strengths and limitations. For instance, general population probability studies have defined sampling frames but are prone to information bias associated with underreporting of LGB identities. By contrast, LGB community surveys may support disclosure of sexuality but overrepresent individuals with strong LGB community attachment. To reassess the burden of suicide-related behavior among LGB adults, directly comparing estimates derived from population- versus LGB community-based samples. In 2014, we searched MEDLINE, EMBASE, PsycInfo, CINAHL, and Scopus databases for articles addressing suicide-related behavior (ideation, attempts) among sexual minorities. We selected quantitative studies of sexual minority adults conducted in nonclinical settings in the United States, Canada, Europe, Australia, and New Zealand. Random effects meta-analysis and meta-regression assessed for a difference in prevalence of suicide-related behavior by sample type, adjusted for study or sample-level variables, including context (year, country), methods (medium, response rate), and subgroup characteristics (age, gender, sexual minority construct). We examined residual heterogeneity by using τ(2). We pooled 30 cross-sectional studies, including 21,201 sexual minority adults, generating the following lifetime prevalence estimates of suicide attempts: 4% (95% confidence interval [CI] = 3%, 5%) for heterosexual respondents to population surveys, 11% (95% CI = 8%, 15%) for LGB respondents to population surveys, and 20% (95% CI = 18%, 22%) for LGB respondents to community surveys (Figure 1). The difference in LGB estimates by sample

  16. Book Review: Deployment Psychology: Evidence-based strategies ...

    African Journals Online (AJOL)

    Book Review: Deployment Psychology: Evidence-based strategies to promote mental health in the Military. AB Adler, PD Bliese, CA Castro. Abstract. Washington, DC: American Psychological Association 2011 294 pages ISBN-13: 978-1-4338-0881-4. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  17. Combined experimental and statistical strategy for mass spectrometry based serum protein profiling for diagnosis of breast cancer

    DEFF Research Database (Denmark)

    Callesen, Anne Kjærgaard; Vach, Werner; Jørgensen, Per E

    2008-01-01

    it in a well-described breast cancer case-control study. A rigorous sample collection protocol ensured high quality specimen and reduced bias from preanalytical factors. Preoperative serum samples obtained from 48 breast cancer patients and 28 controls were used to generate MALDI MS protein profiles. A total...... and controls. A diagnostic rule based on these 72 mass values was constructed and exhibited a cross-validated sensitivity and specificity of approximately 85% for the detection of breast cancer. With this method, it was possible to distinguish early stage cancers from controls without major loss of sensitivity...... and specificity. We conclude that optimized serum sample handling and mass spectrometry data acquisition strategies in combination with statistical analysis provide a viable platform for serum protein profiling in cancer diagnosis....

  18. Sample Based Unit Liter Dose Estimates

    International Nuclear Information System (INIS)

    JENSEN, L.

    1999-01-01

    The Tank Waste Characterization Program has taken many core samples, grab samples, and auger samples from the single-shell and double-shell tanks during the past 10 years. Consequently, the amount of sample data available has increased, both in terms of quantity of sample results and the number of tanks characterized. More and better data is available than when the current radiological and toxicological source terms used in the Basis for Interim Operation (BIO) (FDH 1999) and the Final Safety Analysis Report (FSAR) (FDH 1999) were developed. The Nuclear Safety and Licensing (NS and L) organization wants to use the new data to upgrade the radiological and toxicological source terms used in the BIO and FSAR. The NS and L organization requested assistance in developing a statistically based process for developing the source terms. This report describes the statistical techniques used and the assumptions made to support the development of a new radiological source term for liquid and solid wastes stored in single-shell and double-shell tanks

  19. A DC Microgrid Coordinated Control Strategy Based on Integrator Current-Sharing

    DEFF Research Database (Denmark)

    Gao, Liyuan; Liu, Yao; Ren, Huisong

    2017-01-01

    The DC microgrid has become a new trend for microgrid study with the advantages of high reliability, simple control and low losses. With regard to the drawbacks of the traditional droop control strategies, an improved DC droop control strategy based on integrator current-sharing is introduced....... In the strategy, the principle of eliminating deviation through an integrator is used, constructing the current-sharing term in order to make the power-sharing between different distributed generation (DG) units uniform and reasonable, which can reduce the circulating current between DG units. Furthermore......, at the system coordinated control level, a hierarchical/droop control strategy based on the DC bus voltage is proposed. In the strategy, the operation modes of the AC main network and micro-sources are determined through detecting the DC voltage variation, which can ensure the power balance of the DC microgrid...

  20. IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment

    Directory of Open Access Journals (Sweden)

    Shyamal Dalapati

    2017-12-01

    Full Text Available Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set (INS environment, which we call IN-cross entropy measure and prove its basic properties. We also develop weighted IN-cross entropy measure and investigats its basic properties. Based on the weighted IN-cross entropy measure, we develop a novel strategy for multi attribute group decision making (MAGDM strategy under interval neutrosophic environment. The proposed multi attribute group decision making strategy is compared with the existing cross entropy measure based strategy in the literature under interval neutrosophic set environment. Finally, an illustrative example of multi attribute group decision making problem is solved to show the feasibility, validity and efficiency of the proposed MAGDM strategy.

  1. Evaluation of 5-FU pharmacokinetics in cancer patients with DPD deficiency using a Bayesian limited sampling strategy

    NARCIS (Netherlands)

    Van Kuilenburg, A.; Hausler, P.; Schalhorn, A.; Tanck, M.; Proost, J.H.; Terborg, C.; Behnke, D.; Schwabe, W.; Jabschinsky, K.; Maring, J.G.

    Aims: Dihydropyrimidine dehydrogenase (DPD) is the initial enzyme in the catabolism of 5-fluorouracil (5FU) and DPD deficiency is an important pharmacogenetic syndrome. The main purpose of this study was to develop a limited sampling strategy to evaluate the pharmacokinetics of 5FU and to detect

  2. Evaluation of cognitive loads imposed by traditional paper-based and innovative computer-based instructional strategies.

    Science.gov (United States)

    Khalil, Mohammed K; Mansour, Mahmoud M; Wilhite, Dewey R

    2010-01-01

    Strategies of presenting instructional information affect the type of cognitive load imposed on the learner's working memory. Effective instruction reduces extraneous (ineffective) cognitive load and promotes germane (effective) cognitive load. Eighty first-year students from two veterinary schools completed a two-section questionnaire that evaluated their perspectives on the educational value of a computer-based instructional program. They compared the difference between cognitive loads imposed by paper-based and computer-based instructional strategies used to teach the anatomy of the canine skeleton. Section I included 17 closed-ended items, rated on a five-point Likert scale, that assessed the use of graphics, content, and the learning process. Section II included a nine-point mental effort rating scale to measure the level of difficulty of instruction; students were asked to indicate the amount of mental effort invested in the learning task using both paper-based and computer-based presentation formats. The closed-ended data were expressed as means and standard deviations. A paired t test with an alpha level of 0.05 was used to determine the overall mean difference between the two presentation formats. Students positively evaluated their experience with the computer-based instructional program with a mean score of 4.69 (SD=0.53) for use of graphics, 4.70 (SD=0.56) for instructional content, and 4.45 (SD=0.67) for the learning process. The mean difference of mental effort (1.50) between the two presentation formats was significant, t=8.26, p≤.0001, df=76, for two-tailed distribution. Consistent with cognitive load theory, innovative computer-based instructional strategies decrease extraneous cognitive load compared with traditional paper-based instructional strategies.

  3. Determination of total concentration of chemically labeled metabolites as a means of metabolome sample normalization and sample loading optimization in mass spectrometry-based metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2012-12-18

    For mass spectrometry (MS)-based metabolomics, it is important to use the same amount of starting materials from each sample to compare the metabolome changes in two or more comparative samples. Unfortunately, for biological samples, the total amount or concentration of metabolites is difficult to determine. In this work, we report a general approach of determining the total concentration of metabolites based on the use of chemical labeling to attach a UV absorbent to the metabolites to be analyzed, followed by rapid step-gradient liquid chromatography (LC) UV detection of the labeled metabolites. It is shown that quantification of the total labeled analytes in a biological sample facilitates the preparation of an appropriate amount of starting materials for MS analysis as well as the optimization of the sample loading amount to a mass spectrometer for achieving optimal detectability. As an example, dansylation chemistry was used to label the amine- and phenol-containing metabolites in human urine samples. LC-UV quantification of the labeled metabolites could be optimally performed at the detection wavelength of 338 nm. A calibration curve established from the analysis of a mixture of 17 labeled amino acid standards was found to have the same slope as that from the analysis of the labeled urinary metabolites, suggesting that the labeled amino acid standard calibration curve could be used to determine the total concentration of the labeled urinary metabolites. A workflow incorporating this LC-UV metabolite quantification strategy was then developed in which all individual urine samples were first labeled with (12)C-dansylation and the concentration of each sample was determined by LC-UV. The volumes of urine samples taken for producing the pooled urine standard were adjusted to ensure an equal amount of labeled urine metabolites from each sample was used for the pooling. The pooled urine standard was then labeled with (13)C-dansylation. Equal amounts of the (12)C

  4. Strategy Development of Community Base Tourism in Tidung Island, Jakarta

    Directory of Open Access Journals (Sweden)

    Dhian Tyas Untari

    2018-05-01

    Full Text Available The aim of thus study is to establish a community-based tourism development strategy in Tidung Island. Researcher use Strategy Management matrix, In this research, tourist entrepreneurs and tourist as an observation unit and is determined as an analysis unit of the company that is the decision makers are very influential in the company itself, including related Human Resources, Finance, Production, and Marketing. Eigen Factor score is use ase the weighting input data from the results of questionnaires. From the questionnaire, a score is obtained from the average given by the respondents at each key success factors, where in the input process the researcher used IFAS / IFAS Matrix, and in the process of strategy formulation, the researcher used the recommendation from the Grand Matrix Strategy output. The results of the output recommendations, which will then be implemented in the development of community-based tourism on the island of Tidung. Based on the Grand Matrix Strategy chart seen that the outline of Tidung Island tourism into the weak category, where the quadrant Challenges and Weaknesses is much greater than the strength and opportunities. Thus the strategy that can be done is with; improve tourism governance by maximizing the function of tourism development programs of DKI Jakarta Province, encouraging the Provincial Government of DKI Jakarta to allocate funds and attention to alternative tourism such as marine tourism located in Kepulauan Seribu, maximizing Community Service Activities of Higher Education as a medium of knowladge community transfer Tidung Island, improving the mode of transportation and increasing the frequency of ship felling Jakarta - Pulau Tidung.

  5. Group recommendation strategies based on collaborative filtering

    OpenAIRE

    Ricardo de Melo Queiroz, Sérgio

    2003-01-01

    Ricardo de Melo Queiroz, Sérgio; de Assis Tenório Carvalho, Francisco. Group recommendation strategies based on collaborative filtering. 2003. Dissertação (Mestrado). Programa de Pós-Graduação em Ciência da Computação, Universidade Federal de Pernambuco, Recife, 2003.

  6. Focusing and non-focusing modulation strategies for the improvement of on-line two-dimensional hydrophilic interaction chromatography × reversed phase profiling of complex food samples.

    Science.gov (United States)

    Montero, Lidia; Ibáñez, Elena; Russo, Mariateresa; Rastrelli, Luca; Cifuentes, Alejandro; Herrero, Miguel

    2017-09-08

    Comprehensive two-dimensional liquid chromatography (LC × LC) is ever gaining interest in food analysis, as often, food-related samples are too complex to be analyzed through one-dimensional approaches. The use of hydrophilic interaction chromatography (HILIC) combined with reversed phase (RP) separations has already been demonstrated as a very orthogonal combination, which allows attaining increased resolving power. However, this coupling encompasses different analytical challenges, mainly related to the important solvent strength mismatch between the two dimensions, besides those common to every LC × LC method. In the present contribution, different strategies are proposed and compared to further increase HILIC × RP method performance for the analysis of complex food samples, using licorice as a model sample. The influence of different parameters in non-focusing modulation methods based on sampling loops, as well as under focusing modulation, through the use of trapping columns in the interface and through active modulation procedures are studied in order to produce resolving power and sensitivity gains. Although the use of a dilution strategy using sampling loops as well as the highest possible first dimension sampling rate allowed significant improvements on resolution, focusing modulation produced significant gains also in peak capacity and sensitivity. Overall, the obtained results demonstrate the great applicability and potential that active modulation may have for the analysis of complex food samples, such as licorice, by HILIC × RP. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  8. Decision Tree and Survey Development for Support in Agricultural Sampling Strategies during Nuclear and Radiological Emergencies

    International Nuclear Information System (INIS)

    Yi, Amelia Lee Zhi; Dercon, Gerd

    2017-01-01

    In the event of a severe nuclear or radiological accident, the release of radionuclides results in contamination of land surfaces affecting agricultural and food resources. Speedy accumulation of information and guidance on decision making is essential in enhancing the ability of stakeholders to strategize for immediate countermeasure strategies. Support tools such as decision trees and sampling protocols allow for swift response by governmental bodies and assist in proper management of the situation. While such tools exist, they focus mainly on protecting public well-being and not food safety management strategies. Consideration of the latter is necessary as it has long-term implications especially to agriculturally dependent Member States. However, it is a research gap that remains to be filled.

  9. Limited sampling strategies drawn within 3 hours postdose poorly predict mycophenolic acid area-under-the-curve after enteric-coated mycophenolate sodium.

    NARCIS (Netherlands)

    Winter, B.C. de; Gelder, T. van; Mathôt, R.A.A.; Glander, P.; Tedesco-Silva, H.; Hilbrands, L.B.; Budde, K.; Hest, R.M. van

    2009-01-01

    Previous studies predicted that limited sampling strategies (LSS) for estimation of mycophenolic acid (MPA) area-under-the-curve (AUC(0-12)) after ingestion of enteric-coated mycophenolate sodium (EC-MPS) using a clinically feasible sampling scheme may have poor predictive performance. Failure of

  10. Strategies for the AFM-based manipulation of silver nanowires on a flat surface

    Science.gov (United States)

    Liu, Hong-Zhi; Wu, Sen; Zhang, Jun-Ming; Bai, Hui-Tian; Jin, Fan; Pang, Hai; Hu, Xiao-Dong

    2017-09-01

    Silver nanowires (Ag NWs) are a promising material for building various sensors and devices at the nanoscale. However, the fast and precise placement of individual Ag NWs is still a challenge today. Atomic force microscopy (AFM) has been widely used to manipulate nanoparticles, yet this technology encounters many difficulties when being applied to the movement of Ag NWs as well as other soft one-dimensional (1D) materials, since the samples are easily distorted or even broken due to friction and adhesion on the substrate. In this paper, two novel manipulation strategies based on the parallel pushing method are presented. This method applies a group of short parallel pushing vectors (PPVs) to the Ag NW along its longitudinal direction. Identical and proportional vectors are respectively proposed to translate and rotate the Ag NWs with a straight-line configuration. The rotation strategy is also applied to straighten flexed Ag NWs. The finite element method simulation is introduced to analyse the behaviour of the Ag NWs as well as to optimize the parameter setting of the PPVs. Experiments are carried out to confirm the efficiency of the presented strategies. By comprehensive application of the new strategies, four Ag NWs are continuously assembled in a rectangular pattern. This study improves the controllability of the position and configuration of Ag NWs on a flat substrate. It also indicates the practicability of automatic nanofabrication using common AFMs.

  11. WLAN Fingerprint Indoor Positioning Strategy Based on Implicit Crowdsourcing and Semi-Supervised Learning

    Directory of Open Access Journals (Sweden)

    Chunjing Song

    2017-11-01

    Full Text Available Wireless local area network (WLAN fingerprint positioning is an indoor localization technique with high accuracy and low hardware requirements. However, collecting received signal strength (RSS samples for the fingerprint database is time-consuming and labor-intensive, hindering the use of this technique. The popular crowdsourcing sampling technique has been introduced to reduce the workload of sample collection, but has two challenges: one is the heterogeneity of devices, which can significantly affect the positioning accuracy; the other is the requirement of users’ intervention in traditional crowdsourcing, which reduces the practicality of the system. In response to these challenges, we have proposed a new WLAN indoor positioning strategy, which incorporates a new preprocessing method for RSS samples, the implicit crowdsourcing sampling technique, and a semi-supervised learning algorithm. First, implicit crowdsourcing does not require users’ intervention. The acquisition program silently collects unlabeled samples, the RSS samples, without information about the position. Secondly, to cope with the heterogeneity of devices, the preprocessing method maps all the RSS values of samples to a uniform range and discretizes them. Finally, by using a large number of unlabeled samples with some labeled samples, Co-Forest, the introduced semi-supervised learning algorithm, creates and repeatedly refines a random forest ensemble classifier that performs well for location estimation. The results of experiments conducted in a real indoor environment show that the proposed strategy reduces the demand for large quantities of labeled samples and achieves good positioning accuracy.

  12. A plasmonic colorimetric strategy for visual miRNA detection based on hybridization chain reaction

    Science.gov (United States)

    Miao, Jie; Wang, Jingsheng; Guo, Jinyang; Gao, Huiguang; Han, Kun; Jiang, Chengmin; Miao, Peng

    2016-08-01

    In this work, a novel colorimetric strategy for miRNA analysis is proposed based on hybridization chain reaction (HCR)-mediated localized surface plasmon resonance (LSPR) variation of silver nanoparticles (AgNPs). miRNA in the sample to be tested is able to release HCR initiator from a solid interface to AgNPs colloid system by toehold exchange-mediated strand displacement, which then triggers the consumption of fuel strands with single-stranded tails for HCR. The final produced long nicked double-stranded DNA loses the ability to protect AgNPs from salt-induced aggregation. The stability variation of the colloid system can then be monitored by recording corresponding UV-vis spectrum and initial miRNA level is thus determined. This sensing system involves only four DNA strands which is quite simple. The practical utility is confirmed to be excellent by employing different biological samples.

  13. Renewal strategy and community based organisations in community ...

    African Journals Online (AJOL)

    Renewal strategy and community based organisations in community ... the local population and resources to do that which the governments had failed to do. ... country with a view to reducing poverty and developmental imbalance in Nigeria.

  14. Reproducibility of R-fMRI metrics on the impact of different strategies for multiple comparison correction and sample sizes.

    Science.gov (United States)

    Chen, Xiao; Lu, Bin; Yan, Chao-Gan

    2018-01-01

    Concerns regarding reproducibility of resting-state functional magnetic resonance imaging (R-fMRI) findings have been raised. Little is known about how to operationally define R-fMRI reproducibility and to what extent it is affected by multiple comparison correction strategies and sample size. We comprehensively assessed two aspects of reproducibility, test-retest reliability and replicability, on widely used R-fMRI metrics in both between-subject contrasts of sex differences and within-subject comparisons of eyes-open and eyes-closed (EOEC) conditions. We noted permutation test with Threshold-Free Cluster Enhancement (TFCE), a strict multiple comparison correction strategy, reached the best balance between family-wise error rate (under 5%) and test-retest reliability/replicability (e.g., 0.68 for test-retest reliability and 0.25 for replicability of amplitude of low-frequency fluctuations (ALFF) for between-subject sex differences, 0.49 for replicability of ALFF for within-subject EOEC differences). Although R-fMRI indices attained moderate reliabilities, they replicated poorly in distinct datasets (replicability < 0.3 for between-subject sex differences, < 0.5 for within-subject EOEC differences). By randomly drawing different sample sizes from a single site, we found reliability, sensitivity and positive predictive value (PPV) rose as sample size increased. Small sample sizes (e.g., < 80 [40 per group]) not only minimized power (sensitivity < 2%), but also decreased the likelihood that significant results reflect "true" effects (PPV < 0.26) in sex differences. Our findings have implications for how to select multiple comparison correction strategies and highlight the importance of sufficiently large sample sizes in R-fMRI studies to enhance reproducibility. Hum Brain Mapp 39:300-318, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  15. Strategies for Fostering the Efficacy of School-Based Management ...

    African Journals Online (AJOL)

    The study examined community participation in the School-Based Management Committees (SBMC), the challenges hindering participation, and strategies for fostering efficacy of the School Based Management Committee. The number 340 schools were selected from the population of 2543 public primary schools in ...

  16. Evaluating sampling strategy for DNA barcoding study of coastal and inland halo-tolerant Poaceae and Chenopodiaceae: A case study for increased sample size.

    Science.gov (United States)

    Yao, Peng-Cheng; Gao, Hai-Yan; Wei, Ya-Nan; Zhang, Jian-Hang; Chen, Xiao-Yong; Li, Hong-Qing

    2017-01-01

    Environmental conditions in coastal salt marsh habitats have led to the development of specialist genetic adaptations. We evaluated six DNA barcode loci of the 53 species of Poaceae and 15 species of Chenopodiaceae from China's coastal salt marsh area and inland area. Our results indicate that the optimum DNA barcode was ITS for coastal salt-tolerant Poaceae and matK for the Chenopodiaceae. Sampling strategies for ten common species of Poaceae and Chenopodiaceae were analyzed according to optimum barcode. We found that by increasing the number of samples collected from the coastal salt marsh area on the basis of inland samples, the number of haplotypes of Arundinella hirta, Digitaria ciliaris, Eleusine indica, Imperata cylindrica, Setaria viridis, and Chenopodium glaucum increased, with a principal coordinate plot clearly showing increased distribution points. The results of a Mann-Whitney test showed that for Digitaria ciliaris, Eleusine indica, Imperata cylindrica, and Setaria viridis, the distribution of intraspecific genetic distances was significantly different when samples from the coastal salt marsh area were included (P Imperata cylindrica and Chenopodium album, average intraspecific distance tended to reach stability. These results indicate that the sample size for DNA barcode of globally distributed species should be increased to 11-15.

  17. The Effect of Reading Involvement through Open-Ended Strategy vs. Fill-in- the- Blanks Strategy on Young EFL Learners’ Reading Comprehension Ability

    Directory of Open Access Journals (Sweden)

    Rita Salehi Sepehr

    2014-11-01

    Full Text Available The present study investigated the extent to which an instructional framework of integrating strategy instruction (open-ended strategy and fill-in-the blanks strategy with motivation- support affected on reading result for young EFL learners. The central area of exploration included a comparison among three approaches to reading instruction: First, fill-in-the blanks strategy intervention; second, open-ended strategy intervention; and last, a control group which received the conventional reading strategies. The participants were sampled from amongst a group of seventy-seven pre-intermediate EFL learners in a language school in Tehran- Iran based on convenient sampling technique. For the sake of measurement, the researchers administered PET and CELT along with reading strategy based-test to quantify the participants’ current level of knowledge as well as the degree of achievement after treatment. For measurement’s sake, different types of tests such as PET, reading comprehension test (CELT, and reading strategy based- test were employed to quantify the participants’ current level of knowledge as well as the degree of achievement before and after instruction. The result of the present study indicated that the experimental groups had a significant improvement over the control group. Also, the level of learners’ reading engagement during classroom work mediated the instructional effects on reading outcomes. The results of this study can be to the benefit of both EFL and ESL teachers to teach reading comprehension using the student's critical mind as well as critical involvement in the reading tasks.

  18. A Regional PD Strategy for EPR Systems: Evidence-Based IT Development

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2006-01-01

    One of the five regions in Denmark has initiated a remark-able and alternative strategy for the development of Elec-tronic Patient Record (EPR) systems. This strategy is driven by Participatory Design (PD) experiments and based on evidence of positive effects on the clinical practice when using EPR...... systems. We present this PD strategy and our related research on evidence-based IT development. We report from a newly completed PD experiment with EPR in the region conducted through a close collaboration compris-ing a neurological stroke unit, the region’s EPR unit, the vendor, as well as the authors....

  19. PROXY-BASED PATCHING STREAM TRANSMISSION STRATEGY IN MOBILE STREAMING MEDIA SYSTEM

    Institute of Scientific and Technical Information of China (English)

    Liao Jianxin; Lei Zhengxiong; Ma Xutao; Zhu Xiaomin

    2006-01-01

    A mobile transmission strategy, PMPatching (Proxy-based Mobile Patching) transmission strategy is proposed, it applies to the proxy-based mobile streaming media system in Wideband Code Division Multiple Access (WCDMA) network. Performance of the whole system can be improved by using patching stream to transmit anterior part of the suffix that had been played back, and by batching all the demands for the suffix arrived in prefix period and patching stream transmission threshold period. Experimental results show that this strategy can efficiently reduce average network transmission cost and number of channels consumed in central streaming media server.

  20. Answer Extraction Based on Merging Score Strategy of Hot Terms

    Institute of Scientific and Technical Information of China (English)

    LE Juan; ZHANG Chunxia; NIU Zhendong

    2016-01-01

    Answer extraction (AE) is one of the key technologies in developing the open domain Question&an-swer (Q&A) system . Its task is to yield the highest score to the expected answer based on an effective answer score strategy. We introduce an answer extraction method by Merging score strategy (MSS) based on hot terms. The hot terms are defined according to their lexical and syn-tactic features to highlight the role of the question terms. To cope with the syntactic diversities of the corpus, we propose four improved candidate answer score algorithms. Each of them is based on the lexical function of hot terms and their syntactic relationships with the candidate an-swers. Two independent corpus score algorithms are pro-posed to tap the role of the corpus in ranking the candi-date answers. Six algorithms are adopted in MSS to tap the complementary action among the corpus, the candi-date answers and the questions. Experiments demonstrate the effectiveness of the proposed strategy.

  1. Aptamer Recognition Induced Target-Bridged Strategy for Proteins Detection Based on Magnetic Chitosan and Silver/Chitosan Nanoparticles Using Surface-Enhanced Raman Spectroscopy.

    Science.gov (United States)

    He, Jincan; Li, Gongke; Hu, Yuling

    2015-11-03

    Poor selectivity and biocompability remain problems in applying surface-enhanced Raman spectroscopy (SERS) for direct detection of proteins due to similar spectra of most proteins and overlapping Raman bands in complex mixtures. To solve these problems, an aptamer recognition induced target-bridged strategy based on magnetic chitosan (MCS) and silver/chitosan nanoparticles (Ag@CS NPs) using SERS was developed for detection of protein benefiting from specific affinity of aptamers and biocompatibility of chitosan (CS). In this process, one aptamer (or antibody) modified MCS worked as capture probes through the affinity binding site of protein. The other aptamer modified Raman report molecules encapsulated Ag@CS NPs were used as SERS sensing probes based on the other binding site of protein. The sandwich complexes of aptamer (antibody)/protein/aptamer were separated easily with a magnet from biological samples, and the concentration of protein was indirectly reflected by the intensity variation of SERS signal of Raman report molecules. To explore the universality of the strategy, three different kinds of proteins including thrombin, platelet derived growth factor BB (PDGF BB) and immunoglobulin E (lgE) were investigated. The major advantages of this aptamer recognition induced target-bridged strategy are convenient operation with a magnet, stable signal expressing resulting from preventing loss of report molecules with the help of CS shell, and the avoidance of slow diffusion-limited kinetics problems occurring on a solid substrate. To demonstrate the feasibility of the proposed strategy, the method was applied to detection of PDGF BB in clinical samples. The limit of detection (LOD) of PDGF BB was estimated to be 3.2 pg/mL. The results obtained from human serum of healthy persons and cancer patients using the proposed strategy showed good agreement with that of the ELISA method but with wider linear range, more convenient operation, and lower cost. The proposed

  2. Exact Sampling and Decoding in High-Order Hidden Markov Models

    NARCIS (Netherlands)

    Carter, S.; Dymetman, M.; Bouchard, G.

    2012-01-01

    We present a method for exact optimization and sampling from high order Hidden Markov Models (HMMs), which are generally handled by approximation techniques. Motivated by adaptive rejection sampling and heuristic search, we propose a strategy based on sequentially refining a lower-order language

  3. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  4. Sampling strategies to capture single-cell heterogeneity

    OpenAIRE

    Satwik Rajaram; Louise E. Heinrich; John D. Gordan; Jayant Avva; Kathy M. Bonness; Agnieszka K. Witkiewicz; James S. Malter; Chloe E. Atreya; Robert S. Warren; Lani F. Wu; Steven J. Altschuler

    2017-01-01

    Advances in single-cell technologies have highlighted the prevalence and biological significance of cellular heterogeneity. A critical question is how to design experiments that faithfully capture the true range of heterogeneity from samples of cellular populations. Here, we develop a data-driven approach, illustrated in the context of image data, that estimates the sampling depth required for prospective investigations of single-cell heterogeneity from an existing collection of samples. ...

  5. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Campolina, Daniel; Lima, Paulo Rubens I.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2015-01-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k eff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  6. LC-MS analysis of the plasma metabolome–a novel sample preparation strategy

    DEFF Research Database (Denmark)

    Skov, Kasper; Hadrup, Niels; Smedsgaard, Jørn

    2015-01-01

    Blood plasma is a well-known body fluid often analyzed in studies on the effects of toxic compounds as physiological or chemical induced changes in the mammalian body are reflected in the plasma metabolome. Sample preparation prior to LC-MS based analysis of the plasma metabolome is a challenge...... as plasma contains compounds with very different properties. Besides, proteins, which usually are precipitated with organic solvent, phospholipids, are known to cause ion suppression in electrospray mass spectrometry. We have compared two different sample preparation techniques prior to LC-qTOF analysis...... of plasma samples: The first is protein precipitation; the second is protein precipitation followed by solid phase extraction with sub-fractionation into three sub-samples; a phospholipid, a lipid and a polar sub-fraction. Molecular feature extraction of the data files from LC-qTOF analysis of the samples...

  7. Development of improved space sampling strategies for ocean chemical properties: Total carbon dioxide and dissolved nitrate

    Science.gov (United States)

    Goyet, Catherine; Davis, Daniel; Peltzer, Edward T.; Brewer, Peter G.

    1995-01-01

    Large-scale ocean observing programs such as the Joint Global Ocean Flux Study (JGOFS) and the World Ocean Circulation Experiment (WOCE) today, must face the problem of designing an adequate sampling strategy. For ocean chemical variables, the goals and observing technologies are quite different from ocean physical variables (temperature, salinity, pressure). We have recently acquired data on the ocean CO2 properties on WOCE cruises P16c and P17c that are sufficiently dense to test for sampling redundancy. We use linear and quadratic interpolation methods on the sampled field to investigate what is the minimum number of samples required to define the deep ocean total inorganic carbon (TCO2) field within the limits of experimental accuracy (+/- 4 micromol/kg). Within the limits of current measurements, these lines were oversampled in the deep ocean. Should the precision of the measurement be improved, then a denser sampling pattern may be desirable in the future. This approach rationalizes the efficient use of resources for field work and for estimating gridded (TCO2) fields needed to constrain geochemical models.

  8. Finding needles in a haystack: a methodology for identifying and sampling community-based youth smoking cessation programs.

    Science.gov (United States)

    Emery, Sherry; Lee, Jungwha; Curry, Susan J; Johnson, Tim; Sporer, Amy K; Mermelstein, Robin; Flay, Brian; Warnecke, Richard

    2010-02-01

    Surveys of community-based programs are difficult to conduct when there is virtually no information about the number or locations of the programs of interest. This article describes the methodology used by the Helping Young Smokers Quit (HYSQ) initiative to identify and profile community-based youth smoking cessation programs in the absence of a defined sample frame. We developed a two-stage sampling design, with counties as the first-stage probability sampling units. The second stage used snowball sampling to saturation, to identify individuals who administered youth smoking cessation programs across three economic sectors in each county. Multivariate analyses modeled the relationship between program screening, eligibility, and response rates and economic sector and stratification criteria. Cumulative logit models analyzed the relationship between the number of contacts in a county and the number of programs screened, eligible, or profiled in a county. The snowball process yielded 9,983 unique and traceable contacts. Urban and high-income counties yielded significantly more screened program administrators; urban counties produced significantly more eligible programs, but there was no significant association between the county characteristics and program response rate. There is a positive relationship between the number of informants initially located and the number of programs screened, eligible, and profiled in a county. Our strategy to identify youth tobacco cessation programs could be used to create a sample frame for other nonprofit organizations that are difficult to identify due to a lack of existing directories, lists, or other traditional sample frames.

  9. Intelligent sampling for the measurement of structured surfaces

    International Nuclear Information System (INIS)

    Wang, J; Jiang, X; Blunt, L A; Scott, P J; Leach, R K

    2012-01-01

    Uniform sampling in metrology has known drawbacks such as coherent spectral aliasing and a lack of efficiency in terms of measuring time and data storage. The requirement for intelligent sampling strategies has been outlined over recent years, particularly where the measurement of structured surfaces is concerned. Most of the present research on intelligent sampling has focused on dimensional metrology using coordinate-measuring machines with little reported on the area of surface metrology. In the research reported here, potential intelligent sampling strategies for surface topography measurement of structured surfaces are investigated by using numerical simulation and experimental verification. The methods include the jittered uniform method, low-discrepancy pattern sampling and several adaptive methods which originate from computer graphics, coordinate metrology and previous research by the authors. By combining the use of advanced reconstruction methods and feature-based characterization techniques, the measurement performance of the sampling methods is studied using case studies. The advantages, stability and feasibility of these techniques for practical measurements are discussed. (paper)

  10. Identifying Multiple Levels of Discussion-Based Teaching Strategies for Constructing Scientific Models

    Science.gov (United States)

    Williams, Grant; Clement, John

    2015-01-01

    This study sought to identify specific types of discussion-based strategies that two successful high school physics teachers using a model-based approach utilized in attempting to foster students' construction of explanatory models for scientific concepts. We found evidence that, in addition to previously documented dialogical strategies that…

  11. A novel PMT test system based on waveform sampling

    Science.gov (United States)

    Yin, S.; Ma, L.; Ning, Z.; Qian, S.; Wang, Y.; Jiang, X.; Wang, Z.; Yu, B.; Gao, F.; Zhu, Y.; Wang, Z.

    2018-01-01

    Comparing with the traditional test system based on a QDC and TDC and scaler, a test system based on waveform sampling is constructed for signal sampling of the 8"R5912 and the 20"R12860 Hamamatsu PMT in different energy states from single to multiple photoelectrons. In order to achieve high throughput and to reduce the dead time in data processing, the data acquisition software based on LabVIEW is developed and runs with a parallel mechanism. The analysis algorithm is realized in LabVIEW and the spectra of charge, amplitude, signal width and rising time are analyzed offline. The results from Charge-to-Digital Converter, Time-to-Digital Converter and waveform sampling are discussed in detailed comparison.

  12. Development and pilot study of a marketing strategy for primary care/internet-based depression prevention intervention for adolescents (the CATCH-IT intervention).

    Science.gov (United States)

    Van Voorhees, Benjamin W; Watson, Natalie; Bridges, John F P; Fogel, Joshua; Galas, Jill; Kramer, Clarke; Connery, Marc; McGill, Ann; Marko, Monika; Cardenas, Alonso; Landsback, Josephine; Dmochowska, Karoline; Kuwabara, Sachiko A; Ellis, Justin; Prochaska, Micah; Bell, Carl

    2010-01-01

    Adolescent depression is both common and burdensome, and while evidence-based strategies have been developed to prevent adolescent depression, participation in such interventions remains extremely low, with less than 3% of at-risk individuals participating. To promote participation in evidence-based preventive strategies, a rigorous marketing strategy is needed to translate research into practice. To develop and pilot a rigorous marketing strategy for engaging at-risk individuals with an Internet-based depression prevention intervention in primary care targeting key attitudes and beliefs. A marketing design group was constituted to develop a marketing strategy based on the principles of targeting, positioning/competitor analysis, decision analysis, and promotion/distribution and incorporating contemporary models of behavior change. We evaluated the formative quality of the intervention and observed the fielding experience for prevention using a pilot study (observational) design. The marketing plan focused on "resiliency building" rather than "depression intervention" and was relayed by office staff and the Internet site. Twelve practices successfully implemented the intervention and recruited a diverse sample of adolescents with > 30% of all those with positive screens and > 80% of those eligible after phone assessment enrolling in the study with a cost of $58 per enrollee. Adolescent motivation for depression prevention (1-10 scale) increased from a baseline mean value of 7.45 (SD = 2.05) to 8.07 poststudy (SD = 1.33) (P = .048). Marketing strategies for preventive interventions for mental disorders can be developed and successfully introduced and marketed in primary care.

  13. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  14. Domain-based Teaching Strategy for Intelligent Tutoring System Based on Generic Rules

    Science.gov (United States)

    Kseibat, Dawod; Mansour, Ali; Adjei, Osei; Phillips, Paul

    In this paper we present a framework for selecting the proper instructional strategy for a given teaching material based on its attributes. The new approach is based on a flexible design by means of generic rules. The framework was adapted in an Intelligent Tutoring System to teach Modern Standard Arabic language to adult English-speaking learners with no pre-knowledge of Arabic language is required.

  15. Integrative review of implementation strategies for translation of research-based evidence by nurses.

    Science.gov (United States)

    Wuchner, Staci S

    2014-01-01

    The purpose of this review was to synthesize and critique experimental and/or quasi-experimental research that has evaluated implementation strategies for translation of research-based evidence into nursing practice. Successfully implementing evidence-based research can improve patient outcomes. Identifying successful implementation strategies is imperative to move research-based evidence into practice. As implementation science gains popularity, it is imperative to understand the strategies that most effectively translate research-based evidence into practice. The review used the CINAHL and MEDLINE (Ovid) databases. Articles were included if they were experimental and/or quasi-experimental research designs, were written in English, and measured nursing compliance to translation of research-based evidence. An independent review was performed to select and critique the included articles. A wide array of interventions were completed, including visual cues, audit and feedback, educational meetings and materials, reminders, outreach, and leadership involvement. Because of the complex multimodal nature of the interventions and the variety of research topics, comparison across interventions was difficult. Many difficulties exist in determining what implementation strategies are most effective for translation of research-based evidence into practice by nurses. With these limited findings, further research is warranted to determine which implementation strategies most successfully translate research-based evidence into practice.

  16. Engineering-Based Problem Solving Strategies in AP Calculus: An Investigation into High School Student Performance on Related Rate Free-Response Problems

    Science.gov (United States)

    Thieken, John

    2012-01-01

    A sample of 127 high school Advanced Placement (AP) Calculus students from two schools was utilized to study the effects of an engineering design-based problem solving strategy on student performance with AP style Related Rate questions and changes in conceptions, beliefs, and influences. The research design followed a treatment-control multiple…

  17. A nested-PCR strategy for molecular diagnosis of mollicutes in uncultured biological samples from cows with vulvovaginitis.

    Science.gov (United States)

    Voltarelli, Daniele Cristina; de Alcântara, Brígida Kussumoto; Lunardi, Michele; Alfieri, Alice Fernandes; de Arruda Leme, Raquel; Alfieri, Amauri Alcindo

    2018-01-01

    Bacteria classified in Mycoplasma (M. bovis and M. bovigenitalium) and Ureaplasma (U. diversum) genera are associated with granular vulvovaginitis that affect heifers and cows at reproductive age. The traditional means for detection and speciation of mollicutes from clinical samples have been culture and serology. However, challenges experienced with these laboratory methods have hampered assessment of their impact in pathogenesis and epidemiology in cattle worldwide. The aim of this study was to develop a PCR strategy to detect and primarily discriminate between the main species of mollicutes associated with reproductive disorders of cattle in uncultured clinical samples. In order to amplify the 16S-23S rRNA internal transcribed spacer region of the genome, a consensual and species-specific nested-PCR assay was developed to identify and discriminate between main species of mollicutes. In addition, 31 vaginal swab samples from dairy and beef affected cows were investigated. This nested-PCR strategy was successfully employed in the diagnosis of single and mixed mollicute infections of diseased cows from cattle herds from Brazil. The developed system enabled the rapid and unambiguous identification of the main mollicute species known to be associated with this cattle reproductive disorder through differential amplification of partial fragments of the ITS region of mollicute genomes. The development of rapid and sensitive tools for mollicute detection and discrimination without the need for previous cultures or sequencing of PCR products is a high priority for accurate diagnosis in animal health. Therefore, the PCR strategy described herein may be helpful for diagnosis of this class of bacteria in genital swabs submitted to veterinary diagnostic laboratories, not demanding expertise in mycoplasma culture and identification. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Adaptive Rate Sampling and Filtering Based on Level Crossing Sampling

    Directory of Open Access Journals (Sweden)

    Saeed Mian Qaisar

    2009-01-01

    Full Text Available The recent sophistications in areas of mobile systems and sensor networks demand more and more processing resources. In order to maintain the system autonomy, energy saving is becoming one of the most difficult industrial challenges, in mobile computing. Most of efforts to achieve this goal are focused on improving the embedded systems design and the battery technology, but very few studies target to exploit the input signal time-varying nature. This paper aims to achieve power efficiency by intelligently adapting the processing activity to the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting a non conventional sampling scheme and adaptive rate filtering. The proposed approach, based on the LCSS (Level Crossing Sampling Scheme presents two filtering techniques, able to adapt their sampling rate and filter order by online analyzing the input signal variations. Indeed, the principle is to intelligently exploit the signal local characteristics—which is usually never considered—to filter only the relevant signal parts, by employing the relevant order filters. This idea leads towards a drastic gain in the computational efficiency and hence in the processing power when compared to the classical techniques.

  19. Best control strategy for unified power quality conditioner (UPQC) based on simulation

    Energy Technology Data Exchange (ETDEWEB)

    Shayanfar, H.A. [Iran Univ. of Science and Technology, Tehran (Iran, Islamic Republic of). Dept. of Electrical Engineering; Tabatabaei, N.M. [Azarbaijan Univ. of Tarbiat Moallem, Tabriz (Iran, Islamic Republic of). Dept. of Electrical Engineering; Mokhtarpour, A. [Islamic Azad Univ., Tabriz (Iran, Islamic Republic of). Dept. of Electrical Engineering

    2007-07-01

    Electronic devices used in both industry and residences need high-quality energy to work properly. Unified Power Quality Conditioners (UPQC) solve any power quality problems faced by these devices. Three new control strategies for UPQCs were presented and their operation was investigated and compared using the MATLAB Simulink simulation software package. A UPQC consists of a Shunt-Active Filter and a Series Active Filter with a common direct current link to compensate for any source currents and delivered voltage to the load. As such, it isolates the utility from current quality problems associated with load. It also isolates the load from the voltage quality problems of the utility. In the first control strategy, the Parallel Active Filter (PAF) and Series Active Filter (SAF) are based on the Fourier transform theory. In the second control strategy, the Parallel Active Filter is based on the power quality theory and the Series Active Filter is based on the Fourier transform theory. In the third control strategy, the Parallel Active Filter is based on Fourier transform theory and the Series Active Filter is based on positive sequence detection. Operating the PAF using these methods compensates for reactive power and current harmonics, while operating the SAF compensates for imbalances, voltage harmonics and positive and zero sequences of utility voltages. MATLAB simulation software was used to explain the compensation resolution and speed of the 3 new control strategies. According to simulation test results, it was concluded that the best compensation speed and resolution can be obtained using the third control strategy. 7 refs., 2 tabs., 24 figs.

  20. Nature-based strategies for improving urban health and safety

    Science.gov (United States)

    Michelle C. Kondo; Eugenia C. South; Charles C. Branas

    2015-01-01

    Place-based programs are being noticed as key opportunities to prevent disease and promote public health and safety for populations at-large. As one key type of place-based intervention, nature-based and green space strategies can play an especially large role in improving health and safety for dwellers in urban environments such as US legacy cities that lack nature...

  1. Prevalence of behavior changing strategies in fitness video games: theory-based content analysis.

    Science.gov (United States)

    Lyons, Elizabeth Jane; Hatkevich, Claire

    2013-05-07

    Fitness video games are popular, but little is known about their content. Because many contain interactive tools that mimic behavioral strategies from weight loss intervention programs, it is possible that differences in content could affect player physical activity and/or weight outcomes. There is a need for a better understanding of what behavioral strategies are currently available in fitness games and how they are implemented. The purpose of this study was to investigate the prevalence of evidence-based behavioral strategies across fitness video games available for home use. Games available for consoles that used camera-based controllers were also contrasted with games available for a console that used handheld motion controllers. Fitness games (N=18) available for three home consoles were systematically identified and play-tested by 2 trained coders for at least 3 hours each. In cases of multiple games from one series, only the most recently released game was included. The Sony PlayStation 3 and Microsoft Xbox360 were the two camera-based consoles, and the Nintendo Wii was the handheld motion controller console. A coding list based on a taxonomy of behavioral strategies was used to begin coding. Codes were refined in an iterative process based on data found during play-testing. The most prevalent behavioral strategies were modeling (17/18), specific performance feedback (17/18), reinforcement (16/18), caloric expenditure feedback (15/18), and guided practice (15/18). All games included some kind of feedback on performance accuracy, exercise frequency, and/or fitness progress. Action planning (scheduling future workouts) was the least prevalent of the included strategies (4/18). Twelve games included some kind of social integration, with nine of them providing options for real-time multiplayer sessions. Only two games did not feature any kind of reward. Games for the camera-based consoles (mean 12.89, SD 2.71) included a greater number of strategies than those

  2. Draft evaluation of the frequency for gas sampling for the high burnup confirmatory data project

    Energy Technology Data Exchange (ETDEWEB)

    Stockman, Christine T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Alsaed, Halim A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bryan, Charles R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-26

    This report fulfills the M3 milestone M3FT-15SN0802041, “Draft Evaluation of the Frequency for Gas Sampling for the High Burn-up Storage Demonstration Project” under Work Package FT-15SN080204, “ST Field Demonstration Support – SNL”. This report provides a technically based gas sampling frequency strategy for the High Burnup (HBU) Confirmatory Data Project. The evaluation of: 1) the types and magnitudes of gases that could be present in the project cask and, 2) the degradation mechanisms that could change gas compositions culminates in an adaptive gas sampling frequency strategy. This adaptive strategy is compared against the sampling frequency that has been developed based on operational considerations. Gas sampling will provide information on the presence of residual water (and byproducts associated with its reactions and decomposition) and breach of cladding, which could inform the decision of when to open the project cask.

  3. Efficient learning strategy of Chinese characters based on network approach.

    Directory of Open Access Journals (Sweden)

    Xiaoyong Yan

    Full Text Available We develop an efficient learning strategy of Chinese characters based on the network of the hierarchical structural relations between Chinese characters. A more efficient strategy is that of learning the same number of useful Chinese characters in less effort or time. We construct a node-weighted network of Chinese characters, where character usage frequencies are used as node weights. Using this hierarchical node-weighted network, we propose a new learning method, the distributed node weight (DNW strategy, which is based on a new measure of nodes' importance that considers both the weight of the nodes and its location in the network hierarchical structure. Chinese character learning strategies, particularly their learning order, are analyzed as dynamical processes over the network. We compare the efficiency of three theoretical learning methods and two commonly used methods from mainstream Chinese textbooks, one for Chinese elementary school students and the other for students learning Chinese as a second language. We find that the DNW method significantly outperforms the others, implying that the efficiency of current learning methods of major textbooks can be greatly improved.

  4. Reproductive Strategy and Sexual Conflict Slow Life History Strategy Inihibts Negative Androcentrism

    Directory of Open Access Journals (Sweden)

    Paul R. Gladden

    2013-11-01

    Full Text Available Recent findings indicate that a slow Life History (LH strategy factor is associated with increased levels of Executive Functioning (EF, increased emotional intelligence, decreased levels of sexually coercive behaviors, and decreased levels of negative ethnocentrism. Based on these findings, as well as the generative theory, we predicted that slow LH strategy should inhibit negative androcentrism (bias against women. A sample of undergraduates responded to a battery of questionnaires measuring various facets of their LH Strategy, (e.g., sociosexual orientation, mating effort, mate-value, psychopathy, executive functioning, and emotional intelligence and various convergent measures of Negative Androcentrism. A structural model that the data fit well indicated a latent protective LH strategy trait predicted decreased negative androcentrism. This trait fully mediated the relationship between participant biological sex and androcentrism. We suggest that slow LH strategy may inhibit negative attitudes toward women because of relatively decreased intrasexual competition and intersexual conflict among slow LH strategists. DOI: 10.2458/azu_jmmss.v4i1.17774

  5. Cost-effectiveness and harm-benefit analyses of risk-based screening strategies for breast cancer.

    Directory of Open Access Journals (Sweden)

    Ester Vilaprinyo

    Full Text Available The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1 To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2 To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial, the starting ages (40, 45 and 50 years and the ending ages (69 and 74 years in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies.

  6. High-resolution monitoring of marine protists based on an observation strategy integrating automated on-board filtration and molecular analyses

    Science.gov (United States)

    Metfies, Katja; Schroeder, Friedhelm; Hessel, Johanna; Wollschläger, Jochen; Micheller, Sebastian; Wolf, Christian; Kilias, Estelle; Sprong, Pim; Neuhaus, Stefan; Frickenhaus, Stephan; Petersen, Wilhelm

    2016-11-01

    Information on recent biomass distribution and biogeography of photosynthetic marine protists with adequate temporal and spatial resolution is urgently needed to better understand the consequences of environmental change for marine ecosystems. Here we introduce and review a molecular-based observation strategy for high-resolution assessment of these protists in space and time. It is the result of extensive technology developments, adaptations and evaluations which are documented in a number of different publications, and the results of the recently completed field testing which are introduced in this paper. The observation strategy is organized at four different levels. At level 1, samples are collected at high spatiotemporal resolution using the remotely controlled automated filtration system AUTOFIM. Resulting samples can either be preserved for later laboratory analyses, or directly subjected to molecular surveillance of key species aboard the ship via an automated biosensor system or quantitative polymerase chain reaction (level 2). Preserved samples are analyzed at the next observational levels in the laboratory (levels 3 and 4). At level 3 this involves molecular fingerprinting methods for a quick and reliable overview of differences in protist community composition. Finally, selected samples can be used to generate a detailed analysis of taxonomic protist composition via the latest next generation sequencing technology (NGS) at level 4. An overall integrated dataset of the results based on the different analyses provides comprehensive information on the diversity and biogeography of protists, including all related size classes. At the same time the cost of the observation is optimized with respect to analysis effort and time.

  7. Control charts for location based on different sampling schemes

    NARCIS (Netherlands)

    Mehmood, R.; Riaz, M.; Does, R.J.M.M.

    2013-01-01

    Control charts are the most important statistical process control tool for monitoring variations in a process. A number of articles are available in the literature for the X̄ control chart based on simple random sampling, ranked set sampling, median-ranked set sampling (MRSS), extreme-ranked set

  8. Individual and pen-based oral fluid sampling: A welfare-friendly sampling method for group-housed gestating sows.

    Science.gov (United States)

    Pol, Françoise; Dorenlor, Virginie; Eono, Florent; Eudier, Solveig; Eveno, Eric; Liégard-Vanhecke, Dorine; Rose, Nicolas; Fablet, Christelle

    2017-11-01

    The aims of this study were to assess the feasibility of individual and pen-based oral fluid sampling (OFS) in 35 pig herds with group-housed sows, compare these methods to blood sampling, and assess the factors influencing the success of sampling. Individual samples were collected from at least 30 sows per herd. Pen-based OFS was performed using devices placed in at least three pens for 45min. Information related to the farm, the sows, and their living conditions were collected. Factors significantly associated with the duration of sampling and the chewing behaviour of sows were identified by logistic regression. Individual OFS took 2min 42s on average; the type of floor, swab size, and operator were associated with a sampling time >2min. Pen-based OFS was obtained from 112 devices (62.2%). The type of floor, parity, pen-level activity, and type of feeding were associated with chewing behaviour. Pen activity was associated with the latency to interact with the device. The type of floor, gestation stage, parity, group size, and latency to interact with the device were associated with a chewing time >10min. After 15, 30 and 45min of pen-based OFS, 48%, 60% and 65% of the sows were lying down, respectively. The time spent after the beginning of sampling, genetic type, and time elapsed since the last meal were associated with 50% of the sows lying down at one time point. The mean time to blood sample the sows was 1min 16s and 2min 52s if the number of operators required was considered in the sampling time estimation. The genetic type, parity, and type of floor were significantly associated with a sampling time higher than 1min 30s. This study shows that individual OFS is easy to perform in group-housed sows by a single operator, even though straw-bedded animals take longer to sample than animals housed on slatted floors, and suggests some guidelines to optimise pen-based OFS success. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Optimization of Decision-Making for Spatial Sampling in the North China Plain, Based on Remote-Sensing a Priori Knowledge

    Science.gov (United States)

    Feng, J.; Bai, L.; Liu, S.; Su, X.; Hu, H.

    2012-07-01

    In this paper, the MODIS remote sensing data, featured with low-cost, high-timely and moderate/low spatial resolutions, in the North China Plain (NCP) as a study region were firstly used to carry out mixed-pixel spectral decomposition to extract an useful regionalized indicator parameter (RIP) (i.e., an available ratio, that is, fraction/percentage, of winter wheat planting area in each pixel as a regionalized indicator variable (RIV) of spatial sampling) from the initial selected indicators. Then, the RIV values were spatially analyzed, and the spatial structure characteristics (i.e., spatial correlation and variation) of the NCP were achieved, which were further processed to obtain the scalefitting, valid a priori knowledge or information of spatial sampling. Subsequently, founded upon an idea of rationally integrating probability-based and model-based sampling techniques and effectively utilizing the obtained a priori knowledge or information, the spatial sampling models and design schemes and their optimization and optimal selection were developed, as is a scientific basis of improving and optimizing the existing spatial sampling schemes of large-scale cropland remote sensing monitoring. Additionally, by the adaptive analysis and decision strategy the optimal local spatial prediction and gridded system of extrapolation results were able to excellently implement an adaptive report pattern of spatial sampling in accordance with report-covering units in order to satisfy the actual needs of sampling surveys.

  10. Citizen-based Strategies to Improve Community Security: Working ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Citizen-based Strategies to Improve Community Security: Working with Vulnerable Populations to Address Urban Violence in Medellin ... Water Resources Association, in close collaboration with IDRC, is holding a webinar titled “Climate change and adaptive water management: Innovative solutions from the Global South”.

  11. Evaluating sampling strategy for DNA barcoding study of coastal and inland halo-tolerant Poaceae and Chenopodiaceae: A case study for increased sample size.

    Directory of Open Access Journals (Sweden)

    Peng-Cheng Yao

    Full Text Available Environmental conditions in coastal salt marsh habitats have led to the development of specialist genetic adaptations. We evaluated six DNA barcode loci of the 53 species of Poaceae and 15 species of Chenopodiaceae from China's coastal salt marsh area and inland area. Our results indicate that the optimum DNA barcode was ITS for coastal salt-tolerant Poaceae and matK for the Chenopodiaceae. Sampling strategies for ten common species of Poaceae and Chenopodiaceae were analyzed according to optimum barcode. We found that by increasing the number of samples collected from the coastal salt marsh area on the basis of inland samples, the number of haplotypes of Arundinella hirta, Digitaria ciliaris, Eleusine indica, Imperata cylindrica, Setaria viridis, and Chenopodium glaucum increased, with a principal coordinate plot clearly showing increased distribution points. The results of a Mann-Whitney test showed that for Digitaria ciliaris, Eleusine indica, Imperata cylindrica, and Setaria viridis, the distribution of intraspecific genetic distances was significantly different when samples from the coastal salt marsh area were included (P < 0.01. These results suggest that increasing the sample size in specialist habitats can improve measurements of intraspecific genetic diversity, and will have a positive effect on the application of the DNA barcodes in widely distributed species. The results of random sampling showed that when sample size reached 11 for Chloris virgata, Chenopodium glaucum, and Dysphania ambrosioides, 13 for Setaria viridis, and 15 for Eleusine indica, Imperata cylindrica and Chenopodium album, average intraspecific distance tended to reach stability. These results indicate that the sample size for DNA barcode of globally distributed species should be increased to 11-15.

  12. An evaluation of oligonucleotide-based therapeutic strategies for polyQ diseases

    Directory of Open Access Journals (Sweden)

    Fiszer Agnieszka

    2012-03-01

    Full Text Available Abstract Background RNA interference (RNAi and antisense strategies provide experimental therapeutic agents for numerous diseases, including polyglutamine (polyQ disorders caused by CAG repeat expansion. We compared the potential of different oligonucleotide-based strategies for silencing the genes responsible for several polyQ diseases, including Huntington's disease and two spinocerebellar ataxias, type 1 and type 3. The strategies included nonallele-selective gene silencing, gene replacement, allele-selective SNP targeting and CAG repeat targeting. Results Using the patient-derived cell culture models of polyQ diseases, we tested various siRNAs, and antisense reagents and assessed their silencing efficiency and allele selectivity. We showed considerable allele discrimination by several SNP targeting siRNAs based on a weak G-G or G-U pairing with normal allele and strong G-C pairing with mutant allele at the site of RISC-induced cleavage. Among the CAG repeat targeting reagents the strongest allele discrimination is achieved by miRNA-like functioning reagents that bind to their targets and inhibit their translation without substantial target cleavage. Also, morpholino analog performs well in mutant and normal allele discrimination but its efficient delivery to cells at low effective concentration still remains a challenge. Conclusions Using three cellular models of polyQ diseases and the same experimental setup we directly compared the performance of different oligonucleotide-based treatment strategies that are currently under development. Based on the results obtained by us and others we discussed the advantages and drawbacks of these strategies considering them from several different perspectives. The strategy aimed at nonallele-selective inhibiting of causative gene expression by targeting specific sequence of the implicated gene is the easiest to implement but relevant benefits are still uncertain. The gene replacement strategy that

  13. Intelligent fault recognition strategy based on adaptive optimized multiple centers

    Science.gov (United States)

    Zheng, Bo; Li, Yan-Feng; Huang, Hong-Zhong

    2018-06-01

    For the recognition principle based optimized single center, one important issue is that the data with nonlinear separatrix cannot be recognized accurately. In order to solve this problem, a novel recognition strategy based on adaptive optimized multiple centers is proposed in this paper. This strategy recognizes the data sets with nonlinear separatrix by the multiple centers. Meanwhile, the priority levels are introduced into the multi-objective optimization, including recognition accuracy, the quantity of optimized centers, and distance relationship. According to the characteristics of various data, the priority levels are adjusted to ensure the quantity of optimized centers adaptively and to keep the original accuracy. The proposed method is compared with other methods, including support vector machine (SVM), neural network, and Bayesian classifier. The results demonstrate that the proposed strategy has the same or even better recognition ability on different distribution characteristics of data.

  14. Sustained Attention Across the Life Span in a Sample of 10,000: Dissociating Ability and Strategy.

    Science.gov (United States)

    Fortenbaugh, Francesca C; DeGutis, Joseph; Germine, Laura; Wilmer, Jeremy B; Grosso, Mallory; Russo, Kathryn; Esterman, Michael

    2015-09-01

    Normal and abnormal differences in sustained visual attention have long been of interest to scientists, educators, and clinicians. Still lacking, however, is a clear understanding of how sustained visual attention varies across the broad sweep of the human life span. In the present study, we filled this gap in two ways. First, using an unprecedentedly large 10,430-person sample, we modeled age-related differences with substantially greater precision than have prior efforts. Second, using the recently developed gradual-onset continuous performance test (gradCPT), we parsed sustained-attention performance over the life span into its ability and strategy components. We found that after the age of 15 years, the strategy and ability trajectories saliently diverge. Strategy becomes monotonically more conservative with age, whereas ability peaks in the early 40s and is followed by a gradual decline in older adults. These observed life-span trajectories for sustained attention are distinct from results of other life-span studies focusing on fluid and crystallized intelligence. © The Author(s) 2015.

  15. Do biological-based strategies hold promise to biofouling control in MBRs?

    KAUST Repository

    Malaeb, Lilian

    2013-10-01

    Biofouling in membrane bioreactors (MBRs) remains a primary challenge for their wider application, despite the growing acceptance of MBRs worldwide. Research studies on membrane fouling are extensive in the literature, with more than 200 publications on MBR fouling in the last 3 years; yet, improvements in practice on biofouling control and management have been remarkably slow. Commonly applied cleaning methods are only partially effective and membrane replacement often becomes frequent. The reason for the slow advancement in successful control of biofouling is largely attributed to the complex interactions of involved biological compounds and the lack of representative-for-practice experimental approaches to evaluate potential effective control strategies. Biofouling is driven by microorganisms and their associated extra-cellular polymeric substances (EPS) and microbial products. Microorganisms and their products convene together to form matrices that are commonly treated as a black box in conventional control approaches. Biological-based antifouling strategies seem to be a promising constituent of an effective integrated control approach since they target the essence of biofouling problems. However, biological-based strategies are in their developmental phase and several questions should be addressed to set a roadmap for translating existing and new information into sustainable and effective control techniques. This paper investigates membrane biofouling in MBRs from the microbiological perspective to evaluate the potential of biological-based strategies in offering viable control alternatives. Limitations of available control methods highlight the importance of an integrated anti-fouling approach including biological strategies. Successful development of these strategies requires detailed characterization of microorganisms and EPS through the proper selection of analytical tools and assembly of results. Existing microbiological/EPS studies reveal a number of

  16. Model-based optimization strategy of chiller driven liquid desiccant dehumidifier with genetic algorithm

    International Nuclear Information System (INIS)

    Wang, Xinli; Cai, Wenjian; Lu, Jiangang; Sun, Youxian; Zhao, Lei

    2015-01-01

    This study presents a model-based optimization strategy for an actual chiller driven dehumidifier of liquid desiccant dehumidification system operating with lithium chloride solution. By analyzing the characteristics of the components, energy predictive models for the components in the dehumidifier are developed. To minimize the energy usage while maintaining the outlet air conditions at the pre-specified set-points, an optimization problem is formulated with an objective function, the constraints of mechanical limitations and components interactions. Model-based optimization strategy using genetic algorithm is proposed to obtain the optimal set-points for desiccant solution temperature and flow rate, to minimize the energy usage in the dehumidifier. Experimental studies on an actual system are carried out to compare energy consumption between the proposed optimization and the conventional strategies. The results demonstrate that energy consumption using the proposed optimization strategy can be reduced by 12.2% in the dehumidifier operation. - Highlights: • Present a model-based optimization strategy for energy saving in LDDS. • Energy predictive models for components in dehumidifier are developed. • The Optimization strategy are applied and tested in an actual LDDS. • Optimization strategy can achieve energy savings by 12% during operation

  17. Metacognitive strategies in learning sight-singing

    Directory of Open Access Journals (Sweden)

    Bogunović Blanka

    2012-01-01

    Full Text Available This paper presents a part of a wider study that is based on interdisciplinary research of sight-singing (psychology and music education. Our intention was to join the psychological knowledge of cognitive processes on the one hand, and the practical approach of music teachers, based on methods, techniques and procedures of mastering sight-reading-singing skills on the other. We aimed: 1. to determine the kinds and levels of strategies that music students use in the cognitive processes involved during sight-singing; 2. to explore strategies of problem solving when difficulties appear; 3. to investigate the self-evaluation perspectives of students; and 4. to relate students' learning experience to the strategies used. The sample consisted of 89 music students from higher music education in Belgrade and The Hague. They filled in the questionnaire based on self-reports, covering general data about their music education background, different issues of sight-singing, such as planning, problem solving, monitoring and evaluation of outcomes, and three melodic examples written in different musical styles. Results showed that strategies used during sight-singing can be roughly sorted into three groups that differ according to the 'key accent' given: cognitive, intuitive and no-strategy. The music cognitive strategies involved cover three levels of musical organization and representation: a relying on smaller chunks of the musical piece, referring to existing knowledge and learning experience; b leaning on a slightly 'bigger picture' of familiar patterns; and c mental representation of melodic/rhythmic/harmonic structures. When faced with a problem, half of the students employed analytic approaches. Comparisons between sub-samples showed, for example, that future performing musicians more often used 'tone-to-tone' thinking and 'bottom-up' strategies in approaching musical structure, while music theory students had better insight into the whole and used

  18. Theory of sampling: four critical success factors before analysis.

    Science.gov (United States)

    Wagner, Claas; Esbensen, Kim H

    2015-01-01

    Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.

  19. Developmental Changes in the Consideration of Sample Diversity in Inductive Reasoning

    Science.gov (United States)

    Rhodes, Marjorie; Gelman, Susan A.; Brickman, Daniel

    2008-01-01

    Determining whether a sample provides a good basis for broader generalizations is a basic challenge of inductive reasoning. Adults apply a diversity-based strategy to this challenge, expecting diverse samples to be a better basis for generalization than homogeneous samples. For example, adults expect that a property shared by two diverse mammals…

  20. The effects of strength-based versus deficit-based self-regulated learning strategies on students' effort intentions

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Van Yperen, Nico W.

    In two randomized experiments, one conducted online (n = 174) and one in the classroom (n = 267), we tested the effects of two types of self-regulated learning (SRL) strategies on students’ intentions to put effort into professional development activities: strength-based SRL strategies (i.e.,

  1. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    Energy Technology Data Exchange (ETDEWEB)

    Man, Jun [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Zhang, Jiangjiang [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Li, Weixuan [Pacific Northwest National Laboratory, Richland Washington USA; Zeng, Lingzao [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Wu, Laosheng [Department of Environmental Sciences, University of California, Riverside California USA

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees of freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.

  2. Effectiveness and cost of recruitment strategies for a community-based randomised controlled trial among rainwater drinkers.

    Science.gov (United States)

    Rodrigo, Shelly; Sinclair, Martha; Cunliffe, David; Leder, Karin

    2009-07-16

    Community-based recruitment is challenging particularly if the sampling frame is not easily defined as in the case of people who drink rainwater. Strategies for contacting participants must be carefully considered to maximise generalisability and minimise bias of the results. This paper assesses the recruitment strategies for a 1-year double-blinded randomised trial on drinking untreated rainwater. The effectiveness of the recruitment strategies and associated costs are described. Community recruitment of households from Adelaide, Australia occurred from February to July 2007 using four methods: electoral roll mail-out, approaches to schools and community groups, newspaper advertising, and other media involvement. Word of mouth communication was also assessed. A total of 810 callers were screened, with 53.5% eligible. Of those who were eligible and sent further information, 76.7% were willing to participate in the study and 75.1% were enrolled. The target for recruitment was 300 households, and this was achieved. The mail-out was the most effective method with respect to number of households randomised, while recruitment via schools had the highest yield (57.3%) and was the most cost effective when considering cost per household randomised (AUD$147.20). Yield and cost effectiveness were lowest for media advertising. The use of electoral roll mail-out and advertising via schools were effective in reaching households using untreated rainwater for drinking. Employing multiple strategies enabled success in achieving the recruitment target. In countries where electoral roll extracts are available to researchers, this method is likely to have a high yield for recruitment into community-based epidemiological studies.

  3. Prevalence of Behavior Changing Strategies in Fitness Video Games: Theory-Based Content Analysis

    Science.gov (United States)

    Hatkevich, Claire

    2013-01-01

    Background Fitness video games are popular, but little is known about their content. Because many contain interactive tools that mimic behavioral strategies from weight loss intervention programs, it is possible that differences in content could affect player physical activity and/or weight outcomes. There is a need for a better understanding of what behavioral strategies are currently available in fitness games and how they are implemented. Objective The purpose of this study was to investigate the prevalence of evidence-based behavioral strategies across fitness video games available for home use. Games available for consoles that used camera-based controllers were also contrasted with games available for a console that used handheld motion controllers. Methods Fitness games (N=18) available for three home consoles were systematically identified and play-tested by 2 trained coders for at least 3 hours each. In cases of multiple games from one series, only the most recently released game was included. The Sony PlayStation 3 and Microsoft Xbox360 were the two camera-based consoles, and the Nintendo Wii was the handheld motion controller console. A coding list based on a taxonomy of behavioral strategies was used to begin coding. Codes were refined in an iterative process based on data found during play-testing. Results The most prevalent behavioral strategies were modeling (17/18), specific performance feedback (17/18), reinforcement (16/18), caloric expenditure feedback (15/18), and guided practice (15/18). All games included some kind of feedback on performance accuracy, exercise frequency, and/or fitness progress. Action planning (scheduling future workouts) was the least prevalent of the included strategies (4/18). Twelve games included some kind of social integration, with nine of them providing options for real-time multiplayer sessions. Only two games did not feature any kind of reward. Games for the camera-based consoles (mean 12.89, SD 2.71) included a

  4. The impact of university-based incubation support on the innovation strategy of academic spin-offs

    OpenAIRE

    Soetanto, Danny Prabowo; Jack, Sarah Louise

    2016-01-01

    This paper develops understanding about how incubation support and innovation strategy can determine the performance of academic spin-offs. Using a sample of spin-offs from the United Kingdom, the Netherlands and Norway, we analyse the potential moderating effect of incubation support (networking and entrepreneurial support) on innovation strategy effectiveness. The empirical results demonstrate: (1) a technology and market exploitation strategy has a stronger and more positive effect on the ...

  5. Fate of organic microcontaminants in wastewater treatment and river systems: An uncertainty assessment in view of sampling strategy, and compound consumption rate and degradability.

    Science.gov (United States)

    Aymerich, I; Acuña, V; Ort, C; Rodríguez-Roda, I; Corominas, Ll

    2017-11-15

    The growing awareness of the relevance of organic microcontaminants on the environment has led to a growing number of studies on attenuation of these compounds in wastewater treatment plants (WWTP) and rivers. However, the effects of the sampling strategies (frequency and duration of composite samples) on the attenuation estimates are largely unknown. Our goal was to assess how frequency and duration of composite samples influence uncertainty of the attenuation estimates in WWTPs and rivers. Furthermore, we also assessed how compound consumption rate and degradability influence uncertainty. The assessment was conducted through simulating the integrated wastewater system of Puigcerdà (NE Iberian Peninsula) using a sewer pattern generator and a coupled model of WWTP and river. Results showed that the sampling strategy is especially critical at the influent of WWTP, particularly when the number of toilet flushes containing the compound of interest is small (≤100 toilet flushes with compound day -1 ), and less critical at the effluent of the WWTP and in the river due to the mixing effects of the WWTP. For example, at the WWTP, when evaluating a compound that is present in 50 pulses·d -1 using a sampling frequency of 15-min to collect a 24-h composite sample, the attenuation uncertainty can range from 94% (0% degradability) to 9% (90% degradability). The estimation of attenuation in rivers is less critical than in WWTPs, as the attenuation uncertainty was lower than 10% for all evaluated scenarios. Interestingly, the errors in the estimates of attenuation are usually lower than those of loads for most sampling strategies and compound characteristics (e.g. consumption and degradability), although the opposite occurs for compounds with low consumption and inappropriate sampling strategies at the WWTP. Hence, when designing a sampling campaign, one should consider the influence of compounds' consumption and degradability as well as the desired level of accuracy in

  6. Development and Pilot Study of a Marketing Strategy for Primary Care/Internet–Based Depression Prevention Intervention for Adolescents (The CATCH-IT Intervention)

    Science.gov (United States)

    Watson, Natalie; Bridges, John F. P.; Fogel, Joshua; Galas, Jill; Kramer, Clarke; Connery, Marc; McGill, Ann; Marko, Monika; Cardenas, Alonso; Landsback, Josephine; Dmochowska, Karoline; Kuwabara, Sachiko A.; Ellis, Justin; Prochaska, Micah; Bell, Carl

    2010-01-01

    Background: Adolescent depression is both common and burdensome, and while evidence-based strategies have been developed to prevent adolescent depression, participation in such interventions remains extremely low, with less than 3% of at-risk individuals participating. To promote participation in evidence-based preventive strategies, a rigorous marketing strategy is needed to translate research into practice. Objective: To develop and pilot a rigorous marketing strategy for engaging at-risk individuals with an Internet-based depression prevention intervention in primary care targeting key attitudes and beliefs. Method: A marketing design group was constituted to develop a marketing strategy based on the principles of targeting, positioning/competitor analysis, decision analysis, and promotion/distribution and incorporating contemporary models of behavior change. We evaluated the formative quality of the intervention and observed the fielding experience for prevention using a pilot study (observational) design. Results: The marketing plan focused on “resiliency building” rather than “depression intervention” and was relayed by office staff and the Internet site. Twelve practices successfully implemented the intervention and recruited a diverse sample of adolescents with > 30% of all those with positive screens and > 80% of those eligible after phone assessment enrolling in the study with a cost of $58 per enrollee. Adolescent motivation for depression prevention (1–10 scale) increased from a baseline mean value of 7.45 (SD = 2.05) to 8.07 poststudy (SD = 1.33) (P = .048). Conclusions: Marketing strategies for preventive interventions for mental disorders can be developed and successfully introduced and marketed in primary care. PMID:20944776

  7. Teacher’s Voice on Metacognitive Strategy Based Instruction Using Audio Visual Aids for Listening

    Directory of Open Access Journals (Sweden)

    Salasiah Salasiah

    2018-02-01

    Full Text Available The paper primarily stresses on exploring the teacher’s voice toward the application of metacognitive strategy with audio-visual aid in improving listening comprehension. The metacognitive strategy model applied in the study was inspired from Vandergrift and Tafaghodtari (2010 instructional model. Thus it is modified in the procedure and applied with audio-visual aids for improving listening comprehension. The study’s setting was at SMA Negeri 2 Parepare, South Sulawesi Province, Indonesia. The population of the research was the teacher of English at tenth grade at SMAN 2. The sample was taken by using random sampling technique. The data was collected by using in depth interview during the research, recorded, and analyzed using qualitative analysis. This study explored the teacher’s response toward the modified model of metacognitive strategy with audio visual aids in class of listening which covers positive and negative response toward the strategy applied during the teaching of listening. The result of data showed that this strategy helped the teacher a lot in teaching listening comprehension as the procedure has systematic steps toward students’ listening comprehension. Also, it eases the teacher to teach listening by empowering audio visual aids such as video taken from youtube.

  8. A national evaluation of community-based mental health strategies in Finland.

    Science.gov (United States)

    Vähäniemi, Anu; Warwick-Smith, Katja; Hätönen, Heli; Välimäki, Maritta

    2018-02-01

    High-quality mental health care requires written strategies to set a vision for the future, yet, there is limited systematic information available on the monitoring and evaluation of such strategies. The aim of this nationwide study is to evaluate local mental health strategies in community-based mental health services provided by municipalities. Mental health strategy documents were gathered through an online search and an e-mail survey of the local authorities of all Finnish mainland municipalities (n = 320). Out of 320 municipalities, documents for 129 municipalities (63 documents) were included in the study. The documents obtained (n = 63) were evaluated against the World Health Organization checklist for mental health strategies and policies. Evaluation of the process, operations and content of the documents, against 31 indicators in the checklist. Out of 320 Finnish municipalities, 40% (n = 129) had a mental health strategy document available and 33% (n = 104) had a document that was either in preparation or being updated. In these documents, priorities, targets and activities were clearly described. Nearly all (99%) of the documents suggested a commitment to preventative work, and 89% mentioned a dedication to developing community-based care. The key shortfalls identified were the lack of consideration of human rights (0%), the limited consideration of research (5%) and the lack of financial planning (28%) to successfully execute the plans. Of the documents obtained, 60% covered both mental health and substance abuse issues. This study contributes to the limited evidence base on health care strategy evaluations. Further research is needed to understand the potential impact of policy analysis. © The Author(s) 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  9. Evaluation of different frontier-based multi-robot exploration strategies

    Directory of Open Access Journals (Sweden)

    Benkrid Abdenour

    2016-01-01

    Full Text Available In this paper, we focus on the problem of exploring an unknown environment by a team of mobile robots. The main objective is to compare four different coordination strategies based on frontier concept (boundaries between unexplored and explored open areas and analyze their performance in term of assignment quality, overall exploration time and computational complexity. In order to provide a suitable qualitative study we used three optimization criteria. Each strategy has been implemented and tested extensively in computerized simulation.

  10. Barriers and Strategies to Engaging Our Community-Based Preceptors.

    Science.gov (United States)

    Graziano, Scott C; McKenzie, Margaret L; Abbott, Jodi F; Buery-Joyner, Samantha D; Craig, LaTasha B; Dalrymple, John L; Forstein, David A; Hampton, Brittany S; Page-Ramsey, Sarah M; Pradhan, Archana; Wolf, Abigail; Hopkins, Laura

    2018-03-26

    This article, from the "To the Point" series that is prepared by the Association of Professors of Gynecology and Obstetrics Undergraduate Medical Education Committee, is a review of commonly cited barriers to recruiting and retaining community-based preceptors in undergraduate medical education and potential strategies to overcome them. Community-based preceptors have traditionally served as volunteer, nonsalaried faculty, with academic institutions relying on intrinsic teaching rewards to sustain this model. However, increasing numbers of learners, the burdens of incorporating the electronic medical record in practice, and increasing demands for clinical productivity are making recruitment and retention of community-based preceptors more challenging. General challenges to engaging preceptors, as well as those unique to women's health, are discussed. Potential solutions are reviewed, including alternative recruitment strategies, faculty development to emphasize efficient teaching practices in the ambulatory setting, offers of online educational resources, and opportunities to incorporate students in value-added roles. Through examples cited in this review, clerkship directors and medical school administrators should have a solid foundation to actively engage their community-based preceptors.

  11. Weight Management Preferences in a Non-Treatment Seeking Sample

    Directory of Open Access Journals (Sweden)

    Victoria B. Barry

    2013-12-01

    Full Text Available Background: Obesity is a serious public health issue in the United States, with the CDC reporting that most adult Americans are now either overweight or obese. Little is known about the comparative acceptability of available weight management approaches in non-treatment seeking samples. Method: This report presents preliminary survey data collected from an online sample on weight management preferences for 8 different weight management strategies including a proposed incentive-based program. Participants were 72 individuals (15 men, 55 women and 2 transgendered individuals who self-re-ported being overweight or obese, or who currently self-reported a normal weight but had attempted to lose weight in the past. Results: ANOVA and Pair-wise comparison indicated clear preferences for certain treatments over others in the full sample; most notably, the most popular option in our sample for managing weight was to diet and exercise without professional assistance. Several differences in preference between the three weight groups were also observed. Conclusions: Dieting and exercising without any professional assistance is the most highly endorsed weight management option among all groups. Overweight and obese individuals may find self-management strategies for weight loss less attractive than normal weight individuals, but still prefer it to other alternatives. This has implications for the development and dissemination of empirically based self-management strategies for weight.

  12. Language-based communication strategies that support person-centered communication with persons with dementia.

    Science.gov (United States)

    Savundranayagam, Marie Y; Moore-Nielsen, Kelsey

    2015-10-01

    There are many recommended language-based strategies for effective communication with persons with dementia. What is unknown is whether effective language-based strategies are also person centered. Accordingly, the objective of this study was to examine whether language-based strategies for effective communication with persons with dementia overlapped with the following indicators of person-centered communication: recognition, negotiation, facilitation, and validation. Conversations (N = 46) between staff-resident dyads were audio-recorded during routine care tasks over 12 weeks. Staff utterances were coded twice, using language-based and person-centered categories. There were 21 language-based categories and 4 person-centered categories. There were 5,800 utterances transcribed: 2,409 without indicators, 1,699 coded as language or person centered, and 1,692 overlapping utterances. For recognition, 26% of utterances were greetings, 21% were affirmations, 13% were questions (yes/no and open-ended), and 15% involved rephrasing. Questions (yes/no, choice, and open-ended) comprised 74% of utterances that were coded as negotiation. A similar pattern was observed for utterances coded as facilitation where 51% of utterances coded as facilitation were yes/no questions, open-ended questions, and choice questions. However, 21% of facilitative utterances were affirmations and 13% involved rephrasing. Finally, 89% of utterances coded as validation were affirmations. The findings identify specific language-based strategies that support person-centered communication. However, between 1 and 4, out of a possible 21 language-based strategies, overlapped with at least 10% of utterances coded as each person-centered indicator. This finding suggests that staff need training to use more diverse language strategies that support personhood of residents with dementia.

  13. Optimizing sampling strategy for radiocarbon dating of Holocene fluvial systems in a vertically aggrading setting

    International Nuclear Information System (INIS)

    Toernqvist, T.E.; Dijk, G.J. Van

    1993-01-01

    The authors address the question of how to determine the period of activity (sedimentation) of fossil (Holocene) fluvial systems in vertically aggrading environments. The available data base consists of almost 100 14 C ages from the Rhine-Meuse delta. Radiocarbon samples from the tops of lithostratigraphically correlative organic beds underneath overbank deposits (sample type 1) yield consistent ages, indicating a synchronous onset of overbank deposition over distances of at least up to 20 km along channel belts. Similarly, 14 C ages from the base of organic residual channel fills (sample type 3) generally indicate a clear termination of within-channel sedimentation. In contrast, 14 C ages from the base of organic beds overlying overbank deposits (sample type 2), commonly assumed to represent the end of fluvial sedimentation, show a large scatter reaching up to 1000 14 C years. It is concluded that a combination of sample types 1 and 3 generally yields a satisfactory delimitation of the period of activity of a fossil fluvial system. 30 refs., 11 figs., 4 tabs

  14. Optimal Electricity Charge Strategy Based on Price Elasticity of Demand for Users

    Science.gov (United States)

    Li, Xin; Xu, Daidai; Zang, Chuanzhi

    The price elasticity is very important for the prediction of electricity demand. This paper mainly establishes the price elasticity coefficient for electricity in single period and inter-temporal. Then, a charging strategy is established based on these coefficients. To evaluate the strategy proposed, simulations of the two elastic coefficients are carried out based on the history data of a certain region.

  15. Actual distribution of Cronobacter spp. in industrial batches of powdered infant formula and consequences for performance of sampling strategies.

    Science.gov (United States)

    Jongenburger, I; Reij, M W; Boer, E P J; Gorris, L G M; Zwietering, M H

    2011-11-15

    The actual spatial distribution of microorganisms within a batch of food influences the results of sampling for microbiological testing when this distribution is non-homogeneous. In the case of pathogens being non-homogeneously distributed, it markedly influences public health risk. This study investigated the spatial distribution of Cronobacter spp. in powdered infant formula (PIF) on industrial batch-scale for both a recalled batch as well a reference batch. Additionally, local spatial occurrence of clusters of Cronobacter cells was assessed, as well as the performance of typical sampling strategies to determine the presence of the microorganisms. The concentration of Cronobacter spp. was assessed in the course of the filling time of each batch, by taking samples of 333 g using the most probable number (MPN) enrichment technique. The occurrence of clusters of Cronobacter spp. cells was investigated by plate counting. From the recalled batch, 415 MPN samples were drawn. The expected heterogeneous distribution of Cronobacter spp. could be quantified from these samples, which showed no detectable level (detection limit of -2.52 log CFU/g) in 58% of samples, whilst in the remainder concentrations were found to be between -2.52 and 2.75 log CFU/g. The estimated average concentration in the recalled batch was -2.78 log CFU/g and a standard deviation of 1.10 log CFU/g. The estimated average concentration in the reference batch was -4.41 log CFU/g, with 99% of the 93 samples being below the detection limit. In the recalled batch, clusters of cells occurred sporadically in 8 out of 2290 samples of 1g taken. The two largest clusters contained 123 (2.09 log CFU/g) and 560 (2.75 log CFU/g) cells. Various sampling strategies were evaluated for the recalled batch. Taking more and smaller samples and keeping the total sampling weight constant, considerably improved the performance of the sampling plans to detect such a type of contaminated batch. Compared to random sampling

  16. Combining Electrochemical Sensors with Miniaturized Sample Preparation for Rapid Detection in Clinical Samples

    Science.gov (United States)

    Bunyakul, Natinan; Baeumner, Antje J.

    2015-01-01

    Clinical analyses benefit world-wide from rapid and reliable diagnostics tests. New tests are sought with greatest demand not only for new analytes, but also to reduce costs, complexity and lengthy analysis times of current techniques. Among the myriad of possibilities available today to develop new test systems, amperometric biosensors are prominent players—best represented by the ubiquitous amperometric-based glucose sensors. Electrochemical approaches in general require little and often enough only simple hardware components, are rugged and yet provide low limits of detection. They thus offer many of the desirable attributes for point-of-care/point-of-need tests. This review focuses on investigating the important integration of sample preparation with (primarily electrochemical) biosensors. Sample clean up requirements, miniaturized sample preparation strategies, and their potential integration with sensors will be discussed, focusing on clinical sample analyses. PMID:25558994

  17. Adult Professional Development: Can Brain-Based Teaching Strategies Increase Learning Effectiveness?

    Science.gov (United States)

    Tilton, Wendy

    2011-01-01

    Brain-based teaching strategies, compared to facilitative student-centered teaching strategies, were employed with 62 real estate professionals in a quasi-mixed-methods study. Participants attended a 2-day proprietary real estate continuing education course. Both the experimental and control groups received the same facilitative instruction, as…

  18. RF Sub-sampling Receiver Architecture based on Milieu Adapting Techniques

    DEFF Research Database (Denmark)

    Behjou, Nastaran; Larsen, Torben; Jensen, Ole Kiel

    2012-01-01

    A novel sub-sampling based architecture is proposed which has the ability of reducing the problem of image distortion and improving the signal to noise ratio significantly. The technique is based on sensing the environment and adapting the sampling rate of the receiver to the best possible...

  19. A belief-based evolutionarily stable strategy

    OpenAIRE

    Deng, Xinyang; Wang, Zhen; Liu, Qi; Deng, Yong; Mahadevan, Sankaran

    2014-01-01

    As an equilibrium refinement of the Nash equilibrium, evolutionarily stable strategy (ESS) is a key concept in evolutionary game theory and has attracted growing interest. An ESS can be either a pure strategy or a mixed strategy. Even though the randomness is allowed in mixed strategy, the selection probability of pure strategy in a mixed strategy may fluctuate due to the impact of many factors. The fluctuation can lead to more uncertainty. In this paper, such uncertainty involved in mixed st...

  20. Research on motor braking-based DYC strategy for distributed electric vehicle

    Science.gov (United States)

    Zhang, Jingming; Liao, Weijie; Chen, Lei; Cui, Shumei

    2017-08-01

    In order to bring into full play the advantages of motor braking and enhance the handling stability of distributed electric vehicle, a motor braking-based direct yaw moment control (DYC) strategy was proposed. This strategy could identify whether a vehicle has under-steered or overs-steered, to calculate the direct yaw moment required for vehicle steering correction by taking the corrected yaw velocity deviation and slip-angle deviation as control variables, and exert motor braking moment on the target wheels to perform correction in the manner of differential braking. For validation of the results, a combined simulation platform was set up finally to simulate the motor braking control strategy proposed. As shown by the results, the motor braking-based DYC strategy timely adjusted the motor braking moment and hydraulic braking moment on the target wheels, and corrected the steering deviation and sideslip of the vehicle in unstable state, improving the handling stability.

  1. Educational strategies for teaching evidence-based practice to undergraduate health students: systematic review.

    Science.gov (United States)

    Kyriakoulis, Konstantinos; Patelarou, Athina; Laliotis, Aggelos; Wan, Andrew C; Matalliotakis, Michail; Tsiou, Chrysoula; Patelarou, Evridiki

    2016-01-01

    The aim of this systematic review was to find best teaching strategies for teaching evidence-based practice (EBP) to undergraduate health students that have been adopted over the last years in healthcare institutions worldwide. The authors carried out a systematic, comprehensive bibliographic search using Medline database for the years 2005 to March 2015 (updated in March 2016). Search terms used were chosen from the USNLM Institutes of Health list of MeSH (Medical Subject Headings) and free text key terms were used as well. Selected articles were measured based on the inclusion criteria of this study and initially compared in terms of titles or abstracts. Finally, articles relevant to the subject of this review were retrieved in full text. Critical appraisal was done to determine the effects of strategy of teaching evidence-based medicine (EBM). Twenty articles were included in the review. The majority of the studies sampled medical students (n=13) and only few conducted among nursing (n=2), pharmacy (n=2), physiotherapy/therapy (n=1), dentistry (n=1), or mixed disciplines (n=1) students. Studies evaluated a variety of educational interventions of varying duration, frequency and format (lectures, tutorials, workshops, conferences, journal clubs, and online sessions), or combination of these to teach EBP. We categorized interventions into single interventions covering a workshop, conference, lecture, journal club, or e-learning and multifaceted interventions where a combination of strategies had been assessed. Seven studies reported an overall increase to all EBP domains indicating a higher EBP competence and two studies focused on the searching databases skill. Followings were deduced from above analysis: multifaceted approach may be best suited when teaching EBM to health students; the use of technology to promote EBP through mobile devices, simulation, and the web is on the rise; and the duration of the interventions varying form some hours to even months was

  2. Educational strategies for teaching evidence-based practice to undergraduate health students: systematic review

    Directory of Open Access Journals (Sweden)

    Konstantinos Kyriakoulis

    2016-09-01

    Full Text Available Purpose The aim of this systematic review was to find best teaching strategies for teaching evidence-based practice (EBP to undergraduate health students that have been adopted over the last years in healthcare institutions worldwide. Methods The authors carried out a systematic, comprehensive bibliographic search using Medline database for the years 2005 to March 2015 (updated in March 2016. Search terms used were chosen from the USNLM Institutes of Health list of MeSH (Medical Subject Headings and free text key terms were used as well. Selected articles were measured based on the inclusion criteria of this study and initially compared in terms of titles or abstracts. Finally, articles relevant to the subject of this review were retrieved in full text. Critical appraisal was done to determine the effects of strategy of teaching evidence-based medicine (EBM. Results Twenty articles were included in the review. The majority of the studies sampled medical students (n=13 and only few conducted among nursing (n=2, pharmacy (n=2, physiotherapy/therapy (n=1, dentistry (n=1, or mixed disciplines (n=1 students. Studies evaluated a variety of educational interventions of varying duration, frequency and format (lectures, tutorials, workshops, conferences, journal clubs, and online sessions, or combination of these to teach EBP. We categorized interventions into single interventions covering a workshop, conference, lecture, journal club, or e-learning and multifaceted interventions where a combination of strategies had been assessed. Seven studies reported an overall increase to all EBP domains indicating a higher EBP competence and two studies focused on the searching databases skill. Conclusion Followings were deduced from above analysis: multifaceted approach may be best suited when teaching EBM to health students; the use of technology to promote EBP through mobile devices, simulation, and the web is on the rise; and the duration of the interventions

  3. Noninvasive Strategy Based on Real-Time in Vivo Cataluminescence Monitoring for Clinical Breath Analysis.

    Science.gov (United States)

    Zhang, Runkun; Huang, Wanting; Li, Gongke; Hu, Yufei

    2017-03-21

    The development of noninvasive methods for real-time in vivo analysis is of great significant, which provides powerful tools for medical research and clinical diagnosis. In the present work, we described a new strategy based on cataluminescence (CTL) for real-time in vivo clinical breath analysis. To illustrate such strategy, a homemade real-time CTL monitoring system characterized by coupling an online sampling device with a CTL sensor for sevoflurane (SVF) was designed, and a real-time in vivo method for the monitoring of SVF in exhaled breath was proposed. The accuracy of the method was evaluated by analyzing the real exhaled breath samples, and the results were compared with those obtained by GC/MS. The measured data obtained by the two methods were in good agreement. Subsequently, the method was applied to real-time monitoring of SVF in exhaled breath from rat models of the control group to investigate elimination pharmacokinetics. In order to further probe the potential of the method for clinical application, the elimination pharmacokinetics of SVF from rat models of control group, liver fibrosis group alcohol liver group, and nonalcoholic fatty liver group were monitored by the method. The raw data of pharmacokinetics of different groups were normalized and subsequently subjected to linear discriminant analysis (LDA). These data were transformed to canonical scores which were visualized as well-clustered with the classification accuracy of 100%, and the overall accuracy of leave-one-out cross-validation procedure is 88%, thereby indicating the utility of the potential of the method for liver disease diagnosis. Our strategy undoubtedly opens up a new door for real-time clinical analysis in a pain-free and noninvasive way and also guides a promising development direction for CTL.

  4. Automated Prediction of Catalytic Mechanism and Rate Law Using Graph-Based Reaction Path Sampling.

    Science.gov (United States)

    Habershon, Scott

    2016-04-12

    In a recent article [ J. Chem. Phys. 2015 , 143 , 094106 ], we introduced a novel graph-based sampling scheme which can be used to generate chemical reaction paths in many-atom systems in an efficient and highly automated manner. The main goal of this work is to demonstrate how this approach, when combined with direct kinetic modeling, can be used to determine the mechanism and phenomenological rate law of a complex catalytic cycle, namely cobalt-catalyzed hydroformylation of ethene. Our graph-based sampling scheme generates 31 unique chemical products and 32 unique chemical reaction pathways; these sampled structures and reaction paths enable automated construction of a kinetic network model of the catalytic system when combined with density functional theory (DFT) calculations of free energies and resultant transition-state theory rate constants. Direct simulations of this kinetic network across a range of initial reactant concentrations enables determination of both the reaction mechanism and the associated rate law in an automated fashion, without the need for either presupposing a mechanism or making steady-state approximations in kinetic analysis. Most importantly, we find that the reaction mechanism which emerges from these simulations is exactly that originally proposed by Heck and Breslow; furthermore, the simulated rate law is also consistent with previous experimental and computational studies, exhibiting a complex dependence on carbon monoxide pressure. While the inherent errors of using DFT simulations to model chemical reactivity limit the quantitative accuracy of our calculated rates, this work confirms that our automated simulation strategy enables direct analysis of catalytic mechanisms from first principles.

  5. TU-AB-BRC-11: Moving a GPU-OpenCL-Based Monte Carlo (MC) Dose Engine Towards Routine Clinical Use: Automatic Beam Commissioning and Efficient Source Sampling

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Z; Folkerts, M; Jiang, S; Jia, X [UT Southwestern Medical Ctr, Dallas, TX (United States); Li, Y [Beihang University, Beijing (China)

    2016-06-15

    Purpose: We have previously developed a GPU-OpenCL-based MC dose engine named goMC with built-in analytical linac beam model. To move goMC towards routine clinical use, we have developed an automatic beam-commissioning method, and an efficient source sampling strategy to facilitate dose calculations for real treatment plans. Methods: Our commissioning method is to automatically adjust the relative weights among the sub-sources, through an optimization process minimizing the discrepancies between calculated dose and measurements. Six models built for Varian Truebeam linac photon beams (6MV, 10MV, 15MV, 18MV, 6MVFFF, 10MVFFF) were commissioned using measurement data acquired at our institution. To facilitate dose calculations for real treatment plans, we employed inverse sampling method to efficiently incorporate MLC leaf-sequencing into source sampling. Specifically, instead of sampling source particles control-point by control-point and rejecting the particles blocked by MLC, we assigned a control-point index to each sampled source particle, according to MLC leaf-open duration of each control-point at the pixel where the particle intersects the iso-center plane. Results: Our auto-commissioning method decreased distance-to-agreement (DTA) of depth dose at build-up regions by 36.2% averagely, making it within 1mm. Lateral profiles were better matched for all beams, with biggest improvement found at 15MV for which root-mean-square difference was reduced from 1.44% to 0.50%. Maximum differences of output factors were reduced to less than 0.7% for all beams, with largest decrease being from1.70% to 0.37% found at 10FFF. Our new sampling strategy was tested on a Head&Neck VMAT patient case. Achieving clinically acceptable accuracy, the new strategy could reduce the required history number by a factor of ∼2.8 given a statistical uncertainty level and hence achieve a similar speed-up factor. Conclusion: Our studies have demonstrated the feasibility and effectiveness of

  6. Do generic strategies impact performance in higher educational institutions? A SEM-based investigation

    OpenAIRE

    Ahlam Mohammad Alzoubi; Okechukwu Lawrence Emeagwali

    2016-01-01

    This study set out to initiate an investigation into the linkage between generic strategy and performance in higher educational institutions and the moderating effect of institution-type. Using structural equation modeling (SEM), it examined the responses of a stratified sample of academics and administrative staff (n= 333) randomly selected from eight universities in northern Cyprus. Findings suggest that while there is a weak effect of differentiation strategy on performance, a strong effec...

  7. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  8. Advances in paper-based sample pretreatment for point-of-care testing.

    Science.gov (United States)

    Tang, Rui Hua; Yang, Hui; Choi, Jane Ru; Gong, Yan; Feng, Shang Sheng; Pingguan-Murphy, Belinda; Huang, Qing Sheng; Shi, Jun Ling; Mei, Qi Bing; Xu, Feng

    2017-06-01

    In recent years, paper-based point-of-care testing (POCT) has been widely used in medical diagnostics, food safety and environmental monitoring. However, a high-cost, time-consuming and equipment-dependent sample pretreatment technique is generally required for raw sample processing, which are impractical for low-resource and disease-endemic areas. Therefore, there is an escalating demand for a cost-effective, simple and portable pretreatment technique, to be coupled with the commonly used paper-based assay (e.g. lateral flow assay) in POCT. In this review, we focus on the importance of using paper as a platform for sample pretreatment. We firstly discuss the beneficial use of paper for sample pretreatment, including sample collection and storage, separation, extraction, and concentration. We highlight the working principle and fabrication of each sample pretreatment device, the existing challenges and the future perspectives for developing paper-based sample pretreatment technique.

  9. Optimal Search Strategy of Robotic Assembly Based on Neural Vibration Learning

    Directory of Open Access Journals (Sweden)

    Lejla Banjanovic-Mehmedovic

    2011-01-01

    Full Text Available This paper presents implementation of optimal search strategy (OSS in verification of assembly process based on neural vibration learning. The application problem is the complex robot assembly of miniature parts in the example of mating the gears of one multistage planetary speed reducer. Assembly of tube over the planetary gears was noticed as the most difficult problem of overall assembly. The favourable influence of vibration and rotation movement on compensation of tolerance was also observed. With the proposed neural-network-based learning algorithm, it is possible to find extended scope of vibration state parameter. Using optimal search strategy based on minimal distance path between vibration parameter stage sets (amplitude and frequencies of robots gripe vibration and recovery parameter algorithm, we can improve the robot assembly behaviour, that is, allow the fastest possible way of mating. We have verified by using simulation programs that search strategy is suitable for the situation of unexpected events due to uncertainties.

  10. User-centric Query Refinement and Processing Using Granularity Based Strategies

    NARCIS (Netherlands)

    Zeng, Y.; Zhong, N.; Wang, Y.; Qin, Y.; Huang, Z.; Zhou, H; Yao, Y; van Harmelen, F.A.H.

    2011-01-01

    Under the context of large-scale scientific literatures, this paper provides a user-centric approach for refining and processing incomplete or vague query based on cognitive- and granularity-based strategies. From the viewpoints of user interests retention and granular information processing, we

  11. Improved sample size determination for attributes and variables sampling

    International Nuclear Information System (INIS)

    Stirpe, D.; Picard, R.R.

    1985-01-01

    Earlier INMM papers have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, we have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed; and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments. 1 ref., 3 figs

  12. Plant-based anti-HIV-1 strategies: vaccine molecules and antiviral approaches.

    Science.gov (United States)

    Scotti, Nunzia; Buonaguro, Luigi; Tornesello, Maria Lina; Cardi, Teodoro; Buonaguro, Franco Maria

    2010-08-01

    The introduction of highly active antiretroviral therapy has drastically changed HIV infection from an acute, very deadly, to a chronic, long-lasting, mild disease. However, this requires continuous care management, which is difficult to implement worldwide, especially in developing countries. Sky-rocketing costs of HIV-positive subjects and the limited success of preventive recommendations mean that a vaccine is urgently needed, which could be the only effective strategy for the real control of the AIDS pandemic. To be effective, vaccination will need to be accessible, affordable and directed against multiple antigens. Plant-based vaccines, which are easy to produce and administer, and require no cold chain for their heat stability are, in principle, suited to such a strategy. More recently, it has been shown that even highly immunogenic, enveloped plant-based vaccines can be produced at a competitive and more efficient rate than conventional strategies. The high variability of HIV epitopes and the need to stimulate both humoral neutralizing antibodies and cellular immunity suggest the importance of using the plant system: it offers a wide range of possible strategies, from single-epitope to multicomponent vaccines, modulators of the immune response (adjuvants) and preventive molecules (microbicides), either alone or in association with plant-derived monoclonal antibodies, besides the potential use of the latter as therapeutic agents. Furthermore, plant-based anti-HIV strategies can be administered not only parenterally but also by the more convenient and safer oral route, which is a more suitable approach for possible mass vaccination.

  13. Dried blood spots of pooled samples for RHD gene screening in blood donors of mixed ancestry.

    Science.gov (United States)

    Silva-Malta, M C F; Araujo, N C Fidélis; Vieira, O V Neves; Schmidt, L Cayres; Gonçalves, P de Cassia; Martins, M Lobato

    2015-10-01

    In this study, we present a strategy for RHD gene screening based on real-time polymerase chain reaction (PCR) using dried blood spots of pooled samples. Molecular analysis of blood donors may be used to detect RHD variants among the presumed D-negative individuals. RHD genotyping using pooled samples is a strategy to test a large number of samples at a more reasonable cost. RHD gene detection based on real-time PCR using dried blood spots of pooled samples was standardised and used to evaluate 1550 Brazilian blood donors phenotyped as RhD-negative. Positive results were re-evaluated by retesting single samples using real-time PCR and conventional multiplex PCR to amplify five RHD-specific exons. PCR-sequence-specific primers was used to amplify RHDψ allele. We devised a strategy for RHD gene screening using dried blood spots of five pooled samples. Among 1550 serologically D-negative blood donors, 58 (3.74%) had the RHD gene. The non-functional RHDψ allele was detected in 47 samples (3.02%). The present method is a promising strategy to detect the RHD gene among presumed RhD-negative blood donors, particularly for populations with African ancestry. © 2015 British Blood Transfusion Society.

  14. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  15. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  16. [Global brain metastases management strategy: a multidisciplinary-based approach].

    Science.gov (United States)

    Métellus, P; Tallet, A; Dhermain, F; Reyns, N; Carpentier, A; Spano, J-P; Azria, D; Noël, G; Barlési, F; Taillibert, S; Le Rhun, É

    2015-02-01

    Brain metastases management has evolved over the last fifteen years and may use varying strategies, including more or less aggressive treatments, sometimes combined, leading to an improvement in patient's survival and quality of life. The therapeutic decision is subject to a multidisciplinary analysis, taking into account established prognostic factors including patient's general condition, extracerebral disease status and clinical and radiological presentation of lesions. In this article, we propose a management strategy based on the state of current knowledge and available therapeutic resources. Copyright © 2015. Published by Elsevier SAS.

  17. The Recent Developments in Sample Preparation for Mass Spectrometry-Based Metabolomics.

    Science.gov (United States)

    Gong, Zhi-Gang; Hu, Jing; Wu, Xi; Xu, Yong-Jiang

    2017-07-04

    Metabolomics is a critical member in systems biology. Although great progress has been achieved in metabolomics, there are still some problems in sample preparation, data processing and data interpretation. In this review, we intend to explore the roles, challenges and trends in sample preparation for mass spectrometry- (MS-) based metabolomics. The newly emerged sample preparation methods were also critically examined, including laser microdissection, in vivo sampling, dried blood spot, microwave, ultrasound and enzyme-assisted extraction, as well as microextraction techniques. Finally, we provide some conclusions and perspectives for sample preparation in MS-based metabolomics.

  18. Use of Research-Based Instructional Strategies: How to Avoid Faculty Quitting

    Science.gov (United States)

    Wieman, Carl; Deslauriers, Louis; Gilley, Brett

    2013-01-01

    We have examined the teaching practices of faculty members who adopted research-based instructional strategies (RBIS) as part of the Carl Wieman Science Education Initiative (CWSEI) at the University of British Columbia (UBC). Of the 70 that adopted such strategies with the support of the CWSEI program, only one subsequently stopped using these…

  19. Improving Reading Instruction through Research-Based Instructional Strategies

    Science.gov (United States)

    Nash, Vickie Lynn

    2010-01-01

    The diverse population of students in grades 1- 3 at a suburban elementary school has created a challenge for teachers when differentiating instruction in reading. The purpose of this doctoral project study was to explore the lived experiences of these teachers as they have acquired research-based instructional strategies in reading that support…

  20. A hybrid computational strategy to address WGS variant analysis in >5000 samples.

    Science.gov (United States)

    Huang, Zhuoyi; Rustagi, Navin; Veeraraghavan, Narayanan; Carroll, Andrew; Gibbs, Richard; Boerwinkle, Eric; Venkata, Manjunath Gorentla; Yu, Fuli

    2016-09-10

    The decreasing costs of sequencing are driving the need for cost effective and real time variant calling of whole genome sequencing data. The scale of these projects are far beyond the capacity of typical computing resources available with most research labs. Other infrastructures like the cloud AWS environment and supercomputers also have limitations due to which large scale joint variant calling becomes infeasible, and infrastructure specific variant calling strategies either fail to scale up to large datasets or abandon joint calling strategies. We present a high throughput framework including multiple variant callers for single nucleotide variant (SNV) calling, which leverages hybrid computing infrastructure consisting of cloud AWS, supercomputers and local high performance computing infrastructures. We present a novel binning approach for large scale joint variant calling and imputation which can scale up to over 10,000 samples while producing SNV callsets with high sensitivity and specificity. As a proof of principle, we present results of analysis on Cohorts for Heart And Aging Research in Genomic Epidemiology (CHARGE) WGS freeze 3 dataset in which joint calling, imputation and phasing of over 5300 whole genome samples was produced in under 6 weeks using four state-of-the-art callers. The callers used were SNPTools, GATK-HaplotypeCaller, GATK-UnifiedGenotyper and GotCloud. We used Amazon AWS, a 4000-core in-house cluster at Baylor College of Medicine, IBM power PC Blue BioU at Rice and Rhea at Oak Ridge National Laboratory (ORNL) for the computation. AWS was used for joint calling of 180 TB of BAM files, and ORNL and Rice supercomputers were used for the imputation and phasing step. All other steps were carried out on the local compute cluster. The entire operation used 5.2 million core hours and only transferred a total of 6 TB of data across the platforms. Even with increasing sizes of whole genome datasets, ensemble joint calling of SNVs for low

  1. Strategies for adding adaptive learning mechanisms to rule-based diagnostic expert systems

    Science.gov (United States)

    Stclair, D. C.; Sabharwal, C. L.; Bond, W. E.; Hacke, Keith

    1988-01-01

    Rule-based diagnostic expert systems can be used to perform many of the diagnostic chores necessary in today's complex space systems. These expert systems typically take a set of symptoms as input and produce diagnostic advice as output. The primary objective of such expert systems is to provide accurate and comprehensive advice which can be used to help return the space system in question to nominal operation. The development and maintenance of diagnostic expert systems is time and labor intensive since the services of both knowledge engineer(s) and domain expert(s) are required. The use of adaptive learning mechanisms to increment evaluate and refine rules promises to reduce both time and labor costs associated with such systems. This paper describes the basic adaptive learning mechanisms of strengthening, weakening, generalization, discrimination, and discovery. Next basic strategies are discussed for adding these learning mechanisms to rule-based diagnostic expert systems. These strategies support the incremental evaluation and refinement of rules in the knowledge base by comparing the set of advice given by the expert system (A) with the correct diagnosis (C). Techniques are described for selecting those rules in the in the knowledge base which should participate in adaptive learning. The strategies presented may be used with a wide variety of learning algorithms. Further, these strategies are applicable to a large number of rule-based diagnostic expert systems. They may be used to provide either immediate or deferred updating of the knowledge base.

  2. Agent Based Simulation of Group Emotions Evolution and Strategy Intervention in Extreme Events

    Directory of Open Access Journals (Sweden)

    Bo Li

    2014-01-01

    Full Text Available Agent based simulation method has become a prominent approach in computational modeling and analysis of public emergency management in social science research. The group emotions evolution, information diffusion, and collective behavior selection make extreme incidents studies a complex system problem, which requires new methods for incidents management and strategy evaluation. This paper studies the group emotion evolution and intervention strategy effectiveness using agent based simulation method. By employing a computational experimentation methodology, we construct the group emotion evolution as a complex system and test the effects of three strategies. In addition, the events-chain model is proposed to model the accumulation influence of the temporal successive events. Each strategy is examined through three simulation experiments, including two make-up scenarios and a real case study. We show how various strategies could impact the group emotion evolution in terms of the complex emergence and emotion accumulation influence in extreme events. This paper also provides an effective method of how to use agent-based simulation for the study of complex collective behavior evolution problem in extreme incidents, emergency, and security study domains.

  3. The Search for Suitable Strategy: Threat-Based and Capabilities-Based Strategies in a Complex World

    Science.gov (United States)

    2016-05-26

    the 1973 Arab-Israeli War show that the true path to suitable strategy is a measure of forethought and theoretical planning exercises to shape habits ...is a measure of forethought and theoretical planning exercises to shape habits of thought and identify risks or shortcomings inherent in a chosen...political direction to the military instrument of power , while military strategy links military means to those 22 Oxford Dictionaries, s.v. “Strategy

  4. Performance of local information-based link prediction: a sampling perspective

    Science.gov (United States)

    Zhao, Jichang; Feng, Xu; Dong, Li; Liang, Xiao; Xu, Ke

    2012-08-01

    Link prediction is pervasively employed to uncover the missing links in the snapshots of real-world networks, which are usually obtained through different kinds of sampling methods. In the previous literature, in order to evaluate the performance of the prediction, known edges in the sampled snapshot are divided into the training set and the probe set randomly, without considering the underlying sampling approaches. However, different sampling methods might lead to different missing links, especially for the biased ways. For this reason, random partition-based evaluation of performance is no longer convincing if we take the sampling method into account. In this paper, we try to re-evaluate the performance of local information-based link predictions through sampling method governed division of the training set and the probe set. It is interesting that we find that for different sampling methods, each prediction approach performs unevenly. Moreover, most of these predictions perform weakly when the sampling method is biased, which indicates that the performance of these methods might have been overestimated in the prior works.

  5. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    Directory of Open Access Journals (Sweden)

    Young Soo Suh

    2009-04-01

    Full Text Available This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD, sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen.

  6. Sampling Key Populations for HIV Surveillance: Results From Eight Cross-Sectional Studies Using Respondent-Driven Sampling and Venue-Based Snowball Sampling.

    Science.gov (United States)

    Rao, Amrita; Stahlman, Shauna; Hargreaves, James; Weir, Sharon; Edwards, Jessie; Rice, Brian; Kochelani, Duncan; Mavimbela, Mpumelelo; Baral, Stefan

    2017-10-20

    In using regularly collected or existing surveillance data to characterize engagement in human immunodeficiency virus (HIV) services among marginalized populations, differences in sampling methods may produce different pictures of the target population and may therefore result in different priorities for response. The objective of this study was to use existing data to evaluate the sample distribution of eight studies of female sex workers (FSW) and men who have sex with men (MSM), who were recruited using different sampling approaches in two locations within Sub-Saharan Africa: Manzini, Swaziland and Yaoundé, Cameroon. MSM and FSW participants were recruited using either respondent-driven sampling (RDS) or venue-based snowball sampling. Recruitment took place between 2011 and 2016. Participants at each study site were administered a face-to-face survey to assess sociodemographics, along with the prevalence of self-reported HIV status, frequency of HIV testing, stigma, and other HIV-related characteristics. Crude and RDS-adjusted prevalence estimates were calculated. Crude prevalence estimates from the venue-based snowball samples were compared with the overlap of the RDS-adjusted prevalence estimates, between both FSW and MSM in Cameroon and Swaziland. RDS samples tended to be younger (MSM aged 18-21 years in Swaziland: 47.6% [139/310] in RDS vs 24.3% [42/173] in Snowball, in Cameroon: 47.9% [99/306] in RDS vs 20.1% [52/259] in Snowball; FSW aged 18-21 years in Swaziland 42.5% [82/325] in RDS vs 8.0% [20/249] in Snowball; in Cameroon 15.6% [75/576] in RDS vs 8.1% [25/306] in Snowball). They were less educated (MSM: primary school completed or less in Swaziland 42.6% [109/310] in RDS vs 4.0% [7/173] in Snowball, in Cameroon 46.2% [138/306] in RDS vs 14.3% [37/259] in Snowball; FSW: primary school completed or less in Swaziland 86.6% [281/325] in RDS vs 23.9% [59/247] in Snowball, in Cameroon 87.4% [520/576] in RDS vs 77.5% [238/307] in Snowball) than the snowball

  7. Replication Strategy for Spatiotemporal Data Based on Distributed Caching System.

    Science.gov (United States)

    Xiong, Lian; Yang, Liu; Tao, Yang; Xu, Juan; Zhao, Lun

    2018-01-14

    The replica strategy in distributed cache can effectively reduce user access delay and improve system performance. However, developing a replica strategy suitable for varied application scenarios is still quite challenging, owing to differences in user access behavior and preferences. In this paper, a replication strategy for spatiotemporal data (RSSD) based on a distributed caching system is proposed. By taking advantage of the spatiotemporal locality and correlation of user access, RSSD mines high popularity and associated files from historical user access information, and then generates replicas and selects appropriate cache node for placement. Experimental results show that the RSSD algorithm is simple and efficient, and succeeds in significantly reducing user access delay.

  8. Effectiveness and cost of recruitment strategies for a community-based randomised controlled trial among rainwater drinkers

    Directory of Open Access Journals (Sweden)

    Cunliffe David

    2009-07-01

    Full Text Available Abstract Background Community-based recruitment is challenging particularly if the sampling frame is not easily defined as in the case of people who drink rainwater. Strategies for contacting participants must be carefully considered to maximise generalisability and minimise bias of the results. This paper assesses the recruitment strategies for a 1-year double-blinded randomised trial on drinking untreated rainwater. The effectiveness of the recruitment strategies and associated costs are described. Methods Community recruitment of households from Adelaide, Australia occurred from February to July 2007 using four methods: electoral roll mail-out, approaches to schools and community groups, newspaper advertising, and other media involvement. Word of mouth communication was also assessed. Results A total of 810 callers were screened, with 53.5% eligible. Of those who were eligible and sent further information, 76.7% were willing to participate in the study and 75.1% were enrolled. The target for recruitment was 300 households, and this was achieved. The mail-out was the most effective method with respect to number of households randomised, while recruitment via schools had the highest yield (57.3% and was the most cost effective when considering cost per household randomised (AUD$147.20. Yield and cost effectiveness were lowest for media advertising. Conclusion The use of electoral roll mail-out and advertising via schools were effective in reaching households using untreated rainwater for drinking. Employing multiple strategies enabled success in achieving the recruitment target. In countries where electoral roll extracts are available to researchers, this method is likely to have a high yield for recruitment into community-based epidemiological studies.

  9. Bias of shear wave elasticity measurements in thin layer samples and a simple correction strategy.

    Science.gov (United States)

    Mo, Jianqiang; Xu, Hao; Qiang, Bo; Giambini, Hugo; Kinnick, Randall; An, Kai-Nan; Chen, Shigao; Luo, Zongping

    2016-01-01

    Shear wave elastography (SWE) is an emerging technique for measuring biological tissue stiffness. However, the application of SWE in thin layer tissues is limited by bias due to the influence of geometry on measured shear wave speed. In this study, we investigated the bias of Young's modulus measured by SWE in thin layer gelatin-agar phantoms, and compared the result with finite element method and Lamb wave model simulation. The result indicated that the Young's modulus measured by SWE decreased continuously when the sample thickness decreased, and this effect was more significant for smaller thickness. We proposed a new empirical formula which can conveniently correct the bias without the need of using complicated mathematical modeling. In summary, we confirmed the nonlinear relation between thickness and Young's modulus measured by SWE in thin layer samples, and offered a simple and practical correction strategy which is convenient for clinicians to use.

  10. An integrated strategy for in vivo metabolite profiling using high-resolution mass spectrometry based data processing techniques

    International Nuclear Information System (INIS)

    Guo, Jian; Zhang, Minli; Elmore, Charles S.; Vishwanathan, Karthick

    2013-01-01

    Graphical abstract: -- Highlights: •Profiling the metabolites of model compounds in rats using high resolution mass spectrometry based data processing techniques. •Demonstrating an integrated strategy in vivo metabolite profiling using data mining tools. •Unusual metabolites generated via thiazole-ring opening were characterized based on processed LC–MS.data. -- Abstract: An ongoing challenge of drug metabolite profiling is to detect and identify unknown or low-level metabolites in complex biological matrices. Here we present a generic strategy for metabolite detection using multiple accurate-mass-based data processing tools via the analysis of rat samples of two model drug candidates, AZD6280 and AZ12488024. First, the function of isotopic pattern recognition was proved to be highly effective in the detection of metabolites derived from [ 14 C]-AZD6280 that possesses a distinct isotopic pattern. The metabolites revealed using this approach were in excellent qualitative correlation to those observed in radiochromatograms. Second, the effectiveness of accurate mass based untargeted data mining tools such as background subtraction, mass defect filtering, or a data mining package (MZmine) used for metabolomic analysis in detection of metabolites of [ 14 C]-AZ12488024 in rat urine, feces, bile and plasma samples was examined and a total of 33 metabolites of AZ12488024 were detected. Among them, at least 16 metabolites were only detected by the aid of the data mining packages and not via radiochromatograms. New metabolic pathways such as S-oxidation and thiomethylation reactions occurring on the thiazole ring were proposed based on the processed data. The results of these experiments also demonstrated that accurate mass-based mass defect filtering (MDF) and data mining techniques used in metabolomics are complementary and can be valuable tools for delineating low-level metabolites in complex matrices. Furthermore, the application of distinct multiple data

  11. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    Science.gov (United States)

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  12. Assessment of Peer-Based and Structural Strategies for Increasing ...

    African Journals Online (AJOL)

    Assessment of Peer-Based and Structural Strategies for Increasing Male Participation in an Antenatal Setting in Lilongwe, Malawi. SM Mphonda, NE Rosenberg, E Kamanga, I Mofolo, G Mwale, E Boa, M Mwale, F Martinson, I Hoffman, MC Hosseinipour ...

  13. Cell-based therapeutic strategies for multiple sclerosis

    DEFF Research Database (Denmark)

    Scolding, Neil J; Pasquini, Marcelo; Reingold, Stephen C

    2017-01-01

    and none directly promotes repair. Cell-based therapies, including immunoablation followed by autologous haematopoietic stem cell transplantation, mesenchymal and related stem cell transplantation, pharmacologic manipulation of endogenous stem cells to enhance their reparative capabilities......, and transplantation of oligodendrocyte progenitor cells, have generated substantial interest as novel therapeutic strategies for immune modulation, neuroprotection, or repair of the damaged central nervous system in multiple sclerosis. Each approach has potential advantages but also safety concerns and unresolved...

  14. Decision making in advanced otosclerosis: an evidence-based strategy

    NARCIS (Netherlands)

    Merkus, P.; van Loon, M.C.; Smit, C.F.G.M.; Smits, J.C.M.; de Cock, A.F.C.; Hensen, E.F.

    2011-01-01

    Objectives/Hypothesis: To propose an evidence-based strategy for the management of patients with advanced otosclerosis accompanied by severe to profound hearing loss. Study Design: Systematic review of the literature and development of treatment guidelines. Methods: A systematic review was conducted

  15. Fundraising Strategies Developed by MBA Students in Project-Based Learning Courses

    Science.gov (United States)

    Arantes do Amaral, Joao Alberto; Petroni, Liége Mariel; Hess, Aurélio

    2016-01-01

    The ability to raise funds is a skill that most modern project managers need. While a good deal of literature exists on the strategies NGOs employ to raise funds for their operations, less attention has been paid to the strategies used by students involved in Project-Based Learning courses that often partner with NGOs. Fundraising is an important…

  16. Risk-Based Sampling: I Don't Want to Weight in Vain.

    Science.gov (United States)

    Powell, Mark R

    2015-12-01

    Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. © 2015 Society for Risk Analysis.

  17. Bidding strategy in pay-as-bid markets based on supplier-market interaction analysis

    International Nuclear Information System (INIS)

    Bigdeli, N.; Afshar, K.; Fotuhi-Firuzabad, M.

    2010-01-01

    In this paper, a new bidding strategy for pay-as-bid market suppliers is introduced. This method is based on a systematic analysis of interactions of market with the suppliers via several market indices as well as forecasting important indices by artificial neural networks. Besides, the proposed method considers the practical limitations in the system and deals with incomplete information handling, closely. Next, a strategic bidding approach is proposed for optimal bidding by the suppliers. In these investigations, the paper focus is on the experimental situation of Iran electricity market as a pay-as-bid market and a sample generating company with several generating units from this market is considered as the benchmark. The results of applying this approach to this generating company are representative of good performance of the proposed method.

  18. FPGA Techniques Based New Hybrid Modulation Strategies for Voltage Source Inverters

    Science.gov (United States)

    Sudha, L. U.; Baskaran, J.; Elankurisil, S. A.

    2015-01-01

    This paper corroborates three different hybrid modulation strategies suitable for single-phase voltage source inverter. The proposed method is formulated using fundamental switching and carrier based pulse width modulation methods. The main tale of this proposed method is to optimize a specific performance criterion, such as minimization of the total harmonic distortion (THD), lower order harmonics, switching losses, and heat losses. The proposed method is articulated using fundamental switching and carrier based pulse width modulation methods. Thus, the harmonic pollution in the power system will be reduced and the power quality will be augmented with better harmonic profile for a target fundamental output voltage. The proposed modulation strategies are simulated in MATLAB r2010a and implemented in a Xilinx spartan 3E-500 FG 320 FPGA processor. The feasibility of these modulation strategies is authenticated through simulation and experimental results. PMID:25821852

  19. Sampling strategies for millipedes (Diplopoda), centipedes ...

    African Journals Online (AJOL)

    At present considerable effort is being made to document and describe invertebrate diversity as part of numerous biodiversity conservation research projects. In order to determine diversity, rapid and effective sampling and estimation procedures are required and these need to be standardized for a particular group of ...

  20. Cattle farmers’ perceptions of risk and risk management strategies

    DEFF Research Database (Denmark)

    Bishu, Kinfe G.; O'Reilly, Seamus; Lahiff, Edward

    2018-01-01

    This study analyzes cattle farmers’ perceptions of risk and risk management strategies in Tigray, Northern Ethiopia. We use survey data from a sample of 356 farmers based on multistage random sampling. Factor analysis is employed to classify scores of risk and management strategies, and multiple...... utilization were perceived as the most important strategies for managing risks. Livestock disease and labor shortage were perceived as less of a risk by farmers who adopted the practice of zero grazing compared to other farmers, pointing to the potential of this practice for risk reduction. We find strong...... evidence that farmers engage in multiple risk management practices in order to reduce losses from cattle morbidity and mortality. The results suggest that government strategies that aim at reducing farmers’ risk need to be tailored to specific farm and farmer characteristics. Findings from this study have...

  1. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  2. Replication Strategy for Spatiotemporal Data Based on Distributed Caching System

    Science.gov (United States)

    Xiong, Lian; Tao, Yang; Xu, Juan; Zhao, Lun

    2018-01-01

    The replica strategy in distributed cache can effectively reduce user access delay and improve system performance. However, developing a replica strategy suitable for varied application scenarios is still quite challenging, owing to differences in user access behavior and preferences. In this paper, a replication strategy for spatiotemporal data (RSSD) based on a distributed caching system is proposed. By taking advantage of the spatiotemporal locality and correlation of user access, RSSD mines high popularity and associated files from historical user access information, and then generates replicas and selects appropriate cache node for placement. Experimental results show that the RSSD algorithm is simple and efficient, and succeeds in significantly reducing user access delay. PMID:29342897

  3. Listening Strategy Use and Influential Factors in Web-Based Computer Assisted Language Learning

    Science.gov (United States)

    Chen, L.; Zhang, R.; Liu, C.

    2014-01-01

    This study investigates second and foreign language (L2) learners' listening strategy use and factors that influence their strategy use in a Web-based computer assisted language learning (CALL) system. A strategy inventory, a factor questionnaire and a standardized listening test were used to collect data from a group of 82 Chinese students…

  4. Strategy-Driven Exploration for Rule-Based Models of Biochemical Systems with Porgy

    OpenAIRE

    Andrei , Oana; Fernández , Maribel; Kirchner , Hélène; Pinaud , Bruno

    2016-01-01

    This paper presents Porgy – an interactive visual environment for rule-based modelling of biochemical systems. We model molecules and molecule interactions as port graphs and port graph rewrite rules, respectively. We use rewriting strategies to control which rules to apply, and where and when to apply them. Our main contributions to rule-based modelling of biochemical systems lie in the strategy language and the associated visual and interactive features offered by Porgy. These features faci...

  5. Investment Strategy Based on Aviation Accidents: Are there abnormal returns?

    Directory of Open Access Journals (Sweden)

    Marcos Rosa Costa

    2013-06-01

    Full Text Available This article investigates whether an investment strategy based on aviation accidents can generate abnormal returns. We performed an event study considering all the aviation accidents with more than 10 fatalities in the period from 1998 to 2009 and the stock market performance of the respective airlines and aircraft manufacturers in the days after the event. The tests performed were based on the model of Campbell, Lo & MacKinlay (1997 for definition of abnormal returns, by means of linear regression between the firms’ stock returns and the return of a market portfolio used as a benchmark. This enabled projecting the expected future returns of the airlines and aircraft makers, for comparison with the observed returns after each event. The result obtained suggests that an investment strategy based on aviation accidents is feasible because abnormal returns can be obtained in the period immediately following an aviation disaster.

  6. On incomplete sampling under birth-death models and connections to the sampling-based coalescent.

    Science.gov (United States)

    Stadler, Tanja

    2009-11-07

    The constant rate birth-death process is used as a stochastic model for many biological systems, for example phylogenies or disease transmission. As the biological data are usually not fully available, it is crucial to understand the effect of incomplete sampling. In this paper, we analyze the constant rate birth-death process with incomplete sampling. We derive the density of the bifurcation events for trees on n leaves which evolved under this birth-death-sampling process. This density is used for calculating prior distributions in Bayesian inference programs and for efficiently simulating trees. We show that the birth-death-sampling process can be interpreted as a birth-death process with reduced rates and complete sampling. This shows that joint inference of birth rate, death rate and sampling probability is not possible. The birth-death-sampling process is compared to the sampling-based population genetics model, the coalescent. It is shown that despite many similarities between these two models, the distribution of bifurcation times remains different even in the case of very large population sizes. We illustrate these findings on an Hepatitis C virus dataset from Egypt. We show that the transmission times estimates are significantly different-the widely used Gamma statistic even changes its sign from negative to positive when switching from the coalescent to the birth-death process.

  7. The Impact of a Strategies-Based Instruction on Iranian EAP Students’ Reading Strategy Use: Developing Strategic EAP Readers

    Directory of Open Access Journals (Sweden)

    Seyyed Hossein Kashef

    2014-01-01

    Full Text Available Underperformance of students in EAP reading comprehension has been an issue of concern for teachers, syllabus designers, and curriculum developers in general and for EAP practitioners in particular. In spite of the fact that considerable efforts have been made to improve reading comprehension of students through strategies instruction over past decades, EAP students however have not benefited much from learning strategies. Thus, this study intended to investigate the impact of a Strategies-Based Instruction (SBI on undergraduate students’ reading strategy use in an EAP context. Taking an instructional model from strategies taxonomy of Oxford (1990; 2001, it was assumed that in contrast to conventional EAP reading methods, SBI would be more effective in encouraging reading strategy use and as a result developing reading comprehension of EAP students through encouraging the use of effective strategies and skills. To do so, 80 freshman undergraduate students were chosen as the participants of this study who were in two intact classes. After administration of a pre-test, treatment (22 sessions, 2 sessions per week, and a post-test, the collected data was analyzed using t-test to examine the effect of the proposed method of instruction. The results of the analysis showed that the teaching intervention had a significant effect on students’ reading strategy use. The findings have implications for teachers encouraging effective reading comprehension instruction through the use of strategies in EAP teaching contexts.

  8. Strategies for monitoring the emerging polar organic contaminants in water with emphasis on integrative passive sampling.

    Science.gov (United States)

    Söderström, Hanna; Lindberg, Richard H; Fick, Jerker

    2009-01-16

    Although polar organic contaminants (POCs) such as pharmaceuticals are considered as some of today's most emerging contaminants few of them are regulated or included in on-going monitoring programs. However, the growing concern among the public and researchers together with the new legislature within the European Union, the registration, evaluation and authorisation of chemicals (REACH) system will increase the future need of simple, low cost strategies for monitoring and risk assessment of POCs in aquatic environments. In this article, we overview the advantages and shortcomings of traditional and novel sampling techniques available for monitoring the emerging POCs in water. The benefits and drawbacks of using active and biological sampling were discussed and the principles of organic passive samplers (PS) presented. A detailed overview of type of polar organic PS available, and their classes of target compounds and field of applications were given, and the considerations involved in using them such as environmental effects and quality control were discussed. The usefulness of biological sampling of POCs in water was found to be limited. Polar organic PS was considered to be the only available, but nevertheless, an efficient alternative to active water sampling due to its simplicity, low cost, no need of power supply or maintenance, and the ability of collecting time-integrative samples with one sample collection. However, the polar organic PS need to be further developed before they can be used as standard in water quality monitoring programs.

  9. Sample-Based Extreme Learning Machine with Missing Data

    Directory of Open Access Journals (Sweden)

    Hang Gao

    2015-01-01

    Full Text Available Extreme learning machine (ELM has been extensively studied in machine learning community during the last few decades due to its high efficiency and the unification of classification, regression, and so forth. Though bearing such merits, existing ELM algorithms cannot efficiently handle the issue of missing data, which is relatively common in practical applications. The problem of missing data is commonly handled by imputation (i.e., replacing missing values with substituted values according to available information. However, imputation methods are not always effective. In this paper, we propose a sample-based learning framework to address this issue. Based on this framework, we develop two sample-based ELM algorithms for classification and regression, respectively. Comprehensive experiments have been conducted in synthetic data sets, UCI benchmark data sets, and a real world fingerprint image data set. As indicated, without introducing extra computational complexity, the proposed algorithms do more accurate and stable learning than other state-of-the-art ones, especially in the case of higher missing ratio.

  10. Optimum Performance Enhancing Strategies of the Gas Turbine Based on the Effective Temperatures

    Directory of Open Access Journals (Sweden)

    Ibrahim Thamir K.

    2016-01-01

    Full Text Available Gas turbines (GT have come to play a significant role in distributed energy systems due to its multi-fuel capability, compact size and low environmental impact and reduced cost. Nevertheless, the low electrical efficiency, typically about 30% (LHV, is an important obstruction to the development of the GT plants. New strategies are designed for the GT plant, to increase the overall performance based on the operational modeling and optimization of GT power plants. The enhancing strategies effect on the GT power plant’s performance (with intercooler, two-shaft, reheat and regenerative based on the real power plant of GT. An analysis based on thermodynamics has been carried out on the modifications of the cycle configurations’ enhancements. Then, the results showed the effect of the ambient and turbine inlet temperatures on the performance of the GT plants to select an optimum strategy for the GT. The performance model code to compare the strategies of the GT plants were developed utilizing the MATLAB software. The results show that, the best thermal efficiency occurs in the intercooler-regenerative-reheated GT strategy (IRHGT; it decreased from 51.5 to 48%, when the ambient temperature increased (from 273 to 327K. Furthermore, the thermal efficiency of the GT for the strategies without the regenerative increased (about 3.3%, while thermal efficiency for the strategies with regenerative increased (about 22% with increased of the turbine inlet temperature. The lower thermal efficiency occurs in the IHGT strategy, while the higher thermal efficiency occurs in the IRHGT strategy. However, the power output variation is more significant at a higher value of the turbine inlet temperature. The simulation model gives a consistent result compared with Baiji GT plant. The extensive modeling performed in this study reveals that; the ambient temperature and turbine inlet temperature are strongly influenced on the performance of GT plant.

  11. A fault-tolerant strategy based on SMC for current-controlled converters

    Science.gov (United States)

    Azer, Peter M.; Marei, Mostafa I.; Sattar, Ahmed A.

    2018-05-01

    The sliding mode control (SMC) is used to control variable structure systems such as power electronics converters. This paper presents a fault-tolerant strategy based on the SMC for current-controlled AC-DC converters. The proposed SMC is based on three sliding surfaces for the three legs of the AC-DC converter. Two sliding surfaces are assigned to control the phase currents since the input three-phase currents are balanced. Hence, the third sliding surface is considered as an extra degree of freedom which is utilised to control the neutral voltage. This action is utilised to enhance the performance of the converter during open-switch faults. The proposed fault-tolerant strategy is based on allocating the sliding surface of the faulty leg to control the neutral voltage. Consequently, the current waveform is improved. The behaviour of the current-controlled converter during different types of open-switch faults is analysed. Double switch faults include three cases: two upper switch fault; upper and lower switch fault at different legs; and two switches of the same leg. The dynamic performance of the proposed system is evaluated during healthy and open-switch fault operations. Simulation results exhibit the various merits of the proposed SMC-based fault-tolerant strategy.

  12. STRATEGI PENGEMBANGAN PASAR TRADISIONAL BERBASIS KEARIFAN LOKAL UNTUK MENGENTASKAN KEMISKINAN DI BALI

    Directory of Open Access Journals (Sweden)

    I Putu Gde Sukaatmadja

    2015-08-01

    Full Text Available The aim of this research is to formulate development strategy of traditional market based on local wisdom to alleviate poverty. This study was taken in Bali Province. The sample determined by nonproportionate random sampling. The samples were 18 heads of traditional market in Bali. The data processed by SWOT analysis with Internal External Matrix (IE Matrix. The result shows that the opportunities consists of income percapita, inflation growth, traditional market revitalisation policy, stabilisation, social awareness, communityappreciation, communiy taste changes, and adoption of information technology, the threats were the existenceof modern market and population growth. The future strengths are product diversify, local productdifferentiation, product quality, merchandise layout, price flexibility, and strategy location. Whereas, theweaknesses are parking area availabilitry, transaction process, promotion, cooperation with tourism industry,market cleanliness, public facilitity availability, and consumer services. Based on the SWOT analysis, businessposition of traditional market are in cell-1 which is further become the basis for development strategy, i.e.“Growth and Maintenance Strategy”

  13. Evaluation of sampling strategies to estimate crown biomass

    Science.gov (United States)

    Krishna P Poudel; Hailemariam Temesgen; Andrew N Gray

    2015-01-01

    Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire...

  14. Biosensor-based diagnostics of contaminated groundwater: assessment and remediation strategy

    International Nuclear Information System (INIS)

    Bhattacharyya, Jessica; Read, David; Amos, Sean; Dooley, Stephen; Killham, Kenneth; Paton, Graeme I.

    2005-01-01

    Shallow groundwater beneath a former airfield site in southern England has been heavily contaminated with a wide range of chlorinated solvents. The feasibility of using bacterial biosensors to complement chemical analysis and enable cost-effective, and focussed sampling has been assessed as part of a site evaluation programme. Five different biosensors, three metabolic (Vibrio fischeri, Pseudomonas fluorescens 10568 and Escherichia coli HB101) and two catabolic (Pseudomonas putida TVA8 and E. coli DH5α), were employed to identify areas where the availability and toxicity of pollutants is of most immediate environmental concern. The biosensors used showed different sensitivities to each other and to the groundwater samples tested. There was generally a good agreement with chemical analyses. The potential efficacy of remediation strategies was explored by coupling sample manipulation to biosensor tests. Manipulation involved sparging and charcoal treatment procedures to simulate remediative engineering solutions. Sparging was sufficient at most locations. - Luminescent bacteria complement chemical analysis and support remediation technology

  15. Sample pre-concentration with high enrichment factors at a fixed location in paper-based microfluidic devices.

    Science.gov (United States)

    Yeh, Shih-Hao; Chou, Kuang-Hua; Yang, Ruey-Jen

    2016-03-07

    The lack of sensitivity is a major problem among microfluidic paper-based analytical devices (μPADs) for early disease detection and diagnosis. Accordingly, the present study presents a method for improving the enrichment factor of low-concentration biomarkers by using shallow paper-based channels realized through a double-sided wax-printing process. In addition, the enrichment factor is further enhanced by exploiting the ion concentration polarization (ICP) effect on the cathodic side of the nanoporous membrane, in which a stationary sample plug is obtained. The occurrence of ICP on the shallow-channel μPAD is confirmed by measuring the current-voltage response as the external voltage is increased from 0 to 210 V (or the field strength from 0 to 1.05 × 10(4) V m(-1)) over 600 s. In addition, to the best of our knowledge, the electroosmotic flow (EOF) speed on the μPAD fabricated with a wax-channel is measured for the first time using a current monitoring method. The experimental results show that for a fluorescein sample, the concentration factor is increased from 130-fold in a conventional full-thickness paper channel to 944-fold in the proposed shallow channel. Furthermore, for a fluorescein isothiocyanate-labeled bovine serum albumin (FITC-BSA) sample, the proposed shallow-channel μPAD achieves an 835-fold improvement in the concentration factor. The concentration technique presented here provides a novel strategy for enhancing the detection sensitivity of μPAD applications.

  16. Finding metastabilities in reversible Markov chains based on incomplete sampling

    Directory of Open Access Journals (Sweden)

    Fackeldey Konstantin

    2017-01-01

    Full Text Available In order to fully characterize the state-transition behaviour of finite Markov chains one needs to provide the corresponding transition matrix P. In many applications such as molecular simulation and drug design, the entries of the transition matrix P are estimated by generating realizations of the Markov chain and determining the one-step conditional probability Pij for a transition from one state i to state j. This sampling can be computational very demanding. Therefore, it is a good idea to reduce the sampling effort. The main purpose of this paper is to design a sampling strategy, which provides a partial sampling of only a subset of the rows of such a matrix P. Our proposed approach fits very well to stochastic processes stemming from simulation of molecular systems or random walks on graphs and it is different from the matrix completion approaches which try to approximate the transition matrix by using a low-rank-assumption. It will be shown how Markov chains can be analyzed on the basis of a partial sampling. More precisely. First, we will estimate the stationary distribution from a partially given matrix P. Second, we will estimate the infinitesimal generator Q of P on the basis of this stationary distribution. Third, from the generator we will compute the leading invariant subspace, which should be identical to the leading invariant subspace of P. Forth, we will apply Robust Perron Cluster Analysis (PCCA+ in order to identify metastabilities using this subspace.

  17. Strategies for satellite-based monitoring of CO2 from distributed area and point sources

    Science.gov (United States)

    Schwandner, Florian M.; Miller, Charles E.; Duren, Riley M.; Natraj, Vijay; Eldering, Annmarie; Gunson, Michael R.; Crisp, David

    2014-05-01

    Atmospheric CO2 budgets are controlled by the strengths, as well as the spatial and temporal variabilities of CO2 sources and sinks. Natural CO2 sources and sinks are dominated by the vast areas of the oceans and the terrestrial biosphere. In contrast, anthropogenic and geogenic CO2 sources are dominated by distributed area and point sources, which may constitute as much as 70% of anthropogenic (e.g., Duren & Miller, 2012), and over 80% of geogenic emissions (Burton et al., 2013). Comprehensive assessments of CO2 budgets necessitate robust and highly accurate satellite remote sensing strategies that address the competing and often conflicting requirements for sampling over disparate space and time scales. Spatial variability: The spatial distribution of anthropogenic sources is dominated by patterns of production, storage, transport and use. In contrast, geogenic variability is almost entirely controlled by endogenic geological processes, except where surface gas permeability is modulated by soil moisture. Satellite remote sensing solutions will thus have to vary greatly in spatial coverage and resolution to address distributed area sources and point sources alike. Temporal variability: While biogenic sources are dominated by diurnal and seasonal patterns, anthropogenic sources fluctuate over a greater variety of time scales from diurnal, weekly and seasonal cycles, driven by both economic and climatic factors. Geogenic sources typically vary in time scales of days to months (geogenic sources sensu stricto are not fossil fuels but volcanoes, hydrothermal and metamorphic sources). Current ground-based monitoring networks for anthropogenic and geogenic sources record data on minute- to weekly temporal scales. Satellite remote sensing solutions would have to capture temporal variability through revisit frequency or point-and-stare strategies. Space-based remote sensing offers the potential of global coverage by a single sensor. However, no single combination of orbit

  18. A general parallelization strategy for random path based geostatistical simulation methods

    Science.gov (United States)

    Mariethoz, Grégoire

    2010-07-01

    The size of simulation grids used for numerical models has increased by many orders of magnitude in the past years, and this trend is likely to continue. Efficient pixel-based geostatistical simulation algorithms have been developed, but for very large grids and complex spatial models, the computational burden remains heavy. As cluster computers become widely available, using parallel strategies is a natural step for increasing the usable grid size and the complexity of the models. These strategies must profit from of the possibilities offered by machines with a large number of processors. On such machines, the bottleneck is often the communication time between processors. We present a strategy distributing grid nodes among all available processors while minimizing communication and latency times. It consists in centralizing the simulation on a master processor that calls other slave processors as if they were functions simulating one node every time. The key is to decouple the sending and the receiving operations to avoid synchronization. Centralization allows having a conflict management system ensuring that nodes being simulated simultaneously do not interfere in terms of neighborhood. The strategy is computationally efficient and is versatile enough to be applicable to all random path based simulation methods.

  19. Effectiveness of Gross Model-Based Emotion Regulation Strategies Training on Anger Reduction in Drug-Dependent Individuals and its Sustainability in Follow-up.

    Science.gov (United States)

    Massah, Omid; Sohrabi, Faramarz; A'azami, Yousef; Doostian, Younes; Farhoudian, Ali; Daneshmand, Reza

    2016-03-01

    Emotion plays an important role in adapting to life changes and stressful events. Difficulty regulating emotions is one of the problems drug abusers often face, and teaching these individuals to express and manage their emotions can be effective on improving their difficult circumstances. The present study aimed to determine the effectiveness of the Gross model-based emotion regulation strategies training on anger reduction in drug-dependent individuals. The present study had a quasi-experimental design wherein pretest-posttest evaluations were applied using a control group. The population under study included addicts attending Marivan's methadone maintenance therapy centers in 2012 - 2013. Convenience sampling was used to select 30 substance-dependent individuals undergoing maintenance treatment who were then randomly assigned to the experiment and control groups. The experiment group received its training in eight two-hour sessions. Data were analyzed using analysis of co-variance and paired t-test. There was significant reduction in anger symptoms of drug-dependent individuals after gross model based emotion regulation training (ERT) (P emotion regulation strategies training. Based on the results of this study, we may conclude that the gross model based emotion regulation strategies training can be applied alongside other therapies to treat drug abusers undergoing rehabilitation.

  20. Model-based comparison of strategies for reduction of stormwater micropollutant emissions

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Sharma, Anitha Kumari; Mikkelsen, Peter Steen

    to improve the recipient quality by reducing the fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene) to natural waters. MP sources were identified by using GIS land usage data. When comparing the different control strategies, the integrated model showed the greater benefits......Strategies for reduction of micropollutant (MP) emissions from stormwater systems require the comparison of different scenarios including source control, end-of-pipe treatment, or their combination. Dynamic integrated models can be important tools for this comparison, as they can integrate...... the limited data provided by monitoring campaigns and evaluate the performance of different strategies based on model simulation results. This study presents an example where an integrated dynamic model, in combination with stormwater quality measurements, was used to evaluate 6 different strategies...

  1. Target-induced formation of gold amalgamation on DNA-based sensing platform for electrochemical monitoring of mercury ion coupling with cycling signal amplification strategy

    International Nuclear Information System (INIS)

    Chen, Jinfeng; Tang, Juan; Zhou, Jun; Zhang, Lan; Chen, Guonan; Tang, Dianping

    2014-01-01

    Graphical abstract: -- Highlights: •We report a new electrochemical sensing protocol for the detection of mercury ion. •Gold amalgamation on DNA-based sensing platform was used as nanocatalyst. •The signal was amplified by cycling signal amplification strategy. -- Abstract: Heavy metal ion pollution poses severe risks in human health and environmental pollutant, because of the likelihood of bioaccumulation and toxicity. Driven by the requirement to monitor trace-level mercury ion (Hg 2+ ), herein we construct a new DNA-based sensor for sensitive electrochemical monitoring of Hg 2+ by coupling target-induced formation of gold amalgamation on DNA-based sensing platform with gold amalgamation-catalyzed cycling signal amplification strategy. The sensor was simply prepared by covalent conjugation of aminated poly-T (25) oligonucleotide onto the glassy carbon electrode by typical carbodiimide coupling. Upon introduction of target analyte, Hg 2+ ion was intercalated into the DNA polyion complex membrane based on T–Hg 2+ –T coordination chemistry. The chelated Hg 2+ ion could induce the formation of gold amalgamation, which could catalyze the p-nitrophenol with the aid of NaBH 4 and Ru(NH 3 ) 6 3+ for cycling signal amplification. Experimental results indicated that the electronic signal of our system increased with the increasing Hg 2+ level in the sample, and has a detection limit of 0.02 nM with a dynamic range of up to 1000 nM Hg 2+ . The strategy afforded exquisite selectivity for Hg 2+ against other environmentally related metal ions. In addition, the methodology was evaluated for the analysis of Hg 2+ in spiked tap-water samples, and the recovery was 87.9–113.8%

  2. Developing Action Plans Based on Strategy

    DEFF Research Database (Denmark)

    Carstensen, Peter H.; Vinter, Otto

    2018-01-01

    The authors have performed a thorough study of the change strategy literature that is the foundation for the 10 overall change strategies defined in ISO/IEC 33014. Then the authors identified eight aspects that should be considered when developing the concrete actions for executing the strategy....

  3. PENGARUH STRATEGI VALUE, SIZE DAN MOMENTUM TERHADAP EXCESS RETURN DI INDONESIA

    Directory of Open Access Journals (Sweden)

    Gleny Gleny

    2014-11-01

    Full Text Available The purpose of this research is to investigate the impact from the strategies that used by investors in Indonesia, such as value, size and momentum strategy. Sample data is a monthly data of 100 non-financial individual stocks which fulfill the requirement, from July 2006 – December 2010 and use 12 months holding period. This research also use ARCH method to test heteroscedasticity and VIF method to test multicolinearity. The outcome form this research is value strategy based on book to market ratio, size strategy based on market capitalization and momentum strategy based on past six months price are not significant in Indonesia. This can be happened because of the depreciation in Indonesia currency and crisis years. In addition, Indonesia is one of emerging market in Asia, so that some of the information must be difficult and make imperfect market.

  4. A basic condition-based maintenance strategy for air-cooled turbine generators

    International Nuclear Information System (INIS)

    Laird, T.; Griffith, G.; Hoof, M.

    2005-01-01

    This paper discusses the methods of using condition-based maintenance (CBM) for turbine generators. Even though it is focused on the maintenance strategy for air-cooled generators, all types of power producers can realize benefits from a better maintenance strategy at lower costs. A reliable assessment of the actual unit condition requires detailed knowledge of the unit design, operational weaknesses, cost of maintenance and operational capabilities. (author)

  5. Sensor-Based Model Driven Control Strategy for Precision Irrigation

    Directory of Open Access Journals (Sweden)

    Camilo Lozoya

    2016-01-01

    Full Text Available Improving the efficiency of the agricultural irrigation systems substantially contributes to sustainable water management. This improvement can be achieved through an automated irrigation system that includes a real-time control strategy based on the water, soil, and crop relationship. This paper presents a model driven control strategy applied to an irrigation system, in order to make an efficient use of water for large crop fields, that is, applying the correct amount of water in the correct place at the right moment. The proposed model uses a predictive algorithm that senses soil moisture and weather variables, to determine optimal amount of water required by the crop. This proposed approach is evaluated against a traditional irrigation system based on the empirical definition of time periods and against a basic soil moisture control system. Results indicate that the use of a model predictive control in an irrigation system achieves a higher efficiency and significantly reduce the water consumption.

  6. Base-of-the-pyramid global strategy

    Directory of Open Access Journals (Sweden)

    Boşcor, D.

    2010-12-01

    Full Text Available Global strategies for MNCs should focus on customers in emerging and developing markets instead of customers in developed economies. The “base-of-the-pyramid segment” comprises 4 billion people in the world. In order to be successful, companies will be required to form unconventional partnerships- with entities ranging from local governments to non-profit organizations - to gain the community’s trust and understand the environmental, infrastructure and political issues that may affect business. Being able to provide affordable, high-quality products and services in this market segment often means new approaches to marketing- new packaging and pricing structures, and using unfamiliar distribution structures.

  7. Do Bodybuilders Use Evidence-Based Nutrition Strategies to Manipulate Physique?

    Directory of Open Access Journals (Sweden)

    Lachlan Mitchell

    2017-09-01

    Full Text Available Competitive bodybuilders undergo strict dietary and training practices to achieve an extremely lean and muscular physique. The purpose of this study was to identify and describe different dietary strategies used by bodybuilders, their rationale, and the sources of information from which these strategies are gathered. In-depth interviews were conducted with seven experienced (10.4 ± 3.4 years bodybuilding experience, male, natural bodybuilders. Participants were asked about training, dietary and supplement practices, and information resources for bodybuilding strategies. Interviews were transcribed verbatim and analyzed using qualitative content analysis. During the off-season, energy intake was higher and less restricted than during the in-season to aid in muscle hypertrophy. There was a focus on high protein intake with adequate carbohydrate to permit high training loads. To create an energy deficit and loss of fat mass, energy intake was gradually and progressively reduced during the in-season via a reduction in carbohydrate and fat intake. The rationale for weekly higher carbohydrate refeed days was to offset declines in metabolic rate and fatigue, while in the final “peak week” before competition, the reasoning for fluid and sodium manipulation and carbohydrate loading was to enhance the appearance of leanness and vascularity. Other bodybuilders, coaches and the internet were significant sources of information. Despite the common perception of extreme, non-evidence-based regimens, these bodybuilders reported predominantly using strategies which are recognized as evidence-based, developed over many years of experience. Additionally, novel strategies such as weekly refeed days to enhance fat loss, and sodium and fluid manipulation, warrant further investigation to evaluate their efficacy and safety.

  8. External calibration strategy for trace element quantification in botanical samples by LA-ICP-MS using filter paper

    International Nuclear Information System (INIS)

    Nunes, Matheus A.G.; Voss, Mônica; Corazza, Gabriela; Flores, Erico M.M.; Dressler, Valderi L.

    2016-01-01

    The use of reference solutions dispersed on filter paper discs is proposed for the first time as an external calibration strategy for matrix matching and determination of As, Cd, Co, Cr, Cu, Mn, Ni, Pb, Sr, V and Zn in plants by laser ablation-inductively coupled plasma mass spectrometry (LA-ICP-MS). The procedure is based on the use of filter paper discs as support for aqueous reference solutions, which are further evaporated, resulting in solid standards with concentrations up to 250 μg g −1 of each element. The use of filter paper for calibration is proposed as matrix matched standards due to the similarities of this material with botanical samples, regarding to carbon concentration and its distribution through both matrices. These characteristics allowed the use of 13 C as internal standard (IS) during the analysis by LA-ICP-MS. In this way, parameters as analyte signal normalization with 13 C, carrier gas flow rate, laser energy, spot size, and calibration range were monitored. The calibration procedure using solution deposition on filter paper discs resulted in precision improvement when 13 C was used as IS. The method precision was calculated by the analysis of a certified reference material (CRM) of botanical matrix, considering the RSD obtained for 5 line scans and was lower than 20%. Accuracy of LA-ICP-MS determinations were evaluated by analysis of four CRM pellets of botanical composition, as well as by comparison with results obtained by ICP-MS using solution nebulization after microwave assisted digestion. Plant samples of unknown elemental composition were analyzed by the proposed LA method and good agreement were obtained with results of solution analysis. Limits of detection (LOD) established for LA-ICP-MS were obtained by the ablation of 10 lines on the filter paper disc containing 40 μL of 5% HNO 3 (v v −1 ) as calibration blank. Values ranged from 0.05 to 0.81  μg g −1 . Overall, the use of filter paper as support for dried

  9. External calibration strategy for trace element quantification in botanical samples by LA-ICP-MS using filter paper

    Energy Technology Data Exchange (ETDEWEB)

    Nunes, Matheus A.G.; Voss, Mônica; Corazza, Gabriela; Flores, Erico M.M.; Dressler, Valderi L., E-mail: vdressler@gmail.com

    2016-01-28

    The use of reference solutions dispersed on filter paper discs is proposed for the first time as an external calibration strategy for matrix matching and determination of As, Cd, Co, Cr, Cu, Mn, Ni, Pb, Sr, V and Zn in plants by laser ablation-inductively coupled plasma mass spectrometry (LA-ICP-MS). The procedure is based on the use of filter paper discs as support for aqueous reference solutions, which are further evaporated, resulting in solid standards with concentrations up to 250 μg g{sup −1} of each element. The use of filter paper for calibration is proposed as matrix matched standards due to the similarities of this material with botanical samples, regarding to carbon concentration and its distribution through both matrices. These characteristics allowed the use of {sup 13}C as internal standard (IS) during the analysis by LA-ICP-MS. In this way, parameters as analyte signal normalization with {sup 13}C, carrier gas flow rate, laser energy, spot size, and calibration range were monitored. The calibration procedure using solution deposition on filter paper discs resulted in precision improvement when {sup 13}C was used as IS. The method precision was calculated by the analysis of a certified reference material (CRM) of botanical matrix, considering the RSD obtained for 5 line scans and was lower than 20%. Accuracy of LA-ICP-MS determinations were evaluated by analysis of four CRM pellets of botanical composition, as well as by comparison with results obtained by ICP-MS using solution nebulization after microwave assisted digestion. Plant samples of unknown elemental composition were analyzed by the proposed LA method and good agreement were obtained with results of solution analysis. Limits of detection (LOD) established for LA-ICP-MS were obtained by the ablation of 10 lines on the filter paper disc containing 40 μL of 5% HNO{sub 3} (v v{sup −1}) as calibration blank. Values ranged from 0.05 to 0.81  μg g{sup −1}. Overall, the use of filter

  10. Inferring the demographic history from DNA sequences: An importance sampling approach based on non-homogeneous processes.

    Science.gov (United States)

    Ait Kaci Azzou, S; Larribe, F; Froda, S

    2016-10-01

    In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history. Copyright © 2016. Published by Elsevier Inc.

  11. Research and Engagement Strategies for Young Adult Immigrants Without Documentation: Lessons Learned Through Community Partnership.

    Science.gov (United States)

    Raymond-Flesch, Marissa; Siemons, Rachel; Brindis, Claire D

    2016-01-01

    Limited research has focused on undocumented immigrants' health and access to care. This paper describes participant engagement strategies used to investigate the health needs of immigrants eligible for Deferred Action for Childhood Arrivals (DACA). Community-based strategies engaged advocates and undocumented Californians in study design and recruitment. Outreach in diverse settings, social media, and participant-driven sampling recruited 61 DACA-eligible focus group participants. Social media, community-based organizations (CBOs), family members, advocacy groups, and participant-driven sampling were the most successful recruitment strategies. Participants felt engaging in research was instrumental for sharing their concerns with health care providers and policymakers, noteworthy in light of their previously identified fears and mistrust of government officials. Using multiple culturally responsive strategies including participant-driven sampling, engagement with CBOs, and use of social media, those eligible for DACA eagerly engage as research participants. Educating researchers and institutional review boards (IRBs) about legal and safety concerns can improve research engagement.

  12. Data splitting for artificial neural networks using SOM-based stratified sampling.

    Science.gov (United States)

    May, R J; Maier, H R; Dandy, G C

    2010-03-01

    Data splitting is an important consideration during artificial neural network (ANN) development where hold-out cross-validation is commonly employed to ensure generalization. Even for a moderate sample size, the sampling methodology used for data splitting can have a significant effect on the quality of the subsets used for training, testing and validating an ANN. Poor data splitting can result in inaccurate and highly variable model performance; however, the choice of sampling methodology is rarely given due consideration by ANN modellers. Increased confidence in the sampling is of paramount importance, since the hold-out sampling is generally performed only once during ANN development. This paper considers the variability in the quality of subsets that are obtained using different data splitting approaches. A novel approach to stratified sampling, based on Neyman sampling of the self-organizing map (SOM), is developed, with several guidelines identified for setting the SOM size and sample allocation in order to minimize the bias and variance in the datasets. Using an example ANN function approximation task, the SOM-based approach is evaluated in comparison to random sampling, DUPLEX, systematic stratified sampling, and trial-and-error sampling to minimize the statistical differences between data sets. Of these approaches, DUPLEX is found to provide benchmark performance with good model performance, with no variability. The results show that the SOM-based approach also reliably generates high-quality samples and can therefore be used with greater confidence than other approaches, especially in the case of non-uniform datasets, with the benefit of scalability to perform data splitting on large datasets. Copyright 2009 Elsevier Ltd. All rights reserved.

  13. Effectiveness comparison of partially executed t-way test suite based generated by existing strategies

    Science.gov (United States)

    Othman, Rozmie R.; Ahmad, Mohd Zamri Zahir; Ali, Mohd Shaiful Aziz Rashid; Zakaria, Hasneeza Liza; Rahman, Md. Mostafijur

    2015-05-01

    Consuming 40 to 50 percent of software development cost, software testing is one of the most resource consuming activities in software development lifecycle. To ensure an acceptable level of quality and reliability of a typical software product, it is desirable to test every possible combination of input data under various configurations. Due to combinatorial explosion problem, considering all exhaustive testing is practically impossible. Resource constraints, costing factors as well as strict time-to-market deadlines are amongst the main factors that inhibit such consideration. Earlier work suggests that sampling strategy (i.e. based on t-way parameter interaction or called as t-way testing) can be effective to reduce number of test cases without effecting the fault detection capability. However, for a very large system, even t-way strategy will produce a large test suite that need to be executed. In the end, only part of the planned test suite can be executed in order to meet the aforementioned constraints. Here, there is a need for test engineers to measure the effectiveness of partially executed test suite in order for them to assess the risk they have to take. Motivated by the abovementioned problem, this paper presents the effectiveness comparison of partially executed t-way test suite generated by existing strategies using tuples coverage method. Here, test engineers can predict the effectiveness of the testing process if only part of the original test cases is executed.

  14. Novel therapeutic strategies against AIDS progression based on the ...

    African Journals Online (AJOL)

    Novel therapeutic strategies against AIDS progression based on the pathogenic effects of HIV-1 and V pr Proteins. Ahmed A Azad. Abstract. No Abstract. Discovery and Innovation Vol. 17, 2005: 52-60. Full Text: EMAIL FULL TEXT EMAIL FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. Article Metrics.

  15. How do small rural food-processing firms compete?A resource-based approach to competitive strategies

    Directory of Open Access Journals (Sweden)

    S. FORSMAN

    2008-12-01

    Full Text Available The study was concerned with the competitive strategies of small food-processing firms in rural Finland and their ability to achieve and maintain a competitively advantaged position in relation to larger food companies in the dynamic and mature food market. Competitive strategies were approached from the resource-based view (RBV that emphasises internal firm factors as sources of competitive advantage and long-term success. As strategic choices, differentiation was specifically considered. The main objective was to explain the relationships between resources, competitive advantage and firm success. To understand the ambiguous nature of the resources in the small-scale food production context, the study introduced a distinction between strategic resources and basic resources and the strategic relationship between them. The empirical part of the study was based on quantitative analyses of the survey data collected from 238 small (less than 20 persons, food-processing firms in rural Finland. The sample firms represented different branches of the food industry and 39% of them operated in connection with a farm. The linkage between resources, competitive advantage and firm success was investigated by means of cluster analysis, mean comparisons and LISREL modelling. The results demonstrated that there are some typical features relating to small-scale food production in Finland. The results also revealed that small-scale, rural food processing firms do not constitute a homogenous group of their own, but that different strategies among small firms can be identified as well. The analyses proved that a linkage between resources, competitive advantage and firm success can be identified, which is consistent with resource-based logic. However, according to the findings, following a particular strategy does not automatically ensure that a firm will achieve success. The analysis also showed that strategic resources and basic resources are strongly interlinked

  16. Coordinated control strategy for hybrid wind farms with DFIG-based and PMSG-based wind farms during network unbalance

    DEFF Research Database (Denmark)

    Yao, Jun; Liu, Ruikuo; Zhou, Te

    2017-01-01

    This paper investigates the coordinated control strategy for a hybrid wind farm with doubly fed induction generator (DFIG)-based and direct-driven permanent-magnet synchronous generator (PMSG)-based wind farms during network unbalance. The negative-sequence current output capabilities of DFIG...... to the controllable operating regions, a targets selection scheme for each control unit is proposed to improve the stability of the hybrid wind farms containing both DFIG-based and PMSG-based wind farms during network unbalance, especially to avoid DFIG-based wind farm tripping from connected power grid under severe...... grid voltage unbalance conditions. Finally, the proposed coordinated control strategy is validated by the simulation results of a 30-MW-DFIG-based wind farm and a 30-MW-PMSG-based wind farm under different operation conditions and experimental results on a laboratory-scale experimental rig under severe...

  17. Examining Generic Competitive Strategy Types in U.S. and European Markets

    OpenAIRE

    Susan P Douglas; Dong Kee Rhee

    1989-01-01

    Identification of generic competitive strategy types has recently attracted considerable attention. Most of this research has, however, focused on competitive strategy of U.S. businesses in their domestic market. The present study extends these findings to markets outside the United States, and more specifically Europe, based on a sample of industrial businesses drawn from the PIMS database. Similar dimensions underlying competitive strategy, and similar generic types are found among business...

  18. Direct and long-term detection of gene doping in conventional blood samples.

    Science.gov (United States)

    Beiter, T; Zimmermann, M; Fragasso, A; Hudemann, J; Niess, A M; Bitzer, M; Lauer, U M; Simon, P

    2011-03-01

    The misuse of somatic gene therapy for the purpose of enhancing athletic performance is perceived as a coming threat to the world of sports and categorized as 'gene doping'. This article describes a direct detection approach for gene doping that gives a clear yes-or-no answer based on the presence or absence of transgenic DNA in peripheral blood samples. By exploiting a priming strategy to specifically amplify intronless DNA sequences, we developed PCR protocols allowing the detection of very small amounts of transgenic DNA in genomic DNA samples to screen for six prime candidate genes. Our detection strategy was verified in a mouse model, giving positive signals from minute amounts (20 μl) of blood samples for up to 56 days following intramuscular adeno-associated virus-mediated gene transfer, one of the most likely candidate vector systems to be misused for gene doping. To make our detection strategy amenable for routine testing, we implemented a robust sample preparation and processing protocol that allows cost-efficient analysis of small human blood volumes (200 μl) with high specificity and reproducibility. The practicability and reliability of our detection strategy was validated by a screening approach including 327 blood samples taken from professional and recreational athletes under field conditions.

  19. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  20. Frequency Control Strategy for Black Starts via PMSG-Based Wind Power Generation

    Directory of Open Access Journals (Sweden)

    Yi Tang

    2017-03-01

    Full Text Available The use of wind power generation (WPG as a source for black starts will significantly enhance the resiliency of power systems and shorten their recovery time from blackouts. Given that frequency stability is the most serious issue during the initial recovery period, virtual inertia control can enable wind turbines to provide frequency support to an external system. In this study, a general procedure of WPG participating in black starts is presented, and the key issues are discussed. The adaptability of existing virtual inertia control strategies is analyzed, and improvement work is performed. A new coordinated frequency control strategy is proposed based on the presented improvement work. A local power network with a permanent-magnet synchronous generator (PMSG-based wind farm is modeled and used to verify the effectiveness of the strategy.

  1. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya; Amato, Nancy M.

    2012-01-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented

  2. Organizational Strategy and Business Environment Effects Based on a Computation Method

    Science.gov (United States)

    Reklitis, Panagiotis; Konstantopoulos, Nikolaos; Trivellas, Panagiotis

    2007-12-01

    According to many researchers of organizational theory, a great number of problems encountered by the manufacturing firms are due to their ineffectiveness to respond to significant changes of their external environment and align their competitive strategy accordingly. From this point of view, the pursuit of the appropriate generic strategy is vital for firms facing a dynamic and highly competitive environment. In the present paper, we adopt Porter's typology to operationalise organizational strategy (cost leadership, innovative and marketing differentiation, and focus) considering changes in the external business environment (dynamism, complexity and munificence). Although simulation of social events is a quite difficult task, since there are so many considerations (not all well understood) involved, in the present study we developed a dynamic system based on the conceptual framework of strategy-environment associations.

  3. QoE-based transmission strategies for multi-user wireless information and power transfer

    Directory of Open Access Journals (Sweden)

    Taehun Jung

    2015-12-01

    Full Text Available One solution to the problem of supplying energy to wireless networks is wireless power transfer. One such technology–electromagnetic radiation enabled wireless power transfer–will change traditional wireless networks. In this paper, we investigate a transmission strategy for multi-user wireless information and power transfer. We consider a multi-user multiple-input multiple-output (MIMO channel that includes one base station (BS and two user terminals (UT consisting of one energy harvesting (EH receiver and one information decoding (ID receiver. Our system provides transmission strategies that can be executed and implemented in practical scenarios. The paper then analyzes the rate–energy (R–E pair of our strategies and compares them to those of the theoretical optimal strategy. We furthermore propose a QoE-based mode selection algorithm by mapping the R–E pair to the utility functions.

  4. Marine anthropogenic radiotracers in the Southern Hemisphere: New sampling and analytical strategies

    Science.gov (United States)

    Levy, I.; Povinec, P. P.; Aoyama, M.; Hirose, K.; Sanchez-Cabeza, J. A.; Comanducci, J.-F.; Gastaud, J.; Eriksson, M.; Hamajima, Y.; Kim, C. S.; Komura, K.; Osvath, I.; Roos, P.; Yim, S. A.

    2011-04-01

    The Japan Agency for Marine Earth Science and Technology conducted in 2003-2004 the Blue Earth Global Expedition (BEAGLE2003) around the Southern Hemisphere Oceans, which was a rare opportunity to collect many seawater samples for anthropogenic radionuclide studies. We describe here sampling and analytical methodologies based on radiochemical separations of Cs and Pu from seawater, as well as radiometric and mass spectrometry measurements. Several laboratories took part in radionuclide analyses using different techniques. The intercomparison exercises and analyses of certified reference materials showed a reasonable agreement between the participating laboratories. The obtained data on the distribution of 137Cs and plutonium isotopes in seawater represent the most comprehensive results available for the Southern Hemisphere Oceans.

  5. The Viking X ray fluorescence experiment - Sampling strategies and laboratory simulations. [Mars soil sampling

    Science.gov (United States)

    Baird, A. K.; Castro, A. J.; Clark, B. C.; Toulmin, P., III; Rose, H., Jr.; Keil, K.; Gooding, J. L.

    1977-01-01

    Ten samples of Mars regolith material (six on Viking Lander 1 and four on Viking Lander 2) have been delivered to the X ray fluorescence spectrometers as of March 31, 1977. An additional six samples at least are planned for acquisition in the remaining Extended Mission (to January 1979) for each lander. All samples acquired are Martian fines from the near surface (less than 6-cm depth) of the landing sites except the latest on Viking Lander 1, which is fine material from the bottom of a trench dug to a depth of 25 cm. Several attempts on each lander to acquire fresh rock material (in pebble sizes) for analysis have yielded only cemented surface crustal material (duricrust). Laboratory simulation and experimentation are required both for mission planning of sampling and for interpretation of data returned from Mars. This paper is concerned with the rationale for sample site selections, surface sampler operations, and the supportive laboratory studies needed to interpret X ray results from Mars.

  6. A comparative research of different ensemble surrogate models based on set pair analysis for the DNAPL-contaminated aquifer remediation strategy optimization

    Science.gov (United States)

    Hou, Zeyu; Lu, Wenxi; Xue, Haibo; Lin, Jin

    2017-08-01

    Surrogate-based simulation-optimization technique is an effective approach for optimizing the surfactant enhanced aquifer remediation (SEAR) strategy for clearing DNAPLs. The performance of the surrogate model, which is used to replace the simulation model for the aim of reducing computation burden, is the key of corresponding researches. However, previous researches are generally based on a stand-alone surrogate model, and rarely make efforts to improve the approximation accuracy of the surrogate model to the simulation model sufficiently by combining various methods. In this regard, we present set pair analysis (SPA) as a new method to build ensemble surrogate (ES) model, and conducted a comparative research to select a better ES modeling pattern for the SEAR strategy optimization problems. Surrogate models were developed using radial basis function artificial neural network (RBFANN), support vector regression (SVR), and Kriging. One ES model is assembling RBFANN model, SVR model, and Kriging model using set pair weights according their performance, and the other is assembling several Kriging (the best surrogate modeling method of three) models built with different training sample datasets. Finally, an optimization model, in which the ES model was embedded, was established to obtain the optimal remediation strategy. The results showed the residuals of the outputs between the best ES model and simulation model for 100 testing samples were lower than 1.5%. Using an ES model instead of the simulation model was critical for considerably reducing the computation time of simulation-optimization process and maintaining high computation accuracy simultaneously.

  7. Strategies to Enhance Online Learning Teams. Team Assessment and Diagnostics Instrument and Agent-based Modeling

    Science.gov (United States)

    2010-08-12

    Strategies to Enhance Online Learning Teams Team Assessment and Diagnostics Instrument and Agent-based Modeling Tristan E. Johnson, Ph.D. Learning ...REPORT DATE AUG 2010 2. REPORT TYPE 3. DATES COVERED 00-00-2010 to 00-00-2010 4. TITLE AND SUBTITLE Strategies to Enhance Online Learning ...TeamsTeam Strategies to Enhance Online Learning Teams: Team Assessment and Diagnostics Instrument and Agent-based Modeling 5a. CONTRACT NUMBER 5b. GRANT

  8. Improved Control Strategy for DFIG-based Wind Energy Conversion System during Grid Voltage Disturbances

    DEFF Research Database (Denmark)

    Zhu, Rongwu

    electromagnetic torque during grid faults. Therefore, the virtual damping flux based strategy not only can help the DFIG achieve the LVRT requirement, but also can reduce the mechanical stress on the drive train. On the other hand, on the basis of the decaying characteristic of the stator flux, the passive...... flux based active damping strategy and the stator series resistance based passive damping strategy can help the DFIG to fulfill the LVRT requirement, and improve the DFIG performances. Besides the previous active and passive damping strategies, the modified power converter and DFIG configurations...... of the stator voltage can cause the transient stator flux, and then the transient stator flux may be enlarged due to the effects of the initial value. The amplitude of the transient flux is determined by both the instant and depth of stator voltage variation, and the decaying characteristic of the transient...

  9. Influence of prior knowledge of exercise duration on pacing strategies during game-based activities.

    Science.gov (United States)

    Gabbett, Tim J; Walker, Ben; Walker, Shane

    2015-04-01

    To investigate the influence of prior knowledge of exercise duration on the pacing strategies employed during game-based activities. Twelve semiprofessional team-sport athletes (mean ± SD age 22.8 ± 2.1 y) participated in this study. Players performed 3 small-sided games in random order. In one condition (Control), players were informed that they would play the small-sided game for 12 min and then completed the 12-min game. In a 2nd condition (Deception), players were told that they would play the small-sided game for 6 minutes, but after completing the 6-min game, they were asked to complete another 6 min. In a 3rd condition (Unknown), players were not told how long they would be required to play the small-sided game, but the activity was terminated after 12 min. Movement was recorded using a GPS unit sampling at 10 Hz. Post hoc inspection of video footage was undertaken to count the number of possessions and the number and quality of disposals. Higher initial intensities were observed in the Deception (130.6 ± 3.3 m/min) and Unknown (129.3 ± 2.4 m/min) conditions than the Control condition (123.3 ± 3.4 m/min). Greater amounts of high-speed running occurred during the initial phases of the Deception condition, and more low-speed activity occurred during the Unknown condition. A moderately greater number of total skill involvements occurred in the Unknown condition than the Control condition. These findings suggest that during game-based activities, players alter their pacing strategy based on the anticipated endpoint of the exercise bout.

  10. Public Communication Strategies for Public Engagement with RFT

    International Nuclear Information System (INIS)

    Kim, H. S.; Cho, S. K.; Kim, H. D.

    2006-12-01

    The objective of this study is to provide a discussion for improving the public understanding of products RFT by investigation the following: - perception of nuclear, radiation and products. This study conducted a telephone survey using nationally sampled 800 adults. The sample was selected using probabilistic quota sampling based on sex, age and region. The different communication strategies are required for different types of products using RFT. And government and related institutes should consider and access cautiously in promoting the radiation technology industry.

  11. Sampling stored product insect pests: a comparison of four statistical sampling models for probability of pest detection

    Science.gov (United States)

    Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...

  12. Soft magnetic properties of bulk amorphous Co-based samples

    International Nuclear Information System (INIS)

    Fuezer, J.; Bednarcik, J.; Kollar, P.

    2006-01-01

    Ball milling of melt-spun ribbons and subsequent compaction of the resulting powders in the supercooled liquid region were used to prepare disc shaped bulk amorphous Co-based samples. The several bulk samples have been prepared by hot compaction with subsequent heat treatment (500 deg C - 575 deg C). The influence of the consolidation temperature and follow-up heat treatment on the magnetic properties of bulk samples was investigated. The final heat treatment leads to decrease of the coercivity to the value between the 7.5 to 9 A/m (Authors)

  13. An estimator-based distributed voltage-predictive control strategy for ac islanded microgrids

    DEFF Research Database (Denmark)

    Wang, Yanbo; Chen, Zhe; Wang, Xiongfei

    2015-01-01

    This paper presents an estimator-based voltage predictive control strategy for AC islanded microgrids, which is able to perform voltage control without any communication facilities. The proposed control strategy is composed of a network voltage estimator and a voltage predictive controller for each...... and has a good capability to reject uncertain perturbations of islanded microgrids....

  14. Operations Strategy Development in Project-based Production – a building contractor implements Lean

    DEFF Research Database (Denmark)

    Koch, Christian; Friis, Ole Uhrskov

    2015-01-01

    Purpose: To study how operations strategy innovation occurs in project-based production and organisation. Design/methodology/approach: A longitudinal case study encompassing the processes at the company headquarters and in two projects using Lean. Findings: The operations strategy development com...

  15. Towards a Community-Based Dementia Care Strategy: A Perspective from Quebec.

    Science.gov (United States)

    Godard-Sebillotte, Claire; Vedel, Isabelle; Bergman, Howard

    Morton-Chang et al. highlighted in their article the key strategic pillars of a community-based dementia care strategy: put "people first," support informal caregiving and enable "ground up" innovation and change. In our commentary, we draw upon our experience as authors of the Quebec Alzheimer Plan and evaluators of its implementation by the Quebec Ministry of Health and Social Services (MSSS). To us, a sustainable dementia care strategy entails a patient-centred approach, grounded in primary care, caring for persons with dementia at every stage of the disease. Implementation of such a strategy requires an ongoing effort to allow innovation adoption by clinicians and organizations.

  16. Using Load Balancing to Scalably Parallelize Sampling-Based Motion Planning Algorithms

    KAUST Repository

    Fidel, Adam; Jacobs, Sam Ade; Sharma, Shishir; Amato, Nancy M.; Rauchwerger, Lawrence

    2014-01-01

    Motion planning, which is the problem of computing feasible paths in an environment for a movable object, has applications in many domains ranging from robotics, to intelligent CAD, to protein folding. The best methods for solving this PSPACE-hard problem are so-called sampling-based planners. Recent work introduced uniform spatial subdivision techniques for parallelizing sampling-based motion planning algorithms that scaled well. However, such methods are prone to load imbalance, as planning time depends on region characteristics and, for most problems, the heterogeneity of the sub problems increases as the number of processors increases. In this work, we introduce two techniques to address load imbalance in the parallelization of sampling-based motion planning algorithms: an adaptive work stealing approach and bulk-synchronous redistribution. We show that applying these techniques to representatives of the two major classes of parallel sampling-based motion planning algorithms, probabilistic roadmaps and rapidly-exploring random trees, results in a more scalable and load-balanced computation on more than 3,000 cores. © 2014 IEEE.

  17. Using Load Balancing to Scalably Parallelize Sampling-Based Motion Planning Algorithms

    KAUST Repository

    Fidel, Adam

    2014-05-01

    Motion planning, which is the problem of computing feasible paths in an environment for a movable object, has applications in many domains ranging from robotics, to intelligent CAD, to protein folding. The best methods for solving this PSPACE-hard problem are so-called sampling-based planners. Recent work introduced uniform spatial subdivision techniques for parallelizing sampling-based motion planning algorithms that scaled well. However, such methods are prone to load imbalance, as planning time depends on region characteristics and, for most problems, the heterogeneity of the sub problems increases as the number of processors increases. In this work, we introduce two techniques to address load imbalance in the parallelization of sampling-based motion planning algorithms: an adaptive work stealing approach and bulk-synchronous redistribution. We show that applying these techniques to representatives of the two major classes of parallel sampling-based motion planning algorithms, probabilistic roadmaps and rapidly-exploring random trees, results in a more scalable and load-balanced computation on more than 3,000 cores. © 2014 IEEE.

  18. Energy Management Strategies based on efficiency map for Fuel Cell Hybrid Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Feroldi, Diego; Serra, Maria; Riera, Jordi [Institut de Robotica i Informatica Industrial (CSIC-UPC), C. Llorens i Artigas 4, 08028 Barcelona (Spain)

    2009-05-15

    The addition of a fast auxiliary power source like a supercapacitor bank in fuel cell-based vehicles has a great potential because permits a significant reduction of the hydrogen consumption and an improvement of the vehicle efficiency. The Energy Management Strategies, commanding the power split between the power sources in the hybrid arrangement to fulfil the power requirement, perform a fundamental role to achieve this objective. In this work, three strategies based on the knowledge of the fuel cell efficiency map are proposed. These strategies are attractive due to the relative simplicity of the real time implementation and the good performance. The strategies are tested both in a simulation environment and in an experimental setup using a 1.2-kW PEM fuel cell. The results, in terms of hydrogen consumption, are compared with an optimal case, which is assessed trough an advantageous technique also introduced in this work and with a pure fuel cell vehicle as well. This comparative reveals high efficiency and good performance, allowing to save up to 26% of hydrogen in urban scenarios. (author)

  19. English Language Learners' Strategies for Reading Computer-Based Texts at Home and in School

    Science.gov (United States)

    Park, Ho-Ryong; Kim, Deoksoon

    2016-01-01

    This study investigated four elementary-level English language learners' (ELLs') use of strategies for reading computer-based texts at home and in school. The ELLs in this study were in the fourth and fifth grades in a public elementary school. We identify the ELLs' strategies for reading computer-based texts in home and school environments. We…

  20. Model-based control strategies for systems with constraints of the program type

    Science.gov (United States)

    Jarzębowska, Elżbieta

    2006-08-01

    The paper presents a model-based tracking control strategy for constrained mechanical systems. Constraints we consider can be material and non-material ones referred to as program constraints. The program constraint equations represent tasks put upon system motions and they can be differential equations of orders higher than one or two, and be non-integrable. The tracking control strategy relies upon two dynamic models: a reference model, which is a dynamic model of a system with arbitrary order differential constraints and a dynamic control model. The reference model serves as a motion planner, which generates inputs to the dynamic control model. It is based upon a generalized program motion equations (GPME) method. The method enables to combine material and program constraints and merge them both into the motion equations. Lagrange's equations with multipliers are the peculiar case of the GPME, since they can be applied to systems with constraints of first orders. Our tracking strategy referred to as a model reference program motion tracking control strategy enables tracking of any program motion predefined by the program constraints. It extends the "trajectory tracking" to the "program motion tracking". We also demonstrate that our tracking strategy can be extended to a hybrid program motion/force tracking.

  1. Incentive-compatible demand-side management for smart grids based on review strategies

    Science.gov (United States)

    Xu, Jie; van der Schaar, Mihaela

    2015-12-01

    Demand-side load management is able to significantly improve the energy efficiency of smart grids. Since the electricity production cost depends on the aggregate energy usage of multiple consumers, an important incentive problem emerges: self-interested consumers want to increase their own utilities by consuming more than the socially optimal amount of energy during peak hours since the increased cost is shared among the entire set of consumers. To incentivize self-interested consumers to take the socially optimal scheduling actions, we design a new class of protocols based on review strategies. These strategies work as follows: first, a review stage takes place in which a statistical test is performed based on the daily prices of the previous billing cycle to determine whether or not the other consumers schedule their electricity loads in a socially optimal way. If the test fails, the consumers trigger a punishment phase in which, for a certain time, they adjust their energy scheduling in such a way that everybody in the consumer set is punished due to an increased price. Using a carefully designed protocol based on such review strategies, consumers then have incentives to take the socially optimal load scheduling to avoid entering this punishment phase. We rigorously characterize the impact of deploying protocols based on review strategies on the system's as well as the users' performance and determine the optimal design (optimal billing cycle, punishment length, etc.) for various smart grid deployment scenarios. Even though this paper considers a simplified smart grid model, our analysis provides important and useful insights for designing incentive-compatible demand-side management schemes based on aggregate energy usage information in a variety of practical scenarios.

  2. Parameters Design for Logarithmic Quantizer Based on Zoom Strategy

    Directory of Open Access Journals (Sweden)

    Jingjing Yan

    2017-01-01

    Full Text Available This paper is concerned with the problem of designing suitable parameters for logarithmic quantizer such that the closed-loop system is asymptotic convergent. Based on zoom strategy, we propose two methods for quantizer parameters design, under which it ensures that the state of the closed-loop system can load in the invariant sets after some certain moments. Then we obtain that the quantizer is unsaturated, and thus the quantization errors are bounded under the time-varying logarithm quantization strategy. On that basis, we obtain that the closed-loop system is asymptotic convergent. A benchmark example is given to show the usefulness of the proposed methods, and the comparison results are illustrated.

  3. Behavioral Contexts, Food-Choice Coping Strategies, and Dietary Quality of a Multiethnic Sample of Employed Parents

    Science.gov (United States)

    Blake, Christine E.; Wethington, Elaine; Farrell, Tracy J.; Bisogni, Carole A.; Devine, Carol M.

    2012-01-01

    Employed parents’ work and family conditions provide behavioral contexts for their food choices. Relationships between employed parents’ food-choice coping strategies, behavioral contexts, and dietary quality were evaluated. Data on work and family conditions, sociodemographic characteristics, eating behavior, and dietary intake from two 24-hour dietary recalls were collected in a random sample cross-sectional pilot telephone survey in the fall of 2006. Black, white, and Latino employed mothers (n=25) and fathers (n=25) were recruited from a low/moderate income urban area in upstate New York. Hierarchical cluster analysis (Ward’s method) identified three clusters of parents differing in use of food-choice coping strategies (ie, Individualized Eating, Missing Meals, and Home Cooking). Cluster sociodemographic, work, and family characteristics were compared using χ2 and Fisher’s exact tests. Cluster differences in dietary quality (Healthy Eating Index 2005) were analyzed using analysis of variance. Clusters differed significantly (P≤0.05) on food-choice coping strategies, dietary quality, and behavioral contexts (ie, work schedule, marital status, partner’s employment, and number of children). Individualized Eating and Missing Meals clusters were characterized by nonstandard work hours, having a working partner, single parenthood and with family meals away from home, grabbing quick food instead of a meal, using convenience entrées at home, and missing meals or individualized eating. The Home Cooking cluster included considerably more married fathers with nonemployed spouses and more home-cooked family meals. Food-choice coping strategies affecting dietary quality reflect parents’ work and family conditions. Nutritional guidance and family policy needs to consider these important behavioral contexts for family nutrition and health. PMID:21338739

  4. Research Strategies in Science-based Start-ups

    DEFF Research Database (Denmark)

    Valentin, Finn; Dahlgren, Johan Henrich; Lund Jensen, Rasmus

    develop a contingency view on complex problem solving which structures the argument into three steps:1) Characterising the problem architectures addressed by different types of DBFs;2) Testing and confirming that DBFs form requisite research strategies, by which we refer to problem solving approaches......Although biotech start-ups fail or succeed based on their research few attempts have been made to examine if and how they strategize in this core of their activity. Popular views on Dedicated Biotech Firms (DBFs) see the inherent uncertainty of research as defying notions of strategizing, directing...

  5. Crisis Intervention Strategies for School-Based Helpers. Second Edition.

    Science.gov (United States)

    Fairchild, Thomas N., Ed.

    School-based helpers are helping professionals who work within educational settings and whose training and primary responsibility is to promote the mental health of students. Few resource materials provide these helpers with needed information and practical strategies--this text tries to meet that need. The 12 chapters here cover a wide range of…

  6. Identification of proteomic biomarkers predicting prostate cancer aggressiveness and lethality despite biopsy-sampling error

    OpenAIRE

    Shipitsin, M; Small, C; Choudhury, S; Giladi, E; Friedlander, S; Nardone, J; Hussain, S; Hurley, A D; Ernst, C; Huang, Y E; Chang, H; Nifong, T P; Rimm, D L; Dunyak, J; Loda, M

    2014-01-01

    Background: Key challenges of biopsy-based determination of prostate cancer aggressiveness include tumour heterogeneity, biopsy-sampling error, and variations in biopsy interpretation. The resulting uncertainty in risk assessment leads to significant overtreatment, with associated costs and morbidity. We developed a performance-based strategy to identify protein biomarkers predictive of prostate cancer aggressiveness and lethality regardless of biopsy-sampling variation. Methods: Prostatectom...

  7. Nonlinear control strategy based on using a shape-tunable neural controller

    Energy Technology Data Exchange (ETDEWEB)

    Chen, C.; Peng, S. [Feng Chia Univ, Taichung (Taiwan, Province of China). Department of chemical Engineering; Chang, W. [Feng Chia Univ, Taichung (Taiwan, Province of China). Department of Automatic Control

    1997-08-01

    In this paper, a nonlinear control strategy based on using a shape-tunable neural network is developed for adaptive control of nonlinear processes. Based on the steepest descent method, a learning algorithm that enables the neural controller to possess the ability of automatic controller output range adjustment is derived. The novel feature of automatic output range adjustment provides the neural controller more flexibility and capability, and therefore the scaling procedure, which is usually unavoidable for the conventional fixed-shape neural controllers, becomes unnecessary. The advantages and effectiveness of the proposed nonlinear control strategy are demonstrated through the challenge problem of controlling an open-loop unstable nonlinear continuous stirred tank reactor (CSTR). 14 refs., 11 figs.

  8. A strategy learning model for autonomous agents based on classification

    Directory of Open Access Journals (Sweden)

    Śnieżyński Bartłomiej

    2015-09-01

    Full Text Available In this paper we propose a strategy learning model for autonomous agents based on classification. In the literature, the most commonly used learning method in agent-based systems is reinforcement learning. In our opinion, classification can be considered a good alternative. This type of supervised learning can be used to generate a classifier that allows the agent to choose an appropriate action for execution. Experimental results show that this model can be successfully applied for strategy generation even if rewards are delayed. We compare the efficiency of the proposed model and reinforcement learning using the farmer-pest domain and configurations of various complexity. In complex environments, supervised learning can improve the performance of agents much faster that reinforcement learning. If an appropriate knowledge representation is used, the learned knowledge may be analyzed by humans, which allows tracking the learning process

  9. How does observation uncertainty influence which stream water samples are most informative for model calibration?

    Science.gov (United States)

    Wang, Ling; van Meerveld, Ilja; Seibert, Jan

    2016-04-01

    Streamflow isotope samples taken during rainfall-runoff events are very useful for multi-criteria model calibration because they can help decrease parameter uncertainty and improve internal model consistency. However, the number of samples that can be collected and analysed is often restricted by practical and financial constraints. It is, therefore, important to choose an appropriate sampling strategy and to obtain samples that have the highest information content for model calibration. We used the Birkenes hydrochemical model and synthetic rainfall, streamflow and isotope data to explore which samples are most informative for model calibration. Starting with error-free observations, we investigated how many samples are needed to obtain a certain model fit. Based on different parameter sets, representing different catchments, and different rainfall events, we also determined which sampling times provide the most informative data for model calibration. Our results show that simulation performance for models calibrated with the isotopic data from two intelligently selected samples was comparable to simulations based on isotopic data for all 100 time steps. The models calibrated with the intelligently selected samples also performed better than the model calibrations with two benchmark sampling strategies (random selection and selection based on hydrologic information). Surprisingly, samples on the rising limb and at the peak were less informative than expected and, generally, samples taken at the end of the event were most informative. The timing of the most informative samples depends on the proportion of different flow components (baseflow, slow response flow, fast response flow and overflow). For events dominated by baseflow and slow response flow, samples taken at the end of the event after the fast response flow has ended were most informative; when the fast response flow was dominant, samples taken near the peak were most informative. However when overflow

  10. Automated contouring error detection based on supervised geometric attribute distribution models for radiation therapy: A general strategy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Hsin-Chen; Tan, Jun; Dolly, Steven; Kavanaugh, James; Harold Li, H.; Altman, Michael; Gay, Hiram; Thorstad, Wade L.; Mutic, Sasa; Li, Hua, E-mail: huli@radonc.wustl.edu [Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States); Anastasio, Mark A. [Department of Biomedical Engineering, Washington University, St. Louis, Missouri 63110 (United States); Low, Daniel A. [Department of Radiation Oncology, University of California Los Angeles, Los Angeles, California 90095 (United States)

    2015-02-15

    were separately employed to test the effectiveness of the proposed contouring error detection strategy. Results: An evaluation tool was implemented to illustrate how the proposed strategy automatically detects the radiation therapy contouring errors for a given patient and provides 3D graphical visualization of error detection results as well. The contouring error detection results were achieved with an average sensitivity of 0.954/0.906 and an average specificity of 0.901/0.909 on the centroid/volume related contouring errors of all the tested samples. As for the detection results on structural shape related contouring errors, an average sensitivity of 0.816 and an average specificity of 0.94 on all the tested samples were obtained. The promising results indicated the feasibility of the proposed strategy for the detection of contouring errors with low false detection rate. Conclusions: The proposed strategy can reliably identify contouring errors based upon inter- and intrastructural constraints derived from clinically approved contours. It holds great potential for improving the radiation therapy workflow. ROC and box plot analyses allow for analytically tuning of the system parameters to satisfy clinical requirements. Future work will focus on the improvement of strategy reliability by utilizing more training sets and additional geometric attribute constraints.

  11. Automated contouring error detection based on supervised geometric attribute distribution models for radiation therapy: A general strategy

    International Nuclear Information System (INIS)

    Chen, Hsin-Chen; Tan, Jun; Dolly, Steven; Kavanaugh, James; Harold Li, H.; Altman, Michael; Gay, Hiram; Thorstad, Wade L.; Mutic, Sasa; Li, Hua; Anastasio, Mark A.; Low, Daniel A.

    2015-01-01

    were separately employed to test the effectiveness of the proposed contouring error detection strategy. Results: An evaluation tool was implemented to illustrate how the proposed strategy automatically detects the radiation therapy contouring errors for a given patient and provides 3D graphical visualization of error detection results as well. The contouring error detection results were achieved with an average sensitivity of 0.954/0.906 and an average specificity of 0.901/0.909 on the centroid/volume related contouring errors of all the tested samples. As for the detection results on structural shape related contouring errors, an average sensitivity of 0.816 and an average specificity of 0.94 on all the tested samples were obtained. The promising results indicated the feasibility of the proposed strategy for the detection of contouring errors with low false detection rate. Conclusions: The proposed strategy can reliably identify contouring errors based upon inter- and intrastructural constraints derived from clinically approved contours. It holds great potential for improving the radiation therapy workflow. ROC and box plot analyses allow for analytically tuning of the system parameters to satisfy clinical requirements. Future work will focus on the improvement of strategy reliability by utilizing more training sets and additional geometric attribute constraints

  12. Using machine learning to accelerate sampling-based inversion

    Science.gov (United States)

    Valentine, A. P.; Sambridge, M.

    2017-12-01

    In most cases, a complete solution to a geophysical inverse problem (including robust understanding of the uncertainties associated with the result) requires a sampling-based approach. However, the computational burden is high, and proves intractable for many problems of interest. There is therefore considerable value in developing techniques that can accelerate sampling procedures.The main computational cost lies in evaluation of the forward operator (e.g. calculation of synthetic seismograms) for each candidate model. Modern machine learning techniques-such as Gaussian Processes-offer a route for constructing a computationally-cheap approximation to this calculation, which can replace the accurate solution during sampling. Importantly, the accuracy of the approximation can be refined as inversion proceeds, to ensure high-quality results.In this presentation, we describe and demonstrate this approach-which can be seen as an extension of popular current methods, such as the Neighbourhood Algorithm, and bridges the gap between prior- and posterior-sampling frameworks.

  13. The effect of problem-based and lecture-based instructional strategies on learner problem solving performance, problem solving processes, and attitudes

    Science.gov (United States)

    Visser, Yusra Laila

    This study compared the effect of lecture-based instruction to that of problem-based instruction on learner performance (on near-transfer and far-transfer problems), problem solving processes (reasoning strategy usage and reasoning efficiency), and attitudes (overall motivation and learner confidence) in a Genetics course. The study also analyzed the effect of self-regulatory skills and prior-academic achievement on performance for both instructional strategies. Sixty 11th grade students at a public math and science academy were assigned to either a lecture-based instructional strategy or a problem-based instructional strategy. Both treatment groups received 18 weeks of Genetics instruction through the assigned instructional strategy. In terms of problem solving performance, results revealed that the lecture-based group performed significantly better on near-transfer post-test problems. The problem-based group performed significantly better on far-transfer post-test problems. In addition, results indicated the learners in the lecture-based instructional treatment were significantly more likely to employ data-driven reasoning in the solving of problems, whereas learners in the problem-based instructional treatment were significantly more likely to employ hypothesis-driven reasoning in problem solving. No significant differences in reasoning efficiency were uncovered between treatment groups. Preliminary analysis of the motivation data suggested that there were no significant differences in motivation between treatment groups. However, a post-research exploratory analysis suggests that overall motivation was significantly higher in the lecture-based instructional treatment than in the problem-based instructional treatment. Learner confidence was significantly higher in the lecture-based group than in the problem-based group. A significant positive correlation was detected between self-regulatory skills scores and problem solving performance scores in the problem-based

  14. Classification of amyotrophic lateral sclerosis disease based on convolutional neural network and reinforcement sample learning algorithm.

    Science.gov (United States)

    Sengur, Abdulkadir; Akbulut, Yaman; Guo, Yanhui; Bajaj, Varun

    2017-12-01

    Electromyogram (EMG) signals contain useful information of the neuromuscular diseases like amyotrophic lateral sclerosis (ALS). ALS is a well-known brain disease, which can progressively degenerate the motor neurons. In this paper, we propose a deep learning based method for efficient classification of ALS and normal EMG signals. Spectrogram, continuous wavelet transform (CWT), and smoothed pseudo Wigner-Ville distribution (SPWVD) have been employed for time-frequency (T-F) representation of EMG signals. A convolutional neural network is employed to classify these features. In it, Two convolution layers, two pooling layer, a fully connected layer and a lost function layer is considered in CNN architecture. The CNN architecture is trained with the reinforcement sample learning strategy. The efficiency of the proposed implementation is tested on publicly available EMG dataset. The dataset contains 89 ALS and 133 normal EMG signals with 24 kHz sampling frequency. Experimental results show 96.80% accuracy. The obtained results are also compared with other methods, which show the superiority of the proposed method.

  15. Development strategy and conceptual design of China Lead-based Research Reactor

    International Nuclear Information System (INIS)

    Wu, Yican; Bai, Yunqing; Song, Yong; Huang, Qunying; Zhao, Zhumin; Hu, Liqin

    2016-01-01

    Highlights: • China LEAd-based Reactor (CLEAR) proposed by Institute of Nuclear Energy Safety Technology (INEST) is selected as the ADS reference reactor. • The Chinese ADS development program consists of three stages, and during the first stage, a 10 MW th lead-based research reactor named CLEAR-I will be built with subcritical and critical dual-mode operation capability for validation of ADS transmutation system and lead cooled fast reactor technology. • Major design principles of CLEAR-I are oriented at technology feasibility, safety reliability, experiment flexibility and technology continuity. Followed by the development strategy and design principles, CLEAR-I design options and conceptual design scenarios are presented. - Abstract: Chinese Academy of Sciences (CAS) launched an engineering project to develop an Accelerator Driven System (ADS) for nuclear waste transmutation since 2011, and China LEAd-based Reactor (CLEAR) proposed by Institute of Nuclear Energy Safety Technology (INEST) is selected as the ADS reference reactor. In this paper, the development strategy and conceptual design of China Lead-based Research Reactor are proposed. The Chinese ADS development program consists of three stages, and during the first stage, a 10 MW th lead-based research reactor named CLEAR-I will be built with subcritical and critical dual-mode operation capability for validation of ADS transmutation system and lead cooled fast reactor technology. Major design principles of CLEAR-I are oriented at technology feasibility, safety reliability, experiment flexibility and technology continuity. Followed by the development strategy and design principles, CLEAR-I design options and conceptual design scenarios are presented.

  16. Sampling guidelines for oral fluid-based surveys of group-housed animals.

    Science.gov (United States)

    Rotolo, Marisa L; Sun, Yaxuan; Wang, Chong; Giménez-Lirola, Luis; Baum, David H; Gauger, Phillip C; Harmon, Karen M; Hoogland, Marlin; Main, Rodger; Zimmerman, Jeffrey J

    2017-09-01

    Formulas and software for calculating sample size for surveys based on individual animal samples are readily available. However, sample size formulas are not available for oral fluids and other aggregate samples that are increasingly used in production settings. Therefore, the objective of this study was to develop sampling guidelines for oral fluid-based porcine reproductive and respiratory syndrome virus (PRRSV) surveys in commercial swine farms. Oral fluid samples were collected in 9 weekly samplings from all pens in 3 barns on one production site beginning shortly after placement of weaned pigs. Samples (n=972) were tested by real-time reverse-transcription PCR (RT-rtPCR) and the binary results analyzed using a piecewise exponential survival model for interval-censored, time-to-event data with misclassification. Thereafter, simulation studies were used to study the barn-level probability of PRRSV detection as a function of sample size, sample allocation (simple random sampling vs fixed spatial sampling), assay diagnostic sensitivity and specificity, and pen-level prevalence. These studies provided estimates of the probability of detection by sample size and within-barn prevalence. Detection using fixed spatial sampling was as good as, or better than, simple random sampling. Sampling multiple barns on a site increased the probability of detection with the number of barns sampled. These results are relevant to PRRSV control or elimination projects at the herd, regional, or national levels, but the results are also broadly applicable to contagious pathogens of swine for which oral fluid tests of equivalent performance are available. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Failure Probability Calculation Method Using Kriging Metamodel-based Importance Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seunggyu [Korea Aerospace Research Institue, Daejeon (Korea, Republic of); Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-05-15

    The kernel density was determined based on sampling points obtained in a Markov chain simulation and was assumed to be an important sampling function. A Kriging metamodel was constructed in more detail in the vicinity of a limit state. The failure probability was calculated based on importance sampling, which was performed for the Kriging metamodel. A pre-existing method was modified to obtain more sampling points for a kernel density in the vicinity of a limit state. A stable numerical method was proposed to find a parameter of the kernel density. To assess the completeness of the Kriging metamodel, the possibility of changes in the calculated failure probability due to the uncertainty of the Kriging metamodel was calculated.

  18. Evaluation Strategy

    DEFF Research Database (Denmark)

    Coto Chotto, Mayela; Wentzer, Helle; Dirckinck-Holmfeld, Lone

    2009-01-01

    The paper presents an evaluation strategy based on deliberate ideals and principles of dialogue design. The evaluation strategy is based on experiential phenomenology taking the point of departure for design and evaluation processes in the experienced practitioners themselves. The article present...... the evaluation strategy and methodology of a research project Making Online Path to Enter new Markets, MOPEM. It is an EU-research project with partners from different Educational Institutions of Technology and Business in five European Countries.......The paper presents an evaluation strategy based on deliberate ideals and principles of dialogue design. The evaluation strategy is based on experiential phenomenology taking the point of departure for design and evaluation processes in the experienced practitioners themselves. The article presents...

  19. Adaptive list sequential sampling method for population-based observational studies

    NARCIS (Netherlands)

    Hof, Michel H.; Ravelli, Anita C. J.; Zwinderman, Aeilko H.

    2014-01-01

    In population-based observational studies, non-participation and delayed response to the invitation to participate are complications that often arise during the recruitment of a sample. When both are not properly dealt with, the composition of the sample can be different from the desired

  20. Systems consultation: protocol for a novel implementation strategy designed to promote evidence-based practice in primary care

    OpenAIRE

    Quanbeck, Andrew; Brown, Randall T; E Zgierska, Aleksandra; A Johnson, Roberta; Robinson, James M; Jacobson, Nora

    2016-01-01

    Background Adoption of evidence-based practices takes place at a glacial place in healthcare. This research will pilot test an innovative implementation strategy ? systems consultation ?intended to speed the adoption of evidence-based practice in primary care. The strategy is based on tenets of systems engineering and has been extensively tested in addiction treatment. Three innovations have been included in the strategy ? translation of a clinical practice guideline into a checklist-based im...

  1. Students’ Reading Comprehension Performance with Emotional Literacy-Based Strategy Intervention

    Directory of Open Access Journals (Sweden)

    Yusfarina Mohd Yussof

    2013-07-01

    Full Text Available An effective reading comprehension process demands a strategy to enhance the cognitive ability to digest text information in the effort to elicit meaning contextually. In addition, the role of emotions also influences the efficacy of this process, especially in narrative text comprehension. This quasi-experimental study aims to observe students’ performance in the Reading Comprehension Test resulting from Emotional Literacy-Based Reading Comprehension Strategy (ELBRCS, which is a combination of cognitive and affective strategies. This study involved 90 students, whereby 45 students were clustered in the Experimental Group and received the ELBRCS intervension. The remaining 45 students were placed in the Control Group and underwent the conventional strategy (prevalent classroom method.The students’ reading comprehension performance was measured using the Reading Comprehension Test (RCT. The findings show that the experimental group received a higher score than the control group for RCT. The intervention has successfully increased student’s Reading Comprehension from literal comprehension to higher levels of comprehension i.e. inferential, evaluative and appreciative levels, as indicated by Barret’s Taxonomy.

  2. Developing trading strategies based on fractal finance: An application of MF-DFA in the context of Islamic equities

    Science.gov (United States)

    Dewandaru, Ginanjar; Masih, Rumi; Bacha, Obiyathulla Ismath; Masih, A. Mansur. M.

    2015-11-01

    We provide a new contribution to trading strategies by using multi-fractal de-trended fluctuation analysis (MF-DFA), imported from econophysics, to complement various momentum strategies. The method provides a single measure that can capture both persistency and anti-persistency in stock prices, accounting for multifractality. This study uses a sample of Islamic stocks listed in the U.S. Dow Jones Islamic market for a sample period covering 16 years starting in 1996. The findings show that the MF-DFA strategy produces monthly excess returns of 6.12%, outperforming other various momentum strategies. Even though the risk of the MF-DFA strategy may be relatively higher, it can still produce a Sharpe ratio of 0.164, which is substantially higher than that of the other strategies. When we control for the MF-DFA factor with the other factors, its pure factor return is still able to yield a monthly excess return of 1.35%. Finally, we combine the momentum and MF-DFA strategies, with the proportions of 90/10, 80/20, and 70/30 and by doing so we demonstrate that the MF-DFA measure can boost the total monthly excess returns as well as Sharpe ratio. The value added is non-linear which implies that the additional returns are associated with lower incremental risk.

  3. Target-induced formation of gold amalgamation on DNA-based sensing platform for electrochemical monitoring of mercury ion coupling with cycling signal amplification strategy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jinfeng; Tang, Juan; Zhou, Jun; Zhang, Lan; Chen, Guonan; Tang, Dianping, E-mail: dianping.tang@fzu.edu.cn

    2014-01-31

    Graphical abstract: -- Highlights: •We report a new electrochemical sensing protocol for the detection of mercury ion. •Gold amalgamation on DNA-based sensing platform was used as nanocatalyst. •The signal was amplified by cycling signal amplification strategy. -- Abstract: Heavy metal ion pollution poses severe risks in human health and environmental pollutant, because of the likelihood of bioaccumulation and toxicity. Driven by the requirement to monitor trace-level mercury ion (Hg{sup 2+}), herein we construct a new DNA-based sensor for sensitive electrochemical monitoring of Hg{sup 2+} by coupling target-induced formation of gold amalgamation on DNA-based sensing platform with gold amalgamation-catalyzed cycling signal amplification strategy. The sensor was simply prepared by covalent conjugation of aminated poly-T{sub (25)} oligonucleotide onto the glassy carbon electrode by typical carbodiimide coupling. Upon introduction of target analyte, Hg{sup 2+} ion was intercalated into the DNA polyion complex membrane based on T–Hg{sup 2+}–T coordination chemistry. The chelated Hg{sup 2+} ion could induce the formation of gold amalgamation, which could catalyze the p-nitrophenol with the aid of NaBH{sub 4} and Ru(NH{sub 3}){sub 6}{sup 3+} for cycling signal amplification. Experimental results indicated that the electronic signal of our system increased with the increasing Hg{sup 2+} level in the sample, and has a detection limit of 0.02 nM with a dynamic range of up to 1000 nM Hg{sup 2+}. The strategy afforded exquisite selectivity for Hg{sup 2+} against other environmentally related metal ions. In addition, the methodology was evaluated for the analysis of Hg{sup 2+} in spiked tap-water samples, and the recovery was 87.9–113.8%.

  4. Creating infrastructure supportive of evidence-based nursing practice: leadership strategies.

    Science.gov (United States)

    Newhouse, Robin P

    2007-01-01

    Nursing leadership is the cornerstone of successful evidence-based practice (EBP) programs within health care organizations. The key to success is a strategic approach to building an EBP infrastructure, with allocation of appropriate human and material resources. This article indicates the organizational infrastructure that enables evidence-based nursing practice and strategies for leaders to enhance evidence-based practice using "the conceptual model for considering the determinants of diffusion, dissemination, and implementation of innovations in health service delivery and organization." Enabling EBP within organizations is important for promoting positive outcomes for nurses and patients. Fostering EBP is not a static or immediate outcome, but a long-term developmental process within organizations. Implementation requires multiple strategies to cultivate a culture of inquiry where nurses generate and answer important questions to guide practice. Organizations that can enable the culture and build infrastructure to help nurses develop EBP competencies will produce a professional environment that will result in both personal growth for their staff and improvements in quality that would not otherwise be possible.

  5. Influence of faith-based organisations on HIV prevention strategies ...

    African Journals Online (AJOL)

    2017-09-03

    Sep 3, 2017 ... Keywords: Faith-based organisations, HIV prevention strategies, systematic review. ... 2017;17(3): 753-761. https://dx.doi.org/10.4314/ahs.v17i3.18. Introduction. HIV (Human ... checked, and citations in key papers were hand searched9. ... that answered our research question: What is the influ- ence of ...

  6. Strategy and space

    DEFF Research Database (Denmark)

    Jensen, Per Anker

    2011-01-01

    in different periods and how these strategies can be related to the general conditions of the corporation. The strategic uncertainty of the corporation is investigated as a main determining factor for changes in space strategy based on theories of the relations between strategy and place. These theories......The article is based on results from a research project on space strategies and building values, which included a major case study of the development of facilities for the Danish Broadcasting Corporation over time. The focus is to identify, how different space strategies have been implemented...... include that corporations follows one of the three generic space strategies: Incrementalism, standardization, and value-based strategy. Among the conclusion are, that the space strategies mostly changes between incremental and value-based strategies, but one period of standardization was identified...

  7. Stochastic bounded consensus tracking of leader-follower multi-agent systems with measurement noises based on sampled data with general sampling delay

    International Nuclear Information System (INIS)

    Wu Zhi-Hai; Peng Li; Xie Lin-Bo; Wen Ji-Wei

    2013-01-01

    In this paper we provide a unified framework for consensus tracking of leader-follower multi-agent systems with measurement noises based on sampled data with a general sampling delay. First, a stochastic bounded consensus tracking protocol based on sampled data with a general sampling delay is presented by employing the delay decomposition technique. Then, necessary and sufficient conditions are derived for guaranteeing leader-follower multi-agent systems with measurement noises and a time-varying reference state to achieve mean square bounded consensus tracking. The obtained results cover no sampling delay, a small sampling delay and a large sampling delay as three special cases. Last, simulations are provided to demonstrate the effectiveness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  8. Advances in stable isotope assisted labeling strategies with information science.

    Science.gov (United States)

    Kigawa, Takanori

    2017-08-15

    Stable-isotope (SI) labeling of proteins is an essential technique to investigate their structures, interactions or dynamics by nuclear magnetic resonance (NMR) spectroscopy. The assignment of the main-chain signals, which is the fundamental first step in these analyses, is usually achieved by a sequential assignment method based on triple resonance experiments. Independently of the triple resonance experiment-based sequential assignment, amino acid-selective SI labeling is beneficial for discriminating the amino acid type of each signal; therefore, it is especially useful for the signal assignment of difficult targets. Various combinatorial selective labeling schemes have been developed as more sophisticated labeling strategies. In these strategies, amino acids are represented by combinations of SI labeled samples, rather than simply assigning one amino acid to one SI labeled sample as in the case of conventional amino acid-selective labeling. These strategies have proven to be useful for NMR analyses of difficult proteins, such as those in large complex systems, in living cells, attached or integrated into membranes, or with poor solubility. In this review, recent advances in stable isotope assisted labeling strategies will be discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Preschool Boys' Development of Emotional Self-regulation Strategies in a Sample At-risk for Behavior Problems

    Science.gov (United States)

    Supplee, Lauren H.; Skuban, Emily Moye; Trentacosta, Christopher J.; Shaw, Daniel S.; Stoltz, Emilee

    2011-01-01

    Little longitudinal research has been conducted on changes in children's emotional self-regulation strategy (SRS) use after infancy, particularly for children at risk. The current study examined changes in boys' emotional SRS from toddlerhood through preschool. Repeated observational assessments using delay of gratification tasks at ages 2, 3, and 4 were examined with both variable- and person-oriented analyses in a low-income sample of boys (N = 117) at-risk for early problem behavior. Results were consistent with theory on emotional SRS development in young children. Children initially used more emotion-focused SRS (e.g., comfort seeking) and transitioned to greater use of planful SRS (e.g., distraction) by age 4. Person-oriented analysis using trajectory analysis found similar patterns from 2–4, with small groups of boys showing delayed movement away from emotion-focused strategies or delay in the onset of regular use of distraction. The results provide a foundation for future research to examine the development of SRS in low-income young children. PMID:21675542

  10. An Organization-Based View of Strategy and an Integrative Framework of Strategic Management

    OpenAIRE

    Li, Xin

    2017-01-01

    Just like scholars distinguish two types of firm’s external environment, i.e., competitive and institutional, we make a distinction between two types of firm’s internal environment, i.e., resources and organization. Based on this distinction, we propose an organization-based view of strategy (OBV), not only as a label to unite various organization-related issues within the strategy field, but also as a fourth research paradigm to supplement to three existing paradigms, i.e., industry- or comp...

  11. Strategy for complete NMR assignment of disordered proteins with highly repetitive sequences based on resolution-enhanced 5D experiments

    Energy Technology Data Exchange (ETDEWEB)

    Motackova, Veronika; Novacek, Jiri [Masaryk University, Faculty of Science, National Centre for Biomolecular Research (Czech Republic); Zawadzka-Kazimierczuk, Anna; Kazimierczuk, Krzysztof [University of Warsaw, Faculty of Chemistry (Poland); Zidek, Lukas, E-mail: lzidek@chemi.muni.c [Masaryk University, Faculty of Science, National Centre for Biomolecular Research (Czech Republic); Sanderova, Hana; Krasny, Libor [Academy of Sciences of the Czech Republic, Laboratory of Molecular Genetics of Bacteria and Department of Bacteriology, Institute of Microbiology (Czech Republic); Kozminski, Wiktor [University of Warsaw, Faculty of Chemistry (Poland); Sklenar, Vladimir [Masaryk University, Faculty of Science, National Centre for Biomolecular Research (Czech Republic)

    2010-11-15

    A strategy for complete backbone and side-chain resonance assignment of disordered proteins with highly repetitive sequence is presented. The protocol is based on three resolution-enhanced NMR experiments: 5D HN(CA)CONH provides sequential connectivity, 5D HabCabCONH is utilized to identify amino acid types, and 5D HC(CC-TOCSY)CONH is used to assign the side-chain resonances. The improved resolution was achieved by a combination of high dimensionality and long evolution times, allowed by non-uniform sampling in the indirect dimensions. Random distribution of the data points and Sparse Multidimensional Fourier Transform processing were used. Successful application of the assignment procedure to a particularly difficult protein, {delta} subunit of RNA polymerase from Bacillus subtilis, is shown to prove the efficiency of the strategy. The studied protein contains a disordered C-terminal region of 81 amino acids with a highly repetitive sequence. While the conventional assignment methods completely failed due to a very small differences in chemical shifts, the presented strategy provided a complete backbone and side-chain assignment.

  12. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  13. A decentralized receptance-based damage detection strategy for wireless smart sensors

    International Nuclear Information System (INIS)

    Jang, Shinae; Spencer Jr, Billie F; Sim, Sung-Han

    2012-01-01

    Various structural health monitoring strategies have been proposed recently that can be implemented in the decentralized computing environment intrinsic to wireless smart sensor networks (WSSN). Many are based on changes in the experimentally determined flexibility matrix for the structure under consideration. However, the flexibility matrix contains only static information; much richer information is available by considering the dynamic flexibility, or receptance, of the structure. Recently, the stochastic dynamic damage locating vector (SDDLV) method was proposed based on changes of dynamic flexibility matrices employing centrally collected output-only measurements. This paper investigates the potential of the SDDLV method for implementation on a network of wireless smart sensors, where a decentralized, hierarchical, in-network processing approach is used to address issues of scalability of the SDDLV algorithm. Two approaches to aggregate results are proposed that provide robust estimates of damage locations. The efficacy of the developed strategy is first verified using wired sensors emulating a wireless sensor network. Subsequently, the decentralized damage detection strategy is implemented on MEMSIC’s Imote2 smart sensor platform and validated experimentally on a laboratory scale truss bridge. (paper)

  14. A Novel Marketing Strategy based on Information Technology

    Institute of Scientific and Technical Information of China (English)

    Zhang Xiao

    2012-01-01

    Marketing, electronic data interchange, internet data center, electric ordering system Abstract:Marketing is the process of performing market research, selling products and/or services to customers and promoting them via advertising to further enhance sales. It generates the strategy that underlies sales techniques, business communication, and business developments. Information technology is the acquisition, processing, storage and dissemination of vocal, pictorial, textual and numerical information by a microelectronics-based combination of computing and telecommunications.

  15. Community based rehabilitation: a strategy for peace-building

    OpenAIRE

    Hodgson Jennifer; Koros Michael; Boyce William

    2002-01-01

    Abstract Background Certain features of peace-building distinguish it from peacekeeping, and make it an appropriate strategy in dealing with vertical conflict and low intensity conflict. However, some theorists suggest that attempts, through peace-building, to impose liberal values upon non-democratic cultures are misguided and lack an ethical basis. Discussion We have been investigating the peace-building properties of community based approaches to disability in a number of countries. This p...

  16. Effectiveness of a School-Based Yoga Program on Adolescent Mental Health, Stress Coping Strategies, and Attitudes toward Violence: Findings from a High-Risk Sample

    Science.gov (United States)

    Frank, Jennifer L.; Bose, Bidyut; Schrobenhauser-Clonan, Alex

    2014-01-01

    This study aimed to assess the effectiveness of a universal yoga-based social-emotional wellness promotion program, Transformative Life Skills, on indicators of adolescent emotional distress, prosocial behavior, and attitudes toward violence in a high-risk sample. Participants included 49 students attending an alternative education school in an…

  17. DEVELOPING A TECHNOLOGY-BASED BUSINESS STRATEGY FOR THE INTERNATIONAL BUSINESS OF TELKOM SA

    OpenAIRE

    M. John; A.J. Buys

    2012-01-01

    ENGLISH ABSTRACT: This study was aimed at developing a technology-based business strategy for Telkom’s international business. Deregulation, competition and demand for converging voice, data and video in the telecommunication market were the driving forces behind this study. Without a proper strategy, Telkom will not be able to withstand the new competition. As the initial step in strategy formulation, Telkom’s strategic goals were identified. A SWOT analysis was conducted and a stra...

  18. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  19. DEA-BASED INVESTMENT STRATEGY AND ITS APPLICATION IN THE CROATIAN STOCK MARKET

    Directory of Open Access Journals (Sweden)

    Margareta Gardijan

    2012-12-01

    Full Text Available This paper describes the DEA-based investment strategy for constructing of a stock portfolio in the Croatian stock market. The relative efficiency of the DMUs, which are in this case the selected stocks from Zagreb Stock Exchange, is obtained from the output oriented CCR and BCC models. The set of inputs consists of risk measures, namely return variance, Value at Risk (VaR and beta coefficient $(\\beta$, while monthly return represents an output. Following the „efficiency scores“, obtained from the models, we construct a portfolio of DEA-efficient stocks (DEA-portfolio. This portfolio can be modified over time according to changes of the DMU's efficiency scores. By comparing the returns of the EA-portfolio and the market return during the given time period, the applicability of the investment strategy based on a DEA methodology, as a strategy for achieving superior returns, is estimated.

  20. Effects of cooperative learning strategy on undergraduate kinesiology students' learning styles.

    Science.gov (United States)

    Meeuwsen, Harry J; King, George A; Pederson, Rockie

    2005-10-01

    A growing body of research supports cooperative learning as an effective teaching strategy. A specific cooperative learning strategy, Team-based Learning, was applied to a convenience sample of four undergraduate sophomore-level motor behavior courses over four semesters from Fall 2002 to Spring 2004 to examine whether this strategy would affect students' learning styles. The data from the Grasha-Reichmann Student Learning Style Scales indicated that this teaching strategy was associated with a significant decrease in the negative Avoidant and Dependent learning styles and an improvement in the positive Participant learning style.