WorldWideScience

Sample records for research sample consisted

  1. A Consistent System for Coding Laboratory Samples

    Science.gov (United States)

    Sih, John C.

    1996-07-01

    A formal laboratory coding system is presented to keep track of laboratory samples. Preliminary useful information regarding the sample (origin and history) is gained without consulting a research notebook. Since this system uses and retains the same research notebook page number for each new experiment (reaction), finding and distinguishing products (samples) of the same or different reactions becomes an easy task. Using this system multiple products generated from a single reaction can be identified and classified in a uniform fashion. Samples can be stored and filed according to stage and degree of purification, e.g. crude reaction mixtures, recrystallized samples, chromatographed or distilled products.

  2. Validation of consistency of Mendelian sampling variance.

    Science.gov (United States)

    Tyrisevä, A-M; Fikse, W F; Mäntysaari, E A; Jakobsen, J; Aamand, G P; Dürr, J; Lidauer, M H

    2018-03-01

    Experiences from international sire evaluation indicate that the multiple-trait across-country evaluation method is sensitive to changes in genetic variance over time. Top bulls from birth year classes with inflated genetic variance will benefit, hampering reliable ranking of bulls. However, none of the methods available today enable countries to validate their national evaluation models for heterogeneity of genetic variance. We describe a new validation method to fill this gap comprising the following steps: estimating within-year genetic variances using Mendelian sampling and its prediction error variance, fitting a weighted linear regression between the estimates and the years under study, identifying possible outliers, and defining a 95% empirical confidence interval for a possible trend in the estimates. We tested the specificity and sensitivity of the proposed validation method with simulated data using a real data structure. Moderate (M) and small (S) size populations were simulated under 3 scenarios: a control with homogeneous variance and 2 scenarios with yearly increases in phenotypic variance of 2 and 10%, respectively. Results showed that the new method was able to estimate genetic variance accurately enough to detect bias in genetic variance. Under the control scenario, the trend in genetic variance was practically zero in setting M. Testing cows with an average birth year class size of more than 43,000 in setting M showed that tolerance values are needed for both the trend and the outlier tests to detect only cases with a practical effect in larger data sets. Regardless of the magnitude (yearly increases in phenotypic variance of 2 or 10%) of the generated trend, it deviated statistically significantly from zero in all data replicates for both cows and bulls in setting M. In setting S with a mean of 27 bulls in a year class, the sampling error and thus the probability of a false-positive result clearly increased. Still, overall estimated genetic

  3. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  4. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  5. Underwater Sediment Sampling Research

    Science.gov (United States)

    2017-01-01

    impacted sediments was found to be directly related to the concentration of crude oil detected in the sediment pore waters . Applying this mathematical...Kurt.A.Hansen@uscg.mil. 16. Abstract (MAXIMUM 200 WORDS ) The USCG R&D Center sought to develop a bench top system to determine the amount of total...scattered. The approach here is to sample the interstitial water between the grains of sand and attempt to determine the amount of oil in and on

  6. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  7. Range-efficient consistent sampling and locality-sensitive hashing for polygons

    DEFF Research Database (Denmark)

    Gudmundsson, Joachim; Pagh, Rasmus

    2017-01-01

    Locality-sensitive hashing (LSH) is a fundamental technique for similarity search and similarity estimation in high-dimensional spaces. The basic idea is that similar objects should produce hash collisions with probability significantly larger than objects with low similarity. We consider LSH for...... or union of a set of preprocessed polygons. Curiously, our consistent sampling method uses transformation to a geometric problem....

  8. Factor Structure, Internal Consistency, and Screening Sensitivity of the GARS-2 in a Developmental Disabilities Sample

    OpenAIRE

    Martin A. Volker; Elissa H. Dua; Christopher Lopata; Marcus L. Thomeer; Jennifer A. Toomey; Audrey M. Smerbeck; Jonathan D. Rodgers; Joshua R. Popkin; Andrew T. Nelson; Gloria K. Lee

    2016-01-01

    The Gilliam Autism Rating Scale-Second Edition (GARS-2) is a widely used screening instrument that assists in the identification and diagnosis of autism. The purpose of this study was to examine the factor structure, internal consistency, and screening sensitivity of the GARS-2 using ratings from special education teaching staff for a sample of 240 individuals with autism or other significant developmental disabilities. Exploratory factor analysis yielded a correlated three-factor solution si...

  9. Reliability and consistency of a validated sun exposure questionnaire in a population-based Danish sample.

    Science.gov (United States)

    Køster, B; Søndergaard, J; Nielsen, J B; Olsen, A; Bentzen, J

    2018-06-01

    An important feature of questionnaire validation is reliability. To be able to measure a given concept by questionnaire validly, the reliability needs to be high. The objectives of this study were to examine reliability of attitude and knowledge and behavioral consistency of sunburn in a developed questionnaire for monitoring and evaluating population sun-related behavior. Sun related behavior, attitude and knowledge was measured weekly by a questionnaire in the summer of 2013 among 664 Danes. Reliability was tested in a test-retest design. Consistency of behavioral information was tested similarly in a questionnaire adapted to measure behavior throughout the summer. The response rates for questionnaire 1, 2 and 3 were high and the drop out was not dependent on demographic characteristic. There was at least 73% agreement between sunburns in the measurement week and the entire summer, and a possible sunburn underestimation in questionnaires summarizing the entire summer. The participants underestimated their outdoor exposure in the evaluation covering the entire summer as compared to the measurement week. The reliability of scales measuring attitude and knowledge was high for majority of scales, while consistency in protection behavior was low. To our knowledge, this is the first study to report reliability for a completely validated questionnaire on sun-related behavior in a national random population based sample. Further, we show that attitude and knowledge questions confirmed their validity with good reliability, while consistency of protection behavior in general and in a week's measurement was low.

  10. A general framework for thermodynamically consistent parameterization and efficient sampling of enzymatic reactions.

    Directory of Open Access Journals (Sweden)

    Pedro Saa

    2015-04-01

    Full Text Available Kinetic models provide the means to understand and predict the dynamic behaviour of enzymes upon different perturbations. Despite their obvious advantages, classical parameterizations require large amounts of data to fit their parameters. Particularly, enzymes displaying complex reaction and regulatory (allosteric mechanisms require a great number of parameters and are therefore often represented by approximate formulae, thereby facilitating the fitting but ignoring many real kinetic behaviours. Here, we show that full exploration of the plausible kinetic space for any enzyme can be achieved using sampling strategies provided a thermodynamically feasible parameterization is used. To this end, we developed a General Reaction Assembly and Sampling Platform (GRASP capable of consistently parameterizing and sampling accurate kinetic models using minimal reference data. The former integrates the generalized MWC model and the elementary reaction formalism. By formulating the appropriate thermodynamic constraints, our framework enables parameterization of any oligomeric enzyme kinetics without sacrificing complexity or using simplifying assumptions. This thermodynamically safe parameterization relies on the definition of a reference state upon which feasible parameter sets can be efficiently sampled. Uniform sampling of the kinetics space enabled dissecting enzyme catalysis and revealing the impact of thermodynamics on reaction kinetics. Our analysis distinguished three reaction elasticity regions for common biochemical reactions: a steep linear region (0> ΔGr >-2 kJ/mol, a transition region (-2> ΔGr >-20 kJ/mol and a constant elasticity region (ΔGr <-20 kJ/mol. We also applied this framework to model more complex kinetic behaviours such as the monomeric cooperativity of the mammalian glucokinase and the ultrasensitive response of the phosphoenolpyruvate carboxylase of Escherichia coli. In both cases, our approach described appropriately not only

  11. Factor Structure, Internal Consistency, and Screening Sensitivity of the GARS-2 in a Developmental Disabilities Sample

    Directory of Open Access Journals (Sweden)

    Martin A. Volker

    2016-01-01

    Full Text Available The Gilliam Autism Rating Scale-Second Edition (GARS-2 is a widely used screening instrument that assists in the identification and diagnosis of autism. The purpose of this study was to examine the factor structure, internal consistency, and screening sensitivity of the GARS-2 using ratings from special education teaching staff for a sample of 240 individuals with autism or other significant developmental disabilities. Exploratory factor analysis yielded a correlated three-factor solution similar to that found in 2005 by Lecavalier for the original GARS. Though the three factors appeared to be reasonably consistent with the intended constructs of the three GARS-2 subscales, the analysis indicated that more than a third of the GARS-2 items were assigned to the wrong subscale. Internal consistency estimates met or exceeded standards for screening and were generally higher than those in previous studies. Screening sensitivity was .65 and specificity was .81 for the Autism Index using a cut score of 85. Based on these findings, recommendations are made for instrument revision.

  12. Reliability and consistency of a validated sun exposure questionnaire in a population-based Danish sample

    Directory of Open Access Journals (Sweden)

    B. Køster

    2018-06-01

    Full Text Available An important feature of questionnaire validation is reliability. To be able to measure a given concept by questionnaire validly, the reliability needs to be high.The objectives of this study were to examine reliability of attitude and knowledge and behavioral consistency of sunburn in a developed questionnaire for monitoring and evaluating population sun-related behavior.Sun related behavior, attitude and knowledge was measured weekly by a questionnaire in the summer of 2013 among 664 Danes. Reliability was tested in a test-retest design. Consistency of behavioral information was tested similarly in a questionnaire adapted to measure behavior throughout the summer.The response rates for questionnaire 1, 2 and 3 were high and the drop out was not dependent on demographic characteristic. There was at least 73% agreement between sunburns in the measurement week and the entire summer, and a possible sunburn underestimation in questionnaires summarizing the entire summer. The participants underestimated their outdoor exposure in the evaluation covering the entire summer as compared to the measurement week. The reliability of scales measuring attitude and knowledge was high for majority of scales, while consistency in protection behavior was low.To our knowledge, this is the first study to report reliability for a completely validated questionnaire on sun-related behavior in a national random population based sample. Further, we show that attitude and knowledge questions confirmed their validity with good reliability, while consistency of protection behavior in general and in a week's measurement was low. Keywords: Questionnaire, Validation, Reliability, Skin cancer, Prevention, Ultraviolet radiation

  13. The Consistency of Isotopologues of Ambient Atmospheric Nitric Acid in Passively Collected Samples

    Science.gov (United States)

    Bell, M. D.; Sickman, J. O.; Bytnerowicz, A.; Padgett, P.; Allen, E. B.

    2012-12-01

    Anthropogenic sources of nitrogen oxides have previously been shown to have distinctive isotopic signatures of oxygen and nitrogen. Nylon filters are currently used in passive sampling arrays to measure ambient atmospheric nitric acid concentrations and estimate deposition rates. This experiment measured the ability of nylon filters to consistently collect isotopologues of atmospheric nitric acid in the same ratios as they are present in the atmosphere. Samplers were deployed in continuous stirred tank reactors (CSTR) and at field sites across a nitrogen deposition gradient in Southern California. Filters were exposed over a four week period with individual filters being subjected to 1-4 week exposure times. Extracted nitric acid were measured for δ18O and δ15N ratios and compared for consistency based on length of exposure and amount of HNO3 collected. Filters within the CSTRs collected HNO3 at a consistent rate in both high and low concentration chambers. After two weeks of exposure, the mean δ18O values were within 0.5‰ of the δ18O of the source HNO3 solution. The mean of all weekly exposures were within 0.5‰ of the δ15N of the source solution, but after three weeks, the mean δ15N of adsorbed HNO3 was within 0.2‰. As the length of the exposure increased, the variability of measured delta values decreased for both elements. The field samplers collected HNO3 consistent with previously measured values along a deposition gradient. The mean δ18O at high deposition sites was 52.2‰ compared to 35.7‰ at the low deposition sites. Mean δ15N values were similar at all sites across the deposition gradient. Due to precipitation events occurring during the exposure period, the δ15N and δ18O of nitric acid were highly variable at all field sites. At single sites, changes in δ15N and δ18O were negatively correlated, consistent with two-sourcing mixing dynamics, but the slope of the regressions differed between high and low deposition sites. Anthropogenic

  14. Secondary electron emission and self-consistent charge transport in semi-insulating samples

    Energy Technology Data Exchange (ETDEWEB)

    Fitting, H.-J. [Institute of Physics, University of Rostock, Universitaetsplatz 3, D-18051 Rostock (Germany); Touzin, M. [Unite Materiaux et Transformations, UMR CNRS 8207, Universite de Lille 1, F-59655 Villeneuve d' Ascq (France)

    2011-08-15

    Electron beam induced self-consistent charge transport and secondary electron emission (SEE) in insulators are described by means of an electron-hole flight-drift model (FDM) now extended by a certain intrinsic conductivity (c) and are implemented by an iterative computer simulation. Ballistic secondary electrons (SE) and holes, their attenuation to drifting charge carriers, and their recombination, trapping, and field- and temperature-dependent detrapping are included. As a main result the time dependent ''true'' secondary electron emission rate {delta}(t) released from the target material and based on ballistic electrons and the spatial distributions of currents j(x,t), charges {rho}(x,t), field F(x,t), and potential V(x,t) are obtained where V{sub 0} = V(0,t) presents the surface potential. The intrinsic electronic conductivity limits the charging process and leads to a conduction sample current to the support. In that case the steady-state total SE yield will be fixed below the unit: i.e., {sigma} {eta} + {delta} < 1.

  15. Quota sampling in internet research: practical issues.

    Science.gov (United States)

    Im, Eun-Ok; Chee, Wonshik

    2011-07-01

    Quota sampling has been suggested as a potentially good method for Internet-based research and has been used by several researchers working with Internet samples. However, very little is known about the issues or concerns in using a quota sampling method in Internet research. The purpose of this article was to present the practical issues using quota sampling in an Internet-based study. During the Internet study, the research team recorded all recruitment issues that arose and made written notes indicating the possible reasons for the problems. In addition, biweekly team discussions were conducted for which written records were kept. Overall, quota sampling was effective in ensuring that an adequate number of midlife women were recruited from the targeted ethnic groups. However, during the study process, we encountered the following practical issues using quota sampling: (1) difficulty reaching out to women in lower socioeconomic classes, (2) difficulty ensuring authenticity of participants' identities, (3) participants giving inconsistent answers for the screening questions versus the Internet survey questions, (4) potential problems with a question on socioeconomic status, (5) resentment toward the research project and/or researchers because of rejection, and (6) a longer time and more expense than anticipated.

  16. Conducting Clinical Research Using Crowdsourced Convenience Samples.

    Science.gov (United States)

    Chandler, Jesse; Shapiro, Danielle

    2016-01-01

    Crowdsourcing has had a dramatic impact on the speed and scale at which scientific research can be conducted. Clinical scientists have particularly benefited from readily available research study participants and streamlined recruiting and payment systems afforded by Amazon Mechanical Turk (MTurk), a popular labor market for crowdsourcing workers. MTurk has been used in this capacity for more than five years. The popularity and novelty of the platform have spurred numerous methodological investigations, making it the most studied nonprobability sample available to researchers. This article summarizes what is known about MTurk sample composition and data quality with an emphasis on findings relevant to clinical psychological research. It then addresses methodological issues with using MTurk--many of which are common to other nonprobability samples but unfamiliar to clinical science researchers--and suggests concrete steps to avoid these issues or minimize their impact.

  17. Research results: preserving newborn blood samples.

    Science.gov (United States)

    Lewis, Michelle Huckaby; Scheurer, Michael E; Green, Robert C; McGuire, Amy L

    2012-11-07

    Retention and use, without explicit parental permission, of residual dried blood samples from newborn screening has generated public controversy over concerns about violations of family privacy rights and loss of parental autonomy. The public debate about this issue has included little discussion about the destruction of a potentially valuable public resource that can be used for research that may yield improvements in public health. The research community must advocate for policies and infrastructure that promote retention of residual dried blood samples and their use in biomedical research.

  18. Environmental sample banking-research and methodology

    International Nuclear Information System (INIS)

    Becker, D.A.

    1976-01-01

    The National Bureau of Standards (NBS), in cooperation with the Environment Protection Agency and the National Science Foundation, is engaged in a research program establishing methodology for environmental sample banking. This program is aimed toward evaluating the feasibility of a National Environment Specimen Bank (NESB). The capability for retrospective chemical analyses to evaluate changes in our environment would provide useful information. Much of this information could not be obtained using data from previously analyzed samples. However, to assure validity for these stored samples, they must be sampled, processed and stored under rigorously evaluated, controlled and documented conditions. The program currently under way in the NBS Analytical Chemistry Division has 3 main components. The first is an extension survey of available literature concerning problems of contamination, losses and storage. The components of interest include trace elements, pesticides, other trace organics (PCBs, plasticizers, etc.), radionuclides and microbiological species. The second component is an experimental evaluation of contamination and losses during sampling and sample handling. Of particular interest here is research into container cleaning methodology for trace elements, with respect to adsorption, desorption, leaching and partial dissolution by various sample matrices. The third component of this program is an evaluation of existing methodology for long-term sample storage

  19. Sampling bias in climate-conflict research

    Science.gov (United States)

    Adams, Courtland; Ide, Tobias; Barnett, Jon; Detges, Adrien

    2018-03-01

    Critics have argued that the evidence of an association between climate change and conflict is flawed because the research relies on a dependent variable sampling strategy1-4. Similarly, it has been hypothesized that convenience of access biases the sample of cases studied (the `streetlight effect'5). This also gives rise to claims that the climate-conflict literature stigmatizes some places as being more `naturally' violent6-8. Yet there has been no proof of such sampling patterns. Here we test whether climate-conflict research is based on such a biased sample through a systematic review of the literature. We demonstrate that research on climate change and violent conflict suffers from a streetlight effect. Further, studies which focus on a small number of cases in particular are strongly informed by cases where there has been conflict, do not sample on the independent variables (climate impact or risk), and hence tend to find some association between these two variables. These biases mean that research on climate change and conflict primarily focuses on a few accessible regions, overstates the links between both phenomena and cannot explain peaceful outcomes from climate change. This could result in maladaptive responses in those places that are stigmatized as being inherently more prone to climate-induced violence.

  20. Demystifying Theoretical Sampling in Grounded Theory Research

    Directory of Open Access Journals (Sweden)

    Jenna Breckenridge BSc(Hons,Ph.D.Candidate

    2009-06-01

    Full Text Available Theoretical sampling is a central tenet of classic grounded theory and is essential to the development and refinement of a theory that is ‘grounded’ in data. While many authors appear to share concurrent definitions of theoretical sampling, the ways in which the process is actually executed remain largely elusive and inconsistent. As such, employing and describing the theoretical sampling process can present a particular challenge to novice researchers embarking upon their first grounded theory study. This article has been written in response to the challenges faced by the first author whilst writing a grounded theory proposal. It is intended to clarify theoretical sampling for new grounded theory researchers, offering some insight into the practicalities of selecting and employing a theoretical sampling strategy. It demonstrates that the credibility of a theory cannot be dissociated from the process by which it has been generated and seeks to encourage and challenge researchers to approach theoretical sampling in a way that is apposite to the core principles of the classic grounded theory methodology.

  1. Consistency between Research and Clinical Diagnoses of Autism among Boys and Girls with Fragile X Syndrome

    Science.gov (United States)

    Klusek, J.; Martin, G. E.; Losh, M.

    2014-01-01

    Background: Prior research suggests that 60-74% of males and 16-45% of females with fragile X syndrome (FXS) meet criteria for autism spectrum disorder (ASD) in research settings. However, relatively little is known about the rates of clinical diagnoses in FXS and whether such diagnoses are consistent with those performed in a research setting…

  2. The Sampling Issues in Quantitative Research

    Science.gov (United States)

    Delice, Ali

    2010-01-01

    A concern for generalization dominates quantitative research. For generalizability and repeatability, identification of sample size is essential. The present study investigates 90 qualitative master's theses submitted for the Primary and Secondary School Science and Mathematics Education Departments, Mathematic Education Discipline in 10…

  3. Consistent Estimation of Continuous-Time Signals from Nonlinear Transformations of Noisy Samples,

    Science.gov (United States)

    1980-03-10

    t, then hn is given by (5) (with W = n) and represents the Szasz operator. Theorem 3.0, while guaranteeing mean-square consistency of the estimate Sw...t), provides no bounds on the rate of convergence. We shall derive such bounds for linear systems hW corresponding to the class of generalized Szasz ...operators [6] (see below) and to the Bernstein operator. While the Szasz operator (5) can be generated as in Proposition 3.0, the class of generalized

  4. Research progress of MRI in preoperative evaluation of pituitary adenoma's consistency

    International Nuclear Information System (INIS)

    Lu Yiping; Yin Bo; Geng Daoying

    2013-01-01

    As the most common primary disease in pituitary fossa, the incidence of pituitary adenoma ranks 3rd in the primary tumors of the brain. To remove those resectable pituitary adenomas, there are 2 surgical approaches, named trans-sphenoidal endoscopic surgery and craniotomy. Which approach should be used depends on the size, invasive extension and the consistency of the tumors. The trans-sphenoidal endoscopic surgery is more suitable for the tumors with soft consistency which are easy to pull out, while the craniotomy is suitable for the hard ones. So, preoperative evaluation of the tumors' consistency can help to find the best surgical approach and treatments. MRI is not only an ideal method to show the structure of brain, but also can be used to evaluate consistency of tumor. This review illustrated the forming mechanism of the different consistency of pituitary adenoma and the research process in evaluating the consistency. (authors)

  5. Risky, aggressive, or emotional driving: addressing the need for consistent communication in research.

    Science.gov (United States)

    Dula, Chris S; Geller, E Scott

    2003-01-01

    Researchers agree that a consistent definition for aggressive driving is lacking. Such definitional ambiguity in the literature impedes the accumulation of accurate and precise information, and prevents researchers from communicating clearly about findings and implications for future research directions. This dramatically slows progress in understanding the causes and maintenance factors of aggressive driving. This article critiques prevailing definitions of driver aggression and generates a definition that, if used consistently, can improve the utility of future research. Pertinent driving behaviors have been variably labeled in the literature as risky, aggressive, or road rage. The authors suggest that the term "road rage" be eliminated from research because it has been used inconsistently and has little probability of being clarified and applied consistently. Instead, driving behaviors that endanger or have the potential to endanger others should be considered as lying on a behavioral spectrum of dangerous driving. Three dimensions of dangerous driving are delineated: (a). intentional acts of aggression toward others, (b). negative emotions experienced while driving, and (c). risk-taking. The adoption of a standardized definition for aggressive driving should spark researchers to use more explicit operational definitions that are consistent with theoretical foundations. The use of consistent and unambiguous operational definitions will increase the precision of measurement in research and enhance authors' ability to communicate clearly about findings and conclusions. As this occurs over time, industry will reap benefits from more carefully conducted research. Such benefits may include the development of more valid and reliable means of selecting safe professional drivers, conducting accurate risk assessments, and creating preventative and remedial dangerous driving safety programs.

  6. A sample design for globally consistent biomass estimation using lidar data from the Geoscience Laser Altimeter System (GLAS)

    Science.gov (United States)

    Sean P. Healey; Paul L. Patterson; Sassan S. Saatchi; Michael A. Lefsky; Andrew J. Lister; Elizabeth A. Freeman

    2012-01-01

    Lidar height data collected by the Geosciences Laser Altimeter System (GLAS) from 2002 to 2008 has the potential to form the basis of a globally consistent sample-based inventory of forest biomass. GLAS lidar return data were collected globally in spatially discrete full waveform "shots," which have been shown to be strongly correlated with aboveground forest...

  7. Factorial Validity and Internal Consistency of Malaysian Adapted Depression Anxiety Stress Scale - 21 in an Adolescent Sample

    OpenAIRE

    Hairul Anuar Hashim; Freddy Golok; Rosmatunisah Ali

    2011-01-01

    Background: Psychometrically sound measurement instrument is a fundamental requirement across broad range of research areas. In negative affect research, Depression Anxiety Stress Scale (DASS) has been identified as a psychometrically sound instrument to measure depression, anxiety and stress, especially the 21-item version. However, its psychometric properties in adolescents have been less consistent. Objectives: Thus, the present study sought to examine the factorial validity and internal c...

  8. Sampling knowledge: the hermeneutics of snowball sampling in qualitative research

    OpenAIRE

    Noy, Chaim

    2008-01-01

    During the past two decades we have witnessed a rather impressive growth of theoretical innovations and conceptual revisions of epistemological and methodological approaches within constructivist-qualitative quarters of the social sciences. Methodological discussions have commonly addressed a variety of methods for collecting and analyzing empirical material, yet the critical grounds upon which these were reformulated have rarely been extended to embrace sampling concepts and procedures. The ...

  9. An approach based on HPLC-fingerprint and chemometrics to quality consistency evaluation of Matricaria chamomilla L. commercial samples

    Directory of Open Access Journals (Sweden)

    Agnieszka Viapiana

    2016-10-01

    Full Text Available Chamomile has been used as an herbal medication since ancient times and is still popular because it contains various bioactive phytochemicals that could provide therapeutic effects. In this study, a simple and reliable HPLC method was developed to evaluate the quality consistency of nineteen chamomile samples through establishing a chromatographic fingerprint, quantification of phenolic compounds and determination of antioxidant activity. For fingerprint analysis, 12 peaks were selected as the common peaks to evaluate the similarities of commercial samples of chamomile obtained from different manufacturers. A similarity analysis was performed to assess the similarity/dissimilarity of chamomile samples where values varied from 0.868 to 0.990 what indicating that samples from different manufacturers were consistent. Additionally, simultaneous quantification of five phenolic acids (gallic, caffeic, syringic, p-coumaric, ferulic and four flavonoids (rutin, myricetin, quercetin and keampferol was performed to interpret the quality consistency. In quantitative analysis, the nine individual phenolic compounds showed good regression (r > 0.9975. Inter- and intra-day precisions for all analysed compounds expressed as relative standard deviation (CV ranged from 0.05% to 3.12%. Since flavonoids and other polyphenols are commonly recognised as natural antioxidants, the antioxidant activity of chamomile samples was evaluated using 1,1-diphenyl-2-picrylhydrazyl (DPPH radical scavenging activity and ferric reducing/antioxidant power (FRAP assay. Correlation analysis was used to assess the relationship between antioxidant activity and phenolic composition, and multivariate analysis (PCA and HCA were applied to distinguish chamomile samples. Results shown in the study indicate high similarity of chamomile samples among them, widely spread in the market and commonly used by people as infusions or teas, as well as that there were no statistically significant

  10. An Approach Based on HPLC-Fingerprint and Chemometrics to Quality Consistency Evaluation of Matricaria chamomilla L. Commercial Samples

    Science.gov (United States)

    Viapiana, Agnieszka; Struck-Lewicka, Wiktoria; Konieczynski, Pawel; Wesolowski, Marek; Kaliszan, Roman

    2016-01-01

    Chamomile has been used as an herbal medication since ancient times and is still popular because it contains various bioactive phytochemicals that could provide therapeutic effects. In this study, a simple and reliable HPLC method was developed to evaluate the quality consistency of nineteen chamomile samples through establishing a chromatographic fingerprint, quantification of phenolic compounds and determination of antioxidant activity. For fingerprint analysis, 12 peaks were selected as the common peaks to evaluate the similarities of commercial samples of chamomile obtained from different manufacturers. A similarity analysis was performed to assess the similarity/dissimilarity of chamomile samples where values varied from 0.868 to 0.990 what indicating that samples from different manufacturers were consistent. Additionally, simultaneous quantification of five phenolic acids (gallic, caffeic, syringic, p-coumaric, ferulic) and four flavonoids (rutin, myricetin, quercetin and keampferol) was performed to interpret the quality consistency. In quantitative analysis, the nine individual phenolic compounds showed good regression (r > 0.9975). Inter- and intra-day precisions for all analyzed compounds expressed as relative standard deviation (CV) ranged from 0.05% to 3.12%. Since flavonoids and other polyphenols are commonly recognized as natural antioxidants, the antioxidant activity of chamomile samples was evaluated using 1,1-diphenyl-2-picrylhydrazyl (DPPH) radical scavenging activity and ferric reducing/antioxidant power (FRAP) assay. Correlation analysis was used to assess the relationship between antioxidant activity and phenolic composition, and multivariate analysis (PCA and HCA) were applied to distinguish chamomile samples. Results shown in the study indicate high similarity of chamomile samples among them, widely spread in the market and commonly used by people as infusions or teas, as well as that there were no statistically significant differences among

  11. An Approach Based on HPLC-Fingerprint and Chemometrics to Quality Consistency Evaluation of Matricaria chamomilla L. Commercial Samples

    OpenAIRE

    Viapiana, Agnieszka; Struck-Lewicka, Wiktoria; Konieczynski, Pawel; Wesolowski, Marek; Kaliszan, Roman

    2016-01-01

    Chamomile has been used as an herbal medication since ancient times and is still popular because it contains various bioactive phytochemicals that could provide therapeutic effects. In this study, a simple and reliable HPLC method was developed to evaluate the quality consistency of nineteen chamomile samples through establishing a chromatographic fingerprint, quantification of phenolic compounds and determination of antioxidant activity. For fingerprint analysis, 12 peaks were selected as th...

  12. VLE measurements using a static cell vapor phase manual sampling method accompanied with an empirical data consistency test

    International Nuclear Information System (INIS)

    Freitag, Joerg; Kosuge, Hitoshi; Schmelzer, Juergen P.; Kato, Satoru

    2015-01-01

    Highlights: • We use a new, simple static cell vapor phase manual sampling method (SCVMS) for VLE (x, y, T) measurement. • The method is applied to non-azeotropic, asymmetric and two-liquid phase forming azeotropic binaries. • The method is approved by a data consistency test, i.e., a plot of the polarity exclusion factor vs. pressure. • The consistency test reveals that with the new SCVMS method accurate VLE near ambient temperature can be measured. • Moreover, the consistency test approves that the effect of air in the SCVMS system is negligible. - Abstract: A new static cell vapor phase manual sampling (SCVMS) method is used for the simple measurement of constant temperature x, y (vapor + liquid) equilibria (VLE). The method was applied to the VLE measurements of the (methanol + water) binary at T/K = (283.2, 298.2, 308.2 and 322.9), asymmetric (acetone + 1-butanol) binary at T/K = (283.2, 295.2, 308.2 and 324.2) and two-liquid phase forming azeotropic (water + 1-butanol) binary at T/K = (283.2 and 298.2). The accuracy of the experimental data was approved by a data consistency test, that is, an empirical plot of the polarity exclusion factor, β, vs. the system pressure, P. The SCVMS data are accurate, because the VLE data converge to the same lnβ vs. lnP straight line determined from conventional distillation-still method and a headspace gas chromatography method

  13. Consistently low prevalence of syphilis among female sex workers in Jinan, China: findings from two consecutive respondent driven sampling surveys.

    Directory of Open Access Journals (Sweden)

    Meizhen Liao

    Full Text Available BACKGROUND: Routine surveillance using convenient sampling found low prevalence of HIV and syphilis among female sex workers in China. Two consecutive surveys using respondent driven sampling were conducted in 2008 and 2009 to examine the prevalence of HIV and syphilis among female sex workers in Jinan, China. METHODS: A face-to-face interview was conducted to collect demographic, behavioral and service utilization information using a structured questionnaire. Blood samples were drawn for serological tests of HIV-1 antibody and syphilis antibody. Respondent Driven Sampling Analysis Tool was used to generate population level estimates. RESULTS: In 2008 and in 2009, 363 and 432 subjects were recruited and surveyed respectively. Prevalence of syphilis was 2.8% in 2008 and 2.2% in 2009, while no HIV case was found in both years. Results are comparable to those from routine sentinel surveillance system in the city. Only 60.8% subjects in 2008 and 48.3% in 2009 reported a consistent condom use with clients during the past month. Over 50% subjects had not been covered by any HIV-related services in the past year, with only 15.6% subjects in 2008 and 13.1% in 2009 ever tested for HIV. CONCLUSIONS: Despite the low prevalence of syphilis and HIV, risk behaviors are common. Targeted interventions to promote the safe sex and utilization of existing intervention services are still needed to keep the epidemic from growing.

  14. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research.

    Science.gov (United States)

    Palinkas, Lawrence A; Horwitz, Sarah M; Green, Carla A; Wisdom, Jennifer P; Duan, Naihua; Hoagwood, Kimberly

    2015-09-01

    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research.

  15. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research

    Science.gov (United States)

    Palinkas, Lawrence A.; Horwitz, Sarah M.; Green, Carla A.; Wisdom, Jennifer P.; Duan, Naihua; Hoagwood, Kimberly

    2013-01-01

    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research. PMID:24193818

  16. Two-Sample Two-Stage Least Squares (TSTSLS estimates of earnings mobility: how consistent are they?

    Directory of Open Access Journals (Sweden)

    John Jerrim

    2016-08-01

    Full Text Available Academics and policymakers have shown great interest in cross-national comparisons of intergenerational earnings mobility. However, producing consistent and comparable estimates of earnings mobility is not a trivial task. In most countries researchers are unable to observe earnings information for two generations. They are thus forced to rely upon imputed data from different surveys instead. This paper builds upon previous work by considering the consistency of the intergenerational correlation (ρ as well as the elasticity (β, how this changes when using a range of different instrumental (imputer variables, and highlighting an important but infrequently discussed measurement issue. Our key finding is that, while TSTSLS estimates of β and ρ are both likely to be inconsistent, the magnitude of this problem is much greater for the former than it is for the latter. We conclude by offering advice on estimating earnings mobility using this methodology.

  17. The ethical use of existing samples for genome research.

    Science.gov (United States)

    Bathe, Oliver F; McGuire, Amy L

    2009-10-01

    Modern biobanking efforts consist of prospective collections of tissues linked to clinical data for patients who have given informed consent for the research use of their specimens and data, including their DNA. In such efforts, patient autonomy and privacy are well respected because of the prospective nature of the informed consent process. However, one of the richest sources of tissue for research continues to be the millions of archived samples collected by pathology departments during normal clinical care or for research purposes without specific consent for future research or genetic analysis. Because specific consent was not obtained a priori, issues related to individual privacy and autonomy are much more complicated. A framework for accessing these existing samples and related clinical data for research is presented. Archival tissues may be accessed only when there is a reasonable likelihood of generating beneficial and scientifically valid information. To minimize risks, databases containing information related to the tissue and to clinical data should be coded, no personally identifying phenotypic information should be included, and access should be restricted to bona fide researchers for legitimate research purposes. These precautions, if implemented appropriately, should ensure that the research use of archival tissue and data are no more than minimal risk. A waiver of the requirement for informed consent would then be justified if reconsent is shown to be impracticable. A waiver of consent should not be granted, however, if there is a significant risk to privacy, if the proposed research use is inconsistent with the original consent (where there is one), or if the potential harm from a privacy breach is considerable.

  18. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  19. Turbidity Threshold sampling in watershed research

    Science.gov (United States)

    Rand Eads; Jack Lewis

    2003-01-01

    Abstract - When monitoring suspended sediment for watershed research, reliable and accurate results may be a higher priority than in other settings. Timing and frequency of data collection are the most important factors influencing the accuracy of suspended sediment load estimates, and, in most watersheds, suspended sediment transport is dominated by a few, large...

  20. An overview of coefficient alpha and a reliability matrix for estimating adequacy of internal consistency coefficients with psychological research measures.

    Science.gov (United States)

    Ponterotto, Joseph G; Ruckdeschel, Daniel E

    2007-12-01

    The present article addresses issues in reliability assessment that are often neglected in psychological research such as acceptable levels of internal consistency for research purposes, factors affecting the magnitude of coefficient alpha (alpha), and considerations for interpreting alpha within the research context. A new reliability matrix anchored in classical test theory is introduced to help researchers judge adequacy of internal consistency coefficients with research measures. Guidelines and cautions in applying the matrix are provided.

  1. Analysis of the research sample collections of Uppsala biobank.

    Science.gov (United States)

    Engelmark, Malin T; Beskow, Anna H

    2014-10-01

    Uppsala Biobank is the joint and only biobank organization of the two principals, Uppsala University and Uppsala University Hospital. Biobanks are required to have updated registries on sample collection composition and management in order to fulfill legal regulations. We report here the results from the first comprehensive and overall analysis of the 131 research sample collections organized in the biobank. The results show that the median of the number of samples in the collections was 700 and that the number of samples varied from less than 500 to over one million. Blood samples, such as whole blood, serum, and plasma, were included in the vast majority, 84.0%, of the research sample collections. Also, as much as 95.5% of the newly collected samples within healthcare included blood samples, which further supports the concept that blood samples have fundamental importance for medical research. Tissue samples were also commonly used and occurred in 39.7% of the research sample collections, often combined with other types of samples. In total, 96.9% of the 131 sample collections included samples collected for healthcare, showing the importance of healthcare as a research infrastructure. Of the collections that had accessed existing samples from healthcare, as much as 96.3% included tissue samples from the Department of Pathology, which shows the importance of pathology samples as a resource for medical research. Analysis of different research areas shows that the most common of known public health diseases are covered. Collections that had generated the most publications, up to over 300, contained a large number of samples collected systematically and repeatedly over many years. More knowledge about existing biobank materials, together with public registries on sample collections, will support research collaborations, improve transparency, and bring us closer to the goals of biobanks, which is to save and prolong human lives and improve health and quality of life.

  2. Implementing Data Definition Consistency for Emergency Department Operations Benchmarking and Research.

    Science.gov (United States)

    Yiadom, Maame Yaa A B; Scheulen, James; McWade, Conor M; Augustine, James J

    2016-07-01

    The objective was to obtain a commitment to adopt a common set of definitions for emergency department (ED) demographic, clinical process, and performance metrics among the ED Benchmarking Alliance (EDBA), ED Operations Study Group (EDOSG), and Academy of Academic Administrators of Emergency Medicine (AAAEM) by 2017. A retrospective cross-sectional analysis of available data from three ED operations benchmarking organizations supported a negotiation to use a set of common metrics with identical definitions. During a 1.5-day meeting-structured according to social change theories of information exchange, self-interest, and interdependence-common definitions were identified and negotiated using the EDBA's published definitions as a start for discussion. Methods of process analysis theory were used in the 8 weeks following the meeting to achieve official consensus on definitions. These two lists were submitted to the organizations' leadership for implementation approval. A total of 374 unique measures were identified, of which 57 (15%) were shared by at least two organizations. Fourteen (4%) were common to all three organizations. In addition to agreement on definitions for the 14 measures used by all three organizations, agreement was reached on universal definitions for 17 of the 57 measures shared by at least two organizations. The negotiation outcome was a list of 31 measures with universal definitions to be adopted by each organization by 2017. The use of negotiation, social change, and process analysis theories achieved the adoption of universal definitions among the EDBA, EDOSG, and AAAEM. This will impact performance benchmarking for nearly half of US EDs. It initiates a formal commitment to utilize standardized metrics, and it transitions consistency in reporting ED operations metrics from consensus to implementation. This work advances our ability to more accurately characterize variation in ED care delivery models, resource utilization, and performance. In

  3. Sampling in Qualitative Research: Rationale, Issues, and Methods

    OpenAIRE

    LUBORSKY, MARK R.; RUBINSTEIN, ROBERT L.

    1995-01-01

    In gerontology the most recognized and elaborate discourse about sampling is generally thought to be in quantitative research associated with survey research and medical research. But sampling has long been a central concern in the social and humanistic inquiry, albeit in a different guise suited to the different goals. There is a need for more explicit discussion of qualitative sampling issues. This article will outline the guiding principles and rationales, features, and practices of sampli...

  4. Estimators of internal consistency in health research: the use of the alpha coefficient

    OpenAIRE

    Cascaes da Silva, Fraciele; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil.; Gonçalves, Elizandra; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil.; Valdivia Arancibia, Beatriz Angélica; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil; Graziele Bento, Salma; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil.; da Silva Castro, Thiago Luis; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil; Soleman Hernandez, Salma Stephany; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil; da Silva, Rudney; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil

    2015-01-01

    Academic production has increased in the area of health, increasingly demanding high quality in publications of great impact. One of the ways to consider quality is through methods that increase the consistency of data analysis, such as reliability which, depending on the type of data, can be evaluated by different coefficients, especially the alpha coefficient. Based on this, the present review systematically gathers scientific articles produced in the last five years, which in a methodologi...

  5. THE USE OF RANKING SAMPLING METHOD WITHIN MARKETING RESEARCH

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2011-01-01

    Full Text Available Marketing and statistical literature available to practitioners provides a wide range of sampling methods that can be implemented in the context of marketing research. Ranking sampling method is based on taking apart the general population into several strata, namely into several subdivisions which are relatively homogenous regarding a certain characteristic. In fact, the sample will be composed by selecting, from each stratum, a certain number of components (which can be proportional or non-proportional to the size of the stratum until the pre-established volume of the sample is reached. Using ranking sampling within marketing research requires the determination of some relevant statistical indicators - average, dispersion, sampling error etc. To that end, the paper contains a case study which illustrates the actual approach used in order to apply the ranking sample method within a marketing research made by a company which provides Internet connection services, on a particular category of customers – small and medium enterprises.

  6. Internal consistency, test-retest reliability and measurement error of the self-report version of the social skills rating system in a sample of Australian adolescents.

    Directory of Open Access Journals (Sweden)

    Sharmila Vaz

    Full Text Available The social skills rating system (SSRS is used to assess social skills and competence in children and adolescents. While its characteristics based on United States samples (US are published, corresponding Australian figures are unavailable. Using a 4-week retest design, we examined the internal consistency, retest reliability and measurement error (ME of the SSRS secondary student form (SSF in a sample of Year 7 students (N = 187, from five randomly selected public schools in Perth, western Australia. Internal consistency (IC of the total scale and most subscale scores (except empathy on the frequency rating scale was adequate to permit independent use. On the importance rating scale, most IC estimates for girls fell below the benchmark. Test-retest estimates of the total scale and subscales were insufficient to permit reliable use. ME of the total scale score (frequency rating for boys was equivalent to the US estimate, while that for girls was lower than the US error. ME of the total scale score (importance rating was larger than the error using the frequency rating scale. The study finding supports the idea of using multiple informants (e.g. teacher and parent reports, not just student as recommended in the manual. Future research needs to substantiate the clinical meaningfulness of the MEs calculated in this study by corroborating them against the respective Minimum Clinically Important Difference (MCID.

  7. Internal consistency, test-retest reliability and measurement error of the self-report version of the social skills rating system in a sample of Australian adolescents.

    Science.gov (United States)

    Vaz, Sharmila; Parsons, Richard; Passmore, Anne Elizabeth; Andreou, Pantelis; Falkmer, Torbjörn

    2013-01-01

    The social skills rating system (SSRS) is used to assess social skills and competence in children and adolescents. While its characteristics based on United States samples (US) are published, corresponding Australian figures are unavailable. Using a 4-week retest design, we examined the internal consistency, retest reliability and measurement error (ME) of the SSRS secondary student form (SSF) in a sample of Year 7 students (N = 187), from five randomly selected public schools in Perth, western Australia. Internal consistency (IC) of the total scale and most subscale scores (except empathy) on the frequency rating scale was adequate to permit independent use. On the importance rating scale, most IC estimates for girls fell below the benchmark. Test-retest estimates of the total scale and subscales were insufficient to permit reliable use. ME of the total scale score (frequency rating) for boys was equivalent to the US estimate, while that for girls was lower than the US error. ME of the total scale score (importance rating) was larger than the error using the frequency rating scale. The study finding supports the idea of using multiple informants (e.g. teacher and parent reports), not just student as recommended in the manual. Future research needs to substantiate the clinical meaningfulness of the MEs calculated in this study by corroborating them against the respective Minimum Clinically Important Difference (MCID).

  8. [Sampling in qualitative research: basic principles and some controversies].

    Science.gov (United States)

    Martínez-Salgado, Carolina

    2012-03-01

    This paper presents the rationale for the choice of participants in qualitative research in contrast with that of probability sampling principles in epidemiological research. For a better understanding of the differences, concepts of nomothetic and ideographic generalizability, as well as those of transferability and reflexivity, are proposed, Fundamentals of the main types of sampling commonly used in qualitative research, and the meaning of the concept of saturation are mentioned. Finally, some reflections on the controversies that have arisen in recent years on various paradigmatic perspectives from which to conduct qualitative research, their possibilities of combination with epidemiological research, and some implications for the study of health issues are presented.

  9. Sampling in Qualitative Research: Improving the Quality of ...

    African Journals Online (AJOL)

    Sampling consideration in qualitative research is very important, yet in practice this appears not to be given the prominence and the rigour it deserves among Higher Education researchers. Accordingly, the quality of research outcomes in Higher Education has suffered from low utilisation. This has motivated the production ...

  10. The Work Role Functioning Questionnaire v2.0 Showed Consistent Factor Structure Across Six Working Samples

    NARCIS (Netherlands)

    Abma, F.I.; Bultmann, U.; Amick III, B.C.; Arends, I.; Dorland, P.A.; Flach, P.A.; Klink, J.J.L van der; Ven H.A., van de; Bjørner, J.B.

    2017-01-01

    Objective The Work Role Functioning Questionnaire v2.0 (WRFQ) is an outcome measure linking a persons’ health to the ability to meet work demands in the twenty-first century. We aimed to examine the construct validity of the WRFQ in a heterogeneous set of working samples in the Netherlands with

  11. An historically consistent and broadly applicable MRV system based on LiDAR sampling and Landsat time-series

    Science.gov (United States)

    W. Cohen; H. Andersen; S. Healey; G. Moisen; T. Schroeder; C. Woodall; G. Domke; Z. Yang; S. Stehman; R. Kennedy; C. Woodcock; Z. Zhu; J. Vogelmann; D. Steinwand; C. Huang

    2014-01-01

    The authors are developing a REDD+ MRV system that tests different biomass estimation frameworks and components. Design-based inference from a costly fi eld plot network was compared to sampling with LiDAR strips and a smaller set of plots in combination with Landsat for disturbance monitoring. Biomass estimation uncertainties associated with these different data sets...

  12. The Work Role Functioning Questionnaire v2.0 Showed Consistent Factor Structure Across Six Working Samples

    DEFF Research Database (Denmark)

    Abma, Femke I.; Bültmann, Ute; Amick, Benjamin C.

    2017-01-01

    Objective: The Work Role Functioning Questionnaire v2.0 (WRFQ) is an outcome measure linking a persons’ health to the ability to meet work demands in the twenty-first century. We aimed to examine the construct validity of the WRFQ in a heterogeneous set of working samples in the Netherlands...

  13. Sampling in epidemiological research: issues, hazards and pitfalls

    Science.gov (United States)

    Tyrer, Stephen; Heyman, Bob

    2016-01-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research. PMID:27087985

  14. Convenience samples and caregiving research: how generalizable are the findings?

    Science.gov (United States)

    Pruchno, Rachel A; Brill, Jonathan E; Shands, Yvonne; Gordon, Judith R; Genderson, Maureen Wilson; Rose, Miriam; Cartwright, Francine

    2008-12-01

    We contrast characteristics of respondents recruited using convenience strategies with those of respondents recruited by random digit dial (RDD) methods. We compare sample variances, means, and interrelationships among variables generated from the convenience and RDD samples. Women aged 50 to 64 who work full time and provide care to a community-dwelling older person were recruited using either RDD (N = 55) or convenience methods (N = 87). Telephone interviews were conducted using reliable, valid measures of demographics, characteristics of the care recipient, help provided to the care recipient, evaluations of caregiver-care recipient relationship, and outcomes common to caregiving research. Convenience and RDD samples had similar variances on 68.4% of the examined variables. We found significant mean differences for 63% of the variables examined. Bivariate correlations suggest that one would reach different conclusions using the convenience and RDD sample data sets. Researchers should use convenience samples cautiously, as they may have limited generalizability.

  15. Accounting for Diversity in Suicide Research: Sampling and Sample Reporting Practices in the United States.

    Science.gov (United States)

    Cha, Christine B; Tezanos, Katherine M; Peros, Olivia M; Ng, Mei Yi; Ribeiro, Jessica D; Nock, Matthew K; Franklin, Joseph C

    2018-04-01

    Research on suicidal thoughts and behaviors (STB) has identified many risk factors, but whether these findings generalize to diverse populations remains unclear. We review longitudinal studies on STB risk factors over the past 50 years in the United States and evaluate the methodological practices of sampling and reporting sample characteristics. We found that articles frequently reported participant age and sex, less frequently reported participant race and ethnicity, and rarely reported participant veteran status or lesbian, gay, bisexual, and transgender status. Sample reporting practices modestly and inconsistently improved over time. Finally, articles predominantly featured White, non-Hispanic, young adult samples. © 2017 The American Association of Suicidology.

  16. Research Paper Prevalence of enuresis in a community sample of ...

    African Journals Online (AJOL)

    Research suggests a higher prevalence of coexisting behavioural disorders, particularly Attention-Deficit Hyperactivity Disorder (ADHD), among children with enuresis in comparison to the general population. Studies generally have consisted of participants attending general paediatric medical clinics as opposed to ...

  17. Sample size in psychological research over the past 30 years.

    Science.gov (United States)

    Marszalek, Jacob M; Barber, Carolyn; Kohlhart, Julie; Holmes, Cooper B

    2011-04-01

    The American Psychological Association (APA) Task Force on Statistical Inference was formed in 1996 in response to a growing body of research demonstrating methodological issues that threatened the credibility of psychological research, and made recommendations to address them. One issue was the small, even dramatically inadequate, size of samples used in studies published by leading journals. The present study assessed the progress made since the Task Force's final report in 1999. Sample sizes reported in four leading APA journals in 1955, 1977, 1995, and 2006 were compared using nonparametric statistics, while data from the last two waves were fit to a hierarchical generalized linear growth model for more in-depth analysis. Overall, results indicate that the recommendations for increasing sample sizes have not been integrated in core psychological research, although results slightly vary by field. This and other implications are discussed in the context of current methodological critique and practice.

  18. Chemical and Metallurgy Research (CMR) Sample Tracking System Design Document

    International Nuclear Information System (INIS)

    Bargelski, C. J.; Berrett, D. E.

    1998-01-01

    The purpose of this document is to describe the system architecture of the Chemical and Metallurgy Research (CMR) Sample Tracking System at Los Alamos National Laboratory. During the course of the document observations are made concerning the objectives, constraints and limitations, technical approaches, and the technical deliverables

  19. Sixteen-item Anxiety Sensitivity Index: Confirmatory factor analytic evidence, internal consistency, and construct validity in a young adult sample from the Netherlands

    NARCIS (Netherlands)

    Vujanovic, Anka A.; Arrindell, Willem A.; Bernstein, Amit; Norton, Peter J.; Zvolensky, Michael J.

    The present investigation examined the factor structure, internal consistency, and construct validity of the 16-item Anxiety Sensitivity Index (ASI; Reiss Peterson, Gursky, & McNally 1986) in a young adult sample (n = 420)from the Netherlands. Confirmatory factor analysis was used to comparatively

  20. RANKED SET SAMPLING FOR ECOLOGICAL RESEARCH: ACCOUNTING FOR THE TOTAL COSTS OF SAMPLING

    Science.gov (United States)

    Researchers aim to design environmental studies that optimize precision and allow for generalization of results, while keeping the costs of associated field and laboratory work at a reasonable level. Ranked set sampling is one method to potentially increase precision and reduce ...

  1. Sample Identification at Scale - Implementing IGSN in a Research Agency

    Science.gov (United States)

    Klump, J. F.; Golodoniuc, P.; Wyborn, L. A.; Devaraju, A.; Fraser, R.

    2015-12-01

    Earth sciences are largely observational and rely on natural samples, types of which vary significantly between science disciplines. Sharing and referencing of samples in scientific literature and across the Web requires the use of globally unique identifiers essential for disambiguation. This practice is very common in other fields, e.g. ISBN in publishing, doi in scientific literature, etc. In Earth sciences however, this is still often done in an ad-hoc manner without the use of unique identifiers. The International Geo Sample Number (IGSN) system provides a persistent, globally unique label for identifying environmental samples. As an IGSN allocating agency, CSIRO implements the IGSN registration service at the organisational scale with contributions from multiple research groups. Capricorn Distal Footprints project is one of the first pioneers and early adopters of the technology in Australia. For this project, IGSN provides a mechanism for identification of new and legacy samples, as well as derived sub-samples. It will ensure transparency and reproducibility in various geochemical sampling campaigns that will involve a diversity of sampling methods. Hence, diverse geochemical and isotopic results can be linked back to the parent sample, particularly where multiple children of that sample have also been analysed. The IGSN integration for this project is still in early stages and requires further consultations on the governance mechanisms that we need to put in place to allow efficient collaboration within CSIRO and collaborating partners on the project including naming conventions, service interfaces, etc. In this work, we present the results of the initial implementation of IGSN in the context of the Capricorn Distal Footprints project. This study has so far demonstrated the effectiveness of the proposed approach, while maintaining the flexibility to adapt to various media types, which is critical in the context of a multi-disciplinary project.

  2. Samples and data accessibility in research biobanks: an explorative survey

    Directory of Open Access Journals (Sweden)

    Marco Capocasa

    2016-02-01

    Full Text Available Biobanks, which contain human biological samples and/or data, provide a crucial contribution to the progress of biomedical research. However, the effective and efficient use of biobank resources depends on their accessibility. In fact, making bio-resources promptly accessible to everybody may increase the benefits for society. Furthermore, optimizing their use and ensuring their quality will promote scientific creativity and, in general, contribute to the progress of bio-medical research. Although this has become a rather common belief, several laboratories are still secretive and continue to withhold samples and data. In this study, we conducted a questionnaire-based survey in order to investigate sample and data accessibility in research biobanks operating all over the world. The survey involved a total of 46 biobanks. Most of them gave permission to access their samples (95.7% and data (85.4%, but free and unconditioned accessibility seemed not to be common practice. The analysis of the guidelines regarding the accessibility to resources of the biobanks that responded to the survey highlights three issues: (i the request for applicants to explain what they would like to do with the resources requested; (ii the role of funding, public or private, in the establishment of fruitful collaborations between biobanks and research labs; (iii the request of co-authorship in order to give access to their data. These results suggest that economic and academic aspects are involved in determining the extent of sample and data sharing stored in biobanks. As a second step of this study, we investigated the reasons behind the high diversity of requirements to access biobank resources. The analysis of informative answers suggested that the different modalities of resource accessibility seem to be largely influenced by both social context and legislation of the countries where the biobanks operate.

  3. Solar System Samples for Research, Education, and Public Outreach

    Science.gov (United States)

    Allen, J.; Luckey, M.; McInturff, B.; Kascak, A.; Tobola, K.; Galindo, C.; Allen, C.

    2011-01-01

    In the next two years, during the NASA Year of the Solar System, spacecraft from NASA and our international partners will; encounter a comet, orbit asteroid 4 Vesta, continue to explore Mars with rovers, and launch robotic explorers to the Moon and Mars. We have pieces of all these worlds in our laboratories, and their continued study provides incredibly valuable "ground truth" to complement space exploration missions. Extensive information about these unique materials, as well as actual lunar samples and meteorites, are available for display and education. The Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation, and distribution of samples for research, education, and public outreach.

  4. Research-Grade 3D Virtual Astromaterials Samples: Novel Visualization of NASA's Apollo Lunar Samples and Antarctic Meteorite Samples to Benefit Curation, Research, and Education

    Science.gov (United States)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K. R.; Zeigler, R. A.; Righter, K.; Hanna, R. D.; Ketcham, R. A.

    2017-01-01

    NASA's vast and growing collections of astromaterials are both scientifically and culturally significant, requiring unique preservation strategies that need to be recurrently updated to contemporary technological capabilities and increasing accessibility demands. New technologies have made it possible to advance documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. Our interdisciplinary team has developed a method to create 3D Virtual Astromaterials Samples (VAS) of the existing collections of Apollo Lunar Samples and Antarctic Meteorites. Research-grade 3D VAS will virtually put these samples in the hands of researchers and educators worldwide, increasing accessibility and visibility of these significant collections. With new sample return missions on the horizon, it is of primary importance to develop advanced curation standards for documentation and visualization methodologies.

  5. Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis.

    Science.gov (United States)

    Moser, Albine; Korstjens, Irene

    2018-12-01

    In the course of our supervisory work over the years, we have noticed that qualitative research tends to evoke a lot of questions and worries, so-called frequently asked questions (FAQs). This series of four articles intends to provide novice researchers with practical guidance for conducting high-quality qualitative research in primary care. By 'novice' we mean Master's students and junior researchers, as well as experienced quantitative researchers who are engaging in qualitative research for the first time. This series addresses their questions and provides researchers, readers, reviewers and editors with references to criteria and tools for judging the quality of qualitative research papers. The second article focused on context, research questions and designs, and referred to publications for further reading. This third article addresses FAQs about sampling, data collection and analysis. The data collection plan needs to be broadly defined and open at first, and become flexible during data collection. Sampling strategies should be chosen in such a way that they yield rich information and are consistent with the methodological approach used. Data saturation determines sample size and will be different for each study. The most commonly used data collection methods are participant observation, face-to-face in-depth interviews and focus group discussions. Analyses in ethnographic, phenomenological, grounded theory, and content analysis studies yield different narrative findings: a detailed description of a culture, the essence of the lived experience, a theory, and a descriptive summary, respectively. The fourth and final article will focus on trustworthiness and publishing qualitative research.

  6. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  7. Informatics-guided procurement of patient samples for biomarker discovery projects in cancer research.

    Science.gov (United States)

    Suh, K Stephen; Remache, Yvonne K; Patel, Jalpa S; Chen, Steve H; Haystrand, Russell; Ford, Peggy; Shaikh, Anadil M; Wang, Jian; Goy, Andre H

    2009-02-01

    Modern cancer research for biomarker discovery program requires solving several tasks that are directly involved with patient sample procurement. One requirement is to construct a highly efficient workflow on the clinical side for the procurement to generate a consistent supply of high quality samples for research. This undertaking needs a network of interdepartmental collaborations and participations at various levels, including physical human interactions, information technology implementations and a bioinformatics tool that is highly effective and user-friendly to busy clinicians and researchers associated with the sample procurement. Collegial participation that is sequential but continual from one department to another demands dedicated bioinformatics software coordinating between the institutional clinic and the tissue repository facility. Participants in the process include admissions, consenting process, phlebotomy, surgery center and pathology. During this multiple step procedures, clinical data are collected for detailed analytical endpoints to supplement logistics of defining and validating the discovery of biomarkers.

  8. Reliability of attitude and knowledge items and behavioral consistency in the validated sun exposure questionnaire in a Danish population based sample

    DEFF Research Database (Denmark)

    Køster, Brian; Søndergaard, Jens; Nielsen, Jesper Bo

    2018-01-01

    in protection behavior was low. To our knowledge, this is the first study to report reliability for a completely validated questionnaire on sun-related behavior in a national random population based sample. Further, we show that attitude and knowledge questions confirmed their validity with good reliability......An important feature of questionnaire validation is reliability. To be able to measure a given concept by questionnaire validly, the reliability needs to be high. The objectives of this study were to examine reliability of attitude and knowledge and behavioral consistency of sunburn in a developed...... questionnaire for monitoring and evaluating population sun-related behavior. Sun related behavior, attitude and knowledge was measured weekly by a questionnaire in the summer of 2013 among 664 Danes. Reliability was tested in a test-retest design. Consistency of behavioral information was tested similarly...

  9. Towards the harmonization between National Forest Inventory and Forest Condition Monitoring. Consistency of plot allocation and effect of tree selection methods on sample statistics in Italy.

    Science.gov (United States)

    Gasparini, Patrizia; Di Cosmo, Lucio; Cenni, Enrico; Pompei, Enrico; Ferretti, Marco

    2013-07-01

    In the frame of a process aiming at harmonizing National Forest Inventory (NFI) and ICP Forests Level I Forest Condition Monitoring (FCM) in Italy, we investigated (a) the long-term consistency between FCM sample points (a subsample of the first NFI, 1985, NFI_1) and recent forest area estimates (after the second NFI, 2005, NFI_2) and (b) the effect of tree selection method (tree-based or plot-based) on sample composition and defoliation statistics. The two investigations were carried out on 261 and 252 FCM sites, respectively. Results show that some individual forest categories (larch and stone pine, Norway spruce, other coniferous, beech, temperate oaks and cork oak forests) are over-represented and others (hornbeam and hophornbeam, other deciduous broadleaved and holm oak forests) are under-represented in the FCM sample. This is probably due to a change in forest cover, which has increased by 1,559,200 ha from 1985 to 2005. In case of shift from a tree-based to a plot-based selection method, 3,130 (46.7%) of the original 6,703 sample trees will be abandoned, and 1,473 new trees will be selected. The balance between exclusion of former sample trees and inclusion of new ones will be particularly unfavourable for conifers (with only 16.4% of excluded trees replaced by new ones) and less for deciduous broadleaves (with 63.5% of excluded trees replaced). The total number of tree species surveyed will not be impacted, while the number of trees per species will, and the resulting (plot-based) sample composition will have a much larger frequency of deciduous broadleaved trees. The newly selected trees have-in general-smaller diameter at breast height (DBH) and defoliation scores. Given the larger rate of turnover, the deciduous broadleaved part of the sample will be more impacted. Our results suggest that both a revision of FCM network to account for forest area change and a plot-based approach to permit statistical inference and avoid bias in the tree sample

  10. Hair MDMA samples are consistent with reported ecstasy use: findings from a study investigating effects of ecstasy on mood and memory.

    Science.gov (United States)

    Scholey, A B; Owen, L; Gates, J; Rodgers, J; Buchanan, T; Ling, J; Heffernan, T; Swan, P; Stough, C; Parrott, A C

    2011-01-01

    Our group has conducted several Internet investigations into the biobehavioural effects of self-reported recreational use of MDMA (3,4-methylenedioxymethamphetamine or Ecstasy) and other psychosocial drugs. Here we report a new study examining the relationship between self-reported Ecstasy use and traces of MDMA found in hair samples. In a laboratory setting, 49 undergraduate volunteers performed an Internet-based assessment which included mood scales and the University of East London Drug Use Questionnaire, which asks for history and current drug use. They also provided a hair sample for determination of exposure to MDMA over the previous month. Self-report of Ecstasy use and presence in hair samples were consistent (p happiness and higher self-reported stress. Self-reported Ecstasy use, but not presence in hair, was also associated with decreased tension. Different psychoactive drugs can influence long-term mood and cognition in complex and dynamically interactive ways. Here we have shown a good correspondence between self-report and objective assessment of exposure to MDMA. These data suggest that the Internet has potentially high utility as a useful medium to complement traditional laboratory studies into the sequelae of recreational drug use. Copyright © 2010 S. Karger AG, Basel.

  11. The Index to Marine and Lacustrine Geological Samples: Improving Sample Accessibility and Enabling Current and Future Research

    Science.gov (United States)

    Moore, C.

    2011-12-01

    The Index to Marine and Lacustrine Geological Samples is a community designed and maintained resource enabling researchers to locate and request sea floor and lakebed geologic samples archived by partner institutions. Conceived in the dawn of the digital age by representatives from U.S. academic and government marine core repositories and the NOAA National Geophysical Data Center (NGDC) at a 1977 meeting convened by the National Science Foundation (NSF), the Index is based on core concepts of community oversight, common vocabularies, consistent metadata and a shared interface. Form and content of underlying vocabularies and metadata continue to evolve according to the needs of the community, as do supporting technologies and access methodologies. The Curators Consortium, now international in scope, meets at partner institutions biennially to share ideas and discuss best practices. NGDC serves the group by providing database access and maintenance, a list server, digitizing support and long-term archival of sample metadata, data and imagery. Over three decades, participating curators have performed the herculean task of creating and contributing metadata for over 195,000 sea floor and lakebed cores, grabs, and dredges archived in their collections. Some partners use the Index for primary web access to their collections while others use it to increase exposure of more in-depth institutional systems. The Index is currently a geospatially-enabled relational database, publicly accessible via Web Feature and Web Map Services, and text- and ArcGIS map-based web interfaces. To provide as much knowledge as possible about each sample, the Index includes curatorial contact information and links to related data, information and images; 1) at participating institutions, 2) in the NGDC archive, and 3) at sites such as the Rolling Deck to Repository (R2R) and the System for Earth Sample Registration (SESAR). Over 34,000 International GeoSample Numbers (IGSNs) linking to SESAR are

  12. Types of non-probabilistic sampling used in marketing research. „Snowball” sampling

    OpenAIRE

    Manuela Rozalia Gabor

    2007-01-01

    A significant way of investigating a firm’s market is the statistical sampling. The sampling typology provides a non / probabilistic models of gathering information and this paper describes thorough information related to network sampling, named “snowball” sampling. This type of sampling enables the survey of occurrence forms concerning the decision power within an organisation and of the interpersonal relation network governing a certain collectivity, a certain consumer panel. The snowball s...

  13. A nanocomposite consisting of graphene oxide and Fe3O4 magnetic nanoparticles for the extraction of flavonoids from tea, wine and urine samples

    International Nuclear Information System (INIS)

    Wu, Jianrong; Xiao, Deli; Peng, Jun; Wang, Cuixia; Zhang, Chan; He, Jia; Zhao, Hongyan; He, Hua

    2015-01-01

    We describe a single-step solvothermal method for the preparation of nanocomposites consisting of graphene oxide and Fe 3 O 4 nanoparticles (GO/Fe 3 O 4 ). This material is shown to be useful as a magnetic sorbent for the extraction of flavonoids from green tea, red wine, and urine samples. The nanocomposite is taking advantage of the high surface area of GO and the magnetic phase separation feature of the magnetic sorbent. The nanocomposite is recyclable and was applied to the extraction of flavonoids prior to their determination by HPLC. The effects of amount of surfactant, pH value of the sample solution, extraction time, and desorption condition on the extraction efficiency, and the regeneration conditions were optimized. The limits of detection for luteolin, quercetin and kaempferol range from 0.2 to 0.5 ng∙ mL −1 in urine, from 3.0 to 6.0 ng∙mL −1 in green tea, and from 1.0 to 2.5 ng∙mL −1 in red wine. The recoveries are between 82.0 and 101.4 %, with relative standard deviations of <9.3 %. (author)

  14. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....

  15. CHANG-ES. IX. Radio scale heights and scale lengths of a consistent sample of 13 spiral galaxies seen edge-on and their correlations

    Science.gov (United States)

    Krause, Marita; Irwin, Judith; Wiegert, Theresa; Miskolczi, Arpad; Damas-Segovia, Ancor; Beck, Rainer; Li, Jiang-Tao; Heald, George; Müller, Peter; Stein, Yelena; Rand, Richard J.; Heesen, Volker; Walterbos, Rene A. M.; Dettmar, Ralf-Jürgen; Vargas, Carlos J.; English, Jayanne; Murphy, Eric J.

    2018-03-01

    Aim. The vertical halo scale height is a crucial parameter to understand the transport of cosmic-ray electrons (CRE) and their energy loss mechanisms in spiral galaxies. Until now, the radio scale height could only be determined for a few edge-on galaxies because of missing sensitivity at high resolution. Methods: We developed a sophisticated method for the scale height determination of edge-on galaxies. With this we determined the scale heights and radial scale lengths for a sample of 13 galaxies from the CHANG-ES radio continuum survey in two frequency bands. Results: The sample average values for the radio scale heights of the halo are 1.1 ± 0.3 kpc in C-band and 1.4 ± 0.7 kpc in L-band. From the frequency dependence analysis of the halo scale heights we found that the wind velocities (estimated using the adiabatic loss time) are above the escape velocity. We found that the halo scale heights increase linearly with the radio diameters. In order to exclude the diameter dependence, we defined a normalized scale height h˜ which is quite similar for all sample galaxies at both frequency bands and does not depend on the star formation rate or the magnetic field strength. However, h˜ shows a tight anticorrelation with the mass surface density. Conclusions: The sample galaxies with smaller scale lengths are more spherical in the radio emission, while those with larger scale lengths are flatter. The radio scale height depends mainly on the radio diameter of the galaxy. The sample galaxies are consistent with an escape-dominated radio halo with convective cosmic ray propagation, indicating that galactic winds are a widespread phenomenon in spiral galaxies. While a higher star formation rate or star formation surface density does not lead to a higher wind velocity, we found for the first time observational evidence of a gravitational deceleration of CRE outflow, e.g. a lowering of the wind velocity from the galactic disk.

  16. Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis

    Science.gov (United States)

    Moser, Albine; Korstjens, Irene

    2018-01-01

    Abstract In the course of our supervisory work over the years, we have noticed that qualitative research tends to evoke a lot of questions and worries, so-called frequently asked questions (FAQs). This series of four articles intends to provide novice researchers with practical guidance for conducting high-quality qualitative research in primary care. By ‘novice’ we mean Master’s students and junior researchers, as well as experienced quantitative researchers who are engaging in qualitative research for the first time. This series addresses their questions and provides researchers, readers, reviewers and editors with references to criteria and tools for judging the quality of qualitative research papers. The second article focused on context, research questions and designs, and referred to publications for further reading. This third article addresses FAQs about sampling, data collection and analysis. The data collection plan needs to be broadly defined and open at first, and become flexible during data collection. Sampling strategies should be chosen in such a way that they yield rich information and are consistent with the methodological approach used. Data saturation determines sample size and will be different for each study. The most commonly used data collection methods are participant observation, face-to-face in-depth interviews and focus group discussions. Analyses in ethnographic, phenomenological, grounded theory, and content analysis studies yield different narrative findings: a detailed description of a culture, the essence of the lived experience, a theory, and a descriptive summary, respectively. The fourth and final article will focus on trustworthiness and publishing qualitative research. PMID:29199486

  17. Report of special study meeting on 'Atomic energy research aiming at consistent nuclear fuel cycle', fiscal year 1992

    International Nuclear Information System (INIS)

    Nishina, Kojiro; Nishihara, Hideaki; Mishima, Kaichiro

    1994-12-01

    This meeting was held on March 4, 1993. Since the first power generation with the JPDR and the initial criticality of the KUR, 30 years, and since the initial criticality of the KUCA, 20 years have elapsed. The researchers in universities have contributed greatly to the research and education of atomic energy, but the perspective of leading the world hereafter in this field is very uncertain. This study meeting was held to seek the way to make the proper contribution. In the meeting, lectures were given on Japanese policy on nuclear fuel cycle, the present state of the upstream research and the downstream research in Japan, the experimental plan in NUCEF, the present state of the researches on TRU decay heat data and TRU nucleus data, the present state of the experimental researches at KUCA and at FCA, the present state of the research on the heat removal from high conversion LWRs and the KUR, the present state of the research on radioactive waste treatment, and the present state of TRU chemical research. The record of the holding of this study meeting is added. (K.I.)

  18. [Practical aspects regarding sample size in clinical research].

    Science.gov (United States)

    Vega Ramos, B; Peraza Yanes, O; Herrera Correa, G; Saldívar Toraya, S

    1996-01-01

    The knowledge of the right sample size let us to be sure if the published results in medical papers had a suitable design and a proper conclusion according to the statistics analysis. To estimate the sample size we must consider the type I error, type II error, variance, the size of the effect, significance and power of the test. To decide what kind of mathematics formula will be used, we must define what kind of study we have, it means if its a prevalence study, a means values one or a comparative one. In this paper we explain some basic topics of statistics and we describe four simple samples of estimation of sample size.

  19. Research of pneumatic control transmission system for small irradiation samples

    International Nuclear Information System (INIS)

    Bai Zhongxiong; Zhang Haibing; Rong Ru; Zhang Tao

    2008-01-01

    In order to reduce the absorbed dose damage for the operator, pneumatic control has been adopted to realize the rapid transmission of small irradiation samples. On/off of pneumatic circuit and directions for the rapid transmission system are controlled by the electrical control part. The main program initializes the system and detects the location of the manual/automatic change-over switch, and call for the corresponding subprogram to achieve the automatic or manual operation. Automatic subprogram achieves the automatic sample transmission; Manual subprogram completes the deflation, and back and forth movement of the radiation samples. This paper introduces in detail the implementation of the system, in terms of both hardware and software design. (authors)

  20. Northeast Cooperative Research Study Fleet (SF) Program Biological Sampling Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Northeast Cooperative Research Study Fleet (SF) Program partners with a subset of commercial fishermen to collect high quality, high resolution, haul by haul...

  1. Research Note Pilot survey to assess sample size for herbaceous ...

    African Journals Online (AJOL)

    A pilot survey to determine sub-sample size (number of point observations per plot) for herbaceous species composition assessments, using a wheel-point apparatus applying the nearest-plant method, was conducted. Three plots differing in species composition on the Zululand coastal plain were selected, and on each plot ...

  2. Consistent Treatment of Variables and Causation Poses a Challenge for Behavioral Research Methods: A Commentary on Nesselroade and Molenaar (2016).

    Science.gov (United States)

    Markus, Keith A

    2016-01-01

    Nesselroade and Molenaar presented the ideographic filter as a proposal for analyzing lawful regularities in behavioral research. The proposal highlights an inconsistency that poses a challenge for behavioral research more generally. One can distinguish a broadly Humean approach from a broadly non-Humean approach as they relate to variables and to causation. Nesselroade and Molenaar rejected a Humean approach to latent variables that characterizes them as nothing more than summaries of their manifest indicators. By contrast, they tacitly accepted a Humean approach to causes characterized as nothing more than summaries of their manifest causal effects. A non-Humean treatment of variables coupled with a Humean treatment of causation creates a theoretical tension within their proposal. For example, one can interpret the same model elements as simultaneously representing both variables and causes. Future refinement of the ideographic filter proposal to address this tension could follow any of a number of strategies.

  3. Sample geometry as critical factor for stability research

    NARCIS (Netherlands)

    Klerk, W.P.C. de; Boers, M.N.

    2003-01-01

    Stability research on gun propellants has been widely performed by microcalorimetry since the 1980s. TNO Prins Maurits Laboratory has already a broad experience since the early 1970s. In the past many studies were performed, to investigate the influence of oxygen, humidity etc. Less attention was

  4. Understanding Sample Surveys: Selective Learning about Social Science Research Methods

    Science.gov (United States)

    Currin-Percival, Mary; Johnson, Martin

    2010-01-01

    We investigate differences in what students learn about survey methodology in a class on public opinion presented in two critically different ways: with the inclusion or exclusion of an original research project using a random-digit-dial telephone survey. Using a quasi-experimental design and data obtained from pretests and posttests in two public…

  5. Funding Medical Research Projects: Taking into Account Referees' Severity and Consistency through Many-Faceted Rasch Modeling of Projects' Scores.

    Science.gov (United States)

    Tesio, Luigi; Simone, Anna; Grzeda, Mariuzs T; Ponzio, Michela; Dati, Gabriele; Zaratin, Paola; Perucca, Laura; Battaglia, Mario A

    2015-01-01

    The funding policy of research projects often relies on scores assigned by a panel of experts (referees). The non-linear nature of raw scores and the severity and inconsistency of individual raters may generate unfair numeric project rankings. Rasch measurement (many-facets version, MFRM) provides a valid alternative to scoring. MFRM was applied to the scores achieved by 75 research projects on multiple sclerosis sent in response to a previous annual call by FISM-Italian Foundation for Multiple Sclerosis. This allowed to simulate, a posteriori, the impact of MFRM on the funding scenario. The applications were each scored by 2 to 4 independent referees (total = 131) on a 10-item, 0-3 rating scale called FISM-ProQual-P. The rotation plan assured "connection" of all pairs of projects through at least 1 shared referee.The questionnaire fulfilled satisfactorily the stringent criteria of Rasch measurement for psychometric quality (unidimensionality, reliability and data-model fit). Arbitrarily, 2 acceptability thresholds were set at a raw score of 21/30 and at the equivalent Rasch measure of 61.5/100, respectively. When the cut-off was switched from score to measure 8 out of 18 acceptable projects had to be rejected, while 15 rejected projects became eligible for funding. Some referees, of various severity, were grossly inconsistent (z-std fit indexes less than -1.9 or greater than 1.9). The FISM-ProQual-P questionnaire seems a valid and reliable scale. MFRM may help the decision-making process for allocating funds to MS research projects but also in other fields. In repeated assessment exercises it can help the selection of reliable referees. Their severity can be steadily calibrated, thus obviating the need to connect them with other referees assessing the same projects.

  6. (I Can’t Get No) Saturation: A Simulation and Guidelines for Minimum Sample Sizes in Qualitative Research

    NARCIS (Netherlands)

    van Rijnsoever, F.J.

    2015-01-01

    This paper explores the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the

  7. (I Can’t Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research

    NARCIS (Netherlands)

    van Rijnsoever, Frank J.

    2017-01-01

    I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in

  8. Proteomic analysis of tissue samples in translational breast cancer research

    DEFF Research Database (Denmark)

    Gromov, Pavel; Moreira, José; Gromova, Irina

    2014-01-01

    In the last decade, many proteomic technologies have been applied, with varying success, to the study of tissue samples of breast carcinoma for protein expression profiling in order to discover protein biomarkers/signatures suitable for: characterization and subtyping of tumors; early diagnosis...... the translation of basic discoveries into the daily breast cancer clinical practice. In particular, we address major issues in experimental design by reviewing the strengths and weaknesses of current proteomic strategies in the context of the analysis of human breast tissue specimens....

  9. Research on test of product based on spatial sampling criteria and variable step sampling mechanism

    Science.gov (United States)

    Li, Ruihong; Han, Yueping

    2014-09-01

    This paper presents an effective approach for online testing the assembly structures inside products using multiple views technique and X-ray digital radiography system based on spatial sampling criteria and variable step sampling mechanism. Although there are some objects inside one product to be tested, there must be a maximal rotary step for an object within which the least structural size to be tested is predictable. In offline learning process, Rotating the object by the step and imaging it and so on until a complete cycle is completed, an image sequence is obtained that includes the full structural information for recognition. The maximal rotary step is restricted by the least structural size and the inherent resolution of the imaging system. During online inspection process, the program firstly finds the optimum solutions to all different target parts in the standard sequence, i.e., finds their exact angles in one cycle. Aiming at the issue of most sizes of other targets in product are larger than that of the least structure, the paper adopts variable step-size sampling mechanism to rotate the product specific angles with different steps according to different objects inside the product and match. Experimental results show that the variable step-size method can greatly save time compared with the traditional fixed-step inspection method while the recognition accuracy is guaranteed.

  10. Views of female breast cancer patients who donated biologic samples regarding storage and use of samples for genetic research.

    Science.gov (United States)

    Kaphingst, K A; Janoff, J M; Harris, L N; Emmons, K M

    2006-05-01

    Although social and ethical issues related to the storage and use of biologic specimens for genetic research have been discussed extensively in the medical literature, few empiric data exist describing patients' views. This qualitative study explored the views of 26 female breast cancer patients who had consented to donate blood or tissue samples for breast cancer research. Participants generally did not expect personal benefits from research and had few unprompted concerns. Few participants had concerns about use of samples for studies not planned at the time of consent. Some participants did express concerns about insurance or employment discrimination, while others believed that current privacy protections might actually slow breast cancer research. Participants were generally more interested in receiving individual genetic test results from research studies than aggregate results. Most participants did not want individual results of uncertain clinical significance, although others believed that they should be able to receive such information. These data examined the range of participants' views regarding the storage and use of biologic samples. Further research with different and diverse patient populations is critical to establishing an appropriate balance between protecting the rights of human subjects in genetic research and allowing research to progress.

  11. Descriptions of sampling practices within five approaches to qualitative research in education and the health sciences

    OpenAIRE

    Guetterman, Timothy C.

    2015-01-01

    Although recommendations exist for determining qualitative sample sizes, the literature appears to contain few instances of research on the topic. Practical guidance is needed for determining sample sizes to conduct rigorous qualitative research, to develop proposals, and to budget resources. The purpose of this article is to describe qualitative sample size and sampling practices within published studies in education and the health sciences by research design: case study, ethnography, ground...

  12. Delineating sampling procedures: Pedagogical significance of analysing sampling descriptions and their justifications in TESL experimental research reports

    Directory of Open Access Journals (Sweden)

    Jason Miin-Hwa Lim

    2011-04-01

    Full Text Available Teaching second language learners how to write research reports constitutes a crucial component in programmes on English for Specific Purposes (ESP in institutions of higher learning. One of the rhetorical segments in research reports that merit attention has to do with the descriptions and justifications of sampling procedures. This genre-based study looks into sampling delineations in the Method-related sections of research articles on the teaching of English as a second language (TESL written by expert writers and published in eight reputed international refereed journals. Using Swales’s (1990 & 2004 framework, I conducted a quantitative analysis of the rhetorical steps and a qualitative investigation into the language resources employed in delineating sampling procedures. This investigation has considerable relevance to ESP students and instructors as it has yielded pertinent findings on how samples can be appropriately described to meet the expectations of dissertation examiners, reviewers, and supervisors. The findings of this study have furnished insights into how supervisors and instructors can possibly teach novice writers ways of using specific linguistic mechanisms to lucidly describe and convincingly justify the sampling procedures in the Method sections of experimental research reports.

  13. Descriptions of Sampling Practices Within Five Approaches to Qualitative Research in Education and the Health Sciences

    Directory of Open Access Journals (Sweden)

    Timothy C. Guetterman

    2015-05-01

    Full Text Available Although recommendations exist for determining qualitative sample sizes, the literature appears to contain few instances of research on the topic. Practical guidance is needed for determining sample sizes to conduct rigorous qualitative research, to develop proposals, and to budget resources. The purpose of this article is to describe qualitative sample size and sampling practices within published studies in education and the health sciences by research design: case study, ethnography, grounded theory methodology, narrative inquiry, and phenomenology. I analyzed the 51 most highly cited studies using predetermined content categories and noteworthy sampling characteristics that emerged. In brief, the findings revealed a mean sample size of 87. Less than half of the studies identified a sampling strategy. I include a description of findings by approach and recommendations for sampling to assist methodologists, reviewers, program officers, graduate students, and other qualitative researchers in understanding qualitative sampling practices in recent studies. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1502256

  14. Improving consistency in findings from pharmacoepidemiological studies: The IMI-protect (Pharmacoepidemiological research on outcomes of therapeutics by a European consortium) project

    NARCIS (Netherlands)

    De Groot, Mark C.H.; Schlienger, Raymond; Reynolds, Robert; Gardarsdottir, Helga; Juhaeri, Juhaeri; Hesse, Ulrik; Gasse, Christiane; Rottenkolber, Marietta; Schuerch, Markus; Kurz, Xavier; Klungel, Olaf H.

    2013-01-01

    Background: Pharmacoepidemiological (PE) research should provide consistent, reliable and reproducible results to contribute to the benefit-risk assessment of medicines. IMI-PROTECT aims to identify sources of methodological variations in PE studies using a common protocol and analysis plan across

  15. COMET strongly supported the development and implementation of medium-term topical research roadmaps consistent with the ALLIANCE Strategic Research Agenda.

    Science.gov (United States)

    Garnier-Laplace, J; Vandenhove, H; Beresford, N; Muikku, M; Real, A

    2018-03-01

    The ALLIANCE 6 Strategic Research Agenda (SRA) initiated by the STAR 7 Network of Excellence and integrated in the research strategy implemented by the COMET consortium, defines a long-term vision of the needs for, and implementation of, research in radioecology. This reference document, reflecting views from many stakeholders groups and researchers, serves as an input to those responsible for defining EU research call topics through the ALLIANCE SRA statement delivered each year to the EJP-CONCERT 8 (2015-2020). This statement highlights a focused number of priorities for funding. Research in radioecology and related sciences is justified by various drivers, such as policy changes, scientific advances and knowledge gaps, radiological risk perception by the public, and a growing awareness of interconnections between human and ecosystem health. The SRA is being complemented by topical roadmaps that have been initiated by the COMET 9 EC-funded project, with the help and endorsement of the ALLIANCE. The strategy underlying roadmap development is driven by the need for improved mechanistic understanding across radioecology. By meeting this need, we can provide fit-for-purpose human and environmental impact/risk assessments in support of the protection of man and the environment in interaction with society and for the three exposure situations defined by the ICRP (i.e., planned, existing and emergency). Within the framework of the EJP-CONCERT the development of a joint roadmap is under discussion among all the European research platforms and will highlight the major research needs for the whole radiation protection field and how these are likely to be addressed by 2030.

  16. Sampling frequency of ciliated protozoan microfauna for seasonal distribution research in marine ecosystems.

    Science.gov (United States)

    Xu, Henglong; Yong, Jiang; Xu, Guangjian

    2015-12-30

    Sampling frequency is important to obtain sufficient information for temporal research of microfauna. To determine an optimal strategy for exploring the seasonal variation in ciliated protozoa, a dataset from the Yellow Sea, northern China was studied. Samples were collected with 24 (biweekly), 12 (monthly), 8 (bimonthly per season) and 4 (seasonally) sampling events. Compared to the 24 samplings (100%), the 12-, 8- and 4-samplings recovered 94%, 94%, and 78% of the total species, respectively. To reveal the seasonal distribution, the 8-sampling regime may result in >75% information of the seasonal variance, while the traditional 4-sampling may only explain sampling frequency, the biotic data showed stronger correlations with seasonal variables (e.g., temperature, salinity) in combination with nutrients. It is suggested that the 8-sampling events per year may be an optimal sampling strategy for ciliated protozoan seasonal research in marine ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Consistency of genetic inheritance mode and heritability patterns of triglyceride vs. high density lipoprotein cholesterol ratio in two Taiwanese family samples

    Directory of Open Access Journals (Sweden)

    Yang Chi-Yu

    2003-04-01

    Full Text Available Abstract Background Triglyceride/HDL cholesterol ratio (TG/HDL-C is considered as a risk factor for cardiovascular events. Genetic components were important in controlling the variation in western countries. But the mode of inheritance and family aggregation patterns were still unknown among Asian-Pacific countries. This study, based on families recruited from community and hospital, is aimed to investigate the mode of inheritance, heritability and shared environmental factors in controlling TG/HDL-C. Results Two populations, one from community-based families (n = 988, 894 parent-offspring and 453 sibling pairs and the other from hospital-based families (n = 1313, 76 parent-offspring and 52 sibling pairs were sampled. The population in hospital-based families had higher mean age values than community-based families (54.7 vs. 34.0. Logarithmic transformed TG/ HDL-C values, after adjusted by age, gender and body mass index, were for genetic analyses. Significant parent-offspring and sibling correlations were also found in both samples. The parent-offspring correlation coefficient was higher in the hospital-based families than in the community-based families. Genetic heritability was higher in community-based families (0.338 ± 0.114, p = 0.002, but the common shared environmental factor was higher in hospital-based families (0.203 ± 0.042, p Conclusion Variations of TG/HDL-C in the normal ranges were likely to be influenced by multiple factors, including environmental and genetic components. Higher genetic factors were proved in younger community-based families than in older hospital-based families.

  18. Internal consistency, concurrent validity, and discriminant validity of a measure of public support for policies for active living in transportation (PAL-T) in a population-based sample of adults.

    Science.gov (United States)

    Fuller, Daniel; Gauvin, Lise; Fournier, Michel; Kestens, Yan; Daniel, Mark; Morency, Patrick; Drouin, Louis

    2012-04-01

    Active living is a broad conceptualization of physical activity that incorporates domains of exercise; recreational, household, and occupational activities; and active transportation. Policy makers develop and implement a variety of transportation policies that can influence choices about how to travel from one location to another. In making such decisions, policy makers act in part in response to public opinion or support for proposed policies. Measures of the public's support for policies aimed at promoting active transportation can inform researchers and policy makers. This study examined the internal consistency, and concurrent and discriminant validity of a newly developed measure of the public's support for policies for active living in transportation (PAL-T). A series of 17 items representing potential policies for promoting active transportation was generated. Two samples of participants (n = 2,001 and n = 2,502) from Montreal, Canada, were recruited via random digit dialling. Analyses were conducted on the combined data set (n = 4,503). Participants were aged 18 through 94 years (58% female). The concurrent and discriminant validity of the PAL-T was assessed by examining relationships with physical activity and smoking. To explore the usability of the PAL-T, predicted scale scores were compared to the summed values of responses. Results showed that the internal consistency of the PAL-T was 0.70. Multilevel regression demonstrated no relationship between the PAL-T and smoking status (p > 0.05) but significant relationships with utilitarian walking (p public opinion can inform policy makers and support advocacy efforts aimed at making built environments more suitable for active transportation while allowing researchers to examine the antecedents and consequences of public support for policies.

  19. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  20. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  1. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  2. Connecting Research to Teaching: Using Data to Motivate the Use of Empirical Sampling Distributions

    Science.gov (United States)

    Lee, Hollylynne S.; Starling, Tina T.; Gonzalez, Marggie D.

    2014-01-01

    Research shows that students often struggle with understanding empirical sampling distributions. Using hands-on and technology models and simulations of problems generated by real data help students begin to make connections between repeated sampling, sample size, distribution, variation, and center. A task to assist teachers in implementing…

  3. Attitude of A Sample of Iranian Researchers toward The Future of Stem Cell Research.

    Science.gov (United States)

    Lotfipanah, Mahdi; Azadeh, Fereydoon; Totonchi, Mehdi; Omani-Samani, Reza

    2018-10-01

    Stem cells that have unlimited proliferation potential as well as differentiation potency are considered to be a promising future treatment method for incurable diseases. The aim of the present study is to evaluate the future trend of stem cell researches from researchers' viewpoints. This was a cross-sectional descriptive study on researchers involved in stem cell research at Royan Institute. We designed a questionnaire using a qualitative study based on expert opinion and a literature review. Content validity was performed using three rounds of the Delphi method with experts. Face validity was undertaken by a Persian literature expert and a graphics designer. The questionnaire was distributed among 150 researchers involved in stem cell studies in Royan Institute biology laboratories. We collected 138 completed questionnaires. The mean age of participants was 31.13 ± 5.8 years; most (60.9%) were females. Participants (76.1%) considered the budget to be the most important issue in stem cell research, 79.7% needed financial support from the government, and 77.5% felt that charities could contribute substantially to stem cell research. A total of 90.6% of participants stated that stem cells should lead to commercial usage which could support future researches (86.2%). The aim of stem cell research was stipulated as increasing health status of the society according to 92.8% of the participants. At present, among cell types, importance was attached to cord blood and adult stem cells. Researchers emphasized the importance of mesenchymal stem cells (MSCs) rather than hematopoietic stem cells (HSCs, 57.73%). The prime priorities were given to cancer so that stem cell research could be directed to sphere stem cell research whereas the least preference was given to skin research. Regenerative medicine is considered the future of stem cell research with emphasis on application of these cells, especially in cancer treatment. Copyright© by Royan Institute. All rights

  4. (I Can’t Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research

    Science.gov (United States)

    2017-01-01

    I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: “random chance,” which is based on probability sampling, “minimal information,” which yields at least one new code per sampling step, and “maximum information,” which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario. PMID:28746358

  5. (I Can't Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research.

    Science.gov (United States)

    van Rijnsoever, Frank J

    2017-01-01

    I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: "random chance," which is based on probability sampling, "minimal information," which yields at least one new code per sampling step, and "maximum information," which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario.

  6. Research recruitment: A marketing framework to improve sample representativeness in health research.

    Science.gov (United States)

    Howcutt, Sarah J; Barnett, Anna L; Barbosa-Boucas, Sofia; Smith, Lesley A

    2018-04-01

    This discussion paper proposes a five-part theoretical framework to inform recruitment strategies. The framework is based on a marketing model of consumer decision-making. Respondents in surveys are typically healthier than non-respondents, which has an impact on the availability of information about those most in need. Previous research has identified response patterns, provided theories about why people participate in research and evaluated different recruitment strategies. Social marketing has been applied successfully to recruitment and promotes focus on the needs of the participant, but little attention has been paid to the periods before and after participant-researcher contact (during advertising and following completion of studies). We propose a new model which conceptualises participation as a decision involving motivation, perception of information, attitude formation, integration of intention and action and finally evaluation and sharing of experience. Discussion paper. This discussion paper presents a critical review. No literature was excluded on date and the included citations span the years 1981-2017. The proposed framework suggests that researchers could engage a broader demographic if they shape research design and advertising to perform functions that participants are seeking to achieve. The framework provides a novel and useful conceptualisation of recruitment which could help to inform public engagement in research design, researcher training and research policy. This framework challenges researchers to investigate the goals of the potential participants when designing a study's advertising and procedures. © 2017 John Wiley & Sons Ltd.

  7. Sampling in interview-based qualitative research: A theoretical and practical guide

    OpenAIRE

    Robinson, Oliver

    2014-01-01

    Sampling is central to the practice of qualitative methods, but compared with data collection and analysis, its processes are discussed relatively little. A four-point approach to sampling in qualitative interview-based research is presented and critically discussed in this article, which integrates theory and process for the following: (1) Defining a sample universe, by way of specifying inclusion and exclusion criteria for potential participation; (2) Deciding upon a sample size, through th...

  8. Samples in applied psychology: over a decade of research in review.

    Science.gov (United States)

    Shen, Winny; Kiger, Thomas B; Davies, Stacy E; Rasch, Rena L; Simon, Kara M; Ones, Deniz S

    2011-09-01

    This study examines sample characteristics of articles published in Journal of Applied Psychology (JAP) from 1995 to 2008. At the individual level, the overall median sample size over the period examined was approximately 173, which is generally adequate for detecting the average magnitude of effects of primary interest to researchers who publish in JAP. Samples using higher units of analyses (e.g., teams, departments/work units, and organizations) had lower median sample sizes (Mdn ≈ 65), yet were arguably robust given typical multilevel design choices of JAP authors despite the practical constraints of collecting data at higher units of analysis. A substantial proportion of studies used student samples (~40%); surprisingly, median sample sizes for student samples were smaller than working adult samples. Samples were more commonly occupationally homogeneous (~70%) than occupationally heterogeneous. U.S. and English-speaking participants made up the vast majority of samples, whereas Middle Eastern, African, and Latin American samples were largely unrepresented. On the basis of study results, recommendations are provided for authors, editors, and readers, which converge on 3 themes: (a) appropriateness and match between sample characteristics and research questions, (b) careful consideration of statistical power, and (c) the increased popularity of quantitative synthesis. Implications are discussed in terms of theory building, generalizability of research findings, and statistical power to detect effects. PsycINFO Database Record (c) 2011 APA, all rights reserved

  9. Neutron activation analysis of bulk samples from Chinese ancient porcelain to provenance research

    International Nuclear Information System (INIS)

    Jian Zhu; Wentao Hao; Jianming Zhen; Tongxiu Zhen; Glascock, M.D.

    2013-01-01

    Neutron activation analysis (NAA) is an important technique to determine the provenance of ancient ceramics. The most common technique used for preparing ancient samples for NAA is to grind them into a powder and then encapsulate them before neutron irradiation. Unfortunately, ceramic materials are typically very hard making it a challenge to grind them into a powder. In this study we utilize bulk porcelain samples cut from ancient shards. The bulk samples are irradiated by neutrons alongside samples that have been conventionally ground into a powder. The NAA for both the bulk samples and powders are compared and shown to provide equivalent information regarding their chemical composition. Also, the multivariate statistical have been employed to the analysis data for check the consistency. The findings suggest that NAA results are less dependent on the state of the porcelain sample, and thus bulk samples cut from shards may be used to effectively determine their provenance. (author)

  10. STATISTICAL LANDMARKS AND PRACTICAL ISSUES REGARDING THE USE OF SIMPLE RANDOM SAMPLING IN MARKET RESEARCHES

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2010-01-01

    Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.

  11. Breaking Free of Sample Size Dogma to Perform Innovative Translational Research

    Science.gov (United States)

    Bacchetti, Peter; Deeks, Steven G.; McCune, Joseph M.

    2011-01-01

    Innovative clinical and translational research is often delayed or prevented by reviewers’ expectations that any study performed in humans must be shown in advance to have high statistical power. This supposed requirement is not justifiable and is contradicted by the reality that increasing sample size produces diminishing marginal returns. Studies of new ideas often must start small (sometimes even with an N of 1) because of cost and feasibility concerns, and recent statistical work shows that small sample sizes for such research can produce more projected scientific value per dollar spent than larger sample sizes. Renouncing false dogma about sample size would remove a serious barrier to innovation and translation. PMID:21677197

  12. [Formal sample size calculation and its limited validity in animal studies of medical basic research].

    Science.gov (United States)

    Mayer, B; Muche, R

    2013-01-01

    Animal studies are highly relevant for basic medical research, although their usage is discussed controversially in public. Thus, an optimal sample size for these projects should be aimed at from a biometrical point of view. Statistical sample size calculation is usually the appropriate methodology in planning medical research projects. However, required information is often not valid or only available during the course of an animal experiment. This article critically discusses the validity of formal sample size calculation for animal studies. Within the discussion, some requirements are formulated to fundamentally regulate the process of sample size determination for animal experiments.

  13. Experience-Sampling Research Methods and Their Potential for Education Research

    Science.gov (United States)

    Zirkel, Sabrina; Garcia, Julie A.; Murphy, Mary C.

    2015-01-01

    Experience-sampling methods (ESM) enable us to learn about individuals' lives in context by measuring participants' feelings, thoughts, actions, context, and/or activities as they go about their daily lives. By capturing experience, affect, and action "in the moment" and with repeated measures, ESM approaches allow researchers…

  14. Approaches to sampling and case selection in qualitative research: examples in the geography of health.

    Science.gov (United States)

    Curtis, S; Gesler, W; Smith, G; Washburn, S

    2000-04-01

    This paper focuses on the question of sampling (or selection of cases) in qualitative research. Although the literature includes some very useful discussions of qualitative sampling strategies, the question of sampling often seems to receive less attention in methodological discussion than questions of how data is collected or is analysed. Decisions about sampling are likely to be important in many qualitative studies (although it may not be an issue in some research). There are varying accounts of the principles applicable to sampling or case selection. Those who espouse 'theoretical sampling', based on a 'grounded theory' approach, are in some ways opposed to those who promote forms of 'purposive sampling' suitable for research informed by an existing body of social theory. Diversity also results from the many different methods for drawing purposive samples which are applicable to qualitative research. We explore the value of a framework suggested by Miles and Huberman [Miles, M., Huberman,, A., 1994. Qualitative Data Analysis, Sage, London.], to evaluate the sampling strategies employed in three examples of research by the authors. Our examples comprise three studies which respectively involve selection of: 'healing places'; rural places which incorporated national anti-malarial policies; young male interviewees, identified as either chronically ill or disabled. The examples are used to show how in these three studies the (sometimes conflicting) requirements of the different criteria were resolved, as well as the potential and constraints placed on the research by the selection decisions which were made. We also consider how far the criteria Miles and Huberman suggest seem helpful for planning 'sample' selection in qualitative research.

  15. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  16. Feasibility studies on large sample neutron activation analysis using a low power research reactor

    International Nuclear Information System (INIS)

    Gyampo, O.

    2008-06-01

    Instrumental neutron activation analysis (INAA) using Ghana Research Reactor-1 (GHARR-1) can be directly applied to samples with masses in grams. Samples weights were in the range of 0.5g to 5g. Therefore, the representativity of the sample is improved as well as sensitivity. Irradiation of samples was done using a low power research reactor. The correction for the neutron self-shielding within the sample is determined from measurement of the neutron flux depression just outside the sample. Correction for gamma ray self-attenuation in the sample was performed via linear attenuation coefficients derived from transmission measurements. Quantitative and qualitative analysis of data were done using gamma ray spectrometry (HPGe detector). The results of this study on the possibilities of large sample NAA using a miniature neutron source reactor (MNSR) show clearly that the Ghana Research Reactor-1 (GHARR-1) at the National Nuclear Research Institute (NNRI) can be used for sample analyses up to 5 grams (5g) using the pneumatic transfer systems.

  17. JSC Advanced Curation: Research and Development for Current Collections and Future Sample Return Mission Demands

    Science.gov (United States)

    Fries, M. D.; Allen, C. C.; Calaway, M. J.; Evans, C. A.; Stansbery, E. K.

    2015-01-01

    Curation of NASA's astromaterials sample collections is a demanding and evolving activity that supports valuable science from NASA missions for generations, long after the samples are returned to Earth. For example, NASA continues to loan hundreds of Apollo program samples to investigators every year and those samples are often analyzed using instruments that did not exist at the time of the Apollo missions themselves. The samples are curated in a manner that minimizes overall contamination, enabling clean, new high-sensitivity measurements and new science results over 40 years after their return to Earth. As our exploration of the Solar System progresses, upcoming and future NASA sample return missions will return new samples with stringent contamination control, sample environmental control, and Planetary Protection requirements. Therefore, an essential element of a healthy astromaterials curation program is a research and development (R&D) effort that characterizes and employs new technologies to maintain current collections and enable new missions - an Advanced Curation effort. JSC's Astromaterials Acquisition & Curation Office is continually performing Advanced Curation research, identifying and defining knowledge gaps about research, development, and validation/verification topics that are critical to support current and future NASA astromaterials sample collections. The following are highlighted knowledge gaps and research opportunities.

  18. Research on stored biological samples: views of African American and White American cancer patients.

    Science.gov (United States)

    Pentz, Rebecca D; Billot, Laurent; Wendler, David

    2006-04-01

    Proposals on consent for research with biological samples should be informed by empirical studies of individuals' views. Studies to date queried mostly white research subjects. The aim of this study was to compare the views of two groups of patients: cancer patients at a university clinic (Winship Cancer Institute at Emory Healthcare) and cancer patients at an inner city county hospital (Grady) who were given the option of tissue banking. Overall, 315/452 (70%) patients completed the survey. The Grady cohort was 86% African American; the Winship cohort was 82% White. The vast majority (95%) of individuals in both cohorts agreed to provide a biological sample for future research. Both cohorts were willing for their samples to be used to study cancer and other diseases, including Alzheimer disease. Few participants preferred to control the disease to be studied (10%) or wished to be contacted again for consent for each future research project (11%). In our sample, almost all clinical patients, regardless of site of care, ethnicity or socioeconomic status, were willing to provide a biological sample for research purposes and allow investigators to determine the research to be done without contacting the patients again. These findings support the recommendation to offer individuals a simplified consent with a one-time binary choice whether to provide biological samples for future research. Copyright 2006 Wiley-Liss, Inc.

  19. Life Science Research Sample Transfer Technology for On Orbit Analysis, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — With retirement of the space shuttle program, microgravity researchers can no longer count on bringing experiment samples back to earth for post-flight analysis....

  20. A Systematic Review of Gay, Lesbian, and Bisexual Research Samples in Couple and Family Therapy Journals.

    Science.gov (United States)

    Hartwell, Erica E; Serovich, Julianne M; Reed, Sandra J; Boisvert, Danielle; Falbo, Teresa

    2017-07-01

    The purpose of this study is to review samples from research on gay, lesbian, and bisexual (GLB) issues and to evaluate the suitability of this body of research to support affirmative and evidence-based practice with GLB clients. The authors systematically reviewed the sampling methodology and sample composition of GLB-related research. All original, quantitative articles focusing on GLB issues published in couple and family therapy (CFT)-related journals since 1975 were coded (n = 153). Results suggest that within the GLB literature base there is some evidence of heterocentrism as well as neglect of issues of class, race, and gender. Suggestions to improve the diversity and representativeness of samples-and, thus, clinical implications-of GLB-related research in CFT literature are provided. © 2017 American Association for Marriage and Family Therapy.

  1. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  2. Sampling Methods and the Accredited Population in Athletic Training Education Research

    Science.gov (United States)

    Carr, W. David; Volberding, Jennifer

    2009-01-01

    Context: We describe methods of sampling the widely-studied, yet poorly defined, population of accredited athletic training education programs (ATEPs). Objective: There are two purposes to this study; first to describe the incidence and types of sampling methods used in athletic training education research, and second to clearly define the…

  3. Research and application of sampling and analysis method of sodium aerosol

    International Nuclear Information System (INIS)

    Yu Xiaochen; Guo Qingzhou; Wen Ximeng

    1998-01-01

    Method of sampling-analysis for sodium aerosol is researched. The vacuum sampling technology is used in the sampling process, and the analysis method adopted is volumetric analysis and atomic absorption. When the absolute content of sodium is in the rang of 0.1 mg to 1.0 mg, the deviation of results between volumetric analysis and atomic absorption is less than 2%. The method has been applied in a sodium aerosol removal device successfully. The analysis range, accuracy and precision can meet the requirements for researching sodium aerosol

  4. Utilization of the National Inpatient Sample for abdominal aortic aneurysm research.

    Science.gov (United States)

    Dua, Anahita; Ali, Fadwa; Traudt, Elizabeth; Desai, Sapan S

    2017-10-01

    Large administrative databases, including the Medicare database by the Centers for Medicare and Medicaid Services, the National Surgical Quality Improvement Project database sponsored by the American College of Surgeons, and the National Inpatient Sample, have been used by major public health agencies for years. More recently, medical researchers have turned to database research to power studies on diseases that are noted to be relatively scarce. This study aimed to review and discuss the utilization of the National Inpatient Sample for abdominal aortic aneurysm research, inclusive of its advantages, disadvantages, and best practices. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Publications and geothermal sample library facilities of the Earth Science Laboratory, University of Utah Research Institute

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Phillip M.; Ruth, Kathryn A.; Langton, David R.; Bullett, Michael J.

    1990-03-30

    The Earth Science Laboratory of the University of Utah Research Institute has been involved in research in geothermal exploration and development for the past eleven years. Our work has resulted in the publication of nearly 500 reports, which are listed in this document. Over the years, we have collected drill chip and core samples from more than 180 drill holes in geothermal areas, and most of these samples are available to others for research, exploration and similar purposes. We hope that scientists and engineers involved in industrial geothermal development will find our technology transfer and service efforts helpful.

  6. Comparing two sampling methods to engage hard-to-reach communities in research priority setting.

    Science.gov (United States)

    Valerio, Melissa A; Rodriguez, Natalia; Winkler, Paula; Lopez, Jaime; Dennison, Meagen; Liang, Yuanyuan; Turner, Barbara J

    2016-10-28

    Effective community-partnered and patient-centered outcomes research needs to address community priorities. However, optimal sampling methods to engage stakeholders from hard-to-reach, vulnerable communities to generate research priorities have not been identified. In two similar rural, largely Hispanic communities, a community advisory board guided recruitment of stakeholders affected by chronic pain using a different method in each community: 1) snowball sampling, a chain- referral method or 2) purposive sampling to recruit diverse stakeholders. In both communities, three groups of stakeholders attended a series of three facilitated meetings to orient, brainstorm, and prioritize ideas (9 meetings/community). Using mixed methods analysis, we compared stakeholder recruitment and retention as well as priorities from both communities' stakeholders on mean ratings of their ideas based on importance and feasibility for implementation in their community. Of 65 eligible stakeholders in one community recruited by snowball sampling, 55 (85 %) consented, 52 (95 %) attended the first meeting, and 36 (65 %) attended all 3 meetings. In the second community, the purposive sampling method was supplemented by convenience sampling to increase recruitment. Of 69 stakeholders recruited by this combined strategy, 62 (90 %) consented, 36 (58 %) attended the first meeting, and 26 (42 %) attended all 3 meetings. Snowball sampling recruited more Hispanics and disabled persons (all P research, focusing on non-pharmacologic interventions for management of chronic pain. Ratings on importance and feasibility for community implementation differed only on the importance of massage services (P = 0.045) which was higher for the purposive/convenience sampling group and for city improvements/transportation services (P = 0.004) which was higher for the snowball sampling group. In each of the two similar hard-to-reach communities, a community advisory board partnered with researchers

  7. Comparing two sampling methods to engage hard-to-reach communities in research priority setting

    Directory of Open Access Journals (Sweden)

    Melissa A. Valerio

    2016-10-01

    Full Text Available Abstract Background Effective community-partnered and patient-centered outcomes research needs to address community priorities. However, optimal sampling methods to engage stakeholders from hard-to-reach, vulnerable communities to generate research priorities have not been identified. Methods In two similar rural, largely Hispanic communities, a community advisory board guided recruitment of stakeholders affected by chronic pain using a different method in each community: 1 snowball sampling, a chain- referral method or 2 purposive sampling to recruit diverse stakeholders. In both communities, three groups of stakeholders attended a series of three facilitated meetings to orient, brainstorm, and prioritize ideas (9 meetings/community. Using mixed methods analysis, we compared stakeholder recruitment and retention as well as priorities from both communities’ stakeholders on mean ratings of their ideas based on importance and feasibility for implementation in their community. Results Of 65 eligible stakeholders in one community recruited by snowball sampling, 55 (85 % consented, 52 (95 % attended the first meeting, and 36 (65 % attended all 3 meetings. In the second community, the purposive sampling method was supplemented by convenience sampling to increase recruitment. Of 69 stakeholders recruited by this combined strategy, 62 (90 % consented, 36 (58 % attended the first meeting, and 26 (42 % attended all 3 meetings. Snowball sampling recruited more Hispanics and disabled persons (all P < 0.05. Despite differing recruitment strategies, stakeholders from the two communities identified largely similar ideas for research, focusing on non-pharmacologic interventions for management of chronic pain. Ratings on importance and feasibility for community implementation differed only on the importance of massage services (P = 0.045 which was higher for the purposive/convenience sampling group and for city improvements

  8. Consistency of the MLE under mixture models

    OpenAIRE

    Chen, Jiahua

    2016-01-01

    The large-sample properties of likelihood-based statistical inference under mixture models have received much attention from statisticians. Although the consistency of the nonparametric MLE is regarded as a standard conclusion, many researchers ignore the precise conditions required on the mixture model. An incorrect claim of consistency can lead to false conclusions even if the mixture model under investigation seems well behaved. Under a finite normal mixture model, for instance, the consis...

  9. Improving the quality of biomarker discovery research: the right samples and enough of them.

    Science.gov (United States)

    Pepe, Margaret S; Li, Christopher I; Feng, Ziding

    2015-06-01

    Biomarker discovery research has yielded few biomarkers that validate for clinical use. A contributing factor may be poor study designs. The goal in discovery research is to identify a subset of potentially useful markers from a large set of candidates assayed on case and control samples. We recommend the PRoBE design for selecting samples. We propose sample size calculations that require specifying: (i) a definition for biomarker performance; (ii) the proportion of useful markers the study should identify (Discovery Power); and (iii) the tolerable number of useless markers amongst those identified (False Leads Expected, FLE). We apply the methodology to a study of 9,000 candidate biomarkers for risk of colon cancer recurrence where a useful biomarker has positive predictive value ≥ 30%. We find that 40 patients with recurrence and 160 without recurrence suffice to filter out 98% of useless markers (2% FLE) while identifying 95% of useful biomarkers (95% Discovery Power). Alternative methods for sample size calculation required more assumptions. Biomarker discovery research should utilize quality biospecimen repositories and include sample sizes that enable markers meeting prespecified performance characteristics for well-defined clinical applications to be identified. The scientific rigor of discovery research should be improved. ©2015 American Association for Cancer Research.

  10. Structural Consistency, Consistency, and Sequential Rationality.

    OpenAIRE

    Kreps, David M; Ramey, Garey

    1987-01-01

    Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...

  11. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  12. Reproducibility of preclinical animal research improves with heterogeneity of study samples

    Science.gov (United States)

    Vogt, Lucile; Sena, Emily S.; Würbel, Hanno

    2018-01-01

    Single-laboratory studies conducted under highly standardized conditions are the gold standard in preclinical animal research. Using simulations based on 440 preclinical studies across 13 different interventions in animal models of stroke, myocardial infarction, and breast cancer, we compared the accuracy of effect size estimates between single-laboratory and multi-laboratory study designs. Single-laboratory studies generally failed to predict effect size accurately, and larger sample sizes rendered effect size estimates even less accurate. By contrast, multi-laboratory designs including as few as 2 to 4 laboratories increased coverage probability by up to 42 percentage points without a need for larger sample sizes. These findings demonstrate that within-study standardization is a major cause of poor reproducibility. More representative study samples are required to improve the external validity and reproducibility of preclinical animal research and to prevent wasting animals and resources for inconclusive research. PMID:29470495

  13. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  14. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  15. Twenty-year trends of authorship and sampling in applied biomechanics research.

    Science.gov (United States)

    Knudson, Duane

    2012-02-01

    This study documented the trends in authorship and sampling in applied biomechanics research published in the Journal of Applied Biomechanics and ISBS Proceedings. Original research articles of the 1989, 1994, 1999, 2004, and 2009 volumes of these serials were reviewed, excluding reviews, modeling papers, technical notes, and editorials. Compared to 1989 volumes, the mean number of authors per paper significantly increased (35 and 100%, respectively) in the 2009 volumes, along with increased rates of hyperauthorship, and a decline in rates of single authorship. Sample sizes varied widely across papers and did not appear to change since 1989.

  16. IMPROVEMENT OF METHODS FOR HYDROBIOLOGICAL RESEARCH AND MODIFICATION OF STANDARD TOOLS FOR SAMPLE COLLECTION

    Directory of Open Access Journals (Sweden)

    M. M. Aligadjiev

    2015-01-01

    Full Text Available Aim. The paper discusses the improvement of methods of hydrobiological studies by modifying tools for plankton and benthic samples collecting. Methods. In order to improve the standard methods of hydro-biological research, we have developed tools for sampling zooplankton and benthic environment of the Caspian Sea. Results. Long-term practice of selecting hydrobiological samples in the Caspian Sea shows that it is required to complete the modernization of the sampling tools used to collect hydrobiological material. With the introduction of Azov and Black Sea invasive comb jelly named Mnemiopsis leidyi A. Agassiz to the Caspian Sea there is a need to collect plankton samples without disturbing its integrity. Tools for collecting benthic fauna do not always give a complete picture of the state of benthic ecosystems because of the lack of visual site selection for sampling. Moreover, while sampling by dredge there is a probable loss of the samples, especially in areas with difficult terrain. Conclusion. We propose to modify a small model of Upstein net (applied in shallow water to collect zooplankton samples with an upper inverted cone that will significantly improve the catchability of the net in theCaspian Sea. Bottom sampler can be improved by installing a video camera for visual inspection of the bottom topography, and use sensors to determine tilt of the dredge and the position of the valves of the bucket. 

  17. Are samples drawn from Mechanical Turk valid for research on political ideology?

    Directory of Open Access Journals (Sweden)

    Scott Clifford

    2015-12-01

    Full Text Available Amazon’s Mechanical Turk (MTurk is an increasingly popular tool for the recruitment of research subjects. While there has been much focus on the demographic differences between MTurk samples and the national public, we know little about whether liberals and conservatives recruited from MTurk share the same psychological dispositions as their counterparts in the mass public. In the absence of such evidence, some have argued that the selection process involved in joining MTurk invalidates the subject pool for studying questions central to political science. In this paper, we evaluate this claim by comparing a large MTurk sample to two benchmark national samples – one conducted online and one conducted face-to-face. We examine the personality and value-based motivations of political ideology across the three samples. All three samples produce substantively identical results with only minor variation in effect sizes. In short, liberals and conservatives in our MTurk sample closely mirror the psychological divisions of liberals and conservatives in the mass public, though MTurk liberals hold more characteristically liberal values and attitudes than liberals from representative samples. Overall, our results suggest that MTurk is a valid recruitment tool for psychological research on political ideology.

  18. Recruiting a representative sample in adherence research-The MALT multisite prospective cohort study experience.

    Science.gov (United States)

    Shemesh, Eyal; Mitchell, Jeffrey; Neighbors, Katie; Feist, Susan; Hawkins, Andre; Brown, Amanda; Wanrong, Yin; Anand, Ravinder; Stuber, Margaret L; Annunziato, Rachel A

    2017-12-01

    Medication adherence is an important determinant of transplant outcomes. Attempts to investigate adherence are frequently undermined by selection bias: It is very hard to recruit and retain non-adherent patients in research efforts. This manuscript presents recruitment strategies and results from the MALT (Medication Adherence in children who had a Liver Transplant) multisite prospective cohort study. MALT sites recruited 400 pediatric liver transplant patients who agreed to be followed for 2 years. The primary purpose was to determine whether a marker of adherence, the Medication Level Variability Index (MLVI), predicts rejection outcomes. The present manuscript describes methods used in MALT to ensure that a representative sample was recruited, and presents detailed recruitment results. MALT sites were able to recruit a nationally representative sample, as determined by a comparison between the MALT cohort and a national sample of transplant recipients. Strategies that helped ensure that the sample was representative included monitoring of the outcome measure in comparison with a national sample, drastically limiting patient burden, and specific recruitment methods. We discuss the importance of a representative sample in adherence research and recommend that future efforts to study adherence pay special attention to sample characteristics. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Peripheral biomarkers revisited: integrative profiling of peripheral samples for psychiatric research.

    Science.gov (United States)

    Hayashi-Takagi, Akiko; Vawter, Marquis P; Iwamoto, Kazuya

    2014-06-15

    Peripheral samples, such as blood and skin, have been used for decades in psychiatric research as surrogates for central nervous system samples. Although the validity of the data obtained from peripheral samples has been questioned and other state-of-the-art techniques, such as human brain imaging, genomics, and induced pluripotent stem cells, seem to reduce the value of peripheral cells, accumulating evidence has suggested that revisiting peripheral samples is worthwhile. Here, we re-evaluate the utility of peripheral samples and argue that establishing an understanding of the common signaling and biological processes in the brain and peripheral samples is required for the validity of such models. First, we present an overview of the available types of peripheral cells and describe their advantages and disadvantages. We then briefly summarize the main achievements of omics studies, including epigenome, transcriptome, proteome, and metabolome analyses, as well as the main findings of functional cellular assays, the results of which imply that alterations in neurotransmission, metabolism, the cell cycle, and the immune system may be partially responsible for the pathophysiology of major psychiatric disorders such as schizophrenia. Finally, we discuss the future utility of peripheral samples for the development of biomarkers and tailor-made therapies, such as multimodal assays that are used as a battery of disease and trait pathways and that might be potent and complimentary tools for use in psychiatric research. © 2013 Society of Biological Psychiatry Published by Society of Biological Psychiatry All rights reserved.

  20. Research on self-absorption corrections for laboratory γ spectral analysis of soil samples

    International Nuclear Information System (INIS)

    Tian Zining; Jia Mingyan; Li Huibin; Cheng Ziwei; Ju Lingjun; Shen Maoquan; Yang Xiaoyan; Yan Ling; Fen Tiancheng

    2010-01-01

    Based on the calibration results of the point sources,dimensions of HPGe crystal were characterized.Linear attenuation coefficients and detection efficiencies of all kinds of samples were calculated,and the function F(μ) of φ75 mm x 25 mm sample was established. Standard surface source was used to simulate the source of different heights in the soil sample. And the function ε(h) which reflect the relationship between detection efficiencies and heights of the surface sources was determined. The detection efficiency of calibration source can be obtained by integration, F(μ) functions of soil samples established is consistent with the result of MCNP calculation code. Several φ75 mm x 25 mm soil samples were measured by the HPGe spectrometer,and the function F(μ) was used to correct the self absorption. F(μ) functions of soil samples of various dimensions can be calculated by MCNP calculation code established, and self absorption correction can be done. To verify the efficiency of calculation results, φ75 mm x 75 mm soil samples were measured. Several φ75 mm x 25 mm soil samples from aerosphere nuclear testing field was measured by the HPGe spectrometer,and the function F(μ) was used to correct the self absorption. The function F(m) was established, and the technical method which is used to correct the soil samples of unknown area is also given. The correction method of surface source greatly improves the gamma spectrum's metrical accuracy, and it will be widely applied to environmental radioactive investigation. (authors)

  1. Race and Research Methods Anxiety in an Undergraduate Sample: The Potential Effects of Self-Perception

    Science.gov (United States)

    Eckberg, Deborah A.

    2015-01-01

    This study explores race as a potential predictor of research methods anxiety among a sample of undergraduates. While differences in academic achievement based on race and ethnicity have been well documented, few studies have examined racial differences in anxiety with regard to specific subject matter in undergraduate curricula. This exploratory…

  2. Differences between Internet samples and conventional samples of men who have sex with men: implications for research and HIV interventions.

    Science.gov (United States)

    Ross, M W; Tikkanen, R; Månsson, S A

    2000-09-01

    The Internet is becoming a new erotic oasis for obtaining sex online or in person. We reviewed the literature on cybersex and compared differences in data from samples of homosexually active men obtained on identical questionnaires from a conventional written questionnaire, distributed through the mailing and contact lists of a large national gay organization in Sweden, and through the same organization's website and chat room. A total of 716 written questionnaires and 678 Internet questionnaires were obtained. The Internet sample was younger, more likely to live in small towns or cities, live with parents or a girlfriend, and have lower formal education. They are less likely to have previous sexual experience solely with other men (one in three of the Internet sample vs. 1 in 14 of the written sample defined themselves as bisexual) and more likely to visit erotic oases such as bathhouses, video clubs and erotic movie houses. They also visited Internet chat rooms more frequently (86% of the Internet sample vs. 50% of the written sample). One third of the Internet sample wanted the opportunity to talk with an expert about HIV compared with a quarter of the written sample. Sexual practices between the two samples were generally similar, although the Internet sample reported significantly less body contact, kissing, hugging, mutual masturbation, and more condom use for anal intercourse with steady partners. Over four times as many of the Internet samples reported sex with women in the past year as the written sample. These data indicate that Internet data collection is feasible and that this mode of data collection, despite the nonrandom and self-selected nature of both types of samples, is likely to be more significantly oriented toward the young, geographically more isolated, and more behaviorally and self-identified bisexual respondent than conventionally distributed written questionnaires.

  3. Ethics and law in research with human biological samples: a new approach.

    Science.gov (United States)

    Petrini, Carlo

    2014-01-01

    During the last century a large number of documents (regulations, ethical codes, treatises, declarations, conventions) were published on the subject of ethics and clinical trials, many of them focusing on the protection of research participants. More recently various proposals have been put forward to relax some of the constraints imposed on research by these documents and regulations. It is important to distinguish between risks deriving from direct interventions on human subjects and other types of risk. In Italy the Data Protection Authority has acted in the question of research using previously collected health data and biological samples to simplify the procedures regarding informed consent. The new approach may be of help to other researchers working outside Italy.

  4. Genomic research with human samples. Points of view from scientists and research subjects about disclosure of results and risks of genomic research. Ethical and empirical approach.

    Science.gov (United States)

    Valle Mansilla, José Ignacio

    2011-01-01

    Biomedical researchers often now ask subjects to donate samples to be deposited in biobanks. This is not only of interest to researchers, patients and society as a whole can benefit from the improvements in diagnosis, treatment, and prevention that the advent of genomic medicine portends. However, there is a growing debate regarding the social and ethical implications of creating biobanks and using stored human tissue samples for genomic research. Our aim was to identify factors related to both scientists and patients' preferences regarding the sort of information to convey to subjects about the results of the study and the risks related to genomic research. The method used was a survey addressed to 204 scientists and 279 donors from the U.S. and Spain. In this sample, researchers had already published genomic epidemiology studies; and research subjects had actually volunteered to donate a human sample for genomic research. Concerning the results, patients supported more frequently than scientists their right to know individual results from future genomic research. These differences were statistically significant after adjusting by the opportunity to receive genetic research results from the research they had previously participated and their perception of risks regarding genetic information compared to other clinical data. A slight majority of researchers supported informing participants about individual genomic results only if the reliability and clinical validity of the information had been established. Men were more likely than women to believe that patients should be informed of research results even if these conditions were not met. Also among patients, almost half of them would always prefer to be informed about individual results from future genomic research. The three main factors associated to a higher support of a non-limited access to individual results were: being from the US, having previously been offered individual information and considering

  5. Gel dosimetry - a laser based 3D scanner for gel samples - research in India

    Energy Technology Data Exchange (ETDEWEB)

    Widmer, Johannes [Institut fuer Angewandte Photophysik, TU Dresden (Germany); Photonics Division, VIT University, Vellore, Tamil Nadu (India); Dhiviyaraj Kalaiselven, Senthil Kumar [Photonics Division, VIT University, Vellore, Tamil Nadu (India); Department of Therapeutic Radiology, University of Minnesota, Minneapolis (United States); James, Jebaseelan Samuel [Photonics Division, VIT University, Vellore, Tamil Nadu (India)

    2013-07-01

    A laser based 3D scanner is developed to take tomography images of partly transparent samples. The scanner is optimized to characterize gel samples from spatially resolved dosimetry measurements. The resulting device should be suitably designed to be constructed in India. This gave me valuable insight into the scientific and technological environment of the country and made me find my way through a quite different culture of research and commerce, within and beyond the scientific context of the university. The project was implemented during a nine months stay at the Vellore Institute of Technology University in Vellore, Tamil Nadu, India, in co-operation with the Christian Medical College, Vellore, in 2006/07. It was conducted within the framework of existing research activities of the host university.

  6. Conducting Internet Research With the Transgender Population: Reaching Broad Samples and Collecting Valid Data

    OpenAIRE

    Miner, Michael H.; Bockting, Walter O.; Romine, Rebecca Swinburne; Raman, Sivakumaran

    2011-01-01

    Health research on transgender people has been hampered by the challenges inherent in studying a hard-to-reach, relatively small, and geographically dispersed population. The Internet has the potential to facilitate access to transgender samples large enough to permit examination of the diversity and syndemic health disparities found among this population. In this article, we describe the experiences of a team of investigators using the Internet to study HIV risk behaviors of transgender peop...

  7. Radiological air monitoring and sample analysis research and development progress report

    International Nuclear Information System (INIS)

    1992-01-01

    Sponsored by a Department Of Energy (DOE) research and development grant, the State of Idaho INEL Oversight Program (OP) personnel designed an independent air monitoring system that provides detection of the presence of priority airborne contaminants potentially migrating beyond INEL boundaries. Initial locations for off-site ambient air monitoring stations were chosen in consultation with: DOE and NOAA reports; Mesodif modeling; review of the relevant literature; and communication with private contractors and experts in pertinent fields. Idaho State University (ISU) has initiated an Environmental Monitoring Program (EMP). The EMP provides an independent monitoring function as well as a training ground for students. Students learn research techniques dedicated to environmental studies and learn analytical skills and rules of compliance related to monitoring. ISU-EMP assisted OP in specific aspects of identifying optimum permanent monitoring station locations, and in selecting appropriate sample collection equipment for each station. The authorization to establish, prepare and install sampling devices on selected sites was obtained by OP personnel in conjunction with ISU-EMP personnel. All samples described in this program are collected by OP or ISU-EMP personnel and returned to the ISU for analysis. This report represents the summary of results of those samples collected and analyzed for radioactivity during the year of 1992

  8. Biological samples positioning device for irradiations on a radial channel at the nuclear research reactor

    International Nuclear Information System (INIS)

    Rodriguez Gual, Maritza; Mas Milian, Felix; Deppman, Airton; Pinto Coelho, Paulo Rogerio

    2010-01-01

    For the demand of an experimental device for biological samples positioning system for irradiations on a radial channel at the nuclear research reactor in operation was constructed and started up a device for the place and remove of the biological samples from the irradiation channels without interrupting the operation of the reactor. The economical valuations are effected comparing with another type of device with the same functions. This work formed part of an international project between Cuba and Brazil that undertook the study of the induced damages by various types of ionizing radiation in DNA molecules. Was experimentally tested the proposed solution, which demonstrates the practical validity of the device. As a result of the work, the experimental device for biological samples irradiations are installed and operating in the radial beam hole No3(BH3) for more than five years at the IEA-R1 Brazilian research reactor according to the solicited requirements the device. The designed device increases considerably the type of studies can be conducted in this reactor. Its practical application in research taking place in that facility, in the field of radiobiology and dosimetry, and so on is immediate

  9. Consistent Quantum Theory

    Science.gov (United States)

    Griffiths, Robert B.

    2001-11-01

    Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics

  10. Validating internet research: a test of the psychometric equivalence of internet and in-person samples.

    Science.gov (United States)

    Meyerson, Paul; Tryon, Warren W

    2003-11-01

    This study evaluated the psychometric equivalency of Web-based research. The Sexual Boredom Scale was presented via the World-Wide Web along with five additional scales used to validate it. A subset of 533 participants that matched a previously published sample (Watt & Ewing, 1996) on age, gender, and race was identified. An 8 x 8 correlation matrix from the matched Internet sample was compared via structural equation modeling with a similar 8 x 8 correlation matrix from the previously published study. The Internet and previously published samples were psychometrically equivalent. Coefficient alpha values calculated on the matched Internet sample yielded reliability coefficients almost identical to those for the previously published sample. Factors such as computer administration and uncontrollable administration settings did not appear to affect the results. Demographic data indicated an overrepresentation of males by about 6% and Caucasians by about 13% relative to the U.S. Census (2000). A total of 2,230 participants were obtained in about 8 months without remuneration. These results suggest that data collection on the Web is (1) reliable, (2) valid, (3) reasonably representative, (4) cost effective, and (5) efficient.

  11. RANKED SET SAMPLING FOR ECOLOGICAL RESEARCH: ACCOUNTING FOR THE TOTAL COSTS OF SAMPLING, BY MODE, CONQUEST, AND MARKER. (R825173)

    Science.gov (United States)

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  12. Research on pathogens at Great Lakes beaches: sampling, influential factors, and potential sources

    Science.gov (United States)

    ,

    2013-01-01

    The overall mission of this work is to provide science-based information and methods that will allow beach managers to more accurately make beach closure and advisory decisions, understand the sources and physical processes affecting beach contaminants, and understand how science-based information can be used to mitigate and restore beaches and protect the public. The U.S. Geological Survey (USGS), in collaboration with many Federal, State, and local agencies and universities, has conducted research on beach health issues in the Great Lakes Region for more than a decade. The work consists of four science elements that align with the USGS Beach Health Initiative Mission: real-time assessments of water quality; coastal processes; pathogens and source tracking; and data analysis, interpretation, and communication. The ongoing or completed research for the pathogens and source tracking topic is described in this fact sheet.

  13. Addressing Underrepresentation in Sex Work Research: Reflections on Designing a Purposeful Sampling Strategy.

    Science.gov (United States)

    Bungay, Vicky; Oliffe, John; Atchison, Chris

    2016-06-01

    Men, transgender people, and those working in off-street locales have historically been underrepresented in sex work health research. Failure to include all sections of sex worker populations precludes comprehensive understandings about a range of population health issues, including potential variations in the manifestation of such issues within and between population subgroups, which in turn can impede the development of effective services and interventions. In this article, we describe our attempts to define, determine, and recruit a purposeful sample for a qualitative study examining the interrelationships between sex workers' health and the working conditions in the Vancouver off-street sex industry. Detailed is our application of ethnographic mapping approaches to generate information about population diversity and work settings within distinct geographical boundaries. Bearing in mind the challenges and the overwhelming discrimination sex workers experience, we scope recommendations for safe and effective purposeful sampling inclusive of sex workers' heterogeneity. © The Author(s) 2015.

  14. Consistency argued students of fluid

    Science.gov (United States)

    Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma

    2017-01-01

    Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.

  15. Feasibility, internal consistency and covariates of TICS-m (telephone interview for cognitive status-modified) in a population-based sample: findings from the KORA-Age study.

    Science.gov (United States)

    Lacruz, Me; Emeny, Rt; Bickel, H; Linkohr, B; Ladwig, Kh

    2013-09-01

    Test the feasibility of the modified telephone interview for cognitive status (TICS-m) as a screening tool to detect cognitive impairment in a population-based sample of older subjects. Data were collected from 3,578 participants, age 65-94 years, of the KORA-Age study. We used analysis of covariance to test for significant sex, age and educational differences in raw TICS-m scores. Internal consistency was analysed by assessing Cronbach's alpha. Correction for education years was undertaken, and participants were divided in three subgroups following validated cut-offs. Finally, a logistic regression was performed to determine the impact of sex on cognition subgroups. Internal consistency of the TICS-m was 0.78. Study participants needed approximately 5.4 min to complete the interview. Lower raw TICS-m scores were associated with male sex, older age and lower education (all p education years, 2,851 (79%) had a non-impaired cognitive status (score >31). Male sex was independently associated with having a score equal to or below 27 and 31 (OR = 1.9, 95% CI 1.4-2.5 and OR = 1.5, 95% CI 1.2-1.7, respectively). The TICS-m is a feasible questionnaire for community-dwelling older adults with normal cognitive function or moderate cognitive impairment. Lower cognitive performance was associated with being a man, being older, and having fewer years of formal education. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Conducting Internet Research With the Transgender Population: Reaching Broad Samples and Collecting Valid Data.

    Science.gov (United States)

    Miner, Michael H; Bockting, Walter O; Romine, Rebecca Swinburne; Raman, Sivakumaran

    2012-05-01

    Health research on transgender people has been hampered by the challenges inherent in studying a hard-to-reach, relatively small, and geographically dispersed population. The Internet has the potential to facilitate access to transgender samples large enough to permit examination of the diversity and syndemic health disparities found among this population. In this article, we describe the experiences of a team of investigators using the Internet to study HIV risk behaviors of transgender people in the United States. We developed an online instrument, recruited participants exclusively via websites frequented by members of the target population, and collected data using online quantitative survey and qualitative synchronous and asynchronous interview methods. Our experiences indicate that the Internet environment presents the investigator with some unique challenges and that commonly expressed criticisms about Internet research (e.g., lack of generalizable samples, invalid study participants, and multiple participation by the same subject) can be overcome with careful method design, usability testing, and pilot testing. The importance of both usability and pilot testing are described with respect to participant engagement and retention and the quality of data obtained online.

  17. Conducting Internet Research With the Transgender Population: Reaching Broad Samples and Collecting Valid Data

    Science.gov (United States)

    Miner, Michael H.; Bockting, Walter O.; Romine, Rebecca Swinburne; Raman, Sivakumaran

    2013-01-01

    Health research on transgender people has been hampered by the challenges inherent in studying a hard-to-reach, relatively small, and geographically dispersed population. The Internet has the potential to facilitate access to transgender samples large enough to permit examination of the diversity and syndemic health disparities found among this population. In this article, we describe the experiences of a team of investigators using the Internet to study HIV risk behaviors of transgender people in the United States. We developed an online instrument, recruited participants exclusively via websites frequented by members of the target population, and collected data using online quantitative survey and qualitative synchronous and asynchronous interview methods. Our experiences indicate that the Internet environment presents the investigator with some unique challenges and that commonly expressed criticisms about Internet research (e.g., lack of generalizable samples, invalid study participants, and multiple participation by the same subject) can be overcome with careful method design, usability testing, and pilot testing. The importance of both usability and pilot testing are described with respect to participant engagement and retention and the quality of data obtained online. PMID:24031157

  18. Evaluation applications of instrument calibration research findings in psychology for very small samples

    Science.gov (United States)

    Fisher, W. P., Jr.; Petry, P.

    2016-11-01

    Many published research studies document item calibration invariance across samples using Rasch's probabilistic models for measurement. A new approach to outcomes evaluation for very small samples was employed for two workshop series focused on stress reduction and joyful living conducted for health system employees and caregivers since 2012. Rasch-calibrated self-report instruments measuring depression, anxiety and stress, and the joyful living effects of mindfulness behaviors were identified in peer-reviewed journal articles. Items from one instrument were modified for use with a US population, other items were simplified, and some new items were written. Participants provided ratings of their depression, anxiety and stress, and the effects of their mindfulness behaviors before and after each workshop series. The numbers of participants providing both pre- and post-workshop data were low (16 and 14). Analysis of these small data sets produce results showing that, with some exceptions, the item hierarchies defining the constructs retained the same invariant profiles they had exhibited in the published research (correlations (not disattenuated) range from 0.85 to 0.96). In addition, comparisons of the pre- and post-workshop measures for the three constructs showed substantively and statistically significant changes. Implications for program evaluation comparisons, quality improvement efforts, and the organization of communications concerning outcomes in clinical fields are explored.

  19. Reviewing the research methods literature: principles and strategies illustrated by a systematic overview of sampling in qualitative research.

    Science.gov (United States)

    Gentles, Stephen J; Charles, Cathy; Nicholas, David B; Ploeg, Jenny; McKibbon, K Ann

    2016-10-11

    Overviews of methods are potentially useful means to increase clarity and enhance collective understanding of specific methods topics that may be characterized by ambiguity, inconsistency, or a lack of comprehensiveness. This type of review represents a distinct literature synthesis method, although to date, its methodology remains relatively undeveloped despite several aspects that demand unique review procedures. The purpose of this paper is to initiate discussion about what a rigorous systematic approach to reviews of methods, referred to here as systematic methods overviews, might look like by providing tentative suggestions for approaching specific challenges likely to be encountered. The guidance offered here was derived from experience conducting a systematic methods overview on the topic of sampling in qualitative research. The guidance is organized into several principles that highlight specific objectives for this type of review given the common challenges that must be overcome to achieve them. Optional strategies for achieving each principle are also proposed, along with discussion of how they were successfully implemented in the overview on sampling. We describe seven paired principles and strategies that address the following aspects: delimiting the initial set of publications to consider, searching beyond standard bibliographic databases, searching without the availability of relevant metadata, selecting publications on purposeful conceptual grounds, defining concepts and other information to abstract iteratively, accounting for inconsistent terminology used to describe specific methods topics, and generating rigorous verifiable analytic interpretations. Since a broad aim in systematic methods overviews is to describe and interpret the relevant literature in qualitative terms, we suggest that iterative decision making at various stages of the review process, and a rigorous qualitative approach to analysis are necessary features of this review type

  20. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  1. Bitcoin Meets Strong Consistency

    OpenAIRE

    Decker, Christian; Seidel, Jochen; Wattenhofer, Roger

    2014-01-01

    The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...

  2. Consistent classical supergravity theories

    International Nuclear Information System (INIS)

    Muller, M.

    1989-01-01

    This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included

  3. Consistency of orthodox gravity

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)

    1997-01-01

    A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.

  4. Quasiparticles and thermodynamical consistency

    International Nuclear Information System (INIS)

    Shanenko, A.A.; Biro, T.S.; Toneev, V.D.

    2003-01-01

    A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)

  5. Determination of 7BE in soil sample by gamma spectrometry for erosion researchs

    International Nuclear Information System (INIS)

    Esquivel, Alexander D.; Kastner, Geraldo F.; Amaral, Angela M.; Monteiro, Roberto Pellacani G.; Moreira, Rubens M.

    2015-01-01

    Cosmogenic 7 Be is a natural radiotracer produced in the stratosphere and troposphere and reached to the Earth surface via wet and dry fallout and hence its measurement for research of erosion in soils is very significant. The 7 Be radio analyse based on gamma spectrometry technique has been a routine methodology for decades and although is the reference procedure is not free of analytical interference. 7 Be is a β-γ emitting radionuclide (Eγ = 477.59 keV, T½ = 53.12d) and depending on the chemical profile of the soil its determination is susceptible to 228 Ac (E γ = 478.40 keV, T½ = 6.15h) interference. The aim of this work was to establish an analytical protocol for the 7 Be determination in soil samples from Juatuba-Mg region in different sampling periods of dry and rainy seasons for erosion studies and to establish some methodologies for evaluating and correcting the interference level of 228 Ac in the 7 Be activity measurements by gamma spectrometry. (author)

  6. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples

    Directory of Open Access Journals (Sweden)

    Rígel Licier

    2016-10-01

    Full Text Available The proper handling of samples to be analyzed by mass spectrometry (MS can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  7. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples.

    Science.gov (United States)

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-10-17

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  8. Experience sampling methodology in mental health research: new insights and technical developments.

    Science.gov (United States)

    Myin-Germeys, Inez; Kasanova, Zuzana; Vaessen, Thomas; Vachon, Hugo; Kirtley, Olivia; Viechtbauer, Wolfgang; Reininghaus, Ulrich

    2018-06-01

    In the mental health field, there is a growing awareness that the study of psychiatric symptoms in the context of everyday life, using experience sampling methodology (ESM), may provide a powerful and necessary addition to more conventional research approaches. ESM, a structured self-report diary technique, allows the investigation of experiences within, and in interaction with, the real-world context. This paper provides an overview of how zooming in on the micro-level of experience and behaviour using ESM adds new insights and additional perspectives to standard approaches. More specifically, it discusses how ESM: a) contributes to a deeper understanding of psychopathological phenomena, b) allows to capture variability over time, c) aids in identifying internal and situational determinants of variability in symptomatology, and d) enables a thorough investigation of the interaction between the person and his/her environment and of real-life social interactions. Next to improving assessment of psychopathology and its underlying mechanisms, ESM contributes to advancing and changing clinical practice by allowing a more fine-grained evaluation of treatment effects as well as by providing the opportunity for extending treatment beyond the clinical setting into real life with the development of ecological momentary interventions. Furthermore, this paper provides an overview of the technical details of setting up an ESM study in terms of design, questionnaire development and statistical approaches. Overall, although a number of considerations and challenges remain, ESM offers one of the best opportunities for personalized medicine in psychiatry, from both a research and a clinical perspective. © 2018 World Psychiatric Association.

  9. Double sampling with multiple imputation to answer large sample meta-research questions: Introduction and illustration by evaluating adherence to two simple CONSORT guidelines

    Directory of Open Access Journals (Sweden)

    Patrice L. Capers

    2015-03-01

    Full Text Available BACKGROUND: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. OBJECTIVE: To evaluate the use of double sampling combined with multiple imputation (DS+MI to address meta-research questions, using as an example adherence of PubMed entries to two simple Consolidated Standards of Reporting Trials (CONSORT guidelines for titles and abstracts. METHODS: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT; human; abstract available; and English language (n=322,107. For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO human rating method. Multiple imputation of the missing-completely-at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. RESULTS: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title=1.00, abstract=0.92. Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS+MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by Year: subsample RHITLO 1.050-1.174 vs. DS+MI 1.082-1.151. As evidence of improved accuracy, DS+MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. CONCLUSIONS: Our results support our hypothesis that DS+MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of

  10. Consumption of new psychoactive substances in a Spanish sample of research chemical users.

    Science.gov (United States)

    González, Débora; Ventura, Mireia; Caudevilla, Fernando; Torrens, Marta; Farre, Magi

    2013-07-01

    To know the pattern of use of new psychoactive substances (NPSs) in a Spanish sample of research chemical (RC) users and to deepen the RC user profile and risk reduction strategies employed. This study is a cross-sectional survey by means of a specific questionnaire. Recruitment was carried out at music festivals, at non-governmental organizations (NGOs), and through announcements on an online forum. Two RC user profiles were defined, according to whether they search information through online forums. A total of 230 users participated. The most frequent RCs were hallucinogenic phenethylamines (2C-B 80.0%, 2C-I 39.6%) and cathinones (methylone 40.1%, mephedrone 35.2%). The most frequent combination of RC with other illegal drugs was with cannabis (68.6%) and 2C-B with MDMA (28.3%). Subjects who are consulting drug forums (group 1) use more RC, obtain RC by Internet, and use more frequently risk prevention strategies. Regarding the risk-reduction strategies in this group, users sought information concerning RC before consuming them (100%), used precision scales to calculate dosage (72.3%), and analyzed the contents before consumption (68.8%). There is a specific RC user profile with extensive knowledge and consumption of substances, using different strategies to reduce risks associated to its consumption. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Consistency in PERT problems

    OpenAIRE

    Bergantiños, Gustavo; Valencia-Toledo, Alfredo; Vidal-Puga, Juan

    2016-01-01

    The program evaluation review technique (PERT) is a tool used to schedule and coordinate activities in a complex project. In assigning the cost of a potential delay, we characterize the Shapley rule as the only rule that satisfies consistency and other desirable properties.

  12. Reporting consistently on CSR

    DEFF Research Database (Denmark)

    Thomsen, Christa; Nielsen, Anne Ellerup

    2006-01-01

    This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...

  13. Geometrically Consistent Mesh Modification

    KAUST Repository

    Bonito, A.

    2010-01-01

    A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.

  14. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    Science.gov (United States)

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  15. The Rucio Consistency Service

    CERN Document Server

    Serfon, Cedric; The ATLAS collaboration

    2016-01-01

    One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.

  16. Is cosmology consistent?

    International Nuclear Information System (INIS)

    Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias

    2002-01-01

    We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models

  17. Who Are We Studying? Sample Diversity in Teaching of Psychology Research

    Science.gov (United States)

    Richmond, Aaron S.; Broussard, Kristin A.; Sterns, Jillian L.; Sanders, Kristina K.; Shardy, Justin C.

    2015-01-01

    The purpose of the current study was to examine the sample diversity of empirical articles published in four premier teaching of psychology journals from 2008 to 2013. We investigated which demographic information was commonly reported and if samples were ethnically representative and whether gender was representative compared to National…

  18. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  19. The PowerAtlas: a power and sample size atlas for microarray experimental design and research

    Directory of Open Access Journals (Sweden)

    Wang Jelai

    2006-02-01

    Full Text Available Abstract Background Microarrays permit biologists to simultaneously measure the mRNA abundance of thousands of genes. An important issue facing investigators planning microarray experiments is how to estimate the sample size required for good statistical power. What is the projected sample size or number of replicate chips needed to address the multiple hypotheses with acceptable accuracy? Statistical methods exist for calculating power based upon a single hypothesis, using estimates of the variability in data from pilot studies. There is, however, a need for methods to estimate power and/or required sample sizes in situations where multiple hypotheses are being tested, such as in microarray experiments. In addition, investigators frequently do not have pilot data to estimate the sample sizes required for microarray studies. Results To address this challenge, we have developed a Microrarray PowerAtlas 1. The atlas enables estimation of statistical power by allowing investigators to appropriately plan studies by building upon previous studies that have similar experimental characteristics. Currently, there are sample sizes and power estimates based on 632 experiments from Gene Expression Omnibus (GEO. The PowerAtlas also permits investigators to upload their own pilot data and derive power and sample size estimates from these data. This resource will be updated regularly with new datasets from GEO and other databases such as The Nottingham Arabidopsis Stock Center (NASC. Conclusion This resource provides a valuable tool for investigators who are planning efficient microarray studies and estimating required sample sizes.

  20. Sample size calculations for cluster randomised crossover trials in Australian and New Zealand intensive care research.

    Science.gov (United States)

    Arnup, Sarah J; McKenzie, Joanne E; Pilcher, David; Bellomo, Rinaldo; Forbes, Andrew B

    2018-06-01

    The cluster randomised crossover (CRXO) design provides an opportunity to conduct randomised controlled trials to evaluate low risk interventions in the intensive care setting. Our aim is to provide a tutorial on how to perform a sample size calculation for a CRXO trial, focusing on the meaning of the elements required for the calculations, with application to intensive care trials. We use all-cause in-hospital mortality from the Australian and New Zealand Intensive Care Society Adult Patient Database clinical registry to illustrate the sample size calculations. We show sample size calculations for a two-intervention, two 12-month period, cross-sectional CRXO trial. We provide the formulae, and examples of their use, to determine the number of intensive care units required to detect a risk ratio (RR) with a designated level of power between two interventions for trials in which the elements required for sample size calculations remain constant across all ICUs (unstratified design); and in which there are distinct groups (strata) of ICUs that differ importantly in the elements required for sample size calculations (stratified design). The CRXO design markedly reduces the sample size requirement compared with the parallel-group, cluster randomised design for the example cases. The stratified design further reduces the sample size requirement compared with the unstratified design. The CRXO design enables the evaluation of routinely used interventions that can bring about small, but important, improvements in patient care in the intensive care setting.

  1. On 'Consistent' Poverty

    OpenAIRE

    Rod Hick

    2012-01-01

    The measurement of poverty as ‘consistent’ poverty offers a solution to one of the primary problems of poverty measurement within Social Policy of the last three decades. Often treated as if they were synonymous, ‘indirect’ measures of poverty, such as low income measures, and ‘direct’ measures, such as indices of material deprivation, identify surprisingly different people as being poor. In response to this mismatch, a team of Irish researchers put forward a measure which identified responde...

  2. Recent developments in sample preparation and data pre-treatment in metabonomics research.

    Science.gov (United States)

    Li, Ning; Song, Yi peng; Tang, Huiru; Wang, Yulan

    2016-01-01

    Metabonomics is a powerful approach for biomarker discovery and an effective tool for pinpointing endpoint metabolic effects of external stimuli, such as pathogens and disease development. Due to its wide applications, metabonomics is required to deal with various biological samples of different properties. Hence sample preparation and corresponding data pre-treatment become important factors in ensuring validity of an investigation. In this review, we summarize some recent developments in metabonomics sample preparation and data-pretreatment procedures. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Community understanding of Respondent-Driven Sampling in a medical research setting in Uganda: importance for the use of RDS for public health research.

    Science.gov (United States)

    McCreesh, Nicky; Tarsh, Matilda Nadagire; Seeley, Janet; Katongole, Joseph; White, Richard G

    2013-01-01

    Respondent-driven sampling (RDS) is a widely-used variant of snowball sampling. Respondents are selected not from a sampling frame, but from a social network of existing members of the sample. Incentives are provided for participation and for the recruitment of others. Ethical and methodological criticisms have been raised about RDS. Our purpose was to evaluate whether these criticisms were justified. In this study RDS was used to recruit male household heads in rural Uganda. We investigated community members' understanding and experience of the method, and explored how these may have affected the quality of the RDS survey data. Our findings suggest that because participants recruit participants, the use of RDS in medical research may result in increased difficulties in gaining informed consent, and data collected using RDS may be particularly susceptible to bias due to differences in the understanding of key concepts between researchers and members of the community.

  4. Will Women Diagnosed with Breast Cancer Provide Biological Samples for Research Purposes?

    Directory of Open Access Journals (Sweden)

    Shelley A Harris

    Full Text Available Little is known about the response rates for biological sample donation and attitudes towards control recruitment, especially in younger women. The goals of this pilot study were to determine in women recently diagnosed with breast cancer, the proportion of cases willing to provide biological samples and for purposes of control recruitment, contact information for friends or colleagues.A population-based sample of breast cancer cases (n = 417, 25-74 years was recruited from the Ontario Cancer Registry in 2010 and self-administered questionnaires were completed to determine willingness to provide samples (spot or 24-hr urine, saliva, blood and contact information for friends/colleagues for control recruitment. Using Χ2 analyses of contingency tables we evaluated if these proportions varied by age group (<45 and 45+ and other factors such as ethnicity, education, income, body mass index (BMI, smoking status and alcohol consumption.Cases were willing to provide blood samples, by visiting a clinic (62% or by having a nurse visit the home (61%. Moreover, they would provide saliva (73%, and morning or 24-hr urine samples (66% and 52%. Younger cases (≤45 were 3 times (OR more likely more than older cases to agree to collect morning urine (95% CI: 1.15-8.35. Only 26% of cases indicated they would provide contact information of friends or work colleagues to act as controls. Educated cases were more likely to agree to provide samples, and cases who consumed alcohol were more willing to provide contact information. Ethnicity, income, BMI and smoking had little effect on response rates.Reasonable response rates for biological sample collection should be expected in future case controls studies in younger women, but other methods of control selection must be devised.

  5. Utilization of AHWR critical facility for research and development work on large sample NAA

    International Nuclear Information System (INIS)

    Acharya, R.; Dasari, K.B.; Pujari, P.K.; Swain, K.K.; Reddy, A.V.R.; Verma, S.K.; De, S.K.

    2014-01-01

    The graphite reflector position of AHWR critical facility (CF) was utilized for analysis of large size (g-kg scale) samples using internal mono standard neutron activation analysis (IM-NAA). The reactor position was characterized by cadmium ratio method using In monitor for total flux and sub cadmium to epithermal flux ratio (f). Large sample neutron activation analysis (LSNAA) work was carried out for samples of stainless steel, ancient and new clay potteries and dross. Large as well as non-standard geometry samples (1 g - 0.5 kg) were irradiated. Radioactive assay was carried out using high resolution gamma ray spectrometry. Concentration ratios obtained by IM-NAA were used for provenance study of 30 clay potteries, obtained from excavated Buddhist sites of AP, India. Concentrations of Au and Ag were determined in not so homogeneous three large size samples of dross. An X-Z rotary scanning unit has been installed for counting large and not so homogeneous samples. (author)

  6. Gay and Bisexual Men's Perceptions of the Donation and Use of Human Biological Samples for Research: A Qualitative Study.

    Directory of Open Access Journals (Sweden)

    Chris Patterson

    Full Text Available Human biological samples (biosamples are increasingly important in diagnosing, treating and measuring the prevalence of illnesses. For the gay and bisexual population, biosample research is particularly important for measuring the prevalence of human immunodeficiency virus (HIV. By determining people's understandings of, and attitudes towards, the donation and use of biosamples, researchers can design studies to maximise acceptability and participation. In this study we examine gay and bisexual men's attitudes towards donating biosamples for HIV research. Semi-structured telephone interviews were conducted with 46 gay and bisexual men aged between 18 and 63 recruited in commercial gay scene venues in two Scottish cities. Interview transcripts were analysed thematically using the framework approach. Most men interviewed seemed to have given little prior consideration to the issues. Participants were largely supportive of donating tissue for medical research purposes, and often favourable towards samples being stored, reused and shared. Support was often conditional, with common concerns related to: informed consent; the protection of anonymity and confidentiality; the right to withdraw from research; and ownership of samples. Many participants were in favour of the storage and reuse of samples, but expressed concerns related to data security and potential misuse of samples, particularly by commercial organisations. The sensitivity of tissue collection varied between tissue types and collection contexts. Blood, urine, semen and bowel tissue were commonly identified as sensitive, and donating saliva and as unlikely to cause discomfort. To our knowledge, this is the first in-depth study of gay and bisexual men's attitudes towards donating biosamples for HIV research. While most men in this study were supportive of donating tissue for research, some clear areas of concern were identified. We suggest that these minority concerns should be accounted

  7. The use of human samples obtained during medicolegal autopsies in research: An introduction to current conditions and initiatives in Japan.

    Science.gov (United States)

    Tsujimura-Ito, Takako; Inoue, Yusuke; Muto, Kaori; Yoshida, Ken-Ichi

    2017-04-01

    Background Leftover samples obtained during autopsies are extremely important basic materials for forensic research. However, there are no established practices for research-related use of obtained samples. Objective This study discusses good practice for the secondary use of samples collected during medicolegal autopsies. Methods A questionnaire was posted to all 76 departments of forensic medicine performing medicolegal autopsies in Japan, and 48 responses were received (response rate: 63.2%). As a secondary analysis, we surveyed information provided on department websites. Results Ethical reviews conducted when samples were to be used for research varied greatly among departments, with 21 (43.8%) departments reporting 'fundamentally, all cases are subject to review', eight (16.7%) reporting 'only some are subject to review' and 17 (39.6%) reporting 'none are subject to review'. Information made available on websites indicated that 11 departments had a statement of some type to bereaved families about the potential research use of human samples obtained during autopsies. Nine of these included a notice stating that bereaved families may revoke their consent for use. Several departments used an opt-out system. Conclusion There is no common practice in the field of legal medicine on the ethical use for medical research of leftover samples from medicolegal autopsies. The trust of not only bereaved families but also society in general is required for the scientific validity and social benefits of medical studies using leftover samples from medicolegal autopsies through the use of opt-out consenting and offline and online dissemination and public-relations activities.

  8. Series: Practical guidance to qualitative research : part 3: sampling, data collection and analysis

    NARCIS (Netherlands)

    Albine Moser; Irene Korstjens

    2017-01-01

    In the course of our supervisory work over the years, we have noticed that qualitative research tends to evoke a lot of questions and worries, so-called frequently asked questions (FAQs). This series of four articles intends to provide novice researchers with practical guidance for

  9. RESEARCH ON THE DEGREE OF SATURATION INVESTIGATION BY THE SAMPLING OF THE SAND FOR LIQUEFACTION

    Science.gov (United States)

    Fujii, Nao; Ohuchi, Masatoshi; Sakai, Katsuo; Nishigaki, Makoto

    The liquefaction countermeasure technical method, whereby the liquefaction strength is enhanced by making sand deposit in the state of unsaturation, is currently under study. The author et al have suggested a simple method of verifying the persistence of residual air using the undisturbed sample under ordinary temperature and sampling underground water; and have actually implemented the method in the adjacent ground with the foundation of viaduct pneumatic caisson where the leaked air during the construction was considered to have been trapped. We demonstrated the method of correcting the influence of the pressure of sampling specimen as well as of the dissolved air, and studied the precision of required degree of saturation. As the result, it has been shown that the residual air entrapped in the sand deposit is sustainable for as long time as about 28 years.

  10. The feasible research with measuring radon for taking the soils sample

    International Nuclear Information System (INIS)

    Zeng Bing, Ge Liangquan; Liu Hefan; Li Yeqiang; Zhang Jinzhao; Song Xiao'an

    2010-01-01

    It explains the mechanism of the separation of soil's radon. Through the designed experiment, it confirms the feasibility of measuring radon for taking the soil's sample. It determines the content of the radon and its sub field with indoor and outside through ways of the activated charcoal adsorption, the initiative suction and the diameter mark etching, also the 226 Ra. The paper indicates: it is feasible with measuring radon for taking the soil's sample, and the stability of data is that indoor data are better than outside's. The temperature, the humidity, the rainfall amount, the intensity and so on are the serious influence of the data. If you want to take a soil's sample, you must avoid the rain as far as possible, and avoid the fault zone, the belt of folded strata and complex geologic structure region, and so on. (authors)

  11. The prevalence of dementia in a Portuguese community sample: a 10/66 Dementia Research Group study.

    Science.gov (United States)

    Gonçalves-Pereira, Manuel; Cardoso, Ana; Verdelho, Ana; Alves da Silva, Joaquim; Caldas de Almeida, Manuel; Fernandes, Alexandra; Raminhos, Cátia; Ferri, Cleusa P; Prina, A Matthew; Prince, Martin; Xavier, Miguel

    2017-11-07

    Dementia imposes a high burden of disease worldwide. Recent epidemiological studies in European community samples are scarce. In Portugal, community prevalence data is very limited. The 10/66 Dementia Research Group (DRG) population-based research programmes are focused in low and middle income countries, where the assessments proved to be culture and education fair. We applied the 10/66 DRG prevalence survey methodology in Portugal, where levels of illiteracy in older populations are still high. A cross-sectional comprehensive one-phase survey was conducted of all residents aged 65 and over of two geographically defined catchment areas in Southern Portugal (one urban and one rural site). Nursing home residents were not included in the present study. Standardized 10/66 DRG assessments include a cognitive module, an informant interview and the Geriatric Mental State-AGECAT, providing data on dementia diagnosis and subtypes, mental disorders including depression, physical health, anthropometry, demographics, disability/functioning, health service utilization, care arrangements and caregiver strain. We interviewed 1405 old age participants (mean age 74.9, SD = 6.7 years; 55.5% women) after 313 (18.2%) refusals to participate. The prevalence rate for dementia in community-dwellers was 9.23% (95% CI 7.80-10.90) using the 10/66 DRG algorithm and 3.65% (95% CI 2.97-4.97) using DSM-IV criteria. Pure Alzheimer's disease was the most prevalent dementia subtype (41.9%). The prevalence of dementia was strongly age-dependent for both criteria, but there was no association with sex. Dementia prevalence was higher than previously reported in Portugal. The discrepancy between prevalence according to the 10/66 DRG algorithm and the DSM-IV criteria is consistent with that observed in less developed countries; this suggests potential underestimation using the latter approach, although relative validity of these two approaches remains to be confirmed in the European context. We

  12. The prevalence of dementia in a Portuguese community sample: a 10/66 Dementia Research Group study

    Directory of Open Access Journals (Sweden)

    Manuel Gonçalves-Pereira

    2017-11-01

    Full Text Available Abstract Background Dementia imposes a high burden of disease worldwide. Recent epidemiological studies in European community samples are scarce. In Portugal, community prevalence data is very limited. The 10/66 Dementia Research Group (DRG population-based research programmes are focused in low and middle income countries, where the assessments proved to be culture and education fair. We applied the 10/66 DRG prevalence survey methodology in Portugal, where levels of illiteracy in older populations are still high. Methods A cross-sectional comprehensive one-phase survey was conducted of all residents aged 65 and over of two geographically defined catchment areas in Southern Portugal (one urban and one rural site. Nursing home residents were not included in the present study. Standardized 10/66 DRG assessments include a cognitive module, an informant interview and the Geriatric Mental State-AGECAT, providing data on dementia diagnosis and subtypes, mental disorders including depression, physical health, anthropometry, demographics, disability/functioning, health service utilization, care arrangements and caregiver strain. Results We interviewed 1405 old age participants (mean age 74.9, SD = 6.7 years; 55.5% women after 313 (18.2% refusals to participate. The prevalence rate for dementia in community-dwellers was 9.23% (95% CI 7.80–10.90 using the 10/66 DRG algorithm and 3.65% (95% CI 2.97–4.97 using DSM-IV criteria. Pure Alzheimer’s disease was the most prevalent dementia subtype (41.9%. The prevalence of dementia was strongly age-dependent for both criteria, but there was no association with sex. Conclusions Dementia prevalence was higher than previously reported in Portugal. The discrepancy between prevalence according to the 10/66 DRG algorithm and the DSM-IV criteria is consistent with that observed in less developed countries; this suggests potential underestimation using the latter approach, although relative validity of these two

  13. A New Method for Noninvasive Genetic Sampling of Saliva in Ecological Research.

    Directory of Open Access Journals (Sweden)

    Diana Lobo

    Full Text Available Noninvasive samples for genetic analyses have become essential to address ecological questions. Popular noninvasive samples such as faeces contain degraded DNA which may compromise genotyping success. Saliva is an excellent alternative DNA source but scarcity of suitable collection methods makes its use anecdotal in field ecological studies. We develop a noninvasive method of collection that combines baits and porous materials able to capture saliva. We report its potential in optimal conditions, using confined dogs and collecting saliva early after deposition. DNA concentration in saliva extracts was generally high (mean 14 ng μl(-1. We correctly identified individuals in 78% of samples conservatively using ten microsatellite loci, and 90% of samples using only eight loci. Consensus genotypes closely matched reference genotypes obtained from hair DNA (99% of identification successes and 91% of failures. Mean genotyping effort needed for identification using ten loci was 2.2 replicates. Genotyping errors occurred at a very low frequency (allelic dropout: 2.3%; false alleles: 1.5%. Individual identification success increased with duration of substrate handling inside dog's mouth and the volume of saliva collected. Low identification success was associated with baits rich in DNA-oxidant polyphenols and DNA concentrations <1 ng μl(-1. The procedure performed at least as well as other noninvasive methods, and could advantageously allow detection of socially low-ranked individuals underrepresented in sources of DNA that are involved in marking behaviour (faeces or urine. Once adapted and refined, there is promise for this technique to allow potentially high rates of individual identification in ecological field studies requiring noninvasive sampling of wild vertebrates.

  14. Recent Research Status on the Microbes in the Radioactive Waste Disposal and Identification of Aerobic Microbes in a Groundwater Sampled from the KAERI Underground Research Tunnel(KURT)

    International Nuclear Information System (INIS)

    Baik, Min Hoon; Lee, Seung Yeop; Cho, Won Jin

    2006-11-01

    In this report, a comprehensive review on the research results and status for the various effects of microbes in the radioactive waste disposal including definition and classification of microbes, and researches related with the waste containers, engineered barriers, natural barriers, natural analogue studies, and radionuclide migration and retardation. Cultivation, isolation, and classification of aerobic microbes found in a groundwater sampled from the KAERI Underground Research Tunnel (KURT) located in the KAERI site have carried out and over 20 microbes were found to be present in the groundwater. Microbial identification by a 16S rDNA genetic analysis of the selected major 10 aerobic microbes was performed and the identified microbes were characterized

  15. A call to improve sampling methodology and reporting in young novice driver research.

    Science.gov (United States)

    Scott-Parker, B; Senserrick, T

    2017-02-01

    Young drivers continue to be over-represented in road crash fatalities despite a multitude of research, communication and intervention. Evidence-based improvement depends to a great extent upon research methodology quality and its reporting, with known limitations in the peer-review process. The aim of the current research was to review the scope of research methodologies applied in 'young driver' and 'teen driver' research and their reporting in four peer-review journals in the field between January 2006 and December 2013. In total, 806 articles were identified and assessed. Reporting omissions included participant gender (11% of papers), response rates (49%), retention rates (39%) and information regarding incentives (44%). Greater breadth and specific improvements in study designs and reporting are thereby identified as a means to further advance the field. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  16. Sample Selectivity and the Validity of International Student Achievement Tests in Economic Research. NBER Working Paper No. 15867

    Science.gov (United States)

    Hanushek, Eric A.; Woessmann, Ludger

    2010-01-01

    Critics of international student comparisons argue that results may be influenced by differences in the extent to which countries adequately sample their entire student populations. In this research note, we show that larger exclusion and non-response rates are related to better country average scores on international tests, as are larger…

  17. Research And Establishment Of The Analytical Procedure For/Of Sr-90 In Milk Samples

    International Nuclear Information System (INIS)

    Tran Thi Tuyet Mai; Duong Duc Thang; Nguyen Thi Linh; Bui Thi Anh Duong

    2014-01-01

    Sr-90 is an indicator for the transfer radionuclides from environment to human. This work was setup to build a procedure for Sr-90 determination in main popular foodstuff and focus to fresh milk. The deal of this work was establish procedure for Sr-90 , assessment for chemical yield and test sample of Vietnam fresh milk, also in this work, the QA, QC for the procedure was carried out using standard sample of IAEA. The work has been completed for the procedure of determination Sr-90 in milk. The chemical yield of recovery for Y-90 and Sr-90 were at 46.76 % ±1.25% and 0.78 ± 0.086, respectively. The QA & QC program was carried out using reference material IAEA-373. The result parse is appropriate equally and well agreement with the certificate value. Three reference samples were analyses with 15 measurements. The results of Sr-90 concentration after processing statistics given a value at 3.69 Bq/kg with uncertainty of 0.23 Bq/kg. The certificate of IAEA-154 for Sr-90 (half live 28.8 year) is the 6.9 Bq/kg, with the range 95% Confidence Interval as (6.0 -8.0 ) Bq/kg at 31st August 1987. After adjusting decay, the radioactivity at this time is 3.67 Bq/kg. It means that such the result of this work was perfect matching the value of stock index IAEA. Five Vietnam fresh milk samples were analyzed for Sr-90, the specific radioactivity of Sr-90 in milk were in a range from 0.032 to 0.041 Bq/l. (author)

  18. Research on How to Remove Efficiently the Condensate Water of Sampling System

    International Nuclear Information System (INIS)

    Cho, SungHwan; Kim, MinSoo; Choi, HoYoung; In, WonHo

    2015-01-01

    Corrosion was caused in the measurement chamber inside the O 2 and H 2 analyzer, and thus measuring the concentration of O 2 and H 2 was not possible. It was confirmed that the cause of the occurrence of condensate water is due to the temperature difference caused during the process of the internal gas of the disposal and degasifier tank being brought into the analyzer. Thus, a heating system was installed inside and outside of the sampling panel for gas to remove generated condensate water in the analyzer and pipe. For the case where condensate water is not removed by the heating system, drain port is also installed in the sampling panel for gas to collect the condensate water of the sampling system. It was verified that there is a great volume of condensate water existing in the pipe line during the purging process after installing manufactured goods. The condensate water was fully removed by the installed heating cable and drain port. The heating cable was operated constantly at a temperature of 80 to 90 .deg. C, which allows the precise measurement of gas concentration and longer maintenance duration by blocking of the condensate water before being produced. To install instruments for measuring the gas, such as an O 2 and H 2 analyzer etc., consideration regarding whether there condensate water is present due to the temperature difference between the measuring system and analyzer is required

  19. Research on How to Remove Efficiently the Condensate Water of Sampling System

    Energy Technology Data Exchange (ETDEWEB)

    Cho, SungHwan; Kim, MinSoo; Choi, HoYoung; In, WonHo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Corrosion was caused in the measurement chamber inside the O{sub 2} and H{sub 2} analyzer, and thus measuring the concentration of O{sub 2} and H{sub 2} was not possible. It was confirmed that the cause of the occurrence of condensate water is due to the temperature difference caused during the process of the internal gas of the disposal and degasifier tank being brought into the analyzer. Thus, a heating system was installed inside and outside of the sampling panel for gas to remove generated condensate water in the analyzer and pipe. For the case where condensate water is not removed by the heating system, drain port is also installed in the sampling panel for gas to collect the condensate water of the sampling system. It was verified that there is a great volume of condensate water existing in the pipe line during the purging process after installing manufactured goods. The condensate water was fully removed by the installed heating cable and drain port. The heating cable was operated constantly at a temperature of 80 to 90 .deg. C, which allows the precise measurement of gas concentration and longer maintenance duration by blocking of the condensate water before being produced. To install instruments for measuring the gas, such as an O{sub 2} and H{sub 2} analyzer etc., consideration regarding whether there condensate water is present due to the temperature difference between the measuring system and analyzer is required.

  20. Understanding active sampling strategies: Empirical approaches and implications for attention and decision research.

    Science.gov (United States)

    Gottlieb, Jacqueline

    2018-05-01

    In natural behavior we actively gather information using attention and active sensing behaviors (such as shifts of gaze) to sample relevant cues. However, while attention and decision making are naturally coordinated, in the laboratory they have been dissociated. Attention is studied independently of the actions it serves. Conversely, decision theories make the simplifying assumption that the relevant information is given, and do not attempt to describe how the decision maker may learn and implement active sampling policies. In this paper I review recent studies that address questions of attentional learning, cue validity and information seeking in humans and non-human primates. These studies suggest that learning a sampling policy involves large scale interactions between networks of attention and valuation, which implement these policies based on reward maximization, uncertainty reduction and the intrinsic utility of cognitive states. I discuss the importance of using such paradigms for formalizing the role of attention, as well as devising more realistic theories of decision making that capture a broader range of empirical observations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Impact of implementing ISO 9001:2008 standard on the Spanish Renal Research Network biobank sample transfer process.

    Science.gov (United States)

    Cortés, M Alicia; Irrazábal, Emanuel; García-Jerez, Andrea; Bohórquez-Magro, Lourdes; Luengo, Alicia; Ortiz-Arduán, Alberto; Calleros, Laura; Rodríguez-Puyol, Manuel

    2014-01-01

    Biobank certification ISO 9001:2008 aims to improve the management of processes performed. This has two objectives: customer satisfaction and continuous improvement. This paper presents the impact of certification ISO 9001:2008 on the sample transfer process in a Spanish biobank specialising in kidney patient samples. The biobank experienced a large increase in the number of samples between 2009 (12,582 vials) and 2010 (37,042 vials). The biobank of the Spanish Renal Research Network (REDinREN), located at the University of Alcalá, has implemented ISO standard 9001:2008 for the effective management of human material given to research centres. Using surveys, we analysed two periods in the “sample transfer” process. During the first period between 1-10-12 and 26-11-12 (8 weeks), minimal changes were made to correct isolated errors. In the second period, between 7-01-13 and 18-02-13 (6 weeks), we carried out general corrective actions. The identification of problems and implementation of corrective actions for certification allowed: a 70% reduction in the process execution time, a significant increase (200%) in the number of samples processed and a 25% improvement in the process. The increase in the number of samples processed was directly related to process improvement. The certification of ISO standard 9001:2008, obtained in July 2013, allowed an improvement of the REDinREN biobank processes to be achieved, which increased quality and customer satisfaction.

  2. Brief Report: Comparability of DSM-IV and DSM-5 ASD Research Samples

    Science.gov (United States)

    Mazefsky, C. A.; McPartland, J. C.; Gastgeb, H. Z.; Minshew, N. J.

    2013-01-01

    Diagnostic and Statistical Manual (DSM-5) criteria for ASD have been criticized for being too restrictive, especially for more cognitively-able individuals. It is unclear, however, if high-functioning individuals deemed eligible for research via standardized diagnostic assessments would meet DSM-5 criteria. This study investigated the impact of…

  3. NASA Johnson Space Center's Planetary Sample Analysis and Mission Science (PSAMS) Laboratory: A National Facility for Planetary Research

    Science.gov (United States)

    Draper, D. S.

    2016-01-01

    NASA Johnson Space Center's (JSC's) Astromaterials Research and Exploration Science (ARES) Division, part of the Exploration Integration and Science Directorate, houses a unique combination of laboratories and other assets for conducting cutting edge planetary research. These facilities have been accessed for decades by outside scientists, most at no cost and on an informal basis. ARES has thus provided substantial leverage to many past and ongoing science projects at the national and international level. Here we propose to formalize that support via an ARES/JSC Plane-tary Sample Analysis and Mission Science Laboratory (PSAMS Lab). We maintain three major research capa-bilities: astromaterial sample analysis, planetary process simulation, and robotic-mission analog research. ARES scientists also support planning for eventual human ex-ploration missions, including astronaut geological training. We outline our facility's capabilities and its potential service to the community at large which, taken together with longstanding ARES experience and expertise in curation and in applied mission science, enable multi-disciplinary planetary research possible at no other institution. Comprehensive campaigns incorporating sample data, experimental constraints, and mission science data can be conducted under one roof.

  4. Field Exploration and Life Detection Sampling for Planetary Analogue Research (FELDSPAR)

    Science.gov (United States)

    Gentry, D.; Stockton, A. M.; Amador, E. S.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z. A.; Jacobsen, M. B.; Kirby, J.; McCaig, H. C.; Murukesan, G.; Rennie, V.; Rader, E.; Schwieterman, E. W.; Stevens, A. H.; Sutton, S. A.; Tan, G.; Yin, C.; Cullen, D.; Geppert, W.

    2017-12-01

    Extraterrestrial studies are typically conducted on mg samples from cm-scale features, while landing sites are selected based on m to km-scale features. It is therefore critical to understand spatial distribution of organic molecules over scales from cm to the km, particularly in geological features that appear homogenous at m to km scales. This is addressed by FELDSPAR, a NASA-funded project that conducts field operations analogous to Mars sample return in its science, operations, and technology [1]. Here, we present recent findings from a 2016 and a 2017 campaign to multiple Martian analogue sites in Iceland. Icelandic volcanic regions are Mars analogues due to desiccation, low nutrient availability, temperature extremes [2], and are relatively young and isolated from anthropogenic contamination [3]. Operationally, many Icelandic analogue sites are remote enough to require that field expeditions address several sampling constraints that are also faced by robotic exploration [1, 2]. Four field sites were evaluated in this study. The Fimmvörðuháls lava field was formed by a basaltic effusive eruption associated with the 2010 Eyjafjallajökull eruption. Mælifellssandur is a recently deglaciated plain to the north of the Myrdalsjökull glacier. Holuhraun is a basaltic spatter and cinder cone formed by 2014 fissure eruptions just north of the Vatnajökull glacier. Dyngjusandur is a plain kept barren by repeated aeolian mechanical weathering. Samples were collected in nested triangular grids from 10 cm to the 1 km scale. We obtained overhead imagery at 1 m to 200 m elevation to create digital elevation models. In-field reflectance spectroscopy was obtained with an ASD spectrometer and chemical composition was measured by a Bruker handheld XRF. All sites chosen were homogeneous in apparent color, morphology, moisture, grain size, and reflectance spectra at all scales greater than 10 cm. Field lab ATP assays were conducted to monitor microbial habitation, and home

  5. Standardization of sample collection, isolation and analysis methods in extracellular vesicle research

    Directory of Open Access Journals (Sweden)

    Kenneth W. Witwer

    2013-05-01

    Full Text Available The emergence of publications on extracellular RNA (exRNA and extracellular vesicles (EV has highlighted the potential of these molecules and vehicles as biomarkers of disease and therapeutic targets. These findings have created a paradigm shift, most prominently in the field of oncology, prompting expanded interest in the field and dedication of funds for EV research. At the same time, understanding of EV subtypes, biogenesis, cargo and mechanisms of shuttling remains incomplete. The techniques that can be harnessed to address the many gaps in our current knowledge were the subject of a special workshop of the International Society for Extracellular Vesicles (ISEV in New York City in October 2012. As part of the “ISEV Research Seminar: Analysis and Function of RNA in Extracellular Vesicles (evRNA”, 6 round-table discussions were held to provide an evidence-based framework for isolation and analysis of EV, purification and analysis of associated RNA molecules, and molecular engineering of EV for therapeutic intervention. This article arises from the discussion of EV isolation and analysis at that meeting. The conclusions of the round table are supplemented with a review of published materials and our experience. Controversies and outstanding questions are identified that may inform future research and funding priorities. While we emphasize the need for standardization of specimen handling, appropriate normative controls, and isolation and analysis techniques to facilitate comparison of results, we also recognize that continual development and evaluation of techniques will be necessary as new knowledge is amassed. On many points, consensus has not yet been achieved and must be built through the reporting of well-controlled experiments.

  6. Dental Injuries in a Sample of Portuguese Militaries - A Preliminary Research.

    Science.gov (United States)

    Azevedo, Luís; Martins, David; Veiga, Nélio; Fine, Peter; Correia, André

    2018-05-23

    Traumatic dental and maxillofacial injuries are very common and appear to affect approximately 20-30% of permanent dentition, with often serious psychological, economic, functional, and esthetic consequences. Militaries are a highest risk group for orofacial trauma, not only because they are constantly engaged in physical activity (which increase the risk of traumatic injuries) but also because they are exposed to many risk factors. The aim of this study was to evaluate the prevalence of orofacial injuries, militaries knowledge about first-aid procedures following a dental avulsion and the use of mouthguards in a sample of Portuguese militaries. An observational cross-sectional study was conducted for forces of the Infantry Regiment n°14 of Viseu, Portugal. The study involved 122 members of the armed forces who were asked to complete a questionnaire, which enquired about: the occurrence of dental trauma, the use of mouthguards and militaries knowledge with regard to first-aid management of dental avulsions. In our sample, 5.7% reported having experienced a dental trauma. This was further broken down to reveal that 2.5% had experienced an avulsion and 3.3% had a dental fracture. All respondents who reported having suffered dental trauma, reported that this was the only time that they had experienced dental trauma. Within this group, 71.4% visited a dentist, however only one (20%) visited the dentist during the same day that the trauma occurred. In addition, 21.3% mentioned that they had seen a dental trauma in at least one colleague during military trainings/operations. In the case of dental avulsion, the majority (54.9%) did not know how to act. The rate of mouthguard's use among militaries was very low (6.4%). The main reason reported for not using a mouthguard was thinking that it is not necessary (53.3%). Besides that, 31.1% did not know what a mouthguard was for. Prevention programs and promoting actions with this population are important reflections and

  7. An Interdisciplinary Method for the Visualization of Novel High-Resolution Precision Photography and Micro-XCT Data Sets of NASA's Apollo Lunar Samples and Antarctic Meteorite Samples to Create Combined Research-Grade 3D Virtual Samples for the Benefit of Astromaterials Collections Conservation, Curation, Scientific Research and Education

    Science.gov (United States)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Hanna, R. D.; Ketcham, R. A.

    2016-01-01

    New technologies make possible the advancement of documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. With increasing demands for accessibility to updated comprehensive data, and with new sample return missions on the horizon, it is of primary importance to develop new standards for contemporary documentation and visualization methodologies. Our interdisciplinary team has expertise in the fields of heritage conservation practices, professional photography, photogrammetry, imaging science, application engineering, data curation, geoscience, and astromaterials curation. Our objective is to create virtual 3D reconstructions of Apollo Lunar and Antarctic Meteorite samples that are a fusion of two state-of-the-art data sets: the interior view of the sample by collecting Micro-XCT data and the exterior view of the sample by collecting high-resolution precision photography data. These new data provide researchers an information-rich visualization of both compositional and textural information prior to any physical sub-sampling. Since January 2013 we have developed a process that resulted in the successful creation of the first image-based 3D reconstruction of an Apollo Lunar Sample correlated to a 3D reconstruction of the same sample's Micro- XCT data, illustrating that this technique is both operationally possible and functionally beneficial. In May of 2016 we began a 3-year research period during which we aim to produce Virtual Astromaterials Samples for 60 high-priority Apollo Lunar and Antarctic Meteorite samples and serve them on NASA's Astromaterials Acquisition and Curation website. Our research demonstrates that research-grade Virtual Astromaterials Samples are beneficial in preserving for posterity a precise 3D reconstruction of the sample prior to sub-sampling, which greatly improves documentation practices, provides unique and novel visualization of the sample's interior and

  8. Evaluating Temporal Consistency in Marine Biodiversity Hotspots.

    Science.gov (United States)

    Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  9. Consistency of color representation in smart phones.

    Science.gov (United States)

    Dain, Stephen J; Kwan, Benjamin; Wong, Leslie

    2016-03-01

    One of the barriers to the construction of consistent computer-based color vision tests has been the variety of monitors and computers. Consistency of color on a variety of screens has necessitated calibration of each setup individually. Color vision examination with a carefully controlled display has, as a consequence, been a laboratory rather than a clinical activity. Inevitably, smart phones have become a vehicle for color vision tests. They have the advantage that the processor and screen are associated and there are fewer models of smart phones than permutations of computers and monitors. Colorimetric consistency of display within a model may be a given. It may extend across models from the same manufacturer but is unlikely to extend between manufacturers especially where technologies vary. In this study, we measured the same set of colors in a JPEG file displayed on 11 samples of each of four models of smart phone (iPhone 4s, iPhone5, Samsung Galaxy S3, and Samsung Galaxy S4) using a Photo Research PR-730. The iPhones are white LED backlit LCD and the Samsung are OLEDs. The color gamut varies between models and comparison with sRGB space shows 61%, 85%, 117%, and 110%, respectively. The iPhones differ markedly from the Samsungs and from one another. This indicates that model-specific color lookup tables will be needed. Within each model, the primaries were quite consistent (despite the age of phone varying within each sample). The worst case in each model was the blue primary; the 95th percentile limits in the v' coordinate were ±0.008 for the iPhone 4 and ±0.004 for the other three models. The u'v' variation in white points was ±0.004 for the iPhone4 and ±0.002 for the others, although the spread of white points between models was u'v'±0.007. The differences are essentially the same for primaries at low luminance. The variation of colors intermediate between the primaries (e.g., red-purple, orange) mirror the variation in the primaries. The variation in

  10. Creating a sampling frame for population-based veteran research: Representativeness and overlap of VA and Department of Defense databases

    OpenAIRE

    Donna L. Washington, MD, MPH; Su Sun, MPH; Mark Canning, BA

    2010-01-01

    Most veteran research is conducted in Department of Veterans Affairs (VA) healthcare settings, although most veterans obtain healthcare outside the VA. Our objective was to determine the adequacy and relative contributions of Veterans Health Administration (VHA), Veterans Benefits Administration (VBA), and Department of Defense (DOD) administrative databases for representing the U.S. veteran population, using as an example the creation of a sampling frame for the National Survey of Women Vete...

  11. Research

    African Journals Online (AJOL)

    abp

    2015-05-28

    May 28, 2015 ... The findings revealed a significant association between iron deficiency and anaemia. Therefore ... The sample was selected using a stratified two-stage cluster design consisting of 37 clusters, 18 in the .... deficiency in malaria endemic regions has multiple causes of which p.falciparum being one of the ...

  12. Comparative research of effectiveness of cellulose and fiberglass porous membrane carriers for bio sampling in veterinary and food industry monitoring

    Science.gov (United States)

    Gusev, Alexander; Vasyukova, Inna; Zakharova, Olga; Altabaeva, Yuliya; Saushkin, Nikolai; Samsonova, Jeanne; Kondakov, Sergey; Osipov, Alexander; Snegin, Eduard

    2017-11-01

    The aim of proposed research is to study the applicability of fiberglass porous membrane materials in a new strip format for dried blood storage in food industry monitoring. A comparative analysis of cellulosic and fiberglass porous membrane materials was carried out to obtain dried samples of serum or blood and the possibility of further species-specific analysis. Blood samples of Sus scrofa were used to study the comparative effectiveness of cellulose and fiberglass porous membrane carriers for long-term biomaterial storage allowing for further DNA detection by real-time polymerase chain reaction (PCR) method. Scanning electron microscopy of various membranes - native and with blood samples - indicate a fundamental difference in the form of dried samples. Membranes based on cellulosic materials sorb the components of the biological fluid on the surface of the fibers of their structure, partially penetrating the cellulose fibers, while in the case of glass fiber membranes the components of the biological fluid dry out as films in the pores of the membrane between the structural filaments. This fundamental difference in the retention mechanisms affects the rate of dissolution of the components of dry samples and contributes to an increase in the efficiency of the desorption process of the sample before subsequent analysis. Detecting of pig DNA in every analyzed sample under the performed Real-time PCR as well as good state of the biomaterial preservation on the glass fiber membranes was clearly demonstrated. Good biomaterials preservation has been revealed on the test cards for 4 days as well as for 1 hour.

  13. Small Body GN and C Research Report: G-SAMPLE - An In-Flight Dynamical Method for Identifying Sample Mass [External Release Version

    Science.gov (United States)

    Carson, John M., III; Bayard, David S.

    2006-01-01

    G-SAMPLE is an in-flight dynamical method for use by sample collection missions to identify the presence and quantity of collected sample material. The G-SAMPLE method implements a maximum-likelihood estimator to identify the collected sample mass, based on onboard force sensor measurements, thruster firings, and a dynamics model of the spacecraft. With G-SAMPLE, sample mass identification becomes a computation rather than an extra hardware requirement; the added cost of cameras or other sensors for sample mass detection is avoided. Realistic simulation examples are provided for a spacecraft configuration with a sample collection device mounted on the end of an extended boom. In one representative example, a 1000 gram sample mass is estimated to within 110 grams (95% confidence) under realistic assumptions of thruster profile error, spacecraft parameter uncertainty, and sensor noise. For convenience to future mission design, an overall sample-mass estimation error budget is developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  14. Public involvement in pharmacogenomics research: a national survey on public attitudes towards pharmacogenomics research and the willingness to donate DNA samples to a DNA bank in Japan.

    Science.gov (United States)

    Kobayashi, Eriko; Satoh, Nobunori

    2009-11-01

    To assess the attitudes of the Japanese general public towards pharmacogenomics research and a DNA bank for identifying genomic markers associated with ADRs and their willingness to donate DNA samples, we conducted a national survey for 1,103 Japanese adults from the general public, not a patient population. The response rate was 36.8%. The majority of the respondents showed a positive attitude towards pharmacogenomics research (81.0%) and a DNA bank (70.4%). Considering fictitious clinical situations such as taking medications and experiencing ADRs, the willingness to donate DNA samples when experiencing ADRs (61.7%) was higher than when taking medications (45.3%). Older generations were significantly associated with a decreased willingness to donate (OR = 0.45, CI 0.28-0.72 in 50s. OR = 0.49, CI: 0.31-0.77 in 60s). Positive attitudes towards pharmacogenomics research, a DNA bank, blood/bone marrow/organ donation were significantly associated with an increased willingness. However, the respondents had the following concerns regarding a DNA bank: the confidentiality of their personal information, the manner by which research results were utilized and simply the use of their own DNA for research. In order to attain public understanding to overcome these concerns, a process of public awareness should be put into place to emphasize the beneficial aspects of identifying genomic markers associated with ADRs and to address these concerns raised in our study. Further study is needed to assess the willingness of actual patients taking medications in real situations, since the respondents in our study were from the general public, not a patient population, and their willingness was assessed on the condition of assuming that they were patients taking medications.

  15. Community‐Based Participatory Research Skills and Training Needs in a Sample of Academic Researchers from a Clinical and Translational Science Center in the Northeast

    Science.gov (United States)

    DiGirolamo, Ann; Geller, Alan C.; Tendulkar, Shalini A.; Patil, Pratima; Hacker, Karen

    2012-01-01

    Abstract Purpose: To determine the community‐based participatory research (CBPR) training interests and needs of researchers interested in CBPR to inform efforts to build infrastructure for conducting community‐engaged research. Method: A 20‐item survey was completed by 127 academic health researchers at Harvard Medical School, Harvard School of Public Health, and Harvard affiliated hospitals. Results: Slightly more than half of the participants reported current or prior experience with CBPR (58 %). Across all levels of academic involvement, approximately half of the participants with CBPR experience reported lacking skills in research methods and dissemination, with even fewer reporting skills in training of community partners. Regardless of prior CBPR experience, about half of the respondents reported having training needs in funding, partnership development, evaluation, and dissemination of CBPR projects. Among those with CBPR experience, more than one‐third of the participants wanted a mentor in CBPR; however only 19 % were willing to act as a mentor. Conclusions: Despite having experience with CBPR, many respondents did not have the comprehensive package of CBPR skills, reporting a need for training in a variety of CBPR skill sets. Further, the apparent mismatch between the need for mentors and availability in this sample suggests an important area for development. Clin Trans Sci 2012; Volume #: 1–5 PMID:22686211

  16. Illite K-Ar dating of fault breccia samples from ONKALO underground research facility, Olkiluoto, Eurajoki, SW Finland

    International Nuclear Information System (INIS)

    Maenttaeri, I.; Mattila, J.; Zwingmann, H.; Todd, A.J.

    2007-08-01

    Illite K-Ar age determinations were done on five fault breccia samples from the ONKALO underground research facility, Olkiluoto, Eurajoki, S-W Finland. The XRD, SEM, and TEM studies and K-Ar analyses were done in John deLaeter Center in Mass Spectrometry at Curtin University, Perth, Western Australia. The <2 micron grain size fractions contain illite, chlorite, dickite, and quartz. All fractions had minor contamination phases comprising mainly of quartz but traces of K-feldspar contamination could be identified in all samples. The authigenic illite shows variable K concentrations. The illite contents of the ONK-PL68 and ONK-PL87 samples are the smallest. The K-Ar ages for the <2 micron fractions vary from ∼0.55 Ga to 1.38 Ga. The sample ONKPL68 yields a K-Ar age of 912 ± 18 Ma corresponding to a Neoproterozoic-Tonian age. This age can be roughly temporally linked with late events related to Sveconorwegian orogeny. Sample ONK-PL87 has a K-Ar age of 550 ± 11 Ma corresponding to a Neoproterozoic - Lower Cambrian age. The samples ONK-PL522 and ONK-PL901 sampled from the storage hall fault show identical K-Ar ages of 1385 ± 27 Ma and 1373 ± 27 Ma, respectively. These correspond to a Mesoproterozoic-Ectasian age related to Subjotnian or Postjotnian events. ONK-PL960 yields a K-Ar age of 1225 ± 24 Ma corresponding to a Mesoproterozoic-Ectasian age. This age agrees well with the ages from Postjotnian diabase dykes in W Finland. The 2-3 % detrital K-feldspar contamination in clay fractions increases the age. Especially for the youngest sample ONK-PL87, the effect may be geologically meaningful as after the correction the age clearly indicates Caledonian events. Moreover, the age for the low K sample ONKPL901 shifts to indicate Postjotnian diabase age. (orig.)

  17. X-Ray Micro-Computed Tomography of Apollo Samples as a Curation Technique Enabling Better Research

    Science.gov (United States)

    Ziegler, R. A.; Almeida, N. V.; Sykes, D.; Smith, C. L.

    2014-01-01

    X-ray micro-computed tomography (micro-CT) is a technique that has been used to research meteorites for some time and many others], and recently it is becoming a more common tool for the curation of meteorites and Apollo samples. Micro-CT is ideally suited to the characterization of astromaterials in the curation process as it can provide textural and compositional information at a small spatial resolution rapidly, nondestructively, and without compromising the cleanliness of the samples (e.g., samples can be scanned sealed in Teflon bags). This data can then inform scientists and curators when making and processing future sample requests for meteorites and Apollo samples. Here we present some preliminary results on micro-CT scans of four Apollo regolith breccias. Methods: Portions of four Apollo samples were used in this study: 14321, 15205, 15405, and 60639. All samples were 8-10 cm in their longest dimension and approximately equant. These samples were micro-CT scanned on the Nikon HMXST 225 System at the Natural History Museum in London. Scans were made at 205-220 kV, 135-160 microamps beam current, with an effective voxel size of 21-44 microns. Results: Initial examination of the data identify a variety of mineral clasts (including sub-voxel FeNi metal grains) and lithic clasts within the regolith breccias. Textural information within some of the lithic clasts was also discernable. Of particular interest was a large basalt clast (approx.1.3 cc) found within sample 60639, which appears to have a sub-ophitic texture. Additionally, internal void space, e.g., fractures and voids, is readily identifiable. Discussion: It is clear from the preliminary data that micro-CT analyses are able to identify important "new" clasts within the Apollo breccias, and better characterize previously described clasts or igneous samples. For example, the 60639 basalt clast was previously believed to be quite small based on its approx.0.5 sq cm exposure on the surface of the main mass

  18. Challenges in collecting clinical samples for research from pregnant women of South Asian origin: evidence from a UK study.

    Science.gov (United States)

    Neelotpol, Sharmind; Hay, Alastair W M; Jolly, A Jim; Woolridge, Mike W

    2016-08-31

    To recruit South Asian pregnant women, living in the UK, into a clinicoepidemiological study for the collection of lifestyle survey data and antenatal blood and to retain the women for the later collection of cord blood and meconium samples from their babies for biochemical analysis. A longitudinal study recruiting pregnant women of South Asian and Caucasian origin living in the UK. Recruitment of the participants, collection of clinical samples and survey data took place at the 2 sites within a single UK Northern Hospital Trust. Pregnant women of South Asian origin (study group, n=98) and of Caucasian origin (comparison group, n=38) living in Leeds, UK. Among the participants approached, 81% agreed to take part in the study while a 'direct approach' method was followed. The retention rate of the participants was a remarkable 93.4%. The main challenges in recruiting the ethnic minority participants were their cultural and religious conservativeness, language barrier, lack of interest and feeling of extra 'stress' in taking part in research. The chief investigator developed an innovative participant retention method, associated with the women's cultural and religious practices. The method proved useful in retaining the participants for about 5 months and in enabling successful collection of clinical samples from the same mother-baby pairs. The collection of clinical samples and lifestyle data exceeded the calculated sample size required to give the study sufficient power. The numbers of samples obtained were: maternal blood (n=171), cord blood (n=38), meconium (n=176), lifestyle questionnaire data (n=136) and postnatal records (n=136). Recruitment and retention of participants, according to the calculated sample size, ensured sufficient power and success for a clinicoepidemiological study. Results suggest that development of trust and confidence between the participant and the researcher is the key to the success of a clinical and epidemiological study involving

  19. MCNPX calculations of dose rate distribution inside samples treated in the research gamma irradiating facility at CTEx

    Energy Technology Data Exchange (ETDEWEB)

    Rusin, Tiago; Rebello, Wilson F.; Vellozo, Sergio O.; Gomes, Renato G., E-mail: tiagorusin@ime.eb.b, E-mail: rebello@ime.eb.b, E-mail: vellozo@cbpf.b, E-mail: renatoguedes@ime.eb.b [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil). Dept. de Engenharia Nuclear; Vital, Helio C., E-mail: vital@ctex.eb.b [Centro Tecnologico do Exercito (CTEx), Rio de Janeiro, RJ (Brazil); Silva, Ademir X., E-mail: ademir@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Engenharia Nuclear

    2011-07-01

    A cavity-type cesium-137 research irradiating facility at CTEx has been modeled by using the Monte Carlo code MCNPX. The irradiator has been daily used in experiments to optimize the use of ionizing radiation for conservation of many kinds of food and to improve materials properties. In order to correlate the effects of the treatment, average doses have been calculated for each irradiated sample, accounting for the measured dose rate distribution in the irradiating chambers. However that approach is only approximate, being subject to significant systematic errors due to the heterogeneous internal structure of most samples that can lead to large anisotropy in attenuation and Compton scattering properties across the media. Thus this work is aimed at further investigating such uncertainties by calculating the dose rate distribution inside the items treated such that a more accurate and representative estimate of the total absorbed dose can be determined for later use in the effects-versus-dose correlation curves. Samples of different simplified geometries and densities (spheres, cylinders, and parallelepipeds), have been modeled to evaluate internal dose rate distributions within the volume of the samples and the overall effect on the average dose. (author)

  20. MCNPX calculations of dose rate distribution inside samples treated in the research gamma irradiating facility at CTEx

    International Nuclear Information System (INIS)

    Rusin, Tiago; Rebello, Wilson F.; Vellozo, Sergio O.; Gomes, Renato G.; Silva, Ademir X.

    2011-01-01

    A cavity-type cesium-137 research irradiating facility at CTEx has been modeled by using the Monte Carlo code MCNPX. The irradiator has been daily used in experiments to optimize the use of ionizing radiation for conservation of many kinds of food and to improve materials properties. In order to correlate the effects of the treatment, average doses have been calculated for each irradiated sample, accounting for the measured dose rate distribution in the irradiating chambers. However that approach is only approximate, being subject to significant systematic errors due to the heterogeneous internal structure of most samples that can lead to large anisotropy in attenuation and Compton scattering properties across the media. Thus this work is aimed at further investigating such uncertainties by calculating the dose rate distribution inside the items treated such that a more accurate and representative estimate of the total absorbed dose can be determined for later use in the effects-versus-dose correlation curves. Samples of different simplified geometries and densities (spheres, cylinders, and parallelepipeds), have been modeled to evaluate internal dose rate distributions within the volume of the samples and the overall effect on the average dose. (author)

  1. Consistency of Trend Break Point Estimator with Underspecified Break Number

    Directory of Open Access Journals (Sweden)

    Jingjing Yang

    2017-01-01

    Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.

  2. The Portuguese language version of social phobia and Anxiety Inventory: analysis of items and internal consistency in a Brazilian sample of 1,014 undergraduate students Versão para o português do Inventário de Fobia Social e Ansiedade: análise de itens e consistência interna numa amostra de 1.014 estudantes universitários brasileiros

    Directory of Open Access Journals (Sweden)

    Patrícia Picon

    2006-01-01

    Full Text Available OBJECTIVE: Theoretical and empirical analysis of items and internal consistency of the Portuguese-language version of Social Phobia and Anxiety Inventory (SPAI-Portuguese. METHODS: Social phobia experts conducted a 45-item content analysis of the SPAI-Portuguese administered to a sample of 1,014 university students. Item discrimination was evaluated by Student's t test; interitem, mean and item-to-total correlations, by Pearson coefficient; reliability was estimated by Cronbach's alpha. RESULTS: There was 100% agreement among experts concerning the 45 items. On the SPAI-Portuguese 43 items were discriminative (p OBJETIVO: Análise teórica e empírica dos itens e da consistência interna da versão em português do Social Phobia and Anxiety Inventory (SPAI-Português e subescalas. MÉTODOS: Peritos em fobia social conduziram análise de conteúdo dos 45 itens do SPAI-Português, administrado a 1.014 estudantes universitários. A discriminação dos itens foi avaliada por teste t de Student; correlações interitens, médias e item/total por coeficientes de Pearson; fidedignidade pelo alfa de Cronbach. RESULTADOS: Concordância plena entre os peritos para os 45 itens. SPAI-Português com 43 itens discriminativos (p < 0,05. Alguns itens, entre as subescalas, apresentaram coeficientes abaixo de 0,2. As médias das correlações interitens foram: 0,41 na subescala fobia social; 0,32 na subescala agorafobia; e 0,32 no SPAI-Português. As correlações item/total foram maiores do que 0,3 (p < 0,001. Alfas de Cronbach foram: 0,95 no SPAI-Português; 0,96 na subescala de fobia social; 0,85 na subescala de agorafobia. CONCLUSÃO: O conteúdo dos itens foi relacionado aos constructos subjacentes (agorafobia e fobia social, com discriminabilidade de 43 itens do SPAI-Português. As correlações médias interitens e alfas revelaram consistência interna de SPAI-Português e subescalas, além de multidimensionalidade das mesmas. Nenhum item foi suprimido

  3. Public involvement in pharmacogenomics research: a national survey on patients' attitudes towards pharmacogenomics research and the willingness to donate DNA samples to a DNA bank in Japan.

    Science.gov (United States)

    Kobayashi, Eriko; Sakurada, Tomoya; Ueda, Shiro; Satoh, Nobunori

    2011-05-01

    To assess the attitude of Japanese patients towards pharmacogenomics research and a DNA bank for identifying genomic markers associated with adverse drug reactions (ADRs) and their willingness to donate DNA samples, we conducted a survey of 550 male and female patients. The majority of the respondents showed a positive attitude towards pharmacogenomics research (87.6%) and a DNA bank (75.1%). The willingness to donate DNA samples when experiencing severe ADRs (55.8%) was higher than when taking medications (40.4%). Positive attitudes towards a DNA bank and organ donation were significantly associated with an increased willingness to donate. Though the level of positive attitude in the patient population was higher than that in the general public in our former study (81.0 and 70.4%, respectively), the level of the willingness of patients to donate was 40.4% when taking medications and 55.8% when experiencing severe ADRs which was lower than that of the general public in our former study (45.3 and 61.7%). The results suggested that the level of true willingness in the patient population was lower than that of the general public considering the fictitious situation presented to the public (to suppose that they were patients receiving medication). It is important to assess the willingness of patients who are true potential donors, not the general public.

  4. Supplementing electronic health records through sample collection and patient diaries: A study set within a primary care research database.

    Science.gov (United States)

    Joseph, Rebecca M; Soames, Jamie; Wright, Mark; Sultana, Kirin; van Staa, Tjeerd P; Dixon, William G

    2018-02-01

    To describe a novel observational study that supplemented primary care electronic health record (EHR) data with sample collection and patient diaries. The study was set in primary care in England. A list of 3974 potentially eligible patients was compiled using data from the Clinical Practice Research Datalink. Interested general practices opted into the study then confirmed patient suitability and sent out postal invitations. Participants completed a drug-use diary and provided saliva samples to the research team to combine with EHR data. Of 252 practices contacted to participate, 66 (26%) mailed invitations to patients. Of the 3974 potentially eligible patients, 859 (22%) were at participating practices, and 526 (13%) were sent invitations. Of those invited, 117 (22%) consented to participate of whom 86 (74%) completed the study. We have confirmed the feasibility of supplementing EHR with data collected directly from patients. Although the present study successfully collected essential data from patients, it also underlined the requirement for improved engagement with both patients and general practitioners to support similar studies. © 2017 The Authors. Pharmacoepidemiology & Drug Safety published by John Wiley & Sons Ltd.

  5. Quality assurance program for determining the radioactivity in environmental samples at the Institute of Nuclear Energy Research in Taiwan

    International Nuclear Information System (INIS)

    Gone, J.K.; Wang, T.W.

    2000-01-01

    Interest in determining radioactivity in environmental samples has increased considerably in recent years after the Chernobyl accident in 1986. Environmental monitoring programs have been set up in different countries to measure the trace amount of radionuclides in the environment, and quality of the analytical results on these samples is important because the regulation and safety concerns. A good quality assurance program is essential to provide accurate information for the regulatory body and environmentalists to set proper reactions to protect the environment, and a good analytical result is also important for scientists to determine the transfer of radionuclides between environmental matrices. The Institute of Nuclear Energy Research (lNER) in Taiwan has been working on radionuclide analysis in environmental samples for years, and it's environmental media radioanalytical laboratory (EMRAL) has recently upgraded its quality assurance program for the international standard ISO/lEC guide 25 requirements. The general requirements of lSO/lEC guide 25 has been adapted by the Chinese National Laboratory Accreditation (CNLA) of Taiwan, and CNLA is also a member of International Laboratory Accreditation Cooperation (ILAC) and Asia Pacific Laboratory Accreditation Cooperation (APLAC). This paper summarizes the quality assurance program of lNER's EMRAL. It covers both management and technical sections. These sections have ensured the quality of INER's EMRAL, and they can be applied to different laboratories in the future. (author)

  6. Quality assurance program for determining the radioactivity in environmental samples at the Institute of Nuclear Energy Research in Taiwan

    Energy Technology Data Exchange (ETDEWEB)

    Gone, J.K. [TRR-II Project Team, Institute of Nuclear Energy Research, Taoyuan, Taiwan (China); Wang, T.W. [Division of Health Physics, Institute of Nuclear Energy Research, Taoyuan, Taiwan (China)

    2000-05-01

    Interest in determining radioactivity in environmental samples has increased considerably in recent years after the Chernobyl accident in 1986. Environmental monitoring programs have been set up in different countries to measure the trace amount of radionuclides in the environment, and quality of the analytical results on these samples is important because the regulation and safety concerns. A good quality assurance program is essential to provide accurate information for the regulatory body and environmentalists to set proper reactions to protect the environment, and a good analytical result is also important for scientists to determine the transfer of radionuclides between environmental matrices. The Institute of Nuclear Energy Research (lNER) in Taiwan has been working on radionuclide analysis in environmental samples for years, and it's environmental media radioanalytical laboratory (EMRAL) has recently upgraded its quality assurance program for the international standard ISO/lEC guide 25 requirements. The general requirements of lSO/lEC guide 25 has been adapted by the Chinese National Laboratory Accreditation (CNLA) of Taiwan, and CNLA is also a member of International Laboratory Accreditation Cooperation (ILAC) and Asia Pacific Laboratory Accreditation Cooperation (APLAC). This paper summarizes the quality assurance program of lNER's EMRAL. It covers both management and technical sections. These sections have ensured the quality of INER's EMRAL, and they can be applied to different laboratories in the future. (author)

  7. Collection, pre-treatment and analyses of Cs-137 and Tc-99 in marine samples at the Institute of Marine Research (IMR), Norway

    International Nuclear Information System (INIS)

    Heldal, H.E.

    2010-01-01

    Full text: The Institute of Marine Research (IMR) is an important contributor to the Norwegian marine monitoring programme RAME (Radioactivity in the Marine Environment). RAME is funded by the Ministry of the Environment and coordinated by the Norwegian Radiation Protection Authority (NRPA). Sample collection is performed from IMRs research vessels in the open sea areas of the North-, Norwegian- and Barents Seas and in Norwegian fjords. The samples consist of biota (fish and other marine organisms), sediments and seawater. Biota samples are frozen onboard the ship and transported to IMR where the samples are subsequently ground up, freeze dried, homogenized and aliquoted into polyethylene counting boxes of appropriate size prior to analysis. Attempts are made to collect filets from 25 fish for each sample of large fish such as cod, haddock, saithe, red-fish and Greenland halibut. For smaller fish (e.g. blue whiting, polar cod, capelin and Atlantic herring) and other organisms such as amphipods, krill, and deep-sea shrimps, a sample of 2-3 kg of each species is taken. These samples are ground up whole. Sediment samples are collected using a Smoegen boxcorer, from where both surface samples and cores are taken. The samples are frozen onboard the ship. While half-frozen, the cores are cut into slices of 1 or 2 cm thickness on board the ship, then frozen again and transported to IMR where they are treated as described above for the biota samples. Large volumes (typically 50-200 L) of seawater are needed in order to get enough material for analysis. Pre-treatment of the samples in the field is therefore an advantage. Surface samples (5 m) of seawater are collected from a shipboard pump, while a CTD-rosette multi-bottle sampler with 12 10 L samplers is used to collect seawater from depths below 5 meters. For the analysis of Cs-137, Cu 2 [Fe(CN) 6 ]-impregnated cotton filters are used for the pre-concentration. One pre-filter without impregnation, two Cu 2 [Fe(CN) 6

  8. Measuring process and knowledge consistency

    DEFF Research Database (Denmark)

    Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders

    2007-01-01

    When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...

  9. Classroom Research: Assessment of Student Understanding of Sampling Distributions of Means and the Central Limit Theorem in Post-Calculus Probability and Statistics Classes

    Science.gov (United States)

    Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy

    2006-01-01

    We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…

  10. Domain Adaptation for Pedestrian Detection Based on Prediction Consistency

    Directory of Open Access Journals (Sweden)

    Yu Li-ping

    2014-01-01

    Full Text Available Pedestrian detection is an active area of research in computer vision. It remains a quite challenging problem in many applications where many factors cause a mismatch between source dataset used to train the pedestrian detector and samples in the target scene. In this paper, we propose a novel domain adaptation model for merging plentiful source domain samples with scared target domain samples to create a scene-specific pedestrian detector that performs as well as rich target domain simples are present. Our approach combines the boosting-based learning algorithm with an entropy-based transferability, which is derived from the prediction consistency with the source classifications, to selectively choose the samples showing positive transferability in source domains to the target domain. Experimental results show that our approach can improve the detection rate, especially with the insufficient labeled data in target scene.

  11. Releasable activity and maximum permissible leakage rate within a transport cask of Tehran Research Reactor fuel samples

    Directory of Open Access Journals (Sweden)

    Rezaeian Mahdi

    2015-01-01

    Full Text Available Containment of a transport cask during both normal and accident conditions is important to the health and safety of the public and of the operators. Based on IAEA regulations, releasable activity and maximum permissible volumetric leakage rate within the cask containing fuel samples of Tehran Research Reactor enclosed in an irradiated capsule are calculated. The contributions to the total activity from the four sources of gas, volatile, fines, and corrosion products are treated separately. These calculations are necessary to identify an appropriate leak test that must be performed on the cask and the results can be utilized as the source term for dose evaluation in the safety assessment of the cask.

  12. Development of the methodology of sample preparation to X-ray diffractometry of clay minerals at Petrobras Research Center

    International Nuclear Information System (INIS)

    Alves, D.B.

    1987-01-01

    Various procedures can be used in the analysis of the clay mineral content of rocks by X-ray diffraction. This article describes the principal ones and discusses those adopted in the X-ray clay mineral laboratory of the PETROBRAS Research Center (CENPES) in Rio de Janeiro. This article presents the methodology used and provides users with information about its application and limitations. The methodology has been developed to study polymineral samples. The aim to identify clay mineral groups and to estimate their relative proportions. Of the four main steps of this analysis - separation and concentration of clay minerals, preparation of oriented specimens, X-ray irradiation under standard conditions and interpretation of X-ray diffraction patterns - only the first three are discussed here. Clay minerals occur mainly in the [pt

  13. Comparison of culture based methods for the isolation of Clostridium difficile from stool samples in a research setting.

    Science.gov (United States)

    Lister, Michelle; Stevenson, Emma; Heeg, Daniela; Minton, Nigel P; Kuehne, Sarah A

    2014-08-01

    Effective isolation of Clostridium difficile from stool samples is important in the research setting, especially where low numbers of spores/vegetative cells may be present within a sample. In this study, three protocols for stool culture were investigated to find a sensitive, cost effective and timely method of C. difficile isolation. For the initial enrichment step, the effectiveness of two different rich media, cycloserine-cefoxitin fructose broth (CCFB) and cycloserine-cefoxitin mannitol broth with taurocholate and lysozyme (CCMB-TAL) were compared. For the comparison of four different, selective solid media; Cycloserine-cefoxitin fructose agar (CCFA), Cycloserine-cefoxitin egg yolk agar (CCEY), ChromID C. difficile and tryptone soy agar (TSA) with 5% sheep's blood with and without preceding broth enrichment were used. As a means to enable differentiation between C. difficile and other fecal flora, the effectiveness of the inclusion of a pH indictor (1% Neutral Red), was also evaluated. The data derived indicated that CCFB is more sensitive than CCMB-TAL, however, the latter had an improved recovery rate. A broth enrichment step had a reduced sensitivity over direct plating. ChromID C. difficile showed the best recovery rate whereas CCEY egg yolk agar was the most sensitive of the four. The addition of 1% Neutral Red did not show sufficient colour change when added to CCEY egg yolk agar to be used as a differential medium. For a low cost, timely and sensitive method of isolating C. difficile from stool samples we recommend direct plating onto CCEY egg yolk agar after heat shock. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Analysis of Angolan human hair samples by the k0-NAA technique on the Dalat research reactor

    International Nuclear Information System (INIS)

    Lemos, P.C.D; Ho Manh Dung; Cao Dong Vu; Nguyen Thi Sy; Nguyen Mong Sinh

    2006-01-01

    There is personal difference in concentrations of trace elements in the human hair according to human life or history such as occupation, sex, age, food, habit, social condition and so on. It is also found that the individual's deviation of elemental concentrations reflecting the degree of environmental pollutants exposure to the human body, intakes of food and metabolism. The k 0 -standardization method of neutron activation analysis (k 0 -NAA) on research reactor has been recommended by WHO and IAEA as a main analytical technique with the advantages of sensitivity, precision, accuracy, multi-element and routine. This report presents the results of determination of about 20 elements in 23 human hair samples, which have been collected from different places in Angola by using k 0 -NAA technique on Dalat nuclear research reactor. Accuracy of the method was ascertained by analysis of two human hair certified reference materials (CRMs), i.e. NIES-5 and GBW-09101 and assessed by the deviation of experiment to certified values generally within 10% and U-score values mostly lower 2. (author)

  15. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  16. Research supervision

    African Journals Online (AJOL)

    This gap in the training of nurse educators may result in low in- and output in the research ... Which factors influence the manner in which PG nursing students perceive the .... sample consisted of females (83.9%; n=47), with males representing only. 16.1% (n=9). ..... Winsett and Cashion[20] assert that a research method.

  17. Choice, internal consistency, and rationality

    OpenAIRE

    Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu

    2010-01-01

    The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...

  18. Self-consistent quark bags

    International Nuclear Information System (INIS)

    Rafelski, J.

    1979-01-01

    After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de

  19. Time-consistent and market-consistent evaluations

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2014-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  20. Ethical issues in the export, storage and reuse of human biological samples in biomedical research: perspectives of key stakeholders in Ghana and Kenya.

    Science.gov (United States)

    Tindana, Paulina; Molyneux, Catherine S; Bull, Susan; Parker, Michael

    2014-10-18

    For many decades, access to human biological samples, such as cells, tissues, organs, blood, and sub-cellular materials such as DNA, for use in biomedical research, has been central in understanding the nature and transmission of diseases across the globe. However, the limitations of current ethical and regulatory frameworks in sub-Saharan Africa to govern the collection, export, storage and reuse of these samples have resulted in inconsistencies in practice and a number of ethical concerns for sample donors, researchers and research ethics committees. This paper examines stakeholders' perspectives of and responses to the ethical issues arising from these research practices. We employed a qualitative strategy of inquiry for this research including in-depth interviews and focus group discussions with key research stakeholders in Kenya (Nairobi and Kilifi), and Ghana (Accra and Navrongo). The stakeholders interviewed emphasised the compelling scientific importance of sample export, storage and reuse, and acknowledged the existence of some structures governing these research practices, but they also highlighted the pressing need for a number of practical ethical concerns to be addressed in order to ensure high standards of practice and to maintain public confidence in international research collaborations. These concerns relate to obtaining culturally appropriate consent for sample export and reuse, understanding cultural sensitivities around the use of blood samples, facilitating a degree of local control of samples and sustainable scientific capacity building. Drawing on these findings and existing literature, we argue that the ethical issues arising in practice need to be understood in the context of the interactions between host research institutions and local communities and between collaborating institutions. We propose a set of 'key points-to-consider' for research institutions, ethics committees and funding agencies to address these issues.

  1. Theoretical basis, application, reliability, and sample size estimates of a Meridian Energy Analysis Device for Traditional Chinese Medicine Research

    Directory of Open Access Journals (Sweden)

    Ming-Yen Tsai

    Full Text Available OBJECTIVES: The Meridian Energy Analysis Device is currently a popular tool in the scientific research of meridian electrophysiology. In this field, it is generally believed that measuring the electrical conductivity of meridians provides information about the balance of bioenergy or Qi-blood in the body. METHODS AND RESULTS: PubMed database based on some original articles from 1956 to 2014 and the authoŕs clinical experience. In this short communication, we provide clinical examples of Meridian Energy Analysis Device application, especially in the field of traditional Chinese medicine, discuss the reliability of the measurements, and put the values obtained into context by considering items of considerable variability and by estimating sample size. CONCLUSION: The Meridian Energy Analysis Device is making a valuable contribution to the diagnosis of Qi-blood dysfunction. It can be assessed from short-term and long-term meridian bioenergy recordings. It is one of the few methods that allow outpatient traditional Chinese medicine diagnosis, monitoring the progress, therapeutic effect and evaluation of patient prognosis. The holistic approaches underlying the practice of traditional Chinese medicine and new trends in modern medicine toward the use of objective instruments require in-depth knowledge of the mechanisms of meridian energy, and the Meridian Energy Analysis Device can feasibly be used for understanding and interpreting traditional Chinese medicine theory, especially in view of its expansion in Western countries.

  2. Creating a sampling frame for population-based veteran research: representativeness and overlap of VA and Department of Defense databases.

    Science.gov (United States)

    Washington, Donna L; Sun, Su; Canning, Mark

    2010-01-01

    Most veteran research is conducted in Department of Veterans Affairs (VA) healthcare settings, although most veterans obtain healthcare outside the VA. Our objective was to determine the adequacy and relative contributions of Veterans Health Administration (VHA), Veterans Benefits Administration (VBA), and Department of Defense (DOD) administrative databases for representing the U.S. veteran population, using as an example the creation of a sampling frame for the National Survey of Women Veterans. In 2008, we merged the VHA, VBA, and DOD databases. We identified the number of unique records both overall and from each database. The combined databases yielded 925,946 unique records, representing 51% of the 1,802,000 U.S. women veteran population. The DOD database included 30% of the population (with 8% overlap with other databases). The VHA enrollment database contributed an additional 20% unique women veterans (with 6% overlap with VBA databases). VBA databases contributed an additional 2% unique women veterans (beyond 10% overlap with other databases). Use of VBA and DOD databases substantially expands access to the population of veterans beyond those in VHA databases, regardless of VA use. Adoption of these additional databases would enhance the value and generalizability of a wide range of studies of both male and female veterans.

  3. Combining censored and uncensored data in a U-statistic: design and sample size implications for cell therapy research.

    Science.gov (United States)

    Moyé, Lemuel A; Lai, Dejian; Jing, Kaiyan; Baraniuk, Mary Sarah; Kwak, Minjung; Penn, Marc S; Wu, Colon O

    2011-01-01

    The assumptions that anchor large clinical trials are rooted in smaller, Phase II studies. In addition to specifying the target population, intervention delivery, and patient follow-up duration, physician-scientists who design these Phase II studies must select the appropriate response variables (endpoints). However, endpoint measures can be problematic. If the endpoint assesses the change in a continuous measure over time, then the occurrence of an intervening significant clinical event (SCE), such as death, can preclude the follow-up measurement. Finally, the ideal continuous endpoint measurement may be contraindicated in a fraction of the study patients, a change that requires a less precise substitution in this subset of participants.A score function that is based on the U-statistic can address these issues of 1) intercurrent SCE's and 2) response variable ascertainments that use different measurements of different precision. The scoring statistic is easy to apply, clinically relevant, and provides flexibility for the investigators' prospective design decisions. Sample size and power formulations for this statistic are provided as functions of clinical event rates and effect size estimates that are easy for investigators to identify and discuss. Examples are provided from current cardiovascular cell therapy research.

  4. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  5. The Principle of Energetic Consistency

    Science.gov (United States)

    Cohn, Stephen E.

    2009-01-01

    A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of

  6. Consistent guiding center drift theories

    International Nuclear Information System (INIS)

    Wimmel, H.K.

    1982-04-01

    Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)

  7. Weak consistency and strong paraconsistency

    Directory of Open Access Journals (Sweden)

    Gemma Robles

    2009-11-01

    Full Text Available In a standard sense, consistency and paraconsistency are understood as, respectively, the absence of any contradiction and as the absence of the ECQ (“E contradictione quodlibet” rule that allows us to conclude any well formed formula from any contradiction. The aim of this paper is to explain the concepts of weak consistency alternative to the standard one, the concepts of paraconsistency related to them and the concept of strong paraconsistency, all of which have been defined by the author together with José M. Méndez.

  8. Consistent force fields for saccharides

    DEFF Research Database (Denmark)

    Rasmussen, Kjeld

    1999-01-01

    Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...

  9. Glass consistency and glass performance

    International Nuclear Information System (INIS)

    Plodinec, M.J.; Ramsey, W.G.

    1994-01-01

    Glass produced by the Defense Waste Processing Facility (DWPF) will have to consistently be more durable than a benchmark glass (evaluated using a short-term leach test), with high confidence. The DWPF has developed a Glass Product Control Program to comply with this specification. However, it is not clear what relevance product consistency has on long-term glass performance. In this report, the authors show that DWPF glass, produced in compliance with this specification, can be expected to effectively limit the release of soluble radionuclides to natural environments. However, the release of insoluble radionuclides to the environment will be limited by their solubility, and not glass durability

  10. Time-consistent actuarial valuations

    NARCIS (Netherlands)

    Pelsser, A.A.J.; Salahnejhad Ghalehjooghi, A.

    2016-01-01

    Time-consistent valuations (i.e. pricing operators) can be created by backward iteration of one-period valuations. In this paper we investigate the continuous-time limits of well-known actuarial premium principles when such backward iteration procedures are applied. This method is applied to an

  11. Dynamically consistent oil import tariffs

    International Nuclear Information System (INIS)

    Karp, L.; Newbery, D.M.

    1992-01-01

    The standard theory of optimal tariffs considers tariffs on perishable goods produced abroad under static conditions, in which tariffs affect prices only in that period. Oil and other exhaustable resources do not fit this model, for current tariffs affect the amount of oil imported, which will affect the remaining stock and hence its future price. The problem of choosing a dynamically consistent oil import tariff when suppliers are competitive but importers have market power is considered. The open-loop Nash tariff is solved for the standard competitive case in which the oil price is arbitraged, and it was found that the resulting tariff rises at the rate of interest. This tariff was found to have an equilibrium that in general is dynamically inconsistent. Nevertheless, it is shown that necessary and sufficient conditions exist under which the tariff satisfies the weaker condition of time consistency. A dynamically consistent tariff is obtained by assuming that all agents condition their current decisions on the remaining stock of the resource, in contrast to open-loop strategies. For the natural case in which all agents choose their actions simultaneously in each period, the dynamically consistent tariff was characterized, and found to differ markedly from the time-inconsistent open-loop tariff. It was shown that if importers do not have overwhelming market power, then the time path of the world price is insensitive to the ability to commit, as is the level of wealth achieved by the importer. 26 refs., 4 figs

  12. Consistency of Network Traffic Repositories: An Overview

    NARCIS (Netherlands)

    Lastdrager, E.; Lastdrager, E.E.H.; Pras, Aiko

    2009-01-01

    Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for

  13. Consistency analysis of network traffic repositories

    NARCIS (Netherlands)

    Lastdrager, Elmer; Lastdrager, E.E.H.; Pras, Aiko

    Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for

  14. Do Health Systems Have Consistent Performance Across Locations and Is Consistency Associated With Higher Performance?

    Science.gov (United States)

    Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D

    This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.

  15. Consistently violating the non-Gaussian consistency relation

    International Nuclear Information System (INIS)

    Mooij, Sander; Palma, Gonzalo A.

    2015-01-01

    Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations

  16. Variability of carotid artery measurements on 3-Tesla MRI and its impact on sample size calculation for clinical research.

    Science.gov (United States)

    Syed, Mushabbar A; Oshinski, John N; Kitchen, Charles; Ali, Arshad; Charnigo, Richard J; Quyyumi, Arshed A

    2009-08-01

    Carotid MRI measurements are increasingly being employed in research studies for atherosclerosis imaging. The majority of carotid imaging studies use 1.5 T MRI. Our objective was to investigate intra-observer and inter-observer variability in carotid measurements using high resolution 3 T MRI. We performed 3 T carotid MRI on 10 patients (age 56 +/- 8 years, 7 male) with atherosclerosis risk factors and ultrasound intima-media thickness > or =0.6 mm. A total of 20 transverse images of both right and left carotid arteries were acquired using T2 weighted black-blood sequence. The lumen and outer wall of the common carotid and internal carotid arteries were manually traced; vessel wall area, vessel wall volume, and average wall thickness measurements were then assessed for intra-observer and inter-observer variability. Pearson and intraclass correlations were used in these assessments, along with Bland-Altman plots. For inter-observer variability, Pearson correlations ranged from 0.936 to 0.996 and intraclass correlations from 0.927 to 0.991. For intra-observer variability, Pearson correlations ranged from 0.934 to 0.954 and intraclass correlations from 0.831 to 0.948. Calculations showed that inter-observer variability and other sources of error would inflate sample size requirements for a clinical trial by no more than 7.9%, indicating that 3 T MRI is nearly optimal in this respect. In patients with subclinical atherosclerosis, 3 T carotid MRI measurements are highly reproducible and have important implications for clinical trial design.

  17. Does self-selection affect samples' representativeness in online surveys? An investigation in online video game research.

    Science.gov (United States)

    Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-07-07

    The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.

  18. Cognitive consistency and math-gender stereotypes in Singaporean children.

    Science.gov (United States)

    Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu

    2014-01-01

    In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  20. Self-consistent radial sheath

    International Nuclear Information System (INIS)

    Hazeltine, R.D.

    1988-12-01

    The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig

  1. Lagrangian multiforms and multidimensional consistency

    Energy Technology Data Exchange (ETDEWEB)

    Lobb, Sarah; Nijhoff, Frank [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2009-10-30

    We show that well-chosen Lagrangians for a class of two-dimensional integrable lattice equations obey a closure relation when embedded in a higher dimensional lattice. On the basis of this property we formulate a Lagrangian description for such systems in terms of Lagrangian multiforms. We discuss the connection of this formalism with the notion of multidimensional consistency, and the role of the lattice from the point of view of the relevant variational principle.

  2. Consistency and Communication in Committees

    OpenAIRE

    Inga Deimen; Felix Ketelaar; Mark T. Le Quement

    2013-01-01

    This paper analyzes truthtelling incentives in pre-vote communication in heterogeneous committees. We generalize the classical Condorcet jury model by introducing a new informational structure that captures consistency of information. In contrast to the impossibility result shown by Coughlan (2000) for the classical model, full pooling of information followed by sincere voting is an equilibrium outcome of our model for a large set of parameter values implying the possibility of ex post confli...

  3. Deep Feature Consistent Variational Autoencoder

    OpenAIRE

    Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping

    2016-01-01

    We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...

  4. Consistency Analysis of Nearest Subspace Classifier

    OpenAIRE

    Wang, Yi

    2015-01-01

    The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...

  5. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  6. Spherical sampling

    CERN Document Server

    Freeden, Willi; Schreiner, Michael

    2018-01-01

    This book presents, in a consistent and unified overview, results and developments in the field of today´s spherical sampling, particularly arising in mathematical geosciences. Although the book often refers to original contributions, the authors made them accessible to (graduate) students and scientists not only from mathematics but also from geosciences and geoengineering. Building a library of topics in spherical sampling theory it shows how advances in this theory lead to new discoveries in mathematical, geodetic, geophysical as well as other scientific branches like neuro-medicine. A must-to-read for everybody working in the area of spherical sampling.

  7. Consistent ranking of volatility models

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger

    2006-01-01

    We show that the empirical ranking of volatility models can be inconsistent for the true ranking if the evaluation is based on a proxy for the population measure of volatility. For example, the substitution of a squared return for the conditional variance in the evaluation of ARCH-type models can...... variance in out-of-sample evaluations rather than the squared return. We derive the theoretical results in a general framework that is not specific to the comparison of volatility models. Similar problems can arise in comparisons of forecasting models whenever the predicted variable is a latent variable....

  8. Collaboration in research and the influential factors in Golestan University of Medical Sciences research projects (2005-2007): an academic sample from Iran.

    Science.gov (United States)

    Borghei, Afsaneh; Qorbani, Mostafa; Rezapour, Aziz; Majdzadeh, Reza; Nedjat, Saharnaz; Asayesh, Hamid; Mansourian, Morteza; Noroozi, Mahdi; Jahahgir, Fereydoon

    2013-08-01

    Number of Iranian articles published in ISI journals has increased significantly in recent years.Despite the quantitative progress, studies performed in Iran represent low collaboration in research; therefore,we decided to evaluate collaboration in Golestan University of Medical Sciences (GOUMS) research projects. In this cross-sectional study, all GOUMS research projects that had got grants from the universitybetween 2005-2007 were studied. Among 107 research projects included in our study, 102 projects were evaluatedand checklists were completed. The researcher's questionnaire was sent to the principle investigators (n=46) of the projects and eventually 40 questionnaires were collected. The review of 102 research proposals shows that 10 projects (9.8%) have been performed in collaborationwith other organizations. Scientific outputs in these projects have been more than projects which wereconfined to the university (98% compare to 68%; p= 0.04). The total cost of the projects under study was a littlemore than 300,000 US$. In just 12 projects (11.8%) a part of the cost had been provided by organizations outsidethe university. About 50% of researchers declared that they had chosen their research topic based on their"personal interest". Only 1 project was performed by the demand of nongovernmental organizations and 12 researchersreported no collaboration in their activities. This study shows that collaboration in GOUMS research projects is low. Moreover, collaborationswith governmental and nongovernmental organizations are trivial. The scientific outputs in collaborativeresearch projects are much more than other projects.

  9. Maintaining consistency in distributed systems

    Science.gov (United States)

    Birman, Kenneth P.

    1991-01-01

    In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.

  10. Self-consistent asset pricing models

    Science.gov (United States)

    Malevergne, Y.; Sornette, D.

    2007-08-01

    self-consistency condition derives a risk-factor decomposition in the multi-factor case which is identical to the principal component analysis (PCA), thus providing a direct link between model-driven and data-driven constructions of risk factors. This correspondence shows that PCA will therefore suffer from the same limitations as the CAPM and its multi-factor generalization, namely lack of out-of-sample explanatory power and predictability. In the multi-period context, the self-consistency conditions force the betas to be time-dependent with specific constraints.

  11. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research....

  12. Evaluating Temporal Consistency in Marine Biodiversity Hotspots

    OpenAIRE

    Piacenza, Susan E.; Thurman, Lindsey L.; Barner, Allison K.; Benkwitt, Cassandra E.; Boersma, Kate S.; Cerny-Chipman, Elizabeth B.; Ingeman, Kurt E.; Kindinger, Tye L.; Lindsley, Amy J.; Nelson, Jake; Reimer, Jessica N.; Rowe, Jennifer C.; Shen, Chenchen; Thompson, Kevin A.; Heppell, Selina S.

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monito...

  13. Ethical considerations in forensic genetics research on tissue samples collected post-mortem in Cape Town, South Africa.

    Science.gov (United States)

    Heathfield, Laura J; Maistry, Sairita; Martin, Lorna J; Ramesar, Raj; de Vries, Jantina

    2017-11-29

    The use of tissue collected at a forensic post-mortem for forensic genetics research purposes remains of ethical concern as the process involves obtaining informed consent from grieving family members. Two forensic genetics research studies using tissue collected from a forensic post-mortem were recently initiated at our institution and were the first of their kind to be conducted in Cape Town, South Africa. This article discusses some of the ethical challenges that were encountered in these research projects. Among these challenges was the adaptation of research workflows to fit in with an exceptionally busy service delivery that is operating with limited resources. Whilst seeking guidance from the literature regarding research on deceased populations, it was noted that next of kin of decedents are not formally recognised as a vulnerable group in the existing ethical and legal frameworks in South Africa. The authors recommend that research in the forensic mortuary setting is approached using guidance for vulnerable groups, and the benefit to risk standard needs to be strongly justified. Lastly, when planning forensic genetics research, consideration must be given to the potential of uncovering incidental findings, funding to validate these findings and the feedback of results to family members; the latter of which is recommended to occur through a genetic counsellor. It is hoped that these experiences will contribute towards a formal framework for conducting forensic genetic research in medico-legal mortuaries in South Africa.

  14. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  15. Archive of information about geological samples available for research from the Ohio State University Byrd Polar and Climate Research Center (BPCRC) Polar Rock Repository

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Polar Rock Repository (PRR) operated by the Byrd Polar and Climate Research Center (BPCRC) at the Ohio State University is a partner in the Index to Marine and...

  16. Report of the First Community Consultation on the Responsible Collection and Use of Samples for Genetic Research, September 25-26, 2000

    Energy Technology Data Exchange (ETDEWEB)

    Greenberg, Judith H.

    2002-05-22

    The First Community Consultation on the Responsible Collection and Use of Samples for Genetic Research was held in Bethesda, Maryland, on September 25-26, 2000. The consultation was convened by the National Institute of General Medical Sciences (NIGMS) of the National Institutes of Health (NIH). Approximately 120 individuals participated in the consultation, half from a broad range of communities and populations, and half from government. The participants shared their views and concerns about population- and community-based genetic research, expanding the focus of the meeting from the collection and use of blood or other tissue samples for genetic research to broader issues and concerns about the conduct of genetic research in general with populations and communities.

  17. Researching research

    DEFF Research Database (Denmark)

    Pais, Alexandre; Valero, Paola

    2012-01-01

    We discuss contemporary theories in mathematics education in order to do research on research. Our strategy consists of analysing discursively and ideologically recent key publications addressing the role of theory in mathematics education research. We examine how the field fabricates its object...... of research by deploying Foucault’s notion of bio-politics - mainly to address the object “learning” - and Žižek’s ideology critique - to address the object “mathematics”. These theories, which have already been used in the field to research teaching and learning, have a great potential to contribute...... to a reflexivity of research on its discourses and effects. Furthermore, they enable us to present a clear distinction between what has been called the sociopolitical turn in mathematics education research and what we call a positioning of mathematics education (research) practices in the Political....

  18. Ambient air sampling for radioactive air contaminants at Los Alamos National Laboratory: A large research and development facility

    International Nuclear Information System (INIS)

    Eberhart, C.F.

    1998-01-01

    This paper describes the ambient air sampling program for collection, analysis, and reporting of radioactive air contaminants in and around Los Alamos National Laboratory (LANL). Particulate matter and water vapor are sampled continuously at more than 50 sites. These samples are collected every two weeks and then analyzed for tritium, and gross alpha, gross beta, and gamma ray radiation. The alpha, beta, and gamma measurements are used to detect unexpected radionuclide releases. Quarterly composites are analyzed for isotopes of uranium ( 234 U, 235 U, 238 U), plutonium ( 238 Pu, 239/249 Pu), and americium ( 241 Am). All of the data is stored in a relational database with hard copies as the official records. Data used to determine environmental concentrations are validated and verified before being used in any calculations. This evaluation demonstrates that the sampling and analysis process can detect tritium, uranium, plutonium, and americium at levels much less than one percent of the public dose limit of 10 millirems. The isotopic results also indicate that, except for tritium, off-site concentrations of radionuclides potentially released from LANL are similar to typical background measurements

  19. Barriers to publishing in biomedical journals perceived by a sample of French researchers: results of the DIAzePAM study.

    Science.gov (United States)

    Duracinsky, Martin; Lalanne, Christophe; Rous, Laurence; Dara, Aichata Fofana; Baudoin, Lesya; Pellet, Claire; Descamps, Alexandre; Péretz, Fabienne; Chassany, Olivier

    2017-07-10

    As publishing is essential but competitive for researchers, difficulties in writing and submitting medical articles to biomedical journals are disabling. The DIAzePAM (Difficultés des Auteurs à la Publication d'Articles Médicaux) survey aimed to assess the difficulties experienced by researchers in the AP-HP (Assistance Publique - Hôpitaux de Paris, i.e., Paris Hospitals Board, France), the largest public health institution in Europe, when preparing articles for biomedical journals. The survey also aimed to assess researchers' satisfaction and perceived needs. A 39-item electronic questionnaire based on qualitative interviews was addressed by e-mail to all researchers registered in the AP-HP SIGAPS (Système d'Interrogation, de Gestion et d'Analyse des Publications Scientifiques) bibliometric database. Between 28 May and 15 June 2015, 7766 researchers should have received and read the e-mail, and 1191 anonymously completed the questionnaire (write (79%) or submit (27%), limited skills in English (40%) or in writing (32%), and difficulty in starting writing (35%). 87% of respondents would accept technical support, especially in English reediting (79%), critical reediting (63%), formatting (52%), and/or writing (41%), to save time (92%) and increase high-impact-factor journal submission and acceptance (75%). 79% of respondents would appreciate funding support for their future publications, for English reediting (56%), medical writing (21%), or publication (38%) fees. They considered that this funding support could be covered by AP-HP (73%) and/or by the added financial value obtained by their department from previous publications (56%). The DIAzePAM survey highlights difficulties experienced by researchers preparing articles for biomedical journals, and details room for improvement.

  20. The Japanese Society of Pathology Guidelines on the handling of pathological tissue samples for genomic research: Standard operating procedures based on empirical analyses.

    Science.gov (United States)

    Kanai, Yae; Nishihara, Hiroshi; Miyagi, Yohei; Tsuruyama, Tatsuhiro; Taguchi, Kenichi; Katoh, Hiroto; Takeuchi, Tomoyo; Gotoh, Masahiro; Kuramoto, Junko; Arai, Eri; Ojima, Hidenori; Shibuya, Ayako; Yoshida, Teruhiko; Akahane, Toshiaki; Kasajima, Rika; Morita, Kei-Ichi; Inazawa, Johji; Sasaki, Takeshi; Fukayama, Masashi; Oda, Yoshinao

    2018-02-01

    Genome research using appropriately collected pathological tissue samples is expected to yield breakthroughs in the development of biomarkers and identification of therapeutic targets for diseases such as cancers. In this connection, the Japanese Society of Pathology (JSP) has developed "The JSP Guidelines on the Handling of Pathological Tissue Samples for Genomic Research" based on an abundance of data from empirical analyses of tissue samples collected and stored under various conditions. Tissue samples should be collected from appropriate sites within surgically resected specimens, without disturbing the features on which pathological diagnosis is based, while avoiding bleeding or necrotic foci. They should be collected as soon as possible after resection: at the latest within about 3 h of storage at 4°C. Preferably, snap-frozen samples should be stored in liquid nitrogen (about -180°C) until use. When intending to use genomic DNA extracted from formalin-fixed paraffin-embedded tissue, 10% neutral buffered formalin should be used. Insufficient fixation and overfixation must both be avoided. We hope that pathologists, clinicians, clinical laboratory technicians and biobank operators will come to master the handling of pathological tissue samples based on the standard operating procedures in these Guidelines to yield results that will assist in the realization of genomic medicine. © 2018 The Authors. Pathology International published by Japanese Society of Pathology and John Wiley & Sons Australia, Ltd.

  1. DOE/DOT Crude Oil Characterization Research Study, Task 2 Test Report on Evaluating Crude Oil Sampling and Analysis Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Ray [Allen Energy Services, Inc., Longview, TX (United States); Rudeen, David [GRAM, Inc., Albuquerque, NM (United States)

    2017-11-01

    The Crude Oil Characterization Research Study is designed to evaluate whether crude oils currently transported in North America, including those produced from "tight" formations, exhibit physical or chemical properties that are distinct from conventional crudes, and how these properties associate with combustion hazards with may be realized during transportation and handling.

  2. Assessment of the National Research Universal Reactor Proposed New Stack Sampling Probe Location for Compliance with ANSI/HPS N13.1-1999

    Energy Technology Data Exchange (ETDEWEB)

    Glissmeyer, John A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonio, Ernest J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Flaherty, Julia E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-02-29

    This document reports on a series of tests conducted to assess the proposed air sampling location for the National Research Universal reactor (NRU) complex exhaust stack, located in Chalk River, Ontario, Canada, with respect to the applicable criteria regarding the placement of an air sampling probe. Due to the age of the equipment in the existing monitoring system, and the increasing difficulty in acquiring replacement parts to maintain this equipment, a more up-to-date system is planned to replace the current effluent monitoring system, and a new monitoring location has been proposed. The new sampling probe should be located within the exhaust stack according to the criteria established by the American National Standards Institute/Health Physics Society (ANSI/HPS) N13.1-1999, Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stack and Ducts of Nuclear Facilities. These criteria address the capability of the sampling probe to extract a sample that represents the effluent stream. The internal Pacific Northwest National Laboratory (PNNL) project for this task was 65167, Atomic Energy Canada Ltd. Chalk River Effluent Duct Flow Qualification. The testing described in this document was guided by the Test Plan: Testing of the NRU Stack Air Sampling Position (TP-STMON-032).

  3. Employee Satisfaction in Hospitals with Afilasyo; Sample of Training and Research Hospital of University of Mugla Sitki Kocman

    Directory of Open Access Journals (Sweden)

    Nazli Ülger

    2016-01-01

    Full Text Available Aim: To determine the factors of which affect the employee satisfaction and has a an important role in giving qualified and efficient servise in Mugla Sitki Koçman University Training and Research Hospital where affiliation is applied. Material and Method: Questionnaire form was made to Mugla Sitki Koçman University Training and Research Hospital employees. The data in the research taken from the questionnaires were transferred to SPSS for analysis. As statistical analysis; reliability analysis and comparative analysis of the average one way analysis of variance (ANOVA analysis was performed. Results: According to results, verbal, mobbing and physical every kind of effect on burn out dimensions and unsatisfied, emotional exhaustion and depersonalization. Additionally, these conditions, has directly effect on job satisfaction and working cooperation. Discussion: There is burn out syndrome between health employees’ different dimensions and levels. As a result, contribute directly to beter patient services will be, it is suggested pay attenion to improve working conditions and welfare of health employees, highlight the importance the employees in terms of institutions and community and development of social status of employees.

  4. Barriers to publishing in biomedical journals perceived by a sample of French researchers: results of the DIAzePAM study

    Directory of Open Access Journals (Sweden)

    Martin Duracinsky

    2017-07-01

    Full Text Available Abstract Background As publishing is essential but competitive for researchers, difficulties in writing and submitting medical articles to biomedical journals are disabling. The DIAzePAM (Difficultés des Auteurs à la Publication d’Articles Médicaux survey aimed to assess the difficulties experienced by researchers in the AP-HP (Assistance Publique – Hôpitaux de Paris, i.e., Paris Hospitals Board, France, the largest public health institution in Europe, when preparing articles for biomedical journals. The survey also aimed to assess researchers’ satisfaction and perceived needs. Methods A 39-item electronic questionnaire based on qualitative interviews was addressed by e-mail to all researchers registered in the AP-HP SIGAPS (Système d’Interrogation, de Gestion et d’Analyse des Publications Scientifiques bibliometric database. Results Between 28 May and 15 June 2015, 7766 researchers should have received and read the e-mail, and 1191 anonymously completed the questionnaire (<45 years of age: 63%; women: 55%; physician: 81%; with PhD or Habilitation à Diriger des recherches––accreditation to direct research––: 45%. 94% of respondents had published at least one article in the previous 2 years. 76% of respondents felt they were not publishing enough, mainly because of lack of time to write (79% or submit (27%, limited skills in English (40% or in writing (32%, and difficulty in starting writing (35%. 87% of respondents would accept technical support, especially in English reediting (79%, critical reediting (63%, formatting (52%, and/or writing (41%, to save time (92% and increase high-impact-factor journal submission and acceptance (75%. 79% of respondents would appreciate funding support for their future publications, for English reediting (56%, medical writing (21%, or publication (38% fees. They considered that this funding support could be covered by AP-HP (73% and/or by the added financial value obtained by their

  5. Amostragem domiciliar contínua em estudos epidemiológicos e no ensino Continuous household sampling for epidemiological research and for teaching purposes

    Directory of Open Access Journals (Sweden)

    José da Rocha Carvalheiro

    1979-09-01

    Full Text Available Descreve-se um sistema contínuo de levantamento de condições de saúde, por entrevistas domiciliárias, operando em Ribeirão Preto (SP desde 1974. Comentam-se as vantagens quanto à sua utilização na investigação de problemas específicos surgidos nesse período, bem como a sua utilização no ensino.The use of adequate populational-base survey is frequently impossible in epidemiological studies. Special studies are made among particular groups of individuals to investigate simultaneously the presence of both the factor and the disease. In these studies it is obviously important to use adequate sampling techniques. A system of continuous household sampling is described, designed to perform, simultaneously, epidemiological research, health system monitoring and to serve as a basis for courses on sampling techniques and epidemiological methods. In the municipality of Ribeirão Preto, S. Paulo, Brazil a household sampling system has been in operation since 1974, using a master sample of 8500 households. Every two weeks, 380 households are visited and information is gathered about diseases, accidents, and the use of health services. Special epidemiological research is introduced when necessary. Future development includes the use of standardized questionnaires and physical and laboratory examinations of the people interviewed.

  6. [Current modalities and concepts on access and use of biospecimen samples and associated data for research from human biobanks].

    Science.gov (United States)

    Siddiqui, Roman; Semler, Sebastian Claudius

    2016-03-01

    It is accepted worldwide that biospecimen and data sharing (BDS) play an essential role for the future of medical research to improve diagnostics and prognostics, e.g. by validated biomarkers. BDS is also pivotal to the development of new therapeutic treatments and for the improvement of population health. Human biobanks can generate an added value to this need by providing biospecimens and/or associated data to researchers. An inspection of several examples of epidemiological as well as clinical/disease-oriented biobanks in Germany shows that best practice procedures (BPP) that are internationally agreed on are being installed for biospecimen and/or data access. In general, fair access is aimed at requiring a written application by the requesting scientist, which is then peer-reviewed for scientific and ethical validity by the Biobank. Applied BPP take into account (i) patient education/agreement according to the informed consent model, (ii) privacy protection, (iii) intellectual property rights, the (iv) notification obligation of health-related findings (including incidental findings), the (v) use of material (MTA) and data transfer agreements (DTA) for mutual legal security, the avoidance of conflicts of interests, as well as for cost recovery/fee for service as a basis for sustainability of the biobank. BPP are rooted in the self-regulation efforts of life sciences and are supported by parent ethics committees in Germany. Central biobank registries displaying aggregated information on biospecimens stored and the research foci constitute an important tool to make biobanks that are scattered across the country visible to each other, and, can thus promote access to hitherto unknown biospecimen and data resources.

  7. Materials and Life Science Experimental Facility at the Japan Proton Accelerator Research Complex III: Neutron Devices and Computational and Sample Environments

    Directory of Open Access Journals (Sweden)

    Kaoru Sakasai

    2017-08-01

    Full Text Available Neutron devices such as neutron detectors, optical devices including supermirror devices and 3He neutron spin filters, and choppers are successfully developed and installed at the Materials Life Science Facility (MLF of the Japan Proton Accelerator Research Complex (J-PARC, Tokai, Japan. Four software components of MLF computational environment, instrument control, data acquisition, data analysis, and a database, have been developed and equipped at MLF. MLF also provides a wide variety of sample environment options including high and low temperatures, high magnetic fields, and high pressures. This paper describes the current status of neutron devices, computational and sample environments at MLF.

  8. On the Sampling

    OpenAIRE

    Güleda Doğan

    2017-01-01

    This editorial is on statistical sampling, which is one of the most two important reasons for editorial rejection from our journal Turkish Librarianship. The stages of quantitative research, the stage in which we are sampling, the importance of sampling for a research, deciding on sample size and sampling methods are summarised briefly.

  9. The Enhancement of Consistency of Interpretation Skills on the Newton’s Laws Concept

    Directory of Open Access Journals (Sweden)

    Yudi Kurniawan

    2018-03-01

    Full Text Available Conceptual understanding is the most important thing that students should have rather than they had reaches achievement. The interpretation skill is one of conceptual understanding aspects. The aim of this paper is to know the consistency of students’ interpreting skills and all at once to get the levels of increasing of students’ interpretations skill. These variables learned by Interactive Lecture Demonstrations (ILD common sense. The method of this research is pre-experimental research with one group pretest-posttest design. The sample has taken by cluster random sampling. The result had shown that 16 % of all student that are have perfect consistency of interpretation skill and there are increasing of interpretation skill on 84 % from unknown to be understand (this skill. This finding could be used by the future researcher to study in the other areas of conceptual understanding aspects

  10. Studies on application of neutron activation analysis -Applied research on air pollution monitoring and development of analytical method of environmental samples

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Chung, Young Ju; Jeong, Eui Sik; Lee, Sang Mi; Kang, Sang Hun; Cho, Seung Yeon; Kwon, Young Sik; Chung, Sang Wuk; Lee, Kyu Sung; Chun, Ki Hong; Kim, Nak Bae; Lee, Kil Yong; Yoon, Yoon Yeol; Chun, Sang Ki.

    1997-09-01

    This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs

  11. Prevalence of Methicillin Resistant Staphylococcus aureus in Clinical Samples of Teerthankar Mahaveer Medical College Hospital and Research Centre (TMMCH & RC, Moradabad (UP, India

    Directory of Open Access Journals (Sweden)

    Bina Pani Gupta

    2017-06-01

    Full Text Available Staphylococcus aureus is the emerging and prevalent pathogen causing serious infections in community and hospital associated diseases. S. aureus resistant to methicillin is nowadays a big and expanding problem of concern in India. Amongst the different pathogens, S. aureus is being studied for prevalence of infections and drug resistance behavior. The present study describes the dominance of Staphylococcus aureus prevalence in the clinical samples of TMU, Moradabad, India. The study showed the isolation of 450 cultures of S. aureus from different samples. Amongst which, 234 isolates of S. aureus were from pus, 164 isolates were from blood, 15 isolates were from respiratory fluid samples, 33 isolates were from urine samples and 04 isolates were from ear swabs and nasal swabs. These strains of S. aureus were screened for characteristic coagulase assay. The strains were found to be coagulase positive and coagulases negative both. It was observed that, amongst, 450 isolates of Staphylococci, 185 (41.11% strains were coagulase positive and 265 (58.88% were coagulase negative. A total of 142 (76.75% of the coagulase positive staphylococci strains shows resistance to methicillin and 202 (76.22% coagulase negative strains showed methicillin resistance. Methicillin resistance was consistent when tested with other antibiotics in coagulase positive strains but when studied about coagulase negative strains, about 12.5% strains showed sensitivity with other antibiotics although they were found resistant when checked with methicillin. It was determined that, on an average, 85 (18.88% Staphylococci strains were resistant.

  12. Strain Analysis in the Assessment of a Mouse Model of Cardiotoxicity due to Chemotherapy: Sample for Preclinical Research.

    Science.gov (United States)

    Rea, Domenica; Coppola, Carmela; Barbieri, Antonio; Monti, Maria Gaia; Misso, Gabriella; Palma, Giuseppe; Bimonte, Sabrina; Zarone, Mayra Rachele; Luciano, Antonio; Liccardo, Davide; Maiolino, Piera; Cittadini, Antonio; Ciliberto, Gennaro; Arra, Claudio; Maurea, Nicola

    2016-01-01

    In recent years, the development of more effective anticancer drugs has provided great benefits in patients' quality of life by improving both prognosis and disease-free survival. Nevertheless, the frequency and severity of side-effects, with particular reference to cardiac toxicity, have gained particular attention. The purpose of this study was to create a precise and sensitive preclinical model, able to identify early contractile dysfunction in mice treated with chemotherapy, through use of speckle-tracking echocardiography. We generated a mouse model of cardiotoxicity induced by doxorubicin. C57BL 6 mice were divided into two groups, treated for 7 days by intraperitoneal injections of placebo (vehicle) or doxorubicin (2.17 mg/kg), in order to characterize the cardiac phenotype in vivo. We demonstrated that doxorubicin caused ealy remodeling of the left ventricle: after two days of therapy, the radial, circumferential and strain rates were reduced respectively by 35%, 34%, and 39% (p-value ≤0.001). Moreover, histological analysis revealed that doxorubicin treatment increased fibrosis, cardiomyocyte diameter and apoptosis. In a murine model of doxorubicin-induced cardiac injury, we detected left ventricular dysfunction followed by alterations in conventional echocardiographic indices. Our study suggests that a change in strain could be an effective early marker of myocardial dysfunction for new anticancer treatments and, in preclinical studies, it might also be a valuable indicator for the assessment of activity of cardioprotective agents. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  13. Auxiliary variables in multiple imputation in regression with missing X: a warning against including too many in small sample research

    Directory of Open Access Journals (Sweden)

    Hardt Jochen

    2012-12-01

    Full Text Available Abstract Background Multiple imputation is becoming increasingly popular. Theoretical considerations as well as simulation studies have shown that the inclusion of auxiliary variables is generally of benefit. Methods A simulation study of a linear regression with a response Y and two predictors X1 and X2 was performed on data with n = 50, 100 and 200 using complete cases or multiple imputation with 0, 10, 20, 40 and 80 auxiliary variables. Mechanisms of missingness were either 100% MCAR or 50% MAR + 50% MCAR. Auxiliary variables had low (r=.10 vs. moderate correlations (r=.50 with X’s and Y. Results The inclusion of auxiliary variables can improve a multiple imputation model. However, inclusion of too many variables leads to downward bias of regression coefficients and decreases precision. When the correlations are low, inclusion of auxiliary variables is not useful. Conclusion More research on auxiliary variables in multiple imputation should be performed. A preliminary rule of thumb could be that the ratio of variables to cases with complete data should not go below 1 : 3.

  14. A Research on the Responsibility of Accounting Professionals to Determine and Prevent Accounting Errors and Frauds: Edirne Sample

    Directory of Open Access Journals (Sweden)

    Semanur Adalı

    2017-09-01

    Full Text Available In this study, the ethical dimensions of accounting professionals related to accounting errors and frauds were examined. Firstly, general and technical information about accounting were provided. Then, some terminology on error, fraud and ethics in accounting were discussed. Study also included recent statistics about accounting errors and fraud as well as presenting a literature review. As the methodology of research, a questionnaire was distributed to 36 accounting professionals residing in Edirne city of Turkey. The collected data were then entered to the SPSS package program for analysis. The study revealed very important results. Accounting professionals think that, accounting chambers do not organize enough seminars/conferences on errors and fraud. They also believe that supervision and disciplinary boards of professional accounting chambers fulfill their responsibilities partially. Attitude of professional accounting chambers in terms of errors, fraud and ethics is considered neither strict nor lenient. But, most accounting professionals are aware of colleagues who had disciplinary penalties. Most important and effective tool to prevent errors and fraud is indicated as external audit, but internal audit and internal control are valued as well. According to accounting professionals, most errors occur due to incorrect data received from clients and as a result of recording. Fraud is generally made in order to get credit from banks and for providing benefits to the organization by not showing the real situation of the firm. Finally, accounting professionals state that being honest, trustworthy and impartial is the basis of accounting profession and accountants must adhere to ethical rules.

  15. Synchrotron radiation. 4. Analyses of biological samples using synchrotron radiation. 3. Research on radiation damage to DNA using synchrotron radiation

    International Nuclear Information System (INIS)

    Takakura, Kaoru

    1998-01-01

    This review described how the synchrotron radiation (SR) is used to solve problems unknown hitherto in radiation biology. Historically, the target substance of UV light in bacterial death was suggested to be nucleic acid in 1930. Researches on the radiation damage to DNA were begun at around 1960 and have mainly used UV light, X-ray and γray. Soft X-ray and vacuum UV whose energy covering from several eV to scores of keV have not been used since UV and X-ray lack the energy of this range. This is one of reasons why detailed process leading to radiation-induced death, carcinogenicity and mutation has not been known hitherto. RS possesses wide range of energy, i.e., from UV to hard X-ray, of high intensity, which is helpful for studying the unknown problems. The RS studies were begun in nineteen-seventies. Those include the action spectrum studies and atomic target studies. In the former, the course of the effect, e.g., the mechanism of DNA double strand breakage, can be elucidated. In the latter, photon of known energy can be irradiated to the specified atom like phosphorus in DNA which elucidating the precise physicochemical process of the breakage. Use of RS in these studies is thought still meaningful in future. (K.H.) 62 refs

  16. Research

    African Journals Online (AJOL)

    ebutamanya

    2015-03-02

    Mar 2, 2015 ... Joseph Daniels1,&, Ruth Nduati1,2, James Kiarie1,3, Carey Farquhar1,4,5 .... or basic science research career (Socio-Behavioral Research, .... a research environment that supports knowledge sharing to develop research ...

  17. Sludge characterization: the role of physical consistency

    Energy Technology Data Exchange (ETDEWEB)

    Spinosa, Ludovico; Wichmann, Knut

    2003-07-01

    The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)

  18. Technical Proposal for Loading 3000 Gallon Crude Oil Samples from Field Terminal to Sandia Pressurized Tanker to Support US DOE/DOT Crude Oil Characterization Research Study

    Energy Technology Data Exchange (ETDEWEB)

    Lord, David; Allen, Raymond

    2016-10-01

    Sandia National Laboratories is seeking access to crude oil samples for a research project evaluating crude oil combustion properties in large-scale tests at Sandia National Laboratories in Albuquerque, NM. Samples must be collected from a source location and transported to Albuquerque in a tanker that complies with all applicable regulations for transportation of crude oil over public roadways. Moreover, the samples must not gain or lose any components, to include dissolved gases, from the point of loading through the time of combustion at the Sandia testing facility. In order to achieve this, Sandia designed and is currently procuring a custom tanker that utilizes water displacement in order to achieve these performance requirements. The water displacement procedure is modeled after the GPA 2174 standard “Obtaining Liquid Hydrocarbons Samples for Analysis by Gas Chromatography” (GPA 2014) that is used routinely by crude oil analytical laboratories for capturing and testing condensates and “live” crude oils, though it is practiced at the liter scale in most applications. The Sandia testing requires 3,000 gallons of crude. As such, the water displacement method will be upscaled and implemented in a custom tanker. This report describes the loading process for acquiring a ~3,000 gallon crude oil sample from commercial process piping containing single phase liquid crude oil at nominally 50-100 psig. This document contains a general description of the process (Section 2), detailed loading procedure (Section 3) and associated oil testing protocols (Section 4).

  19. Implications and applications of systematic reviews for evidence-based dentistry and comparative effectiveness research: A sample study on antibiotics for oro-facial cellulitis treatment

    Directory of Open Access Journals (Sweden)

    Quyen Bach

    2015-01-01

    Full Text Available Introduction: Comparative effectiveness and efficacy research for analysis and practice (CEERAP was performed to assess the effects of penicillin-based versus erythromycin-based antibiotic treatments in patients with skin and soft tissue infections (SSTIs including cellulitis, impetigo, and erysipelas. Because SSTIs, especially orofacial cellulitis, are volatile infectious diseases of a life-threatening nature, research on the most efficacious remedies is necessary. Methods: The stringent bibliome yielded three systematic reviews, which were examined for quality of research synthesis protocol and clinical relevance. Results: The sample size of three, rendered the statistical analyses and cumulative meta-analysis problematic. Conclusion: The systematic review outlined here should aid in increasing clinical awareness, improving patient health literacy, and promoting consensus of the best evidence base (BEB to mitigate the threat of sepsis and potential death caused by cellulitis infections.

  20. Simple and rapid determination methods for low-level radioactive wastes generated from nuclear research facilities. Guidelines for determination of radioactive waste samples

    International Nuclear Information System (INIS)

    Kameo, Yutaka; Shimada, Asako; Ishimori, Ken-ichiro; Haraga, Tomoko; Katayama, Atsushi; Nakashima, Mikio; Hoshi, Akiko

    2009-10-01

    Analytical methods were developed for simple and rapid determination of U, Th, and several nuclides, which are selected as important nuclides for safety assessment of disposal of wastes generated from research facilities at Nuclear Science Research Institute and Oarai Research and Development Center. The present analytical methods were assumed to apply to solidified products made from miscellaneous wastes by plasma melting in the Advanced Volume Reduction Facilities. In order to establish a system to analyze the important nuclides in the solidified products at low cost and routinely, we have advanced the development of a high-efficiency non-destructive measurement technique for γ-ray emitting nuclides, simple and rapid methods for pretreatment of solidified product samples and subsequent radiochemical separations, and rapid determination methods for long-lived nuclides. In the present paper, we summarized the methods developed as guidelines for determination of radionuclides in the low-level solidified products. (author)

  1. Context matters: volunteer bias, small sample size, and the value of comparison groups in the assessment of research-based undergraduate introductory biology lab courses.

    Science.gov (United States)

    Brownell, Sara E; Kloser, Matthew J; Fukami, Tadashi; Shavelson, Richard J

    2013-01-01

    The shift from cookbook to authentic research-based lab courses in undergraduate biology necessitates the need for evaluation and assessment of these novel courses. Although the biology education community has made progress in this area, it is important that we interpret the effectiveness of these courses with caution and remain mindful of inherent limitations to our study designs that may impact internal and external validity. The specific context of a research study can have a dramatic impact on the conclusions. We present a case study of our own three-year investigation of the impact of a research-based introductory lab course, highlighting how volunteer students, a lack of a comparison group, and small sample sizes can be limitations of a study design that can affect the interpretation of the effectiveness of a course.

  2. Context Matters: Volunteer Bias, Small Sample Size, and the Value of Comparison Groups in the Assessment of Research-Based Undergraduate Introductory Biology Lab Courses

    Directory of Open Access Journals (Sweden)

    Sara E. Brownell

    2013-08-01

    Full Text Available The shift from cookbook to authentic research-based lab courses in undergraduate biology necessitates the need for evaluation and assessment of these novel courses. Although the biology education community has made progress in this area, it is important that we interpret the effectiveness of these courses with caution and remain mindful of inherent limitations to our study designs that may impact internal and external validity. The specific context of a research study can have a dramatic impact on the conclusions. We present a case study of our own three-year investigation of the impact of a research-based introductory lab course, highlighting how volunteer students, a lack of a comparison group, and small sample sizes can be limitations of a study design that can affect the interpretation of the effectiveness of a course.

  3. Research

    African Journals Online (AJOL)

    raoul

    2011-12-14

    Dec 14, 2011 ... Central African Field Epidemiology and Laboratory Training Program: building and .... consisting of the core public health courses in epidemiology, ... incentives, opportunities for professional and personal growth are some of ...

  4. Research

    African Journals Online (AJOL)

    A descriptive qualitative research design was used to determine whether participants ... simulation as a teaching method; a manikin offering effective learning; confidence ..... Tesch R. Qualitative Research: Analysis Types and Software Tools.

  5. Consistency of canonical formulation of Horava gravity

    International Nuclear Information System (INIS)

    Soo, Chopin

    2011-01-01

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  6. Consistency of canonical formulation of Horava gravity

    Energy Technology Data Exchange (ETDEWEB)

    Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)

    2011-09-22

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  7. Research

    African Journals Online (AJOL)

    research process, as part of which students must find and appraise evidence from research.[5] This highlights that teaching research methodology is inclined towards equipping students ... Students believed that evidence-based practice was vital, yet their understanding of the concept was restricted when compared with the.

  8. Iowa Geologic Sampling Points

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Point locations of geologic samples/files in the IGS repository. Types of samples include well cuttings, outcrop samples, cores, drillers logs, measured sections,...

  9. University of TX Bureau of Economic Geology's Core Research Centers: The Time is Right for Registering Physical Samples and Assigning IGSN's - Workflows, Stumbling Blocks, and Successes.

    Science.gov (United States)

    Averett, A.; DeJarnett, B. B.

    2016-12-01

    The University Of Texas Bureau Of Economic Geology (BEG) serves as the geological survey for Texas and operates three geological sample repositories that house well over 2 million boxes of geological samples (cores and cuttings) and an abundant amount of geoscience data (geophysical logs, thin sections, geochemical analyses, etc.). Material is accessible and searchable online, and it is publically available to the geological community for research and education. Patrons access information about our collection by using our online core and log database (SQL format). BEG is currently undertaking a large project to: 1) improve the internal accuracy of metadata associated with the collection; 2) enhance the capabilities of the database for both BEG curators and researchers as well as our external patrons; and 3) ensure easy and efficient navigation for patrons through our online portal. As BEG undertakes this project, BEG is in the early stages of planning to export the metadata for its collection into SESAR (System for Earth Sample Registration) and have IGSN's (International GeoSample Numbers) assigned to its samples. Education regarding the value of IGSN's and an external registry (SESAR) has been crucial to receiving management support for the project because the concept and potential benefits of registering samples in a registry outside of the institution were not well-known prior to this project. Potential benefits such as increases in discoverability, repository recognition in publications, and interoperability were presented. The project was well-received by management, and BEG fully supports the effort to register our physical samples with SESAR. Since BEG is only in the initial phase of this project, any stumbling blocks, workflow issues, successes/failures, etc. can only be predicted at this point, but by mid-December, BEG expects to have several concrete issues to present in the session. Currently, our most pressing issue involves establishing the most

  10. AutoGNI, the Robot Under the Aircraft Floor: An Automated System for Sampling Giant Aerosol Particles by Impaction in the Free Airstream Outside a Research Aircraft

    Science.gov (United States)

    Jensen, J. B.; Schwenz, K.; Aquino, J.; Carnes, J.; Webster, C.; Munnerlyn, J.; Wissman, T.; Lugger, T.

    2017-12-01

    Giant sea-salt aerosol particles, also called Giant Cloud Condensation Nuclei (GCCN), have been proposed as a means of rapidly forming precipitation sized drizzle drops in warm marine clouds (e.g., Jensen and Nugent, 2017). Such rare particles are best sampled from aircraft in air below cloud base, where normal laser optical instruments have too low sample volume to give statistically significant samples of the large particle tail. An automated sampling system (the AutoGNI) has been built to operate from inside a pressurized aircraft. Under the aircraft floor, a pressurized vessel contains 32 custom-built polycarbonate microscope slides. Using robotics with 5 motor drives and 18 positioning switches, the AutoGNI can take slides from their holding cassettes, pass them onto a caddy in an airfoil that extends 200 mm outside the aircraft, where they are exposed in the free airstream, thus avoiding the usual problems with large particle losses in air intakes. Slides are typically exposed for 10-30 s in the marine boundary layer, giving sample volumes of about 100-300 L or more. Subsequently the slides are retracted into the pressure vessel, stored and transported for laboratory microscope image analysis, in order to derive size-distribution histograms. While the aircraft is flying, the AutoGNI system is remotely controlled from a laptop on the ground, using an encrypted commercial satellite connection to the NSF/NCAR GV research aircraft's main server, and onto the AutoGNI microprocessor. The sampling of such GCCN is becoming increasingly important in order to provide complete input data for model calculations of aerosol-cloud interactions and their feedbacks in climate prediction. The AutoGNI has so far been sampling sea-salt GCCN in the Magellan Straight during the 2016 ORCAS project and over the NW Pacific during the 2017 ARISTO project, both from the NSF/NCAR GV research aircraft. Sea-salt particle sizes of 1.4 - 32 μm dry diameter have been observed.

  11. Using the Perceptron Algorithm to Find Consistent Hypotheses

    OpenAIRE

    Anthony, M.; Shawe-Taylor, J.

    1993-01-01

    The perceptron learning algorithm yields quite naturally an algorithm for finding a linearly separable boolean function consistent with a sample of such a function. Using the idea of a specifying sample, we give a simple proof that this algorithm is not efficient, in general.

  12. El muestreo en investigación cualitativa: principios básicos y algunas controversias Sampling in qualitative research: basic principles and some controversies

    Directory of Open Access Journals (Sweden)

    Carolina Martínez-Salgado

    2012-03-01

    Full Text Available En este trabajo se presentan los fundamentos de la elección de los participantes en una investigación cualitativa, en contraste con los que rigen al muestreo probabilístico en la investigación epidemiológica. Se proponen los conceptos de generalizabilidad nomotética e ideográfica, y los de transferibilidad y reflexividad, para una mejor comprensión de las diferencias. Se mencionan los fundamentos de los principales tipos de muestreo que suelen utilizarse en investigación cualitativa, el significado del concepto de saturación y algunos de sus cuestionamientos. Por último, se plantean algunas reflexiones en torno a las controversias suscitadas en los últimos años sobre las diversas perspectivas paradigmáticas desde las cuales se puede efectuar hoy día la investigación cualitativa, sus posibilidades de combinación con la investigación epidemiológica, y algunas implicaciones para el estudio de los problemas de salud.This paper presents the rationale for the choice of participants in qualitative research in contrast with that of probability sampling principles in epidemiological research. For a better understanding of the differences, concepts of nomothetic and ideographic generalizability, as well as those of transferability and reflexivity, are proposed, Fundamentals of the main types of sampling commonly used in qualitative research, and the meaning of the concept of saturation are mentioned. Finally, some reflections on the controversies that have arisen in recent years on various paradigmatic perspectives from which to conduct qualitative research, their possibilities of combination with epidemiological research, and some implications for the study of health issues are presented.

  13. Research

    African Journals Online (AJOL)

    abp

    2016-06-09

    Jun 9, 2016 ... trachoma by nurse data collectors supervised by ophthalmic supervisors using the WHO simplified clinical .... determining the sample size, we estimated the prevalence of ..... Their argument was that flies may breed on animal.

  14. Research

    African Journals Online (AJOL)

    abp

    2013-02-05

    Feb 5, 2013 ... experimental study (a pre and post-test interventional, one group), aimed at assessing the impact of health education on .... The study was quasi experimental using the ... using simple random sampling using a toss of coin.

  15. Research

    African Journals Online (AJOL)

    abp

    2017-09-22

    Sep 22, 2017 ... Prevalence of bovine brucellosis in slaughtered cattle and barriers to better protection of .... cows, not showing clinical mastitis, into 20ml sterile milk sample bottles. Tests and .... and ultimately treatment [39]. This implies that ...

  16. Research

    African Journals Online (AJOL)

    abp

    2017-10-18

    Oct 18, 2017 ... Key words: Risk sexual behavior, private college, multiple sexual partners, consistent condom use. Received: ... Introduction: Risk sexual practice among students from public universities/colleges is common in Ethiopia. However, little has ... partners is an important indicator of risk sexual behavior. Many.

  17. Research

    African Journals Online (AJOL)

    raoul

    2011-12-15

    Dec 15, 2011 ... This article is published as part of the supplement "Field .... doctors and laboratory scientists) train with veterinarians during ... The other 20% of the program is dedicated to didactics consisting of specialized short courses .... 15. http://www.afenet.net/english/publications/AFENET_Newsletter_Sept_2010.pdf.

  18. Research

    African Journals Online (AJOL)

    2014-05-06

    May 6, 2014 ... facilitate and support articulation between the ECT mid-level worker qualification and the professional B EMC degree. Methods. The researchers used an exploratory, sequential mixed-method design, which is characterised by a qualitative phase of research followed by a quantitative phase. This design is ...

  19. Research

    African Journals Online (AJOL)

    supports medical education and research at institutions in 12 ... (CBE). CapacityPlus, led by IntraHealth International, is the USAID-funded ... acquire public health, clinical, and/or research skills, usually through applied learning in a .... If students were evaluated, indicate the type of student (i.e. medical, dental, nursing, etc.) ...

  20. Research

    African Journals Online (AJOL)

    abp

    2017-01-24

    Jan 24, 2017 ... and the specific rotavirus VP4 (P-types) and VP7 (G-types) determined. Results: The .... Centre for Virus Research (CVR) of the Kenya Medical Research. Institute (KEMRI) ... rotavirus dsRNA was run on 10% polyacrylamide resolving gels using a large format .... What is known about this topic. •. Rotavirus is ...

  1. Consistent assignment of nursing staff to residents in nursing homes: a critical review of conceptual and methodological issues.

    Science.gov (United States)

    Roberts, Tonya; Nolet, Kimberly; Bowers, Barbara

    2015-06-01

    Consistent assignment of nursing staff to residents is promoted by a number of national organizations as a strategy for improving nursing home quality and is included in pay for performance schedules in several states. However, research has shown inconsistent effects of consistent assignment on quality outcomes. In order to advance the state of the science of research on consistent assignment and inform current practice and policy, a literature review was conducted to critique conceptual and methodological understandings of consistent assignment. Twenty original research reports of consistent assignment in nursing homes were found through a variety of search strategies. Consistent assignment was conceptualized and operationalized in multiple ways with little overlap from study to study. There was a lack of established methods to measure consistent assignment. Methodological limitations included a lack of control and statistical analyses of group differences in experimental-level studies, small sample sizes, lack of attention to confounds in multicomponent interventions, and outcomes that were not theoretically linked. Future research should focus on developing a conceptual understanding of consistent assignment focused on definition, measurement, and links to outcomes. To inform current policies, testing consistent assignment should include attention to contexts within and levels at which it is most effective. Published by Oxford University Press on behalf of the Gerontological Society of America 2013.

  2. Consistency of self-reported alcohol consumption on randomized and sequential alcohol purchase tasks

    Directory of Open Access Journals (Sweden)

    Michael eAmlung

    2012-07-01

    Full Text Available Behavioral economic demand for addictive substances is commonly assessed via purchase tasks that measure estimated drug consumption at a range of prices. Purchase tasks typically use escalating prices in sequential order, which may influence performance by providing explicit price reference points. This study investigated the consistency of value preferences on two alcohol purchase tasks (APTs that used either a randomized or sequential price order (price range: free to $30 per drink in a sample of ninety-one young adult monthly drinkers. Randomization of prices significantly reduced relative response consistency (p < .01, although absolute consistency was high for both versions (>95%. Self-reported alcohol consumption across prices and indices of demand were highly similar across versions, although a few notable exceptions were found. These results suggest generally high consistency and overlapping performance between randomized and sequential price assessment. Implications for the behavioral economics literature and priorities for future research are discussed.

  3. Consistency between GRUAN sondes, LBLRTM and IASI

    Directory of Open Access Journals (Sweden)

    X. Calbet

    2017-06-01

    Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.

  4. Consistency of the least weighted squares under heteroscedasticity

    Czech Academy of Sciences Publication Activity Database

    Víšek, Jan Ámos

    2011-01-01

    Roč. 2011, č. 47 (2011), s. 179-206 ISSN 0023-5954 Grant - others:GA UK(CZ) GA402/09/055 Institutional research plan: CEZ:AV0Z10750506 Keywords : Regression * Consistency * The least weighted squares * Heteroscedasticity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-consistency of the least weighted squares under heteroscedasticity.pdf

  5. Rapid instrumental and separation methods for monitoring radionuclides in food and environmental samples. Final report on an IAEA co-ordinated research programme

    International Nuclear Information System (INIS)

    1995-01-01

    The Co-ordinated Research Programme (CRP) on Rapid Instrumental and Separation Methods for Monitoring Radionuclides in Food and Environmental Samples was established by the Agency following a Consultants' Meeting on the same topic, which was held 5-9 September 1988 in Vienna. It was completed in 1992. At various times during its course it encompassed 15 participants from 14 countries. The scope of work and objectives of the CRP were established at the Consultants' Meeting. It was agreed that the CRP should focus on the development of rapid methods for the determination of radionuclides in food and environmental samples during the intermediate and late post-accident phases. The rapid methods developed during the course of the CRP were intended to permit a timely and accurate determination of radionuclides at concentrations at least one order of magnitude below those specified for Derived Intervention Levels (DILs) for food by the WHO/FAO and the IAEA. Research Co-ordination meetings were held in Warsaw, Poland in September 1989 and in Vienna, Austria in 1991. Reports of the meetings are available from the Agency on Request. This document comprises copies of final reports from the participants and selected contributions presented by the participants at the meetings. The contributions were selected on the basis of being able to stand alone, without further explanation. Where there was an overlap in the information presented by a participant at both meetings, the most complete contribution was selected

  6. Rapid instrumental and separation methods for monitoring radionuclides in food and environmental samples. Final report on an IAEA co-ordinated research programme

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The Co-ordinated Research Programme (CRP) on Rapid Instrumental and Separation Methods for Monitoring Radionuclides in Food and Environmental Samples was established by the Agency following a Consultants' Meeting on the same topic, which was held 5-9 September 1988 in Vienna. It was completed in 1992. At various times during its course it encompassed 15 participants from 14 countries. The scope of work and objectives of the CRP were established at the Consultants' Meeting. It was agreed that the CRP should focus on the development of rapid methods for the determination of radionuclides in food and environmental samples during the intermediate and late post-accident phases. The rapid methods developed during the course of the CRP were intended to permit a timely and accurate determination of radionuclides at concentrations at least one order of magnitude below those specified for Derived Intervention Levels (DILs) for food by the WHO/FAO and the IAEA. Research Co-ordination meetings were held in Warsaw, Poland in September 1989 and in Vienna, Austria in 1991. Reports of the meetings are available from the Agency on Request. This document comprises copies of final reports from the participants and selected contributions presented by the participants at the meetings. The contributions were selected on the basis of being able to stand alone, without further explanation. Where there was an overlap in the information presented by a participant at both meetings, the most complete contribution was selected.

  7. Research

    African Journals Online (AJOL)

    abp

    2017-10-25

    Oct 25, 2017 ... stigma and superstition are known to lead to frequent presentation .... The limited documented research on challenges to help-seeking behaviour for cancer ..... to touch your breast [16] that breast self-examination may cause.

  8. Research

    African Journals Online (AJOL)

    ebutamanya

    2015-10-02

    Oct 2, 2015 ... thought to prevent infection, but recent research has proven otherwise. In addition ... One patient had ophthalmalgia and was exposed to. Kaiy for one year and ... migraine, ear infections, tuberculosis, bone fractures, epilepsy,.

  9. Research

    African Journals Online (AJOL)

    abp

    2016-07-12

    Jul 12, 2016 ... multiple risk factors provides support for multiple-behavior interventions as ... consumption) with smoking therefore needs further research. As such this study .... restaurants, in bars, and on a statewide basis. They preferred to.

  10. Research

    African Journals Online (AJOL)

    The mini-clinical-evaluation exercise (mini-CEX) is a way of assessing the clinical ... Ethical approval for this study was obtained from the Medical Health. Research ..... mini-CEX assessment and feedback session, the greater the likelihood of.

  11. Research

    African Journals Online (AJOL)

    abp

    2016-04-14

    Apr 14, 2016 ... Qualitative data, content analysis approach was used. Results: Overall 422 .... Study design: A mixed method cross-sectional design using both quantitative and qualitative research methods as described by. Hanson et al [33] ...

  12. Research

    International Nuclear Information System (INIS)

    1999-01-01

    Subjects covered in this section are: (1) PCAST panel promotes energy research cooperation; (2) Letter issued by ANS urges funding balance in FFTF restart consideration and (3) FESAC panel releases report on priorities and balance

  13. Research

    African Journals Online (AJOL)

    Research. December 2017, Vol. 9, No. 4 AJHPE 171. During curriculum development, teachers ... Ideally, examiners need an educational method to determine ..... A major focus of this study was addressing the human resource gap when.

  14. Personality consistency analysis in cloned quarantine dog candidates

    Directory of Open Access Journals (Sweden)

    Jin Choi

    2017-01-01

    Full Text Available In recent research, personality consistency has become an important characteristic. Diverse traits and human-animal interactions, in particular, are studied in the field of personality consistency in dogs. Here, we investigated the consistency of dominant behaviours in cloned and control groups followed by the modified Puppy Aptitude Test, which consists of ten subtests to ascertain the influence of genetic identity. In this test, puppies are exposed to stranger, restraint, prey-like object, noise, startling object, etc. Six cloned and four control puppies participated and the consistency of responses at ages 7–10 and 16 weeks in the two groups was compared. The two groups showed different consistencies in the subtests. While the average scores of the cloned group were consistent (P = 0.7991, those of the control group were not (P = 0.0089. Scores of Pack Drive and Fight or Flight Drive were consistent in the cloned group, however, those of the control group were not. Scores of Prey Drive were not consistent in either the cloned or the control group. Therefore, it is suggested that consistency of dominant behaviour is affected by genetic identity and some behaviours can be influenced more than others. Our results suggest that cloned dogs could show more consistent traits than non-cloned. This study implies that personality consistency could be one of the ways to analyse traits of puppies.

  15. Research

    African Journals Online (AJOL)

    abp

    2017-10-06

    Oct 6, 2017 ... South Africa). Seropositivity for syphilis in turn uses a completely screening by a Rapid-Plasma-Réagin test (syphilis RPR test, Human. Gesellschaft für Biochemicaund Diagnostic amb H, Germany) then the positive samples were passed to the TPHA (Treponema Pallidum. Hemagglutination Assay) and the ...

  16. Research

    African Journals Online (AJOL)

    abp

    2015-01-20

    Jan 20, 2015 ... ... which permits unrestricted use, distribution, and reproduction in any ... unsafe human waste disposal systems, inadequacy and lack of safe ... hence received an interview after providing the stool sample. This makes the response rate to be 100%. .... that prohibits the transmission of intestinal helminths.

  17. Research

    African Journals Online (AJOL)

    abp

    2015-09-09

    Sep 9, 2015 ... stage systematic sampling design from the three provinces. Instrument ... It was based on a question-and-answer information brochure derived from the ... grade 12 education and 11.9% had a post-secondary school education ... to enable South African citizens to receive good quality healthcare at any time ...

  18. Research

    African Journals Online (AJOL)

    abp

    2016-04-15

    Apr 15, 2016 ... patients was recognized as an independent risk factor of the acquisition of A. baumannii infection [5]. Many authors have reported the predominance of Acinetobacter strains in broncho- pulmonary samples [7, 20, 27]. In this study, the main isolation site of these clinical isolates was also broncho-pulmonary ...

  19. Research

    African Journals Online (AJOL)

    abp

    2014-05-08

    May 8, 2014 ... through air by droplet nuclei and the micro-organisms enter the ... infected people (80-90%) will never become ill with TB unless with .... and a systematic sampling method of 1 in 3 names on the list was .... Variable (N=241).

  20. Research

    African Journals Online (AJOL)

    abp

    2015-06-01

    Jun 1, 2015 ... Pre dialysis serum samples collected from the patients were used for albumin analysis. The serum from the patients was analysed for serum albumin levels using the Mindray BS120 chemistry analyser using the bromocresol green method. Results: A total of 60 patients were recruited from the two hospitals.

  1. Research

    African Journals Online (AJOL)

    sciences regarding telehealth, as well as their views on suitable content areas for a telehealth module. A descriptive survey .... The mean square contingency coefficient, the ..... participate, and the small sample size, representing less than 50% of medical staff ... and demonstrations by companies selling telehealth systems.

  2. Research

    African Journals Online (AJOL)

    abp

    2014-05-24

    May 24, 2014 ... A sample was considered positive if it was reactive to both tests kits and negative if ... WHO Clinical and Immunological Stages of the HIV infected. Subjects .... The lack of determination of dietary intake and food security in the ...

  3. Consistency Anchor Formalization and Correctness Proofs

    OpenAIRE

    Miguel, Correia; Bessani, Alysson

    2014-01-01

    This is report contains the formal proofs for the techniques for increasing the consistency of cloud storage as presented in "Bessani et al. SCFS: A Cloud-backed File System. Proc. of the 2014 USENIX Annual Technical Conference. June 2014." The consistency anchor technique allows one to increase the consistency provided by eventually consistent cloud storage services like Amazon S3. This technique has been used in the SCFS (Shared Cloud File System) cloud-backed file system for solving rea...

  4. Concentrations of higher dicarboxylic acids C5–C13 in fresh snow samples collected at the High Alpine Research Station Jungfraujoch during CLACE 5 and 6

    Directory of Open Access Journals (Sweden)

    K. Sieg

    2009-03-01

    Full Text Available Samples of freshly fallen snow were collected at the high alpine research station Jungfraujoch (Switzerland in February and March 2006 and 2007, during the Cloud and Aerosol Characterization Experiments (CLACE 5 and 6. In this study a new technique has been developed and demonstrated for the measurement of organic acids in fresh snow. The melted snow samples were subjected to solid phase extraction and resulting solutions analysed for organic acids by HPLC-MS-TOF using negative electrospray ionization. A series of linear dicarboxylic acids from C5 to C13 and phthalic acid, were identified and quantified. In several samples the biogenic acid pinonic acid was also observed. In fresh snow the median concentration of the most abundant acid, adipic acid, was 0.69 μg L−1 in 2006 and 0.70 μg L−1 in 2007. Glutaric acid was the second most abundant dicarboxylic acid found with median values of 0.46 μg L−1 in 2006 and 0.61 μg L−1 in 2007, while the aromatic acid phthalic acid showed a median concentration of 0.34 μg L−1 in 2006 and 0.45 μg L−1 in 2007. The concentrations in the samples from various snowfall events varied significantly, and were found to be dependent on the back trajectory of the air mass arriving at Jungfraujoch. Air masses of marine origin showed the lowest concentrations of acids whereas the highest concentrations were measured when the air mass was strongly influenced by boundary layer air.

  5. A Consistent Phylogenetic Backbone for the Fungi

    Science.gov (United States)

    Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt

    2012-01-01

    The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356

  6. Consistency based correlations for tailings consolidation

    Energy Technology Data Exchange (ETDEWEB)

    Azam, S.; Paul, A.C. [Regina Univ., Regina, SK (Canada). Environmental Systems Engineering

    2010-07-01

    The extraction of oil, uranium, metals and mineral resources from the earth generates significant amounts of tailings slurry. The tailings are contained in a disposal area with perimeter dykes constructed from the coarser fraction of the slurry. There are many unique challenges pertaining to the management of the containment facilities for several decades beyond mine closure that are a result of the slow settling rates of the fines and the high standing toxic waters. Many tailings dam failures in different parts of the world have been reported to result in significant contaminant releases causing public concern over the conventional practice of tailings disposal. Therefore, in order to reduce and minimize the environmental footprint, the fluid tailings need to undergo efficient consolidation. This paper presented an investigation into the consolidation behaviour of tailings in conjunction with soil consistency that captured physicochemical interactions. The paper discussed the large strain consolidation behaviour (volume compressibility and hydraulic conductivity) of six fine-grained soil slurries based on published data. The paper provided background information on the study and presented the research methodology. The geotechnical index properties of the selected materials were also presented. The large strain consolidation, volume compressibility correlations, and hydraulic conductivity correlations were provided. It was concluded that the normalized void ratio best described volume compressibility whereas liquidity index best explained the hydraulic conductivity. 17 refs., 3 tabs., 4 figs.

  7. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation

    Science.gov (United States)

    Lindell, Annukka K.

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora. PMID:28270790

  8. A workforce survey of Australian osteopathy: analysis of a nationally-representative sample of osteopaths from the Osteopathy Research and Innovation Network (ORION) project.

    Science.gov (United States)

    Adams, Jon; Sibbritt, David; Steel, Amie; Peng, Wenbo

    2018-05-10

    Limited information is available regarding the profile and clinical practice characteristics of the osteopathy workforce in Australia. This paper reports such information by analysing data from a nationally-representative sample of Australian osteopaths. Data was obtained from a workforce survey of Australian osteopathy, investigating the characteristics of the practitioner, their practice, clinical management features and perceptions regarding research. The survey questionnaire was distributed to all registered osteopaths across Australia in 2016 as part of the Osteopathy Research and Innovation Network (ORION) project. A total of 992 Australian osteopaths participated in this study representing a response rate of 49.1%. The average age of the participants was 38.0 years with 58.1% being female and the majority holding a Bachelor or higher degree qualification related to the osteopathy professional. Approximately 80.0% of the osteopaths were practicing in an urban area, with most osteopaths working in multi-practitioner locations, having referral relationships with a range of health care practitioners, managing patients a number of musculoskeletal disorders, and providing multi-model treatment options. A total of 3.9 million patients were estimated to consult with osteopaths every year and an average of approximate 3.0 million hours were spent delivering osteopathy services per year. Further research is required to provide rich, in-depth examination regarding a range of osteopathy workforce issues which will help ensure safe, effective patient care to all receiving and providing treatments as part of the broader Australian health system.

  9. Research

    African Journals Online (AJOL)

    abp

    2017-05-18

    May 18, 2017 ... available to populations of developing countries [2-5]. In 2013, in. Western and Central Europe and ..... initiation among the infected persons in the community. Addressing stigma and educating ... Lifespan/Tufts/Brown Center for AIDS Research (P30AI042853). Tables. Table 1: Baseline characteristics of ...

  10. Research

    African Journals Online (AJOL)

    abp

    15 févr. 2016 ... présentent un Indice de Masse Corporel (IMC) normal, les autres femmes sont soit ..... In The health belief model and personal health behavior, edited by MH ... Evaluation of the Osteoporosis Health Belief Scale. Research in.

  11. Research

    African Journals Online (AJOL)

    2017-03-14

    Mar 14, 2017 ... R Ebrahim,1 MSc (Dent); H Julie,2 MPH, MCur, PhD. 1 Extended ... and research is applied to develop and sustain society.[5]. Methods .... service they want, not the service we want to give whether they want it or. Co math. G.

  12. Research

    African Journals Online (AJOL)

    abp

    2017-11-24

    Nov 24, 2017 ... Page number not for citation purposes. 1. Prevalence and determinants of common mental ..... illnesses were smoke cigarette in the last 3 months that make prevalence of tobacco use 38.2%. ..... Okasha A, Karam E.Mental health services and research in the. Arab world. Acta Psychiatrica Scandinavica.

  13. Research

    African Journals Online (AJOL)

    abp

    2014-04-21

    Apr 21, 2014 ... Prospective assessment of the risk of obstructive sleep apnea in ... Faculty of Clinical Sciences, College of Medicine, University of .... University Teaching Hospital Health Research Ethics Committee ... BANG, Berlin questionnaire and the American Society of .... The epidemiology of adult obstructive sleep.

  14. Research

    African Journals Online (AJOL)

    abp

    2016-02-01

    Feb 1, 2016 ... University Hospital, DK-5000 Odense, Denmark, 3Center for Global Health, Institute of Clinical Research, University of Southern Denmark, DK-5000. Odense .... BHP is a Danish-Guinean Demographic Surveillance Site with a study-area .... variables such as age groups, previous military duty, history of.

  15. Research

    African Journals Online (AJOL)

    abp

    2015-06-24

    Jun 24, 2015 ... related immunosuppression, previous history of TB, and pause in treatment [6]. In Brazil, researchers .... treatment, use of traditional medicines or herbs, history of TB drug side effects and treatment delay). ..... therapy for pulmonary tuberculosis in Lima Ciudad, Peru. International journal of tuberculosis and ...

  16. Research

    African Journals Online (AJOL)

    Research. May 2016, Vol. 8, No. 1 AJHPE 37. Students who enrol in occupational therapy (OT) at the. University of Kwa Zulu-Natal (UKZN), Durban, South Africa ... The latter may include becoming familiar with the disintegrating social systems in primary .... They also lacked the skills needed to adapt sessions and failed to ...

  17. Research

    African Journals Online (AJOL)

    ebutamanya

    2015-06-22

    Jun 22, 2015 ... collaboration with Makerere University, School of Public Health. We acknowledge The Family Health Research and Development Centre. (FHRDC) Uganda. Supported by Bill & Melinda Gates Institute for. Population & Reproductive Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, ...

  18. Research

    African Journals Online (AJOL)

    However, a focus on competence alone is inadequate to produce graduates who are capable of adapting to the changing needs of health systems. While knowledge and technical ... shared their responses to guided questions. There were three sessions; after each session the researcher aggregated participant responses ...

  19. Research

    African Journals Online (AJOL)

    abp

    2014-01-31

    Jan 31, 2014 ... by Hazarika in a population-based study in India. The researcher noted that patients' preference to the private health facilities was due mainly to their dissatisfaction with the services in the public health facilities [11]. Furthermore, the quality of the services in the private health facilities could also be a major ...

  20. Research

    African Journals Online (AJOL)

    2018-03-20

    Mar 20, 2018 ... student health professionals in various institutions, both in South Africa. (SA) and internationally. ... field include dentists, dental therapists and oral hygienists in training, .... The College of Health Sciences at UKZN has four schools: clinical ..... Journal of Emerging Trends in Educational Research and Policy ...

  1. Research

    African Journals Online (AJOL)

    abp

    2017-09-14

    Sep 14, 2017 ... Abstract. Introduction: Medical and dental students are a high-risk group for hepatitis B virus (HBV) infection which is an ... The Pan African Medical Journal - ISSN 1937-8688. ... Research ... in the College of Health Sciences and clinical students (years four to .... Hepatology International.2017 Jan; 11(1):.

  2. Research

    African Journals Online (AJOL)

    abp

    2015-01-19

    Jan 19, 2015 ... One research assistant was available to assist the learners and to answer questions while they completed the questionnaires during a classroom period. ..... PubMed | Google Scholar. 4. Hall PA, Holmqvist M, Sherry SB. Risky adolescent sexual behaviour: A psychological perspective for primary care.

  3. Applied research and development of neutron activation analysis - The study on human health and environment by neutron activation analysis of biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Seung Yeon; Yoo, Jong Ik; Lee, Jae Kwang; Lee, Sung Jun; Lee, Sang Sun; Jeon, Ki Hong; Na, Kyung Won; Kang, Sang Hun [Yonsei University, Seoul (Korea)

    2000-04-01

    With the development of the precise quantitative analytical method for the analysis of trace elements in the various biological samples such as hair and food, evaluation in view of health and environment to the trace elements in various sources which can be introduced inside human body was done. The trace elemental distribution in Korean total diet and representative food stuff was identified first. With the project the elemental distributions in supplemental healthy food and Korean and Chinese origin oriental medicine were identified. The amount of trace elements ingested with the hair analysis of oriental medicine takers were also estimated. The amounts of trace elements inhaled with the analysis of foundry air, blood and hair of foundry workers were also estimated. The basic estimation method in view of health and environment with the neutron activation analysis of biological samples such as foods and hair was established with the result. Nationwide usage system of the NAA facility in Hanaro in many different and important areas of biological area can be initiated with the results. The output of the project can support public heath, environment, and medical research area. The results can be applied for the process of micronutrients enhanced health food production and for the health safety and health status enhancement with the additional necessary data expansion and the development of various evaluation technique. 19 refs., 7 figs., 23 tabs. (Author)

  4. A new approach to hull consistency

    Directory of Open Access Journals (Sweden)

    Kolev Lubomir

    2016-06-01

    Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.

  5. Replica consistency in a Data Grid

    International Nuclear Information System (INIS)

    Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt

    2004-01-01

    A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented

  6. Peroral endoscopic myotomy can improve esophageal motility in patients with achalasia from a large sample self-control research (66 patients.

    Directory of Open Access Journals (Sweden)

    Shuangzhe Yao

    Full Text Available Peroral endoscopic myotomy (POEM as a new approach to achalasia attracts broad attention. The primary objective of this study was to evaluate the results with esophageal motility after POEM through the first large sample clinical research.We have a self-control research with all patients (205 in total who underwent POEM from 2010 to 2014 at our Digestive Endoscopic Center, 66 patients of which underwent high resolution manometry (HRM before and after POEM in our motility laboratory. Follow-ups last for 5.6 months on average. Outcome variables analyzed included upper esophageal sphincter pressure (UESP, upper esophageal sphincter residual pressure (UESRP, lower esophageal sphincter pressure (LESP, lower esophageal sphincter residual pressure (LESRP and esophageal body peristalsis. We have a statistical analysis to illustrate how POEM impacts on the change of esophageal motility.The symptoms related to dysphagia were relieved in 95% of patients in recent term after POEM. While HRM showed a statistically significant reduction of URSRP, LESP and LESRP (P0.05 did not occur for these two groups on LESP and LESRP reduction.POEM clearly relieved the symptoms related to dysphagia by lowering the pressure of upper esophageal sphincter (UES and lower esophageal sphincter (LES,and other endoscopic treatment before POEM did not affect the improvement of LES pressure. These results are concluded from our short-term follow-up study, while the long-term efficacy remains to be further illustrated.Chinese Clinical Trial Register ChiCTR-TRC-12002204.

  7. Student Effort, Consistency, and Online Performance

    Science.gov (United States)

    Patron, Hilde; Lopez, Salvador

    2011-01-01

    This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…

  8. Translationally invariant self-consistent field theories

    International Nuclear Information System (INIS)

    Shakin, C.M.; Weiss, M.S.

    1977-01-01

    We present a self-consistent field theory which is translationally invariant. The equations obtained go over to the usual Hartree-Fock equations in the limit of large particle number. In addition to deriving the dynamic equations for the self-consistent amplitudes we discuss the calculation of form factors and various other observables

  9. Sticky continuous processes have consistent price systems

    DEFF Research Database (Denmark)

    Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan

    Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under...

  10. Consistent-handed individuals are more authoritarian.

    Science.gov (United States)

    Lyle, Keith B; Grillo, Michael C

    2014-01-01

    Individuals differ in the consistency with which they use one hand over the other to perform everyday activities. Some individuals are very consistent, habitually using a single hand to perform most tasks. Others are relatively inconsistent, and hence make greater use of both hands. More- versus less-consistent individuals have been shown to differ in numerous aspects of personality and cognition. In several respects consistent-handed individuals resemble authoritarian individuals. For example, both consistent-handedness and authoritarianism have been linked to cognitive inflexibility. Therefore we hypothesised that consistent-handedness is an external marker for authoritarianism. Confirming our hypothesis, we found that consistent-handers scored higher than inconsistent-handers on a measure of submission to authority, were more likely to identify with a conservative political party (Republican), and expressed less-positive attitudes towards out-groups. We propose that authoritarianism may be influenced by the degree of interaction between the left and right brain hemispheres, which has been found to differ between consistent- and inconsistent-handed individuals.

  11. Testing the visual consistency of web sites

    NARCIS (Netherlands)

    van der Geest, Thea; Loorbach, N.R.

    2005-01-01

    Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to

  12. Consistent spectroscopy for a extended gauge model

    International Nuclear Information System (INIS)

    Oliveira Neto, G. de.

    1990-11-01

    The consistent spectroscopy was obtained with a Lagrangian constructed with vector fields with a U(1) group extended symmetry. As consistent spectroscopy is understood the determination of quantum physical properties described by the model in an manner independent from the possible parametrizations adopted in their description. (L.C.J.A.)

  13. Report of the second research co-ordination meeting on the co-ordinated research programme: rapid instrumental and separation methods for monitoring radionuclides in food and environmental samples

    International Nuclear Information System (INIS)

    1992-10-01

    The purpose of this Second Research Co-ordinated Meeting (12-16 August 1991) on Rapid Instrumental and Separation Methods for Monitoring Radionuclides in Food and Environmental Samples is to discuss the progress of the programmes since the First Research Co-ordination Meeting, discuss how to validate the methodologies developed (e.g. reference samples, intercomparisons), and outline a schedule for CRP completion by the end of 1992. Radioactive contamination of the environment after a nuclear accident, such as had occurred at Chernobyl, is of serious concern to government officials and members of the general public. In 1990/1991 the Agency was asked to organize the International Chernobyl Project to assess the situation in the USSR. A network of laboratories was organized to carry out the environmental assessment needed for this project. The following recommendations are based on the experience gained by many of the laboratories involved in this project. 1. Maintain a network of analytical laboratories with special skills and experience to provide assessments of radionuclide contamination in the environment in case of a radiological emergency. 2. Methodologies for assessment of contamination in the environment should take into consideration potential trajectories, radioecology, and food chain parameters. 3. Focus on areas of representative sample collection, is situ instrumental and chemical analysis, as well as advanced streamlined laboratory analyses which will facilitate the timeline of an assessment. 4. Conduct intercomparison and testing of technologies, employing standard reference materials and procedures, and field measurements at significantly contaminated area. 5. Conduct training of Member State laboratory personnel through fellowships, special courses, and workshops. 5 refs

  14. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  15. Personality consistency in dogs: a meta-analysis.

    Science.gov (United States)

    Fratkin, Jamie L; Sinn, David L; Patall, Erika A; Gosling, Samuel D

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests') versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.

  16. Personality consistency in dogs: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Jamie L Fratkin

    Full Text Available Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family. Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43. Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests' versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.

  17. Personality Consistency in Dogs: A Meta-Analysis

    Science.gov (United States)

    Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787

  18. Investigation of the History Education Researches in Turkey in Terms of Some Variables (Master's Theses and Dissertations Sample)

    Science.gov (United States)

    Elban, Mehmet

    2017-01-01

    The purpose of this research study is to examine the master's theses and dissertations carried out about history education research in Turkey in terms of certain variables. The study is a qualitative research and it used documentary research design as a research method. The population of the research study is the master's theses and dissertations…

  19. Boat sampling

    International Nuclear Information System (INIS)

    Citanovic, M.; Bezlaj, H.

    1994-01-01

    This presentation describes essential boat sampling activities: on site boat sampling process optimization and qualification; boat sampling of base material (beltline region); boat sampling of weld material (weld No. 4); problems accompanied with weld crown varieties, RPV shell inner radius tolerance, local corrosion pitting and water clarity. The equipment used for boat sampling is described too. 7 pictures

  20. Graph sampling

    OpenAIRE

    Zhang, L.-C.; Patone, M.

    2017-01-01

    We synthesise the existing theory of graph sampling. We propose a formal definition of sampling in finite graphs, and provide a classification of potential graph parameters. We develop a general approach of Horvitz–Thompson estimation to T-stage snowball sampling, and present various reformulations of some common network sampling methods in the literature in terms of the outlined graph sampling theory.

  1. Optimizing trial design in pharmacogenetics research: comparing a fixed parallel group, group sequential, and adaptive selection design on sample size requirements.

    Science.gov (United States)

    Boessen, Ruud; van der Baan, Frederieke; Groenwold, Rolf; Egberts, Antoine; Klungel, Olaf; Grobbee, Diederick; Knol, Mirjam; Roes, Kit

    2013-01-01

    Two-stage clinical trial designs may be efficient in pharmacogenetics research when there is some but inconclusive evidence of effect modification by a genomic marker. Two-stage designs allow to stop early for efficacy or futility and can offer the additional opportunity to enrich the study population to a specific patient subgroup after an interim analysis. This study compared sample size requirements for fixed parallel group, group sequential, and adaptive selection designs with equal overall power and control of the family-wise type I error rate. The designs were evaluated across scenarios that defined the effect sizes in the marker positive and marker negative subgroups and the prevalence of marker positive patients in the overall study population. Effect sizes were chosen to reflect realistic planning scenarios, where at least some effect is present in the marker negative subgroup. In addition, scenarios were considered in which the assumed 'true' subgroup effects (i.e., the postulated effects) differed from those hypothesized at the planning stage. As expected, both two-stage designs generally required fewer patients than a fixed parallel group design, and the advantage increased as the difference between subgroups increased. The adaptive selection design added little further reduction in sample size, as compared with the group sequential design, when the postulated effect sizes were equal to those hypothesized at the planning stage. However, when the postulated effects deviated strongly in favor of enrichment, the comparative advantage of the adaptive selection design increased, which precisely reflects the adaptive nature of the design. Copyright © 2013 John Wiley & Sons, Ltd.

  2. k{sub 0}-INAA method at the pneumatic station of the IEA-R1 nuclear research reactor. Application to geological samples

    Energy Technology Data Exchange (ETDEWEB)

    Mariano, Davi B.; Figueiredo, Ana Maria G.; Semmler, Renato, E-mail: davimariano@usp.br, E-mail: anamaria@ipen.br, E-mail: rsemmler@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    There is a significant number of analytically important elements, when geological samples are concerned, whose activation products are short-lived (seconds to minutes) or medium-lived radioisotopes (minutes to hours). As part of the process of implementation of the k{sub 0}-INAA standardization method at the Neutron Activation Laboratory (LAN-IPEN), Sao Paulo, Brazil, this study presents the results obtained for the analysis of short and medium-lived nuclides in geological samples by k{sub 0}-INAA using the program k{sub 0}-IAEA, provided by The International Atomic Energy Agency (IAEA). The elements Al, Dy, Eu, Na, K, Mn, Mg, Sr, V and Ti were determined with respect to gold ({sup 197}Au) using the pneumatic station facility of the IEA-R1 5 MW swimming pool nuclear research reactor, Sao Paulo. Characterization of the pneumatic station was carried out by using the -bare triple-monitor- method with {sup 197}Au-{sup 96}Zr-{sup 94}Zr. The Certified Reference Material IRMM-530R Al-0,1% Au alloy, high purity zirconium, Ni and Lu comparators were irradiated. The efficiency curves of the gamma-ray spectrometer used were determined by measuring calibrated radioactive sources at the usually utilized counting geometries. The method was validated by analyzing the reference materials basalt BE-N (IWG-GIT), basalt JB- 1 (GSJ), andesite AGV-1 (USGS), granite GS-N (IWG-GIT), SOIL-7 (IAEA) and sediment Buffalo River Sediment (NIST-BRS-8704), which represent different geological matrices. The concentration results obtained agreed with certified, reference and recommended values, showing relative errors less than 10% for most elements. (author)

  3. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  4. Consistent histories and operational quantum theory

    International Nuclear Information System (INIS)

    Rudolph, O.

    1996-01-01

    In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail

  5. Self-consistent areas law in QCD

    International Nuclear Information System (INIS)

    Makeenko, Yu.M.; Migdal, A.A.

    1980-01-01

    The problem of obtaining the self-consistent areas law in quantum chromodynamics (QCD) is considered from the point of view of the quark confinement. The exact equation for the loop average in multicolor QCD is reduced to a bootstrap form. Its iterations yield new manifestly gauge invariant perturbation theory in the loop space, reproducing asymptotic freedom. For large loops, the areas law apprears to be a self-consistent solution

  6. Termo de consentimento e análise de material biológico armazenado Consent procedures and research on stored biological samples

    Directory of Open Access Journals (Sweden)

    Cristiano Guedes Duque

    2010-01-01

    Full Text Available OBJETIVO: Relatar uma experiência envolvendo a obtenção de termo de consentimento livre e esclarecido (TCLE para estudo retrospectivo realizado no Instituto Nacional de Câncer (INCA. O mesmo envolvia a revisão de prontuários e a análise de blocos de parafina de pacientes com câncer de cólon operados entre 2000 e 2004. Respeitando a resolução 196/96 do Conselho Nacional de Saúde e a determinação do Comitê de Ética em Pesquisa (CEP do INCA, buscou-se obter o consentimento informado. MÉTODOS: Nas consultas agendadas, conseguiu-se aplicar o termo a apenas quatro pacientes, durante três meses. Foram enviadas então pelo correio duas cópias do TCLE, juntamente com um sumário e um envelope selado para o re-envio aos pesquisadores. Antes da postagem, tentou-se contato telefônico. RESULTADOS: Obteve-se retorno de 115 dos 155 TCLE enviados (74%. Dentre as respostas recebidas, 111 consentiram participar do estudo, houve uma recusa e foi informado que três pacientes haviam falecido. O tempo entre o envio da correspondência e o recebimento da resposta variou entre 2 e 89 dias (mediana: 10 dias. Houve sucesso no contato telefônico com 60 dos 160 pacientes (37,5%. Para os que já haviam falecido e para os que não retornaram o TCLE, o CEP aprovou a dispensa do mesmo. O custo final do envio dos envelopes foi de R$1.004,40. CONCLUSÃO: A busca de comunicação telefônica e postal com pacientes para a obtenção de TCLE de estudo clínico retrospectivo é factível. A maioria respondeu ao contato e consentiu participar. Há, porém, custos e riscos agregados que não podem ser desprezados.OBJECTIVE: To present practical experience in obtaining consent form (CF for a study performed at the "Instituto Nacional de Câncer" involving research on stored biologic samples from patients operated for colon cancer from 2000 to 2004. According to the Brazilian National Health Council resolution nº196/96, researchers must make every effort to obtain

  7. Temperature, salinity and other variables collected from discrete sample and profile observations using CTD, bottle and other instruments from the OCEAN RESEARCHER I in the Philippine Sea from 1991-06-26 to 1991-07-04 (NODC Accession 0115598)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NODC Accession 0115598 includes chemical, discrete sample, physical and profile data collected from OCEAN RESEARCHER I in the Philippine Sea from 1991-06-26 to...

  8. Temperature, salinity and other variables collected from discrete sample and profile observations using CTD, bottle and other instruments from OCEAN RESEARCHER I in the Philippine Sea from 1990-10-11 to 1990-10-15 (NODC Accession 0115600)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NCEI Accession 0115600 includes chemical, discrete sample, physical and profile data collected from OCEAN RESEARCHER I in the Philippine Sea from 1990-10-11 to...

  9. Personality and Situation Predictors of Consistent Eating Patterns.

    Science.gov (United States)

    Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K

    2015-01-01

    A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  10. Personality and Situation Predictors of Consistent Eating Patterns.

    Directory of Open Access Journals (Sweden)

    Uku Vainik

    Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  11. Accounting for the sedative and analgesic effects of medication changes during patient participation in clinical research studies: measurement development and application to a sample of institutionalized geriatric patients.

    Science.gov (United States)

    Sloane, Philip; Ivey, Jena; Roth, Mary; Roederer, Mary; Williams, Christianna S

    2008-03-01

    To date, no system has been published that allows investigators to adjust for the overall sedative and/or analgesic effects of medications, or changes in medications, in clinical trial participants for whom medication use cannot be controlled. This is common in clinical trials of behavioral and complementary/alternative therapies, and in research involving elderly or chronically ill patients for whom ongoing medical care continues during the trial. This paper describes the development, and illustrates the use, of a method we developed to address this issue, in which we generate single continuous variables to represent the daily sedative and analgesic loads of multiple medications. Medications for 90 study participants in a clinical trial of a nonpharmacological intervention were abstracted from medication administration records across multiple treatment periods. An expert panel of three academic clinical pharmacists and a geriatrician met to develop a system by which each study medication could be assigned a sedative and analgesic effect rating. The two measures, when applied to data on 90 institutionalized persons with Alzheimer's disease, resulted in variables with moderately skewed distributions that are consistent with the clinical profile of analgesia and sedation use in long-term care populations. The average study participant received 1.89 analgesic medications per day and had a daily analgesic load of 2.96; the corresponding figures for sedation were 2.07 daily medications and an average daily load of 11.41. A system of classifying the sedative and analgesic effects of non-study medications was created that divides drugs into categories based on the strength of their effects and assigns a rating to express overall sedative and analgesic effects. These variables may be useful in comparing patients and populations, and to control for drug effects in future studies.

  12. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  13. Towards thermodynamical consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Shanenko, A.A.; Toneev, V.D.; Research Inst. for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasi-particle-based description of the quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamical consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamical consistency. A particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential, which can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics [ru

  14. Toward thermodynamic consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Toneev, V.D.; Shanenko, A.A.

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasiparticle-based description of quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamic consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamic consistency. Particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential that can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics

  15. Toward a consistent RHA-RPA

    International Nuclear Information System (INIS)

    Shepard, J.R.

    1991-01-01

    The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data

  16. Personalized recommendation based on unbiased consistence

    Science.gov (United States)

    Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao

    2015-08-01

    Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.

  17. Financial model calibration using consistency hints.

    Science.gov (United States)

    Abu-Mostafa, Y S

    2001-01-01

    We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.

  18. Ensemble Sampling

    OpenAIRE

    Lu, Xiuyuan; Van Roy, Benjamin

    2017-01-01

    Thompson sampling has emerged as an effective heuristic for a broad range of online decision problems. In its basic form, the algorithm requires computing and sampling from a posterior distribution over models, which is tractable only for simple special cases. This paper develops ensemble sampling, which aims to approximate Thompson sampling while maintaining tractability even in the face of complex models such as neural networks. Ensemble sampling dramatically expands on the range of applica...

  19. Proteolysis and consistency of Meshanger cheese

    NARCIS (Netherlands)

    Jong, de L.

    1978-01-01

    Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α s1 -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of

  20. Developing consistent pronunciation models for phonemic variants

    CSIR Research Space (South Africa)

    Davel, M

    2006-09-01

    Full Text Available Pronunciation lexicons often contain pronunciation variants. This can create two problems: It can be difficult to define these variants in an internally consistent way and it can also be difficult to extract generalised grapheme-to-phoneme rule sets...

  1. Image recognition and consistency of response

    Science.gov (United States)

    Haygood, Tamara M.; Ryan, John; Liu, Qing Mary A.; Bassett, Roland; Brennan, Patrick C.

    2012-02-01

    Purpose: To investigate the connection between conscious recognition of an image previously encountered in an experimental setting and consistency of response to the experimental question. Materials and Methods: Twenty-four radiologists viewed 40 frontal chest radiographs and gave their opinion as to the position of a central venous catheter. One-to-three days later they again viewed 40 frontal chest radiographs and again gave their opinion as to the position of the central venous catheter. Half of the radiographs in the second set were repeated images from the first set and half were new. The radiologists were asked of each image whether it had been included in the first set. For this study, we are evaluating only the 20 repeated images. We used the Kruskal-Wallis test and Fisher's exact test to determine the relationship between conscious recognition of a previously interpreted image and consistency in interpretation of the image. Results. There was no significant correlation between recognition of the image and consistency in response regarding the position of the central venous catheter. In fact, there was a trend in the opposite direction, with radiologists being slightly more likely to give a consistent response with respect to images they did not recognize than with respect to those they did recognize. Conclusion: Radiologists' recognition of previously-encountered images in an observer-performance study does not noticeably color their interpretation on the second encounter.

  2. Consistent Valuation across Curves Using Pricing Kernels

    Directory of Open Access Journals (Sweden)

    Andrea Macrina

    2018-03-01

    Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.

  3. Guided color consistency optimization for image mosaicking

    Science.gov (United States)

    Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li

    2018-01-01

    This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.

  4. Consistent application of codes and standards

    International Nuclear Information System (INIS)

    Scott, M.A.

    1989-01-01

    The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines

  5. Consistency in multi-viewpoint architectural design

    NARCIS (Netherlands)

    Dijkman, R.M.; Dijkman, Remco Matthijs

    2006-01-01

    This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.

  6. Consistent Visual Analyses of Intrasubject Data

    Science.gov (United States)

    Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli

    2010-01-01

    Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…

  7. Consistent Stochastic Modelling of Meteocean Design Parameters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Sterndorff, M. J.

    2000-01-01

    Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...

  8. On the existence of consistent price systems

    DEFF Research Database (Denmark)

    Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan

    2014-01-01

    We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...

  9. Dynamic phonon exchange requires consistent dressing

    International Nuclear Information System (INIS)

    Hahne, F.J.W.; Engelbrecht, C.A.; Heiss, W.D.

    1976-01-01

    It is shown that states with undersirable properties (such as ghosts, states with complex eigenenergies and states with unrestricted normalization) emerge from two-body calculations using dynamic effective interactions if one is not careful in introducing single-particle self-energy insertions in a consistent manner

  10. Consistent feeding positions of great tit parents

    NARCIS (Netherlands)

    Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, Ph.

    2006-01-01

    When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is

  11. Consistency of the postulates of special relativity

    International Nuclear Information System (INIS)

    Gron, O.; Nicola, M.

    1976-01-01

    In a recent article in this journal, Kingsley has tried to show that the postulates of special relativity contradict each other. It is shown that the arguments of Kingsley are invalid because of an erroneous appeal to symmetry in a nonsymmetric situation. The consistency of the postulates of special relativity and the relativistic kinematics deduced from them is restated

  12. The least weighted squares II. Consistency and asymptotic normality

    Czech Academy of Sciences Publication Activity Database

    Víšek, Jan Ámos

    2002-01-01

    Roč. 9, č. 16 (2002), s. 1-28 ISSN 1212-074X R&D Projects: GA AV ČR KSK1019101 Grant - others:GA UK(CR) 255/2000/A EK /FSV Institutional research plan: CEZ:AV0Z1075907 Keywords : robust regression * consistency * asymptotic normality Subject RIV: BA - General Mathematics

  13. Consistent dynamical and statistical description of fission and comparison

    Energy Technology Data Exchange (ETDEWEB)

    Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).

  14. Protective Factors, Risk Indicators, and Contraceptive Consistency Among College Women.

    Science.gov (United States)

    Morrison, Leslie F; Sieving, Renee E; Pettingell, Sandra L; Hellerstedt, Wendy L; McMorris, Barbara J; Bearinger, Linda H

    2016-01-01

    To explore risk and protective factors associated with consistent contraceptive use among emerging adult female college students and whether effects of risk indicators were moderated by protective factors. Secondary analysis of National Longitudinal Study of Adolescent to Adult Health Wave III data. Data collected through in-home interviews in 2001 and 2002. National sample of 18- to 25-year-old women (N = 842) attending 4-year colleges. We examined relationships between protective factors, risk indicators, and consistent contraceptive use. Consistent contraceptive use was defined as use all of the time during intercourse in the past 12 months. Protective factors included external supports of parental closeness and relationship with caring nonparental adult and internal assets of self-esteem, confidence, independence, and life satisfaction. Risk indicators included heavy episodic drinking, marijuana use, and depression symptoms. Multivariable logistic regression models were used to evaluate relationships between protective factors and consistent contraceptive use and between risk indicators and contraceptive use. Self-esteem, confidence, independence, and life satisfaction were significantly associated with more consistent contraceptive use. In a final model including all internal assets, life satisfaction was significantly related to consistent contraceptive use. Marijuana use and depression symptoms were significantly associated with less consistent use. With one exception, protective factors did not moderate relationships between risk indicators and consistent use. Based on our findings, we suggest that risk and protective factors may have largely independent influences on consistent contraceptive use among college women. A focus on risk and protective factors may improve contraceptive use rates and thereby reduce unintended pregnancy among college students. Copyright © 2016 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published

  15. Application of the k{sub 0}-INAA method for analysis of biological samples at the pneumatic station of the IEA-R1 nuclear research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Puerta, Daniel C.; Figueiredo, Ana Maria G.; Semmler, Renato, E-mail: dcpuerta@hotmail.com, E-mail: anamaria@ipen.br, E-mail: rsemmler@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Jacimovic, Radojko, E-mail: radojko.jacimovic@ijs.si [Jozef Stefan Institute (JSI), Ljubljana, LJU (Slovenia). Department of Environmental Sciences

    2013-07-01

    As part of the process of implementation of the k{sub 0}-INAA standardization method at the Neutron Activation Laboratory (LAN-IPEN), Sao Paulo, Brazil, this study presents the results obtained for the analysis of short and medium-lived nuclides in biological samples by k{sub 0}-INAA using the program k{sub 0}-IAEA, provided by the International Atomic Energy Agency (IAEA). The elements Al, Ba, Br, Na, K, Mn, Mg, Sr and V were determined with respect to gold ({sup 197}Au) using the pneumatic station facility of the IEA-R1 4.5 MW swimming pool nuclear research reactor, Sao Paulo. Characterization of the pneumatic station was carried out by using the 'bare triple-monitor' method with {sup 197}Au-{sup 96}Zr-{sup 94}Zr. The Certified Reference Material IRMM-530R Al-0.1%Au alloy and high purity zirconium comparators were used. The efficiency curves of the gamma-ray spectrometer used were determined by measuring calibrated radioactive sources at the usually utilized counting geometries. The method was validated by analyzing the reference materials NIST SRM 1547 Peach Leaves, INCT-MPH-2 Mixed Polish Herbs and NIST SRM 1573a Tomato Leaves. The concentration results obtained agreed with certified, reference and recommended values, showing relative errors (bias, %) less than 30% for most elements. The Coefficients of Variation were below 20%, showing a good reproducibility of the results. The E{sub n}-number showed that all results, except Na in NIST SRM 1547 and NIST SRM 1573a and Al in INCT-MPH-2, were within 95% confidence interval. (author)

  16. Sampling method of water sources at study site Taiping, Perak and Pulau Burung, Penang for research on pollutant movement in underground water

    International Nuclear Information System (INIS)

    Mohd Rifaie Mohd Murtadza; Mohd Tadza Abdul Rahman; Kamarudin Samuding; Roslanzairi Mostapa

    2005-01-01

    This paperwork explain the method of water sampling being used to take the water samples from the study sites in Taiping, Perak and Pulau Burung, Pulau Pinang. The sampling involve collecting of water samples for groundwater from boreholes and surface water from canal, river, pond, and ex-mining pond from several locations at the study sites. This study also elaborates the instruments and chemical used. The main purpose of this sampling are to obtain the important water quality parameters such as pH, conductivity, Total Dissolved Solid (TDS), heavy metals, anions, cations, and environmental isotopes delta values (d) for 18O, Deuterium dan Tritium. A correct sampling method according to standard is very important to ensure an accurate and precise results. With this, the data from the laboratory tests result can be fully utilized to make the interpretation of the pollutants movement. (Author)

  17. A consistent interpretation of quantum mechanics

    International Nuclear Information System (INIS)

    Omnes, Roland

    1990-01-01

    Some mostly recent theoretical and mathematical advances can be linked together to yield a new consistent interpretation of quantum mechanics. It relies upon a unique and universal interpretative rule of a logical character which is based upon Griffiths consistent history. Some new results in semi-classical physics allow classical physics to be derived from this rule, including its logical aspects, and to prove accordingly the existence of determinism within the quantum framework. Together with decoherence, this can be used to retrieve the existence of facts, despite the probabilistic character of the theory. Measurement theory can then be made entirely deductive. It is accordingly found that wave packet reduction is a logical property, whereas one can always choose to avoid using it. The practical consequences of this interpretation are most often in agreement with the Copenhagen formulation but they can be proved never to give rise to any logical inconsistency or paradox. (author)

  18. Self-consistency in Capital Markets

    Science.gov (United States)

    Benbrahim, Hamid

    2013-03-01

    Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.

  19. Student Effort, Consistency and Online Performance

    Directory of Open Access Journals (Sweden)

    Hilde Patron

    2011-07-01

    Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.

  20. Consistent thermodynamic properties of lipids systems

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve......Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... the performance of predictive thermodynamic models was confirmed in this work by analyzing the calculated values of original UNIFAC model. For solid-liquid equilibrium (SLE) data, new consistency tests have been developed [2]. Some of the developed tests were based in the quality tests proposed for VLE data...

  1. Consistency relation for cosmic magnetic fields

    DEFF Research Database (Denmark)

    Jain, R. K.; Sloth, M. S.

    2012-01-01

    If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the cosmic microwave background anisotropies and large scale structure. Within an archetypical model of inflationary magnetogenesis, we show...... that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out...... to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields. DOI...

  2. Consistent Estimation of Partition Markov Models

    Directory of Open Access Journals (Sweden)

    Jesús E. García

    2017-04-01

    Full Text Available The Partition Markov Model characterizes the process by a partition L of the state space, where the elements in each part of L share the same transition probability to an arbitrary element in the alphabet. This model aims to answer the following questions: what is the minimal number of parameters needed to specify a Markov chain and how to estimate these parameters. In order to answer these questions, we build a consistent strategy for model selection which consist of: giving a size n realization of the process, finding a model within the Partition Markov class, with a minimal number of parts to represent the process law. From the strategy, we derive a measure that establishes a metric in the state space. In addition, we show that if the law of the process is Markovian, then, eventually, when n goes to infinity, L will be retrieved. We show an application to model internet navigation patterns.

  3. Internal Branding and Employee Brand Consistent Behaviours

    DEFF Research Database (Denmark)

    Mazzei, Alessandra; Ravazzani, Silvia

    2017-01-01

    constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...... and 2 focus groups with Italian communication managers. Findings show that, in order to enhance employee brand consistent behaviours, the most effective communication practices are those characterised as enablement-oriented. Such a communication creates the organizational conditions adequate to sustain......Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non...

  4. Self-consistent velocity dependent effective interactions

    International Nuclear Information System (INIS)

    Kubo, Takayuki; Sakamoto, Hideo; Kammuri, Tetsuo; Kishimoto, Teruo.

    1993-09-01

    The field coupling method is extended to a system with a velocity dependent mean potential. By means of this method, we can derive the effective interactions which are consistent with the mean potential. The self-consistent velocity dependent effective interactions are applied to the microscopic analysis of the structures of giant dipole resonances (GDR) of 148,154 Sm, of the first excited 2 + states of Sn isotopes and of the first excited 3 - states of Mo isotopes. It is clarified that the interactions play crucial roles in describing the splitting of the resonant structure of GDR peaks, in restoring the energy weighted sum rule values, and in reducing B (Eλ) values. (author)

  5. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  6. Consistency relations in effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Munshi, Dipak; Regan, Donough, E-mail: D.Munshi@sussex.ac.uk, E-mail: D.Regan@sussex.ac.uk [Astronomy Centre, School of Mathematical and Physical Sciences, University of Sussex, Brighton BN1 9QH (United Kingdom)

    2017-06-01

    The consistency relations in large scale structure relate the lower-order correlation functions with their higher-order counterparts. They are direct outcome of the underlying symmetries of a dynamical system and can be tested using data from future surveys such as Euclid. Using techniques from standard perturbation theory (SPT), previous studies of consistency relation have concentrated on continuity-momentum (Euler)-Poisson system of an ideal fluid. We investigate the consistency relations in effective field theory (EFT) which adjusts the SPT predictions to account for the departure from the ideal fluid description on small scales. We provide detailed results for the 3D density contrast δ as well as the scaled divergence of velocity θ-bar . Assuming a ΛCDM background cosmology, we find the correction to SPT results becomes important at k ∼> 0.05 h/Mpc and that the suppression from EFT to SPT results that scales as square of the wave number k , can reach 40% of the total at k ≈ 0.25 h/Mpc at z = 0. We have also investigated whether effective field theory corrections to models of primordial non-Gaussianity can alter the squeezed limit behaviour, finding the results to be rather insensitive to these counterterms. In addition, we present the EFT corrections to the squeezed limit of the bispectrum in redshift space which may be of interest for tests of theories of modified gravity.

  7. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  8. Orthology and paralogy constraints: satisfiability and consistency.

    Science.gov (United States)

    Lafond, Manuel; El-Mabrouk, Nadia

    2014-01-01

    A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family  G. But is a given set  C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for  G? While previous studies have focused on full sets of constraints, here we consider the general case where  C does not necessarily involve a constraint for each pair of genes. The problem is subdivided in two parts: (1) Is  C satisfiable, i.e. can we find an event-labeled gene tree G inducing  C? (2) Is there such a G which is consistent, i.e., such that all displayed triplet phylogenies are included in a species tree? Previous results on the Graph sandwich problem can be used to answer to (1), and we provide polynomial-time algorithms for satisfiability and consistency with a given species tree. We also describe a new polynomial-time algorithm for the case of consistency with an unknown species tree and full knowledge of pairwise orthology/paralogy relationships, as well as a branch-and-bound algorithm in the case when unknown relations are present. We show that our algorithms can be used in combination with ProteinOrtho, a sequence similarity-based orthology detection tool, to extract a set of robust orthology/paralogy relationships.

  9. Laser sampling

    International Nuclear Information System (INIS)

    Gorbatenko, A A; Revina, E I

    2015-01-01

    The review is devoted to the major advances in laser sampling. The advantages and drawbacks of the technique are considered. Specific features of combinations of laser sampling with various instrumental analytical methods, primarily inductively coupled plasma mass spectrometry, are discussed. Examples of practical implementation of hybrid methods involving laser sampling as well as corresponding analytical characteristics are presented. The bibliography includes 78 references

  10. Enhancing rooting consistency in Rosa damascena scions

    African Journals Online (AJOL)

    use

    2011-11-21

    Nov 21, 2011 ... 3Pharmaceutics Research Center, Kerman University of Medical Science, Kerman, Iran. Accepted 14 ... containing; sand, soil, organic materials and vermiculite. ... tap water followed by surface sterilization of scions using an.

  11. Bootstrap-Based Inference for Cube Root Consistent Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....

  12. Consistency in color parameters of a commonly used shade guide.

    Science.gov (United States)

    Tashkandi, Esam

    2010-01-01

    The use of shade guides to assess the color of natural teeth subjectively remains one of the most common means for dental shade assessment. Any variation in the color parameters of the different shade guides may lead to significant clinical implications. Particularly, since the communication between the clinic and the dental laboratory is based on using the shade guide designation. The purpose of this study was to investigate the consistency of the L∗a∗b∗ color parameters of a sample of a commonly used shade guide. The color parameters of a total of 100 VITAPAN Classical Vacuum shade guide (VITA Zahnfabrik, Bad Säckingen, Germany(were measured using a X-Rite ColorEye 7000A Spectrophotometer (Grand Rapids, Michigan, USA). Each shade guide consists of 16 tabs with different designations. Each shade tab was measured five times and the average values were calculated. The ΔE between the average L∗a∗b∗ value for each shade tab and the average of the 100 shade tabs of the same designation was calculated. Using the Student t-test analysis, no significant differences were found among the measured sample. There is a high consistency level in terms of color parameters of the measured VITAPAN Classical Vacuum shade guide sample tested.

  13. Evidence for Consistency of the Glycation Gap in Diabetes

    OpenAIRE

    Nayak, Ananth U.; Holland, Martin R.; Macdonald, David R.; Nevill, Alan; Singh, Baldev M.

    2011-01-01

    OBJECTIVE Discordance between HbA1c and fructosamine estimations in the assessment of glycemia is often encountered. A number of mechanisms might explain such discordance, but whether it is consistent is uncertain. This study aims to coanalyze paired glycosylated hemoglobin (HbA1c)-fructosamine estimations by using fructosamine to determine a predicted HbA1c, to calculate a glycation gap (G-gap) and to determine whether the G-gap is consistent over time. RESEARCH DESIGN AND METHODS We include...

  14. Self-consistent gravitational self-force

    International Nuclear Information System (INIS)

    Pound, Adam

    2010-01-01

    I review the problem of motion for small bodies in general relativity, with an emphasis on developing a self-consistent treatment of the gravitational self-force. An analysis of the various derivations extant in the literature leads me to formulate an asymptotic expansion in which the metric is expanded while a representative worldline is held fixed. I discuss the utility of this expansion for both exact point particles and asymptotically small bodies, contrasting it with a regular expansion in which both the metric and the worldline are expanded. Based on these preliminary analyses, I present a general method of deriving self-consistent equations of motion for arbitrarily structured (sufficiently compact) small bodies. My method utilizes two expansions: an inner expansion that keeps the size of the body fixed, and an outer expansion that lets the body shrink while holding its worldline fixed. By imposing the Lorenz gauge, I express the global solution to the Einstein equation in the outer expansion in terms of an integral over a worldtube of small radius surrounding the body. Appropriate boundary data on the tube are determined from a local-in-space expansion in a buffer region where both the inner and outer expansions are valid. This buffer-region expansion also results in an expression for the self-force in terms of irreducible pieces of the metric perturbation on the worldline. Based on the global solution, these pieces of the perturbation can be written in terms of a tail integral over the body's past history. This approach can be applied at any order to obtain a self-consistent approximation that is valid on long time scales, both near and far from the small body. I conclude by discussing possible extensions of my method and comparing it to alternative approaches.

  15. Measuring consistency of autobiographical memory recall in depression.

    LENUS (Irish Health Repository)

    Semkovska, Maria

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.

  16. Measuring consistency of autobiographical memory recall in depression.

    Science.gov (United States)

    Semkovska, Maria; Noone, Martha; Carton, Mary; McLoughlin, Declan M

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. High-performance speech recognition using consistency modeling

    Science.gov (United States)

    Digalakis, Vassilios; Murveit, Hy; Monaco, Peter; Neumeyer, Leo; Sankar, Ananth

    1994-12-01

    The goal of SRI's consistency modeling project is to improve the raw acoustic modeling component of SRI's DECIPHER speech recognition system and develop consistency modeling technology. Consistency modeling aims to reduce the number of improper independence assumptions used in traditional speech recognition algorithms so that the resulting speech recognition hypotheses are more self-consistent and, therefore, more accurate. At the initial stages of this effort, SRI focused on developing the appropriate base technologies for consistency modeling. We first developed the Progressive Search technology that allowed us to perform large-vocabulary continuous speech recognition (LVCSR) experiments. Since its conception and development at SRI, this technique has been adopted by most laboratories, including other ARPA contracting sites, doing research on LVSR. Another goal of the consistency modeling project is to attack difficult modeling problems, when there is a mismatch between the training and testing phases. Such mismatches may include outlier speakers, different microphones and additive noise. We were able to either develop new, or transfer and evaluate existing, technologies that adapted our baseline genonic HMM recognizer to such difficult conditions.

  18. Consistency Checking of Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2008-01-01

    Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...

  19. A method for consistent precision radiation therapy

    International Nuclear Information System (INIS)

    Leong, J.

    1985-01-01

    Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)

  20. Gentzen's centenary the quest for consistency

    CERN Document Server

    Rathjen, Michael

    2015-01-01

    Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.

  1. Two consistent calculations of the Weinberg angle

    International Nuclear Information System (INIS)

    Fairlie, D.B.

    1979-01-01

    The Weinberg-Salam theory is reformulated as a pure Yang-Mills theory in a six-dimensional space, the Higgs field being interpreted as gauge potentials in the additional dimensions. Viewed in this way, the condition that the Higgs field transforms as a U(1) representation of charge one is equivalent to requiring a value of 30 0 C for the Weinberg angle. A second consistent determination comes from the idea borrowed from monopole theory that the electromagnetic field is in the direction of the Higgs field. (Author)

  2. The Consistency of Performance Management System Based on Attributes of the Performance Indicator: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Jan Zavadsky

    2014-07-01

    Full Text Available Purpose: The performance management system (PMS is a metasystem over all business processes at the strategic and operational level. Effectiveness of the various management systems depends on many factors. One of them is the consistent definition of each system elements. The main purpose of this study is to explore if the performance management systems of the sample companies is consistent and how companies can create such a system. The consistency in this case is based on the homogenous definition of attributes relating to the performance indicator as a basic element of PMS.Methodology: At the beginning, we used an affinity diagram that helped us to clarify and to group various attributes of performance indicators. The main research results we achieved are through empirical study. The empirical study was carried out in a sample of Slovak companies. The criterion for selection was the existence of the certified management systems according to the ISO 9001. Representativeness of the sample companies was confirmed by application of Pearson´s chi-squared test (χ2 - test due to above standards. Findings: Coming from the review of various literature, we defined four groups of attributes relating to the performance indicator: formal attributes, attributes of target value, informational attributes and attributes of evaluation. The whole set contains 21 attributes. The consistency of PMS is based not on maximum or minimum number of attributes, but on the same type of attributes for each performance indicator used in PMS at both the operational and strategic level. The main findings are: companies use various financial and non-financial indicators at strategic or operational level; companies determine various attributes of performance indicator, but most of the performance indicators are otherwise determined; we identified the common attributes for the whole sample of companies. Practical implications: The research results have got an implication for

  3. Consistent Alignment of World Embedding Models

    Science.gov (United States)

    2017-03-02

    propose a solution that aligns variations of the same model (or different models) in a joint low-dimensional la- tent space leveraging carefully...representations of linguistic enti- ties, most often referred to as embeddings. This includes techniques that rely on matrix factoriza- tion (Levy & Goldberg ...higher, the variation is much higher as well. As we increase the size of the neighborhood, or improve the quality of our sample by only picking the most

  4. The Potential of Online Respondent Data for Choice Modeling in Transportation Research: Evidence from Stated Preference Experiments using Web-based Samples: Evidence from Stated Preference Experiments using Web-based Samples

    OpenAIRE

    Hoffer, Brice

    2015-01-01

    The aim of this thesis is to analyze the potential of online survey services for conducting stated preference experiments in the field of transportation planning. Several web-products for hosting questionnaires are evaluated considering important features required when conducting a stated preference survey. Based on this evaluation, the open-source platform LimeSurvey is the most appropriated for this kind of research. A stated preference questionnaire about pedestrians’ route choice in a Sin...

  5. Sample Applications of the Second Generation Intact Stability Criteria – Robustness and Consistency Analysis

    DEFF Research Database (Denmark)

    Schrøter, Carsten; Lützen, Marie; Erichsen, Henrik

    2017-01-01

    , if needed, a direct numerical simulation. The present paper summarizes results testing the vulnerability levels in these news tability criteria. The calculations are carried out for 17 ships using the full matrix of operational draughts, trims and GM values. Each failure mode criterion is examined...

  6. Consistent associations between measures of psychological stress and CMV antibody levels in a large occupational sample

    NARCIS (Netherlands)

    Rector, J.L.; Dowd, J.B.; Loerbroks, A.; Burns, V.E.; Moss, P.A.; Jarczok, M.N.; Stalder, T.; Hoffman, K.; Fischer, J.E.; Bosch, J.A.

    2014-01-01

    Cytomegalovirus (CMV) is a herpes virus that has been implicated in biological aging and impaired health. Evidence, largely accrued from small-scale studies involving select populations, suggests that stress may promote non-clinical reactivation of this virus. However, absent is evidence from larger

  7. Consistent resolution of some relativistic quantum paradoxes

    International Nuclear Information System (INIS)

    Griffiths, Robert B.

    2002-01-01

    A relativistic version of the (consistent or decoherent) histories approach to quantum theory is developed on the basis of earlier work by Hartle, and used to discuss relativistic forms of the paradoxes of spherical wave packet collapse, Bohm's formulation of the Einstein-Podolsky-Rosen paradox, and Hardy's paradox. It is argued that wave function collapse is not needed for introducing probabilities into relativistic quantum mechanics, and in any case should never be thought of as a physical process. Alternative approaches to stochastic time dependence can be used to construct a physical picture of the measurement process that is less misleading than collapse models. In particular, one can employ a coarse-grained but fully quantum-mechanical description in which particles move along trajectories, with behavior under Lorentz transformations the same as in classical relativistic physics, and detectors are triggered by particles reaching them along such trajectories. States entangled between spacelike separate regions are also legitimate quantum descriptions, and can be consistently handled by the formalism presented here. The paradoxes in question arise because of using modes of reasoning which, while correct for classical physics, are inconsistent with the mathematical structure of quantum theory, and are resolved (or tamed) by using a proper quantum analysis. In particular, there is no need to invoke, nor any evidence for, mysterious long-range superluminal influences, and thus no incompatibility, at least from this source, between relativity theory and quantum mechanics

  8. Self-consistent model of confinement

    International Nuclear Information System (INIS)

    Swift, A.R.

    1988-01-01

    A model of the large-spatial-distance, zero--three-momentum, limit of QCD is developed from the hypothesis that there is an infrared singularity. Single quarks and gluons do not propagate because they have infinite energy after renormalization. The Hamiltonian formulation of the path integral is used to quantize QCD with physical, nonpropagating fields. Perturbation theory in the infrared limit is simplified by the absence of self-energy insertions and by the suppression of large classes of diagrams due to vanishing propagators. Remaining terms in the perturbation series are resummed to produce a set of nonlinear, renormalizable integral equations which fix both the confining interaction and the physical propagators. Solutions demonstrate the self-consistency of the concepts of an infrared singularity and nonpropagating fields. The Wilson loop is calculated to provide a general proof of confinement. Bethe-Salpeter equations for quark-antiquark pairs and for two gluons have finite-energy solutions in the color-singlet channel. The choice of gauge is addressed in detail. Large classes of corrections to the model are discussed and shown to support self-consistency

  9. Subgame consistent cooperation a comprehensive treatise

    CERN Document Server

    Yeung, David W K

    2016-01-01

    Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...

  10. Consistent mutational paths predict eukaryotic thermostability

    Directory of Open Access Journals (Sweden)

    van Noort Vera

    2013-01-01

    Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.

  11. Consistency of extreme flood estimation approaches

    Science.gov (United States)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  12. Consistent biokinetic models for the actinide elements

    International Nuclear Information System (INIS)

    Leggett, R.W.

    2001-01-01

    The biokinetic models for Th, Np, Pu, Am and Cm currently recommended by the International Commission on Radiological Protection (ICRP) were developed within a generic framework that depicts gradual burial of skeletal activity in bone volume, depicts recycling of activity released to blood and links excretion to retention and translocation of activity. For other actinide elements such as Ac, Pa, Bk, Cf and Es, the ICRP still uses simplistic retention models that assign all skeletal activity to bone surface and depicts one-directional flow of activity from blood to long-term depositories to excreta. This mixture of updated and older models in ICRP documents has led to inconsistencies in dose estimates and interpretation of bioassay for radionuclides with reasonably similar biokinetics. This paper proposes new biokinetic models for Ac, Pa, Bk, Cf and Es that are consistent with the updated models for Th, Np, Pu, Am and Cm. The proposed models are developed within the ICRP's generic model framework for bone-surface-seeking radionuclides, and an effort has been made to develop parameter values that are consistent with results of comparative biokinetic data on the different actinide elements. (author)

  13. Using Trait-State Models to Evaluate the Longitudinal Consistency of Global Self-Esteem From Adolescence to Adulthood

    OpenAIRE

    Donnellan, M. Brent; Kenny, David A.; Trzesniewski, Kali H.; Lucas, Richard E.; Conger, Rand D.

    2012-01-01

    The present research used a latent variable trait-state model to evaluate the longitudinal consistency of self-esteem during the transition from adolescence to adulthood. Analyses were based on ten administrations of the Rosenberg Self-Esteem scale (Rosenberg, 1965) spanning the ages of approximately 13 to 32 for a sample of 451 participants. Results indicated that a completely stable trait factor and an autoregressive trait factor accounted for the majority of the variance in latent self-est...

  14. Soil sampling

    International Nuclear Information System (INIS)

    Fortunati, G.U.; Banfi, C.; Pasturenzi, M.

    1994-01-01

    This study attempts to survey the problems associated with techniques and strategies of soil sampling. Keeping in mind the well defined objectives of a sampling campaign, the aim was to highlight the most important aspect of representativeness of samples as a function of the available resources. Particular emphasis was given to the techniques and particularly to a description of the many types of samplers which are in use. The procedures and techniques employed during the investigations following the Seveso accident are described. (orig.)

  15. Kenyan female sex workers' use of female-controlled nonbarrier modern contraception: do they use condoms less consistently?

    Science.gov (United States)

    Yam, Eileen A; Okal, Jerry; Musyoki, Helgar; Muraguri, Nicholas; Tun, Waimar; Sheehy, Meredith; Geibel, Scott

    2016-03-01

    To examine whether nonbarrier modern contraceptive use is associated with less consistent condom use among Kenyan female sex workers (FSWs). Researchers recruited 579 FSWs using respondent-driven sampling. We conducted multivariate logistic regression to examine the association between consistent condom use and female-controlled nonbarrier modern contraceptive use. A total of 98.8% reported using male condoms in the past month, and 64.6% reported using female-controlled nonbarrier modern contraception. In multivariate analysis, female-controlled nonbarrier modern contraceptive use was not associated with decreased condom use with clients or nonpaying partners. Consistency of condom use is not compromised when FSWs use available female-controlled nonbarrier modern contraception. FSWs should be encouraged to use condoms consistently, whether or not other methods are used simultaneously. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. The memory failures of everyday questionnaire (MFE): internal consistency and reliability.

    Science.gov (United States)

    Montejo Carrasco, Pedro; Montenegro, Peña Mercedes; Sueiro, Manuel J

    2012-07-01

    The Memory Failures of Everyday Questionnaire (MFE) is one of the most widely-used instruments to assess memory failures in daily life. The original scale has nine response options, making it difficult to apply; we created a three-point scale (0-1-2) with response choices that make it easier to administer. We examined the two versions' equivalence in a sample of 193 participants between 19 and 64 years of age. The test-retest reliability and internal consistency of the version we propose were also computed in a sample of 113 people. Several indicators attest to the two forms' equivalence: the correlation between the items' means (r = .94; p MFE 1-9. The MFE 0-2 provides a brief, simple evaluation, so we recommend it for use in clinical practice as well as research.

  17. Consistent Partial Least Squares Path Modeling via Regularization.

    Science.gov (United States)

    Jung, Sunho; Park, JaeHong

    2018-01-01

    Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.

  18. A consistent thermodynamic database for cement minerals

    International Nuclear Information System (INIS)

    Blanc, P.; Claret, F.; Burnol, A.; Marty, N.; Gaboreau, S.; Tournassat, C.; Gaucher, E.C.; Giffault, E.; Bourbon, X.

    2010-01-01

    work - the formation enthalpy and the Cp(T) function are taken from the literature or estimated - finally, the Log K(T) function is calculated, based on the selected dataset and it is compared to experimental data gathered at different temperatures. Each experimental point is extracted from solution compositions by using PHREEQC with a selection of aqueous complexes, consistent with the Thermochimie database. The selection was tested namely by drawing activity diagrams, allowing to assess phases relations. An example of such a diagram, drawn in the CaO-Al 2 O 3 -SiO 2 -H 2 O system is displayed. It can be seen that low pH concrete alteration proceeds essentially in decreasing the C/S ratio in C-S-H phases to the point where C-S-H are no longer stable and replaced by zeolite, then clay minerals. This evolution corresponds to a decrease in silica activity, which is consistent with the pH decrease, as silica concentration depends essentially on pH. Some rather consistent phase relations have been obtained for the SO 3 -Al 2 O 3 -CaO-CO 2 -H 2 O system. Addition of iron III enlarges the AFm-SO 4 stability field to the low temperature domain, whereas it decreases the pH domain where ettringite is stable. On the other hand, the stability field of katoite remains largely ambiguous, namely with respect to a hydro-garnet/grossular solid solution. With respect to other databases this work was made in consistency with a larger mineral selection, so that it can be used for modelling works in the cement clay interaction context

  19. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel; Houborg, Rasmus; McCabe, Matthew

    2017-01-01

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  20. Self-consistent modelling of ICRH

    International Nuclear Information System (INIS)

    Hellsten, T.; Hedin, J.; Johnson, T.; Laxaaback, M.; Tennfors, E.

    2001-01-01

    The performance of ICRH is often sensitive to the shape of the high energy part of the distribution functions of the resonating species. This requires self-consistent calculations of the distribution functions and the wave-field. In addition to the wave-particle interactions and Coulomb collisions the effects of the finite orbit width and the RF-induced spatial transport are found to be important. The inward drift dominates in general even for a symmetric toroidal wave spectrum in the centre of the plasma. An inward drift does not necessarily produce a more peaked heating profile. On the contrary, for low concentrations of hydrogen minority in deuterium plasmas it can even give rise to broader profiles. (author)

  1. Non linear self consistency of microtearing modes

    International Nuclear Information System (INIS)

    Garbet, X.; Mourgues, F.; Samain, A.

    1987-01-01

    The self consistency of a microtearing turbulence is studied in non linear regimes where the ergodicity of the flux lines determines the electron response. The current which sustains the magnetic perturbation via the Ampere law results from the combines action of the radial electric field in the frame where the island chains are static and of the thermal electron diamagnetism. Numerical calculations show that at usual values of β pol in Tokamaks the turbulence can create a diffusion coefficient of order ν th p 2 i where p i is the ion larmor radius and ν th the electron ion collision frequency. On the other hand, collisionless regimes involving special profiles of each mode near the resonant surface seem possible

  2. Consistent evolution in a pedestrian flow

    Science.gov (United States)

    Guan, Junbiao; Wang, Kaihua

    2016-03-01

    In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.

  3. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel

    2017-01-18

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  4. Thermodynamically consistent model calibration in chemical kinetics

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2011-05-01

    Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new

  5. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  6. Environmental sampling

    International Nuclear Information System (INIS)

    Puckett, J.M.

    1998-01-01

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation

  7. "I didn't have anything to decide, I wanted to help my kids" - An interview-based study of consent procedures for sampling human biological material for genetic research in rural Pakistan.

    Science.gov (United States)

    Kongsholm, Nana Cecilie Halmsted; Lassen, Jesper; Sandøe, Peter

    2018-05-03

    Individual, comprehensive, and written informed consent is broadly considered an ethical obligation in research involving the sampling of human material. In developing countries, however, local conditions, such as widespread illiteracy, low levels of education, and hierarchical social structures complicate compliance with these standards. As a result, researchers may modify the consent process to secure participation. To evaluate the ethical status of such modified consent strategies it is necessary to assess the extent to which local practices accord with the values underlying informed consent. Over a two-week period in April 2014 we conducted semi-structured interviews with researchers from a genetic research institute in rural Pakistan and families who had given blood samples for their research. Interviews with researchers focused on the institute's requirements for consent, and the researchers' strategies for and experiences with obtaining consent in the field. Interviews with donors focused on their motivation for donating samples, their experience of consent and donation, and what factors were central in their decisions to give consent. Researchers often reported modifications to consent procedures suited to the local context, standardly employing oral and elder consent, and tailoring information to the social education level of donor families. Central themes in donors' accounts of their decision to consent were the hope of getting something out of their participation and their remarkably high levels of trust in the researchers. Several donor accounts indicated a degree of confusion about participation and diagnosis, resulting in misconceived expectations of therapeutic benefits. We argue that while building and maintaining trusting relationships in research is important - not least in developing countries - strategies that serve this endeavor should be supplemented with efforts to ensure proper provision and understanding of relevant information

  8. Self-consistent T-matrix theory of superconductivity

    Czech Academy of Sciences Publication Activity Database

    Šopík, B.; Lipavský, Pavel; Männel, M.; Morawetz, K.; Matlock, P.

    2011-01-01

    Roč. 84, č. 9 (2011), 094529/1-094529/13 ISSN 1098-0121 R&D Projects: GA ČR GAP204/10/0212; GA ČR(CZ) GAP204/11/0015 Institutional research plan: CEZ:AV0Z10100521 Keywords : superconductivity * T-matrix * superconducting gap * restricted self-consistency Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 3.691, year: 2011

  9. [Quantity research on epidermal growth factor in saliva and epidermal growth factor receptor in biopsy samples of recurrent aphthous ulcer patients].

    Science.gov (United States)

    Gu, Yang; Zhang, Gang; Lin, Mei

    2008-02-01

    To examine the change of epidermal growth factor (EGF) concentration in saliva of recurrent aphthous ulcer (RAU) patients during the ulcerous and interval period and epidermal growth factor receptor (EGFR) in ulcer biopsy samples. ECF data of the samples, which were 27 saliva samples from RAU gained not only in the ulcerous period but also in interval period and 33 ones from normal persons, were acquired through enzyme linked immunosorhent assay (ELISA) and EGF standard curve. ECFR-RNA date of RAU biopsies, which were 31 biopsy samples from RAU got during the ulcerous period and 35 ones from normal persons, were surveyed by QF-RT-PCR. All RAU samples were obtained under the same level, which were the whole patients were minor aphthous ulcers and their ulcers occurred not over the first four days. All patients and normal persons were selected seriously under the rule of physical situations without any other diseases and histories of using medicines. The EGF concentration of saliva in RAU group at ulcer occurrence was higher than that in the interval period and the normal control with a significant test (F = 3.24, P ulcer occurrence was higher than the normal control with a significant test (t = 3.15, P ulcer occasion of RAU patients could be related with the decreasing of EGF in saliva during interval period, and that the ulcer sell-cure of RAU patients would be contributed to

  10. Field Exploration and Life Detection Sampling for Planetary Analogue Research (FELDSPAR): Variability and Correlation in Biomarker and Mineralogy Measurements from Icelandic Mars Analogues

    Science.gov (United States)

    Gentry, D.; Amador, E.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Jacobsen, M.; Kirby, J.; McCaig, H.; hide

    2018-01-01

    In situ exploration of planetary environments allows biochemical analysis of sub-centimeter-scale samples; however, landing sites are selected a priori based on measurable meter- to kilometer-scale geological features. Optimizing life detection mission science return requires both understanding the expected biomarker distributions across sample sites at different scales and efficiently using first-stage in situ geochemical instruments to justify later-stage biological or chemical analysis. Icelandic volcanic regions have an extensive history as Mars analogue sites due to desiccation, low nutrient availability, and temperature extremes, in addition to the advantages of geological youth and isolation from anthropogenic contamination. Many Icelandic analogue sites are also rugged and remote enough to create the same type of instrumentation and sampling constraints typically faced by robotic exploration.

  11. Effect of seawater on consistency, infiltration rate and swelling characteristics of montmorillonite clay

    Directory of Open Access Journals (Sweden)

    Mohie Eldin Elmashad

    2016-08-01

    Full Text Available This paper presents the results of an experimental investigation performed to quantify the effect of mixing clayey soils with saltwater on consistency and swelling characteristics of clays. Massive natural clay deposits and compacted clay backfills either exist or are used in certain important and sensitive applications such as dams, liners, barriers and buffers in waste disposal facilities. In many cases, the clay deposits in these applications are subjected to saltwater. However, in standard laboratory classification tests, distilled or potable water are usually used in mixing test samples. This may lead to faulty interpretation of the actual in-situ consistency and volume change behaviors. In this research, an attempt is made to quantify the changes in consistency and swelling of clay soils from various locations around the Nile valley and possessing a wide range of consistency, when mixed with natural seawater with different salt concentrations. The results showed that the increase of the salt concentration of the mixing water may result in major decrease in the liquid limit and swelling characteristics of high plasticity montmorillonite clays. The reduction in the swelling of the clay soils is also proportional to the rate of saltwater infiltration. In an attempt to correlate the swelling of clays to the rate of water infiltration, a new simplified laboratory apparatus is proposed where swelling and infiltration are measured in one simple test “the swelling infiltrometer”.

  12. Consistency test of the standard model

    International Nuclear Information System (INIS)

    Pawlowski, M.; Raczka, R.

    1997-01-01

    If the 'Higgs mass' is not the physical mass of a real particle but rather an effective ultraviolet cutoff then a process energy dependence of this cutoff must be admitted. Precision data from at least two energy scale experimental points are necessary to test this hypothesis. The first set of precision data is provided by the Z-boson peak experiments. We argue that the second set can be given by 10-20 GeV e + e - colliders. We pay attention to the special role of tau polarization experiments that can be sensitive to the 'Higgs mass' for a sample of ∼ 10 8 produced tau pairs. We argue that such a study may be regarded as a negative selfconsistency test of the Standard Model and of most of its extensions

  13. Exploring the Consistent behavior of Information Services

    Directory of Open Access Journals (Sweden)

    Kapidakis Sarantos

    2016-01-01

    Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.

  14. [Consistent Declarative Memory with Depressive Symptomatology].

    Science.gov (United States)

    Botelho de Oliveira, Silvia; Flórez, Ruth Natalia Suárez; Caballero, Diego Andrés Vásquez

    2012-12-01

    Some studies have suggested that potentiated remembrance of negative events on people with depressive disorders seems to be an important factor in the etiology, course and maintenance of depression. Evaluate the emotional memory in people with and without depressive symptomatology by means of an audio-visual test. 73 university students were evaluated, male and female, between 18 and 40 years old, distributed in two groups: with depressive symptomatology (32) and without depressive symptomatology (40), using the Scale from the Center of Epidemiologic Studies for Depression (CES-D, English Abbreviation) and a cutting point of 20. There were not meaningful differences between free and voluntary recalls, with and without depressive symptomatology, in spite of the fact that both groups had granted a higher emotional value to the audio-visual test and that they had associated it with emotional sadness. People with depressive symptomatology did not exhibit the effect of mnemonic potentiation generally associated to the content of the emotional version of the test; therefore, the hypothesis of emotional consistency was not validated. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  15. Self consistent field theory of virus assembly

    Science.gov (United States)

    Li, Siyu; Orland, Henri; Zandi, Roya

    2018-04-01

    The ground state dominance approximation (GSDA) has been extensively used to study the assembly of viral shells. In this work we employ the self-consistent field theory (SCFT) to investigate the adsorption of RNA onto positively charged spherical viral shells and examine the conditions when GSDA does not apply and SCFT has to be used to obtain a reliable solution. We find that there are two regimes in which GSDA does work. First, when the genomic RNA length is long enough compared to the capsid radius, and second, when the interaction between the genome and capsid is so strong that the genome is basically localized next to the wall. We find that for the case in which RNA is more or less distributed uniformly in the shell, regardless of the length of RNA, GSDA is not a good approximation. We observe that as the polymer-shell interaction becomes stronger, the energy gap between the ground state and first excited state increases and thus GSDA becomes a better approximation. We also present our results corresponding to the genome persistence length obtained through the tangent-tangent correlation length and show that it is zero in case of GSDA but is equal to the inverse of the energy gap when using SCFT.

  16. Self-consistent nuclear energy systems

    International Nuclear Information System (INIS)

    Shimizu, A.; Fujiie, Y.

    1995-01-01

    A concept of self-consistent energy systems (SCNES) has been proposed as an ultimate goal of the nuclear energy system in the coming centuries. SCNES should realize a stable and unlimited energy supply without endangering the human race and the global environment. It is defined as a system that realizes at least the following four objectives simultaneously: (a) energy generation -attain high efficiency in the utilization of fission energy; (b) fuel production - secure inexhaustible energy source: breeding of fissile material with the breeding ratio greater than one and complete burning of transuranium through recycling; (c) burning of radionuclides - zero release of radionuclides from the system: complete burning of transuranium and elimination of radioactive fission products by neutron capture reactions through recycling; (d) system safety - achieve system safety both for the public and experts: eliminate criticality-related safety issues by using natural laws and simple logic. This paper describes the concept of SCNES and discusses the feasibility of the system. Both ''neutron balance'' and ''energbalance'' of the system are introduced as the necessary conditions to be satisfied at least by SCNES. Evaluations made so far indicate that both the neutron balance and the energy balance can be realized by fast reactors but not by thermal reactors. Concerning the system safety, two safety concepts: ''self controllability'' and ''self-terminability'' are introduced to eliminate the criticality-related safety issues in fast reactors. (author)

  17. Toward a consistent model for glass dissolution

    International Nuclear Information System (INIS)

    Strachan, D.M.; McGrail, B.P.; Bourcier, W.L.

    1994-01-01

    Understanding the process of glass dissolution in aqueous media has advanced significantly over the last 10 years through the efforts of many scientists around the world. Mathematical models describing the glass dissolution process have also advanced from simple empirical functions to structured models based on fundamental principles of physics, chemistry, and thermodynamics. Although borosilicate glass has been selected as the waste form for disposal of high-level wastes in at least 5 countries, there is no international consensus on the fundamental methodology for modeling glass dissolution that could be used in assessing the long term performance of waste glasses in a geologic repository setting. Each repository program is developing their own model and supporting experimental data. In this paper, we critically evaluate a selected set of these structured models and show that a consistent methodology for modeling glass dissolution processes is available. We also propose a strategy for a future coordinated effort to obtain the model input parameters that are needed for long-term performance assessments of glass in a geologic repository. (author) 4 figs., tabs., 75 refs

  18. Consistency Study About Critical Thinking Skill of PGSD Students (Teacher Candidate of Elementary School) on Energy Material

    Science.gov (United States)

    Wijayanti, M. D.; Raharjo, S. B.; Saputro, S.; Mulyani, S.

    2017-09-01

    This study aims to examine the consistency of critical thinking ability of PGSD students in Energy material. The study population is PGSD students in UNS Surakarta. Samples are using cluster random sampling technique obtained by 101 students. Consistency of student’s response in knowing the critical thinking ability of PGSD students can be used as a benchmark of PGSD students’ understanding to see the equivalence of IPA problem, especially in energy material presented with various phenomena. This research uses descriptive method. Data are obtained through questionnaires and interviews. The research results that the average level of critical thinking in this study is divided into 3 levels, i.e.: level 1 (54.85%), level 2 (19.93%), and level 3 (25.23%). The data of the research result affect to the weak of students’ Energy materials’ understanding. In addition, indicators identify that assumptions and arguments analysis are also still low. Ideally, the consistency of critical thinking ability as a whole has an impact on the expansion of students’ conceptual understanding. The results of the study may become a reference to improve the subsequent research in order to obtain positive changes in the ability of critical thinking of students who directly improve the concept of students’ better understanding, especially in energy materials at various real problems occured.

  19. CTD and Water Sample Data from Research Vessel Robert Gordon Sproul in the NE Pacific, 24 October 2013 (NCEI Accession 0157082)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The expedition by research vessel Robert Gordon Sproul from 23 to 25 October 2013 had the objective to recover a broken mooring from the CORC project (Consortium on...

  20. CTD and Water Sample Data from Research Vessel New Horizon in the NE Pacific, 19-22 September 2008 (NCEI Accession 0156931)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The expedition by research vessel New Horizon from 19 to 22 September 2008 had the objective to deploy a number of moored platforms for the CORC project (Consortium...

  1. Processed CTD and Water Sample Data from Research Vessel Ocean Starr in the NE Pacific, Aug. 31 and Sept. 01, 2012 (NCEI Accession 0156932)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The expedition by research vessel Ocean Starr on Aug. 31 and Sept. 01, 2012 had the objective to recover and re-deploy a number of moored platforms from the CORC...

  2. Processed CTD and Water Sample Data from Research Vessel Roger Revelle, Expedition RR1214, in the NE Pacific in November 2012 (NCEI Accession 0156228)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Expedition RR1214 by research vessel Roger Revelle was primarily a transit from French Polynesia to the US mainland. However, a small scientific program was...

  3. Size-fractioned zooplankton biomass data sampled during the Institute of Marine Research Norwegian Sea survey from 1995 to 2005 (NODC Accession 0049894)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data were downloaded from the COPEPOD data base website. The dataset contains zooplankton data from the Institute of Marine Research (Bergen Norway) Norwegian...

  4. Implications and applications of systematic reviews for evidence-based dentistry and comparative effectiveness research: A sample study on antibiotics for oro-facial cellulitis treatment

    OpenAIRE

    Quyen Bach; Vandan Kasar; Francesco Chiappelli

    2015-01-01

    Introduction: Comparative effectiveness and efficacy research for analysis and practice (CEERAP) was performed to assess the effects of penicillin-based versus erythromycin-based antibiotic treatments in patients with skin and soft tissue infections (SSTIs) including cellulitis, impetigo, and erysipelas. Because SSTIs, especially orofacial cellulitis, are volatile infectious diseases of a life-threatening nature, research on the most efficacious remedies is necessary. Methods: The stringent b...

  5. Wetlands Research Program. Evaluation of Methods for Sampling Vegetation and Delineating Wetlands Transition Zones in Coastal West-Central Florida, January 1979-May 1981.

    Science.gov (United States)

    1984-04-01

    method ( Catana 1963) compensates for some of the limitations of the point-centered quarter method. A quarter is established at a sampling point and...Principal Soil Areas of Florida--A Supplement to the General Soils Map. University of Florida, in cooperation with USDA, Bulletin 717. Catana , H. J. 1963

  6. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  7. View from Europe: stability, consistency or pragmatism

    International Nuclear Information System (INIS)

    Dunster, H.J.

    1988-01-01

    The last few years of this decade look like a period of reappraisal of radiation protection standards. The revised risk estimates from Japan will be available, and the United Nations Scientific Committee on the Effects of Atomic Radiation will be publishing new reports on biological topics. The International Commission on Radiological Protection (ICRP) has started a review of its basic recommendations, and the new specification for dose equivalent in radiation fields of the International Commission on Radiation Units and Measurements (ICRU) will be coming into use. All this is occurring at a time when some countries are still trying to catch up with committed dose equivalent and the recently recommended change in the value of the quality factor for neutrons. In Europe, the problems of adapting to new ICRP recommendations are considerable. The European Community, including 12 states and nine languages, takes ICRP recommendations as a basis and develops council directives that are binding on member states, which have then to arrange for their own regulatory changes. Any substantial adjustments could take 5 y or more to work through the system. Clearly, the regulatory preference is for stability. Equally clearly, trade unions and public interest groups favor a rapid response to scientific developments (provided that the change is downward). Organizations such as the ICRP have to balance their desire for internal consistency and intellectual purity against the practical problems of their clients in adjusting to change. This paper indicates some of the changes that might be necessary over the next few years and how, given a pragmatic approach, they might be accommodated in Europe without too much regulatory confusion

  8. The Consistency Between Clinical and Electrophysiological Diagnoses

    Directory of Open Access Journals (Sweden)

    Esra E. Okuyucu

    2009-09-01

    Full Text Available OBJECTIVE: The aim of this study was to provide information concerning the impact of electrophysiological tests in the clinical management and diagnosis of patients, and to evaluate the consistency between referring clinical diagnoses and electrophysiological diagnoses. METHODS: The study included 957 patients referred to the electroneuromyography (ENMG laboratory from different clinics with different clinical diagnoses in 2008. Demographic data, referring clinical diagnoses, the clinics where the requests wanted, and diagnoses after ENMG testing were recorded and statistically evaluated. RESULTS: In all, 957 patients [644 (67.3% female and 313 (32.7% male] were included in the study. Mean age of the patients was 45.40 ± 14.54 years. ENMG requests were made by different specialists; 578 (60.4% patients were referred by neurologists, 122 (12.8% by orthopedics, 140 (14.6% by neurosurgeons, and 117 (12.2% by physical treatment and rehabilitation departments. According to the results of ENMG testing, 513 (53.6% patients’ referrals were related to their referral diagnosis, whereas 397 (41.5% patients had normal ENMG test results, and 47 (4.9% patients had a diagnosis that differed from the referring diagnosis. Among the relation between the referral diagnosis and electrophysiological diagnosis according to the clinics where the requests were made, there was no statistical difference (p= 0.794, but there were statistically significant differences between the support of different clinical diagnoses, such as carpal tunnel syndrome, polyneuropathy, radiculopathy-plexopathy, entrapment neuropathy, and myopathy based on ENMG test results (p< 0.001. CONCLUSION: ENMG is a frequently used neurological examination. As such, referrals for ENMG can be made to either support the referring diagnosis or to exclude other diagnoses. This may explain the inconsistency between clinical referring diagnoses and diagnoses following ENMG

  9. Self-consistent meson mass spectrum

    International Nuclear Information System (INIS)

    Balazs, L.A.P.

    1982-01-01

    A dual-topological-unitarization (or dual-fragmentation) approach to the calculation of hadron masses is presented, in which the effect of planar ''sea''-quark loops is taken into account from the beginning. Using techniques based on analyticity and generalized ladder-graph dynamics, we first derive the approximate ''generic'' Regge-trajectory formula α(t) = max (S 1 +S 2 , S 3 +S 4 )-(1/2) +2alpha-circumflex'[s/sub a/ +(1/2)(t-summationm/sub i/ 2 )] for any given hadronic process 1+2→3+4, where S/sub i/ and m/sub i/ are the spins and masses of i = 1,2,3,4, and √s/sub a/ is the effective mass of the lowest nonvanishing contribution (a) exchanged in the crossed channel. By requiring a minimization of secondary (background, etc.) contributions to a, and demanding simultaneous consistency for entire sets of such processes, we are then able to calculate the masses of all the lowest pseudoscalar and vector qq-bar states with q = u,d,s and the Regge trajectories on which they lie. By making certain additional assumptions we are also able to do this with q = u,d,c and q = u,d,b. Our only arbitrary parameters are m/sub rho/, m/sub K/*, m/sub psi/, and m/sub Upsilon/, one of which merely serves to fix the energy scale. In contrast to many other approaches, a small m/sub π/ 2 /m/sub rho/ 2 ratio arises quite naturally in the present scheme

  10. Speed Consistency in the Smart Tachograph.

    Science.gov (United States)

    Borio, Daniele; Cano, Eduardo; Baldini, Gianmarco

    2018-05-16

    In the transportation sector, safety risks can be significantly reduced by monitoring the behaviour of drivers and by discouraging possible misconducts that entail fatigue and can increase the possibility of accidents. The Smart Tachograph (ST), the new revision of the Digital Tachograph (DT), has been designed with this purpose: to verify that speed limits and compulsory rest periods are respected by drivers. In order to operate properly, the ST periodically checks the consistency of data from different sensors, which can be potentially manipulated to avoid the monitoring of the driver behaviour. In this respect, the ST regulation specifies a test procedure to detect motion conflicts originating from inconsistencies between Global Navigation Satellite System (GNSS) and odometry data. This paper provides an experimental evaluation of the speed verification procedure specified by the ST regulation. Several hours of data were collected using three vehicles and considering light urban and highway environments. The vehicles were equipped with an On-Board Diagnostics (OBD) data reader and a GPS/Galileo receiver. The tests prescribed by the regulation were implemented with specific focus on synchronization aspects. The experimental analysis also considered aspects such as the impact of tunnels and the presence of data gaps. The analysis shows that the metrics selected for the tests are resilient to data gaps, latencies between GNSS and odometry data and simplistic manipulations such as data scaling. The new ST forces an attacker to falsify data from both sensors at the same time and in a coherent way. This makes more difficult the implementation of frauds in comparison to the current version of the DT.

  11. Fluidic sampling

    International Nuclear Information System (INIS)

    Houck, E.D.

    1992-01-01

    This paper covers the development of the fluidic sampler and its testing in a fluidic transfer system. The major findings of this paper are as follows. Fluidic jet samples can dependably produce unbiased samples of acceptable volume. The fluidic transfer system with a fluidic sampler in-line will transfer water to a net lift of 37.2--39.9 feet at an average ratio of 0.02--0.05 gpm (77--192 cc/min). The fluidic sample system circulation rate compares very favorably with the normal 0.016--0.026 gpm (60--100 cc/min) circulation rate that is commonly produced for this lift and solution with the jet-assisted airlift sample system that is normally used at ICPP. The volume of the sample taken with a fluidic sampler is dependant on the motive pressure to the fluidic sampler, the sample bottle size and on the fluidic sampler jet characteristics. The fluidic sampler should be supplied with fluid having the motive pressure of the 140--150 percent of the peak vacuum producing motive pressure for the jet in the sampler. Fluidic transfer systems should be operated by emptying a full pumping chamber to nearly empty or empty during the pumping cycle, this maximizes the solution transfer rate

  12. Consistency between recognition and behavior creates consciousness

    Directory of Open Access Journals (Sweden)

    Keita Inaba

    2004-08-01

    Full Text Available What is consciousness? Is it possible to create consciousness mechanically? Various studies have been performed in the fields of psychology and cerebral science to answer these questions. As of yet, however, no researchers have proposed a model capable of explaining the mind-body problem described by Descartes or replicating a consciousness as advanced as that of human beings. Ancient people believed that the consciousness resided in a Homunculus, a human in miniature who lived in the brain. It is no mystery that the ancients came up with such an idea; for consciousness has always been veiled in mystery, beyond the reach of our explorative powers. We can assert, however, that consciousness does not "live" in us, but "exists" in us. Insofar as the processes occurring inside the human brain are a product of the physical activity of the neurons that reside there, we believe that it should be possible to define consciousness systematically.

  13. SIMPLE ESTIMATOR AND CONSISTENT STRONGLY OF STABLE DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Cira E. Guevara Otiniano

    2016-06-01

    Full Text Available Stable distributions are extensively used to analyze earnings of financial assets, such as exchange rates and stock prices assets. In this paper we propose a simple and strongly consistent estimator for the scale parameter of a symmetric stable L´evy distribution. The advantage of this estimator is that your computational time is minimum thus it can be used to initialize intensive computational procedure such as maximum likelihood. With random samples of sized n we tested the efficacy of these estimators by Monte Carlo method. We also included applications for three data sets.

  14. Fast radiochemical procedure to measure neptunium, plutonium, americium and curium in environmental samples for application in environmental monitoring and in radioecology research

    International Nuclear Information System (INIS)

    Pimpel, M.; Schuettelkopf, H.

    1984-01-01

    A radiochemical method is described by which Np, Pu, Am and Cm in environmental samples can be determined. The transuranium elements are dissolved with acids out of the ashed material. Np/Pu is separated from Am/Cm by sequential extraction using TOPO/cyclohexane. The two fractions are radiochemically purified. Np-237, Pu-239+240, Pu-238 and Pu-236 as well as Am-243, Am-241, Cm-244 and Cm-242 are measured by alpha spectrometery. Pu-236, Am-243 and Np-239 are used to determine the respective yields. A fast method of Np-239 preparation is described. The chemical yields range from 60 to 90%. The detection limit attained per nuclide is 10 fCi/sample. 20 reference, 1 table

  15. A New Heteroskedastic Consistent Covariance Matrix Estimator using Deviance Measure

    Directory of Open Access Journals (Sweden)

    Nuzhat Aftab

    2016-06-01

    Full Text Available In this article we propose a new heteroskedastic consistent covariance matrix estimator, HC6, based on deviance measure. We have studied and compared the finite sample behavior of the new test and compared it with other this kind of estimators, HC1, HC3 and HC4m, which are used in case of leverage observations. Simulation study is conducted to study the effect of various levels of heteroskedasticity on the size and power of quasi-t test with HC estimators. Results show that the test statistic based on our new suggested estimator has better asymptotic approximation and less size distortion as compared to other estimators for small sample sizes when high level ofheteroskedasticity is present in data.

  16. A New Bias Corrected Version of Heteroscedasticity Consistent Covariance Estimator

    Directory of Open Access Journals (Sweden)

    Munir Ahmed

    2016-06-01

    Full Text Available In the presence of heteroscedasticity, different available flavours of the heteroscedasticity consistent covariance estimator (HCCME are used. However, the available literature shows that these estimators can be considerably biased in small samples. Cribari–Neto et al. (2000 introduce a bias adjustment mechanism and give the modified White estimator that becomes almost bias-free even in small samples. Extending these results, Cribari-Neto and Galvão (2003 present a similar bias adjustment mechanism that can be applied to a wide class of HCCMEs’. In the present article, we follow the same mechanism as proposed by Cribari-Neto and Galvão to give bias-correction version of HCCME but we use adaptive HCCME rather than the conventional HCCME. The Monte Carlo study is used to evaluate the performance of our proposed estimators.

  17. Comparing three cohorts of MSM sampled via sex parties, bars/clubs, and Craigslist.org: implications for researchers and providers.

    Science.gov (United States)

    Grov, Christian; Rendina, H Jonathon; Parsons, Jeffrey T

    2014-08-01

    With limited exceptions, few studies have systematically reported on psychosocial and demographic characteristic differences in samples of men who have sex with men (MSM) based on where they were recruited. This study compared three sexually active cohorts of MSM recruited via Craigslist.org (recruited via modified time-space sampling), gay bars and clubs (recruited via time-space sampling), and private sex parties (identified via passive recruitment and listserves), finding mixed results with regard to differences in demographic characteristics, STI history, and psychosocial measures. Men recruited from sex parties were significantly older, reported more symptoms of sexual compulsivity, more likely to be HIV-positive, more likely to report a history of STIs, and more likely to self-identify as a barebacker, than men recruited from the other two venues. In contrast, men from Craigslist.org reported the lowest levels of attachment to the gay and bisexual community and were the least likely to self-identify as gay. Men from bars and clubs were significantly younger, and were more likely to report use of hallucinogens and crack or cocaine. Our findings highlight that the venues in which MSM are recruited have meaningful consequences in terms of the types of individuals who are reached.

  18. Experimental research of the influence of the strength of ore samples on the parameters of an electromagnetic signal during acoustic excitation in the process of uniaxial compression

    Science.gov (United States)

    Yavorovich, L. V.; Bespal`ko, A. A.; Fedotov, P. I.

    2018-01-01

    Parameters of electromagnetic responses (EMRe) generated during uniaxial compression of rock samples under excitation by deterministic acoustic pulses are presented and discussed. Such physical modeling in the laboratory allows to reveal the main regularities of electromagnetic signals (EMS) generation in rock massive. The influence of the samples mechanical properties on the parameters of the EMRe excited by an acoustic signal in the process of uniaxial compression is considered. It has been established that sulfides and quartz in the rocks of the Tashtagol iron ore deposit (Western Siberia, Russia) contribute to the conversion of mechanical energy into the energy of the electromagnetic field, which is expressed in an increase in the EMS amplitude. The decrease in the EMS amplitude when the stress-strain state of the sample changes during the uniaxial compression is observed when the amount of conductive magnetite contained in the rock is increased. The obtained results are important for the physical substantiation of testing methods and monitoring of changes in the stress-strain state of the rock massive by the parameters of electromagnetic signals and the characteristics of electromagnetic emission.

  19. A Linguistic Analysis of the Sample Numeracy Skills Test Items for Pre-Service Teachers Issued by the Australian Council for Educational Research (ACER)

    Science.gov (United States)

    O'Keeffe, Lisa; O'Halloran, Kay L.; Wignell, Peter; Tan, Sabine

    2017-01-01

    In 2015, the Australian Council for Educational Research (ACER) was tasked with developing literacy and numeracy skills testing for pre-service teachers. All undergraduate and postgraduate trainee teachers are now required to pass these literacy and numeracy tests at some stage on their journey to becoming a teacher; for commencing students from…

  20. Innovative Basis of Research of Technologic Features of Some Craftsmanship Traditions of Ganja (On the Sample of Carpets of XIX Century)

    Science.gov (United States)

    Hasanov, Elnur L.

    2016-01-01

    The carpet production in Ganja took one of the leading handicraft activities since ancient times and still impresses with its high skill and the variety of colors, but there have been no widely studied the question of the creation technology of such representatives of cultural heritage. Scientific paper deals with the research of the basic…