A Consistent System for Coding Laboratory Samples
Sih, John C.
1996-07-01
A formal laboratory coding system is presented to keep track of laboratory samples. Preliminary useful information regarding the sample (origin and history) is gained without consulting a research notebook. Since this system uses and retains the same research notebook page number for each new experiment (reaction), finding and distinguishing products (samples) of the same or different reactions becomes an easy task. Using this system multiple products generated from a single reaction can be identified and classified in a uniform fashion. Samples can be stored and filed according to stage and degree of purification, e.g. crude reaction mixtures, recrystallized samples, chromatographed or distilled products.
CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.
Shalizi, Cosma Rohilla; Rinaldo, Alessandro
2013-04-01
The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.
Palinkas, Lawrence A; Horwitz, Sarah M; Green, Carla A; Wisdom, Jennifer P; Duan, Naihua; Hoagwood, Kimberly
2015-09-01
Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research.
Palinkas, Lawrence A.; Horwitz, Sarah M.; Green, Carla A.; Wisdom, Jennifer P.; Duan, Naihua; Hoagwood, Kimberly
2013-01-01
Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research. PMID:24193818
Biro, Peter A
2013-02-01
Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.
Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions
Energy Technology Data Exchange (ETDEWEB)
Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)
2015-01-15
Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.
Range-efficient consistent sampling and locality-sensitive hashing for polygons
DEFF Research Database (Denmark)
Gudmundsson, Joachim; Pagh, Rasmus
2017-01-01
Locality-sensitive hashing (LSH) is a fundamental technique for similarity search and similarity estimation in high-dimensional spaces. The basic idea is that similar objects should produce hash collisions with probability significantly larger than objects with low similarity. We consider LSH for...... or union of a set of preprocessed polygons. Curiously, our consistent sampling method uses transformation to a geometric problem....
Sampling Methods in Cardiovascular Nursing Research: An Overview.
Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie
2014-01-01
Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.
Dula, Chris S; Geller, E Scott
2003-01-01
Researchers agree that a consistent definition for aggressive driving is lacking. Such definitional ambiguity in the literature impedes the accumulation of accurate and precise information, and prevents researchers from communicating clearly about findings and implications for future research directions. This dramatically slows progress in understanding the causes and maintenance factors of aggressive driving. This article critiques prevailing definitions of driver aggression and generates a definition that, if used consistently, can improve the utility of future research. Pertinent driving behaviors have been variably labeled in the literature as risky, aggressive, or road rage. The authors suggest that the term "road rage" be eliminated from research because it has been used inconsistently and has little probability of being clarified and applied consistently. Instead, driving behaviors that endanger or have the potential to endanger others should be considered as lying on a behavioral spectrum of dangerous driving. Three dimensions of dangerous driving are delineated: (a). intentional acts of aggression toward others, (b). negative emotions experienced while driving, and (c). risk-taking. The adoption of a standardized definition for aggressive driving should spark researchers to use more explicit operational definitions that are consistent with theoretical foundations. The use of consistent and unambiguous operational definitions will increase the precision of measurement in research and enhance authors' ability to communicate clearly about findings and conclusions. As this occurs over time, industry will reap benefits from more carefully conducted research. Such benefits may include the development of more valid and reliable means of selecting safe professional drivers, conducting accurate risk assessments, and creating preventative and remedial dangerous driving safety programs.
The ethical use of existing samples for genome research.
Bathe, Oliver F; McGuire, Amy L
2009-10-01
Modern biobanking efforts consist of prospective collections of tissues linked to clinical data for patients who have given informed consent for the research use of their specimens and data, including their DNA. In such efforts, patient autonomy and privacy are well respected because of the prospective nature of the informed consent process. However, one of the richest sources of tissue for research continues to be the millions of archived samples collected by pathology departments during normal clinical care or for research purposes without specific consent for future research or genetic analysis. Because specific consent was not obtained a priori, issues related to individual privacy and autonomy are much more complicated. A framework for accessing these existing samples and related clinical data for research is presented. Archival tissues may be accessed only when there is a reasonable likelihood of generating beneficial and scientifically valid information. To minimize risks, databases containing information related to the tissue and to clinical data should be coded, no personally identifying phenotypic information should be included, and access should be restricted to bona fide researchers for legitimate research purposes. These precautions, if implemented appropriately, should ensure that the research use of archival tissue and data are no more than minimal risk. A waiver of the requirement for informed consent would then be justified if reconsent is shown to be impracticable. A waiver of consent should not be granted, however, if there is a significant risk to privacy, if the proposed research use is inconsistent with the original consent (where there is one), or if the potential harm from a privacy breach is considerable.
Suh, K Stephen; Remache, Yvonne K; Patel, Jalpa S; Chen, Steve H; Haystrand, Russell; Ford, Peggy; Shaikh, Anadil M; Wang, Jian; Goy, Andre H
2009-02-01
Modern cancer research for biomarker discovery program requires solving several tasks that are directly involved with patient sample procurement. One requirement is to construct a highly efficient workflow on the clinical side for the procurement to generate a consistent supply of high quality samples for research. This undertaking needs a network of interdepartmental collaborations and participations at various levels, including physical human interactions, information technology implementations and a bioinformatics tool that is highly effective and user-friendly to busy clinicians and researchers associated with the sample procurement. Collegial participation that is sequential but continual from one department to another demands dedicated bioinformatics software coordinating between the institutional clinic and the tissue repository facility. Participants in the process include admissions, consenting process, phlebotomy, surgery center and pathology. During this multiple step procedures, clinical data are collected for detailed analytical endpoints to supplement logistics of defining and validating the discovery of biomarkers.
Quota sampling in internet research: practical issues.
Im, Eun-Ok; Chee, Wonshik
2011-07-01
Quota sampling has been suggested as a potentially good method for Internet-based research and has been used by several researchers working with Internet samples. However, very little is known about the issues or concerns in using a quota sampling method in Internet research. The purpose of this article was to present the practical issues using quota sampling in an Internet-based study. During the Internet study, the research team recorded all recruitment issues that arose and made written notes indicating the possible reasons for the problems. In addition, biweekly team discussions were conducted for which written records were kept. Overall, quota sampling was effective in ensuring that an adequate number of midlife women were recruited from the targeted ethnic groups. However, during the study process, we encountered the following practical issues using quota sampling: (1) difficulty reaching out to women in lower socioeconomic classes, (2) difficulty ensuring authenticity of participants' identities, (3) participants giving inconsistent answers for the screening questions versus the Internet survey questions, (4) potential problems with a question on socioeconomic status, (5) resentment toward the research project and/or researchers because of rejection, and (6) a longer time and more expense than anticipated.
Sampling bias in climate-conflict research
Adams, Courtland; Ide, Tobias; Barnett, Jon; Detges, Adrien
2018-03-01
Critics have argued that the evidence of an association between climate change and conflict is flawed because the research relies on a dependent variable sampling strategy1-4. Similarly, it has been hypothesized that convenience of access biases the sample of cases studied (the `streetlight effect'5). This also gives rise to claims that the climate-conflict literature stigmatizes some places as being more `naturally' violent6-8. Yet there has been no proof of such sampling patterns. Here we test whether climate-conflict research is based on such a biased sample through a systematic review of the literature. We demonstrate that research on climate change and violent conflict suffers from a streetlight effect. Further, studies which focus on a small number of cases in particular are strongly informed by cases where there has been conflict, do not sample on the independent variables (climate impact or risk), and hence tend to find some association between these two variables. These biases mean that research on climate change and conflict primarily focuses on a few accessible regions, overstates the links between both phenomena and cannot explain peaceful outcomes from climate change. This could result in maladaptive responses in those places that are stigmatized as being inherently more prone to climate-induced violence.
Research results: preserving newborn blood samples.
Lewis, Michelle Huckaby; Scheurer, Michael E; Green, Robert C; McGuire, Amy L
2012-11-07
Retention and use, without explicit parental permission, of residual dried blood samples from newborn screening has generated public controversy over concerns about violations of family privacy rights and loss of parental autonomy. The public debate about this issue has included little discussion about the destruction of a potentially valuable public resource that can be used for research that may yield improvements in public health. The research community must advocate for policies and infrastructure that promote retention of residual dried blood samples and their use in biomedical research.
Analysis of the research sample collections of Uppsala biobank.
Engelmark, Malin T; Beskow, Anna H
2014-10-01
Uppsala Biobank is the joint and only biobank organization of the two principals, Uppsala University and Uppsala University Hospital. Biobanks are required to have updated registries on sample collection composition and management in order to fulfill legal regulations. We report here the results from the first comprehensive and overall analysis of the 131 research sample collections organized in the biobank. The results show that the median of the number of samples in the collections was 700 and that the number of samples varied from less than 500 to over one million. Blood samples, such as whole blood, serum, and plasma, were included in the vast majority, 84.0%, of the research sample collections. Also, as much as 95.5% of the newly collected samples within healthcare included blood samples, which further supports the concept that blood samples have fundamental importance for medical research. Tissue samples were also commonly used and occurred in 39.7% of the research sample collections, often combined with other types of samples. In total, 96.9% of the 131 sample collections included samples collected for healthcare, showing the importance of healthcare as a research infrastructure. Of the collections that had accessed existing samples from healthcare, as much as 96.3% included tissue samples from the Department of Pathology, which shows the importance of pathology samples as a resource for medical research. Analysis of different research areas shows that the most common of known public health diseases are covered. Collections that had generated the most publications, up to over 300, contained a large number of samples collected systematically and repeatedly over many years. More knowledge about existing biobank materials, together with public registries on sample collections, will support research collaborations, improve transparency, and bring us closer to the goals of biobanks, which is to save and prolong human lives and improve health and quality of life.
Conducting Clinical Research Using Crowdsourced Convenience Samples.
Chandler, Jesse; Shapiro, Danielle
2016-01-01
Crowdsourcing has had a dramatic impact on the speed and scale at which scientific research can be conducted. Clinical scientists have particularly benefited from readily available research study participants and streamlined recruiting and payment systems afforded by Amazon Mechanical Turk (MTurk), a popular labor market for crowdsourcing workers. MTurk has been used in this capacity for more than five years. The popularity and novelty of the platform have spurred numerous methodological investigations, making it the most studied nonprobability sample available to researchers. This article summarizes what is known about MTurk sample composition and data quality with an emphasis on findings relevant to clinical psychological research. It then addresses methodological issues with using MTurk--many of which are common to other nonprobability samples but unfamiliar to clinical science researchers--and suggests concrete steps to avoid these issues or minimize their impact.
Environmental sample banking-research and methodology
International Nuclear Information System (INIS)
Becker, D.A.
1976-01-01
The National Bureau of Standards (NBS), in cooperation with the Environment Protection Agency and the National Science Foundation, is engaged in a research program establishing methodology for environmental sample banking. This program is aimed toward evaluating the feasibility of a National Environment Specimen Bank (NESB). The capability for retrospective chemical analyses to evaluate changes in our environment would provide useful information. Much of this information could not be obtained using data from previously analyzed samples. However, to assure validity for these stored samples, they must be sampled, processed and stored under rigorously evaluated, controlled and documented conditions. The program currently under way in the NBS Analytical Chemistry Division has 3 main components. The first is an extension survey of available literature concerning problems of contamination, losses and storage. The components of interest include trace elements, pesticides, other trace organics (PCBs, plasticizers, etc.), radionuclides and microbiological species. The second component is an experimental evaluation of contamination and losses during sampling and sample handling. Of particular interest here is research into container cleaning methodology for trace elements, with respect to adsorption, desorption, leaching and partial dissolution by various sample matrices. The third component of this program is an evaluation of existing methodology for long-term sample storage
Demystifying Theoretical Sampling in Grounded Theory Research
Directory of Open Access Journals (Sweden)
Jenna Breckenridge BSc(Hons,Ph.D.Candidate
2009-06-01
Full Text Available Theoretical sampling is a central tenet of classic grounded theory and is essential to the development and refinement of a theory that is ‘grounded’ in data. While many authors appear to share concurrent definitions of theoretical sampling, the ways in which the process is actually executed remain largely elusive and inconsistent. As such, employing and describing the theoretical sampling process can present a particular challenge to novice researchers embarking upon their first grounded theory study. This article has been written in response to the challenges faced by the first author whilst writing a grounded theory proposal. It is intended to clarify theoretical sampling for new grounded theory researchers, offering some insight into the practicalities of selecting and employing a theoretical sampling strategy. It demonstrates that the credibility of a theory cannot be dissociated from the process by which it has been generated and seeks to encourage and challenge researchers to approach theoretical sampling in a way that is apposite to the core principles of the classic grounded theory methodology.
Moore, C.
2011-12-01
The Index to Marine and Lacustrine Geological Samples is a community designed and maintained resource enabling researchers to locate and request sea floor and lakebed geologic samples archived by partner institutions. Conceived in the dawn of the digital age by representatives from U.S. academic and government marine core repositories and the NOAA National Geophysical Data Center (NGDC) at a 1977 meeting convened by the National Science Foundation (NSF), the Index is based on core concepts of community oversight, common vocabularies, consistent metadata and a shared interface. Form and content of underlying vocabularies and metadata continue to evolve according to the needs of the community, as do supporting technologies and access methodologies. The Curators Consortium, now international in scope, meets at partner institutions biennially to share ideas and discuss best practices. NGDC serves the group by providing database access and maintenance, a list server, digitizing support and long-term archival of sample metadata, data and imagery. Over three decades, participating curators have performed the herculean task of creating and contributing metadata for over 195,000 sea floor and lakebed cores, grabs, and dredges archived in their collections. Some partners use the Index for primary web access to their collections while others use it to increase exposure of more in-depth institutional systems. The Index is currently a geospatially-enabled relational database, publicly accessible via Web Feature and Web Map Services, and text- and ArcGIS map-based web interfaces. To provide as much knowledge as possible about each sample, the Index includes curatorial contact information and links to related data, information and images; 1) at participating institutions, 2) in the NGDC archive, and 3) at sites such as the Rolling Deck to Repository (R2R) and the System for Earth Sample Registration (SESAR). Over 34,000 International GeoSample Numbers (IGSNs) linking to SESAR are
Directory of Open Access Journals (Sweden)
Agnieszka Viapiana
2016-10-01
Full Text Available Chamomile has been used as an herbal medication since ancient times and is still popular because it contains various bioactive phytochemicals that could provide therapeutic effects. In this study, a simple and reliable HPLC method was developed to evaluate the quality consistency of nineteen chamomile samples through establishing a chromatographic fingerprint, quantification of phenolic compounds and determination of antioxidant activity. For fingerprint analysis, 12 peaks were selected as the common peaks to evaluate the similarities of commercial samples of chamomile obtained from different manufacturers. A similarity analysis was performed to assess the similarity/dissimilarity of chamomile samples where values varied from 0.868 to 0.990 what indicating that samples from different manufacturers were consistent. Additionally, simultaneous quantification of five phenolic acids (gallic, caffeic, syringic, p-coumaric, ferulic and four flavonoids (rutin, myricetin, quercetin and keampferol was performed to interpret the quality consistency. In quantitative analysis, the nine individual phenolic compounds showed good regression (r > 0.9975. Inter- and intra-day precisions for all analysed compounds expressed as relative standard deviation (CV ranged from 0.05% to 3.12%. Since flavonoids and other polyphenols are commonly recognised as natural antioxidants, the antioxidant activity of chamomile samples was evaluated using 1,1-diphenyl-2-picrylhydrazyl (DPPH radical scavenging activity and ferric reducing/antioxidant power (FRAP assay. Correlation analysis was used to assess the relationship between antioxidant activity and phenolic composition, and multivariate analysis (PCA and HCA were applied to distinguish chamomile samples. Results shown in the study indicate high similarity of chamomile samples among them, widely spread in the market and commonly used by people as infusions or teas, as well as that there were no statistically significant
Viapiana, Agnieszka; Struck-Lewicka, Wiktoria; Konieczynski, Pawel; Wesolowski, Marek; Kaliszan, Roman
2016-01-01
Chamomile has been used as an herbal medication since ancient times and is still popular because it contains various bioactive phytochemicals that could provide therapeutic effects. In this study, a simple and reliable HPLC method was developed to evaluate the quality consistency of nineteen chamomile samples through establishing a chromatographic fingerprint, quantification of phenolic compounds and determination of antioxidant activity. For fingerprint analysis, 12 peaks were selected as the common peaks to evaluate the similarities of commercial samples of chamomile obtained from different manufacturers. A similarity analysis was performed to assess the similarity/dissimilarity of chamomile samples where values varied from 0.868 to 0.990 what indicating that samples from different manufacturers were consistent. Additionally, simultaneous quantification of five phenolic acids (gallic, caffeic, syringic, p-coumaric, ferulic) and four flavonoids (rutin, myricetin, quercetin and keampferol) was performed to interpret the quality consistency. In quantitative analysis, the nine individual phenolic compounds showed good regression (r > 0.9975). Inter- and intra-day precisions for all analyzed compounds expressed as relative standard deviation (CV) ranged from 0.05% to 3.12%. Since flavonoids and other polyphenols are commonly recognized as natural antioxidants, the antioxidant activity of chamomile samples was evaluated using 1,1-diphenyl-2-picrylhydrazyl (DPPH) radical scavenging activity and ferric reducing/antioxidant power (FRAP) assay. Correlation analysis was used to assess the relationship between antioxidant activity and phenolic composition, and multivariate analysis (PCA and HCA) were applied to distinguish chamomile samples. Results shown in the study indicate high similarity of chamomile samples among them, widely spread in the market and commonly used by people as infusions or teas, as well as that there were no statistically significant differences among
Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis.
Moser, Albine; Korstjens, Irene
2018-12-01
In the course of our supervisory work over the years, we have noticed that qualitative research tends to evoke a lot of questions and worries, so-called frequently asked questions (FAQs). This series of four articles intends to provide novice researchers with practical guidance for conducting high-quality qualitative research in primary care. By 'novice' we mean Master's students and junior researchers, as well as experienced quantitative researchers who are engaging in qualitative research for the first time. This series addresses their questions and provides researchers, readers, reviewers and editors with references to criteria and tools for judging the quality of qualitative research papers. The second article focused on context, research questions and designs, and referred to publications for further reading. This third article addresses FAQs about sampling, data collection and analysis. The data collection plan needs to be broadly defined and open at first, and become flexible during data collection. Sampling strategies should be chosen in such a way that they yield rich information and are consistent with the methodological approach used. Data saturation determines sample size and will be different for each study. The most commonly used data collection methods are participant observation, face-to-face in-depth interviews and focus group discussions. Analyses in ethnographic, phenomenological, grounded theory, and content analysis studies yield different narrative findings: a detailed description of a culture, the essence of the lived experience, a theory, and a descriptive summary, respectively. The fourth and final article will focus on trustworthiness and publishing qualitative research.
Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis
Moser, Albine; Korstjens, Irene
2018-01-01
Abstract In the course of our supervisory work over the years, we have noticed that qualitative research tends to evoke a lot of questions and worries, so-called frequently asked questions (FAQs). This series of four articles intends to provide novice researchers with practical guidance for conducting high-quality qualitative research in primary care. By ‘novice’ we mean Master’s students and junior researchers, as well as experienced quantitative researchers who are engaging in qualitative research for the first time. This series addresses their questions and provides researchers, readers, reviewers and editors with references to criteria and tools for judging the quality of qualitative research papers. The second article focused on context, research questions and designs, and referred to publications for further reading. This third article addresses FAQs about sampling, data collection and analysis. The data collection plan needs to be broadly defined and open at first, and become flexible during data collection. Sampling strategies should be chosen in such a way that they yield rich information and are consistent with the methodological approach used. Data saturation determines sample size and will be different for each study. The most commonly used data collection methods are participant observation, face-to-face in-depth interviews and focus group discussions. Analyses in ethnographic, phenomenological, grounded theory, and content analysis studies yield different narrative findings: a detailed description of a culture, the essence of the lived experience, a theory, and a descriptive summary, respectively. The fourth and final article will focus on trustworthiness and publishing qualitative research. PMID:29199486
Consistency of the MLE under mixture models
Chen, Jiahua
2016-01-01
The large-sample properties of likelihood-based statistical inference under mixture models have received much attention from statisticians. Although the consistency of the nonparametric MLE is regarded as a standard conclusion, many researchers ignore the precise conditions required on the mixture model. An incorrect claim of consistency can lead to false conclusions even if the mixture model under investigation seems well behaved. Under a finite normal mixture model, for instance, the consis...
Sampling in Qualitative Research: Rationale, Issues, and Methods
LUBORSKY, MARK R.; RUBINSTEIN, ROBERT L.
1995-01-01
In gerontology the most recognized and elaborate discourse about sampling is generally thought to be in quantitative research associated with survey research and medical research. But sampling has long been a central concern in the social and humanistic inquiry, albeit in a different guise suited to the different goals. There is a need for more explicit discussion of qualitative sampling issues. This article will outline the guiding principles and rationales, features, and practices of sampli...
Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K. R.; Zeigler, R. A.; Righter, K.; Hanna, R. D.; Ketcham, R. A.
2017-01-01
NASA's vast and growing collections of astromaterials are both scientifically and culturally significant, requiring unique preservation strategies that need to be recurrently updated to contemporary technological capabilities and increasing accessibility demands. New technologies have made it possible to advance documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. Our interdisciplinary team has developed a method to create 3D Virtual Astromaterials Samples (VAS) of the existing collections of Apollo Lunar Samples and Antarctic Meteorites. Research-grade 3D VAS will virtually put these samples in the hands of researchers and educators worldwide, increasing accessibility and visibility of these significant collections. With new sample return missions on the horizon, it is of primary importance to develop advanced curation standards for documentation and visualization methodologies.
(I Can’t Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research
van Rijnsoever, Frank J.
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in
(I Can't Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research.
van Rijnsoever, Frank J
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: "random chance," which is based on probability sampling, "minimal information," which yields at least one new code per sampling step, and "maximum information," which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario.
Directory of Open Access Journals (Sweden)
Pedro Saa
2015-04-01
Full Text Available Kinetic models provide the means to understand and predict the dynamic behaviour of enzymes upon different perturbations. Despite their obvious advantages, classical parameterizations require large amounts of data to fit their parameters. Particularly, enzymes displaying complex reaction and regulatory (allosteric mechanisms require a great number of parameters and are therefore often represented by approximate formulae, thereby facilitating the fitting but ignoring many real kinetic behaviours. Here, we show that full exploration of the plausible kinetic space for any enzyme can be achieved using sampling strategies provided a thermodynamically feasible parameterization is used. To this end, we developed a General Reaction Assembly and Sampling Platform (GRASP capable of consistently parameterizing and sampling accurate kinetic models using minimal reference data. The former integrates the generalized MWC model and the elementary reaction formalism. By formulating the appropriate thermodynamic constraints, our framework enables parameterization of any oligomeric enzyme kinetics without sacrificing complexity or using simplifying assumptions. This thermodynamically safe parameterization relies on the definition of a reference state upon which feasible parameter sets can be efficiently sampled. Uniform sampling of the kinetics space enabled dissecting enzyme catalysis and revealing the impact of thermodynamics on reaction kinetics. Our analysis distinguished three reaction elasticity regions for common biochemical reactions: a steep linear region (0> ΔGr >-2 kJ/mol, a transition region (-2> ΔGr >-20 kJ/mol and a constant elasticity region (ΔGr <-20 kJ/mol. We also applied this framework to model more complex kinetic behaviours such as the monomeric cooperativity of the mammalian glucokinase and the ultrasensitive response of the phosphoenolpyruvate carboxylase of Escherichia coli. In both cases, our approach described appropriately not only
van Rijnsoever, F.J.
2015-01-01
This paper explores the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the
Research progress of MRI in preoperative evaluation of pituitary adenoma's consistency
International Nuclear Information System (INIS)
Lu Yiping; Yin Bo; Geng Daoying
2013-01-01
As the most common primary disease in pituitary fossa, the incidence of pituitary adenoma ranks 3rd in the primary tumors of the brain. To remove those resectable pituitary adenomas, there are 2 surgical approaches, named trans-sphenoidal endoscopic surgery and craniotomy. Which approach should be used depends on the size, invasive extension and the consistency of the tumors. The trans-sphenoidal endoscopic surgery is more suitable for the tumors with soft consistency which are easy to pull out, while the craniotomy is suitable for the hard ones. So, preoperative evaluation of the tumors' consistency can help to find the best surgical approach and treatments. MRI is not only an ideal method to show the structure of brain, but also can be used to evaluate consistency of tumor. This review illustrated the forming mechanism of the different consistency of pituitary adenoma and the research process in evaluating the consistency. (authors)
(I Can’t Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: “random chance,” which is based on probability sampling, “minimal information,” which yields at least one new code per sampling step, and “maximum information,” which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario. PMID:28746358
THE USE OF RANKING SAMPLING METHOD WITHIN MARKETING RESEARCH
Directory of Open Access Journals (Sweden)
CODRUŢA DURA
2011-01-01
Full Text Available Marketing and statistical literature available to practitioners provides a wide range of sampling methods that can be implemented in the context of marketing research. Ranking sampling method is based on taking apart the general population into several strata, namely into several subdivisions which are relatively homogenous regarding a certain characteristic. In fact, the sample will be composed by selecting, from each stratum, a certain number of components (which can be proportional or non-proportional to the size of the stratum until the pre-established volume of the sample is reached. Using ranking sampling within marketing research requires the determination of some relevant statistical indicators - average, dispersion, sampling error etc. To that end, the paper contains a case study which illustrates the actual approach used in order to apply the ranking sample method within a marketing research made by a company which provides Internet connection services, on a particular category of customers – small and medium enterprises.
The Sampling Issues in Quantitative Research
Delice, Ali
2010-01-01
A concern for generalization dominates quantitative research. For generalizability and repeatability, identification of sample size is essential. The present study investigates 90 qualitative master's theses submitted for the Primary and Secondary School Science and Mathematics Education Departments, Mathematic Education Discipline in 10…
Consistency argued students of fluid
Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma
2017-01-01
Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.
Sampling in Qualitative Research: Improving the Quality of ...
African Journals Online (AJOL)
Sampling consideration in qualitative research is very important, yet in practice this appears not to be given the prominence and the rigour it deserves among Higher Education researchers. Accordingly, the quality of research outcomes in Higher Education has suffered from low utilisation. This has motivated the production ...
[Sampling in qualitative research: basic principles and some controversies].
Martínez-Salgado, Carolina
2012-03-01
This paper presents the rationale for the choice of participants in qualitative research in contrast with that of probability sampling principles in epidemiological research. For a better understanding of the differences, concepts of nomothetic and ideographic generalizability, as well as those of transferability and reflexivity, are proposed, Fundamentals of the main types of sampling commonly used in qualitative research, and the meaning of the concept of saturation are mentioned. Finally, some reflections on the controversies that have arisen in recent years on various paradigmatic perspectives from which to conduct qualitative research, their possibilities of combination with epidemiological research, and some implications for the study of health issues are presented.
The Consistency of Isotopologues of Ambient Atmospheric Nitric Acid in Passively Collected Samples
Bell, M. D.; Sickman, J. O.; Bytnerowicz, A.; Padgett, P.; Allen, E. B.
2012-12-01
Anthropogenic sources of nitrogen oxides have previously been shown to have distinctive isotopic signatures of oxygen and nitrogen. Nylon filters are currently used in passive sampling arrays to measure ambient atmospheric nitric acid concentrations and estimate deposition rates. This experiment measured the ability of nylon filters to consistently collect isotopologues of atmospheric nitric acid in the same ratios as they are present in the atmosphere. Samplers were deployed in continuous stirred tank reactors (CSTR) and at field sites across a nitrogen deposition gradient in Southern California. Filters were exposed over a four week period with individual filters being subjected to 1-4 week exposure times. Extracted nitric acid were measured for δ18O and δ15N ratios and compared for consistency based on length of exposure and amount of HNO3 collected. Filters within the CSTRs collected HNO3 at a consistent rate in both high and low concentration chambers. After two weeks of exposure, the mean δ18O values were within 0.5‰ of the δ18O of the source HNO3 solution. The mean of all weekly exposures were within 0.5‰ of the δ15N of the source solution, but after three weeks, the mean δ15N of adsorbed HNO3 was within 0.2‰. As the length of the exposure increased, the variability of measured delta values decreased for both elements. The field samplers collected HNO3 consistent with previously measured values along a deposition gradient. The mean δ18O at high deposition sites was 52.2‰ compared to 35.7‰ at the low deposition sites. Mean δ15N values were similar at all sites across the deposition gradient. Due to precipitation events occurring during the exposure period, the δ15N and δ18O of nitric acid were highly variable at all field sites. At single sites, changes in δ15N and δ18O were negatively correlated, consistent with two-sourcing mixing dynamics, but the slope of the regressions differed between high and low deposition sites. Anthropogenic
Validation of consistency of Mendelian sampling variance.
Tyrisevä, A-M; Fikse, W F; Mäntysaari, E A; Jakobsen, J; Aamand, G P; Dürr, J; Lidauer, M H
2018-03-01
Experiences from international sire evaluation indicate that the multiple-trait across-country evaluation method is sensitive to changes in genetic variance over time. Top bulls from birth year classes with inflated genetic variance will benefit, hampering reliable ranking of bulls. However, none of the methods available today enable countries to validate their national evaluation models for heterogeneity of genetic variance. We describe a new validation method to fill this gap comprising the following steps: estimating within-year genetic variances using Mendelian sampling and its prediction error variance, fitting a weighted linear regression between the estimates and the years under study, identifying possible outliers, and defining a 95% empirical confidence interval for a possible trend in the estimates. We tested the specificity and sensitivity of the proposed validation method with simulated data using a real data structure. Moderate (M) and small (S) size populations were simulated under 3 scenarios: a control with homogeneous variance and 2 scenarios with yearly increases in phenotypic variance of 2 and 10%, respectively. Results showed that the new method was able to estimate genetic variance accurately enough to detect bias in genetic variance. Under the control scenario, the trend in genetic variance was practically zero in setting M. Testing cows with an average birth year class size of more than 43,000 in setting M showed that tolerance values are needed for both the trend and the outlier tests to detect only cases with a practical effect in larger data sets. Regardless of the magnitude (yearly increases in phenotypic variance of 2 or 10%) of the generated trend, it deviated statistically significantly from zero in all data replicates for both cows and bulls in setting M. In setting S with a mean of 27 bulls in a year class, the sampling error and thus the probability of a false-positive result clearly increased. Still, overall estimated genetic
Hairul Anuar Hashim; Freddy Golok; Rosmatunisah Ali
2011-01-01
Background: Psychometrically sound measurement instrument is a fundamental requirement across broad range of research areas. In negative affect research, Depression Anxiety Stress Scale (DASS) has been identified as a psychometrically sound instrument to measure depression, anxiety and stress, especially the 21-item version. However, its psychometric properties in adolescents have been less consistent. Objectives: Thus, the present study sought to examine the factorial validity and internal c...
Køster, B; Søndergaard, J; Nielsen, J B; Olsen, A; Bentzen, J
2018-06-01
An important feature of questionnaire validation is reliability. To be able to measure a given concept by questionnaire validly, the reliability needs to be high. The objectives of this study were to examine reliability of attitude and knowledge and behavioral consistency of sunburn in a developed questionnaire for monitoring and evaluating population sun-related behavior. Sun related behavior, attitude and knowledge was measured weekly by a questionnaire in the summer of 2013 among 664 Danes. Reliability was tested in a test-retest design. Consistency of behavioral information was tested similarly in a questionnaire adapted to measure behavior throughout the summer. The response rates for questionnaire 1, 2 and 3 were high and the drop out was not dependent on demographic characteristic. There was at least 73% agreement between sunburns in the measurement week and the entire summer, and a possible sunburn underestimation in questionnaires summarizing the entire summer. The participants underestimated their outdoor exposure in the evaluation covering the entire summer as compared to the measurement week. The reliability of scales measuring attitude and knowledge was high for majority of scales, while consistency in protection behavior was low. To our knowledge, this is the first study to report reliability for a completely validated questionnaire on sun-related behavior in a national random population based sample. Further, we show that attitude and knowledge questions confirmed their validity with good reliability, while consistency of protection behavior in general and in a week's measurement was low.
International Nuclear Information System (INIS)
Freitag, Joerg; Kosuge, Hitoshi; Schmelzer, Juergen P.; Kato, Satoru
2015-01-01
Highlights: • We use a new, simple static cell vapor phase manual sampling method (SCVMS) for VLE (x, y, T) measurement. • The method is applied to non-azeotropic, asymmetric and two-liquid phase forming azeotropic binaries. • The method is approved by a data consistency test, i.e., a plot of the polarity exclusion factor vs. pressure. • The consistency test reveals that with the new SCVMS method accurate VLE near ambient temperature can be measured. • Moreover, the consistency test approves that the effect of air in the SCVMS system is negligible. - Abstract: A new static cell vapor phase manual sampling (SCVMS) method is used for the simple measurement of constant temperature x, y (vapor + liquid) equilibria (VLE). The method was applied to the VLE measurements of the (methanol + water) binary at T/K = (283.2, 298.2, 308.2 and 322.9), asymmetric (acetone + 1-butanol) binary at T/K = (283.2, 295.2, 308.2 and 324.2) and two-liquid phase forming azeotropic (water + 1-butanol) binary at T/K = (283.2 and 298.2). The accuracy of the experimental data was approved by a data consistency test, that is, an empirical plot of the polarity exclusion factor, β, vs. the system pressure, P. The SCVMS data are accurate, because the VLE data converge to the same lnβ vs. lnP straight line determined from conventional distillation-still method and a headspace gas chromatography method
Directory of Open Access Journals (Sweden)
Martin A. Volker
2016-01-01
Full Text Available The Gilliam Autism Rating Scale-Second Edition (GARS-2 is a widely used screening instrument that assists in the identification and diagnosis of autism. The purpose of this study was to examine the factor structure, internal consistency, and screening sensitivity of the GARS-2 using ratings from special education teaching staff for a sample of 240 individuals with autism or other significant developmental disabilities. Exploratory factor analysis yielded a correlated three-factor solution similar to that found in 2005 by Lecavalier for the original GARS. Though the three factors appeared to be reasonably consistent with the intended constructs of the three GARS-2 subscales, the analysis indicated that more than a third of the GARS-2 items were assigned to the wrong subscale. Internal consistency estimates met or exceeded standards for screening and were generally higher than those in previous studies. Screening sensitivity was .65 and specificity was .81 for the Autism Index using a cut score of 85. Based on these findings, recommendations are made for instrument revision.
Martin A. Volker; Elissa H. Dua; Christopher Lopata; Marcus L. Thomeer; Jennifer A. Toomey; Audrey M. Smerbeck; Jonathan D. Rodgers; Joshua R. Popkin; Andrew T. Nelson; Gloria K. Lee
2016-01-01
The Gilliam Autism Rating Scale-Second Edition (GARS-2) is a widely used screening instrument that assists in the identification and diagnosis of autism. The purpose of this study was to examine the factor structure, internal consistency, and screening sensitivity of the GARS-2 using ratings from special education teaching staff for a sample of 240 individuals with autism or other significant developmental disabilities. Exploratory factor analysis yielded a correlated three-factor solution si...
Sampling in epidemiological research: issues, hazards and pitfalls
Tyrer, Stephen; Heyman, Bob
2016-01-01
Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research. PMID:27087985
Directory of Open Access Journals (Sweden)
B. Køster
2018-06-01
Full Text Available An important feature of questionnaire validation is reliability. To be able to measure a given concept by questionnaire validly, the reliability needs to be high.The objectives of this study were to examine reliability of attitude and knowledge and behavioral consistency of sunburn in a developed questionnaire for monitoring and evaluating population sun-related behavior.Sun related behavior, attitude and knowledge was measured weekly by a questionnaire in the summer of 2013 among 664 Danes. Reliability was tested in a test-retest design. Consistency of behavioral information was tested similarly in a questionnaire adapted to measure behavior throughout the summer.The response rates for questionnaire 1, 2 and 3 were high and the drop out was not dependent on demographic characteristic. There was at least 73% agreement between sunburns in the measurement week and the entire summer, and a possible sunburn underestimation in questionnaires summarizing the entire summer. The participants underestimated their outdoor exposure in the evaluation covering the entire summer as compared to the measurement week. The reliability of scales measuring attitude and knowledge was high for majority of scales, while consistency in protection behavior was low.To our knowledge, this is the first study to report reliability for a completely validated questionnaire on sun-related behavior in a national random population based sample. Further, we show that attitude and knowledge questions confirmed their validity with good reliability, while consistency of protection behavior in general and in a week's measurement was low. Keywords: Questionnaire, Validation, Reliability, Skin cancer, Prevention, Ultraviolet radiation
The Enhancement of Consistency of Interpretation Skills on the Newton’s Laws Concept
Directory of Open Access Journals (Sweden)
Yudi Kurniawan
2018-03-01
Full Text Available Conceptual understanding is the most important thing that students should have rather than they had reaches achievement. The interpretation skill is one of conceptual understanding aspects. The aim of this paper is to know the consistency of students’ interpreting skills and all at once to get the levels of increasing of students’ interpretations skill. These variables learned by Interactive Lecture Demonstrations (ILD common sense. The method of this research is pre-experimental research with one group pretest-posttest design. The sample has taken by cluster random sampling. The result had shown that 16 % of all student that are have perfect consistency of interpretation skill and there are increasing of interpretation skill on 84 % from unknown to be understand (this skill. This finding could be used by the future researcher to study in the other areas of conceptual understanding aspects
Secondary electron emission and self-consistent charge transport in semi-insulating samples
Energy Technology Data Exchange (ETDEWEB)
Fitting, H.-J. [Institute of Physics, University of Rostock, Universitaetsplatz 3, D-18051 Rostock (Germany); Touzin, M. [Unite Materiaux et Transformations, UMR CNRS 8207, Universite de Lille 1, F-59655 Villeneuve d' Ascq (France)
2011-08-15
Electron beam induced self-consistent charge transport and secondary electron emission (SEE) in insulators are described by means of an electron-hole flight-drift model (FDM) now extended by a certain intrinsic conductivity (c) and are implemented by an iterative computer simulation. Ballistic secondary electrons (SE) and holes, their attenuation to drifting charge carriers, and their recombination, trapping, and field- and temperature-dependent detrapping are included. As a main result the time dependent ''true'' secondary electron emission rate {delta}(t) released from the target material and based on ballistic electrons and the spatial distributions of currents j(x,t), charges {rho}(x,t), field F(x,t), and potential V(x,t) are obtained where V{sub 0} = V(0,t) presents the surface potential. The intrinsic electronic conductivity limits the charging process and leads to a conduction sample current to the support. In that case the steady-state total SE yield will be fixed below the unit: i.e., {sigma} {eta} + {delta} < 1.
Klusek, J.; Martin, G. E.; Losh, M.
2014-01-01
Background: Prior research suggests that 60-74% of males and 16-45% of females with fragile X syndrome (FXS) meet criteria for autism spectrum disorder (ASD) in research settings. However, relatively little is known about the rates of clinical diagnoses in FXS and whether such diagnoses are consistent with those performed in a research setting…
Directory of Open Access Journals (Sweden)
Jason Miin-Hwa Lim
2011-04-01
Full Text Available Teaching second language learners how to write research reports constitutes a crucial component in programmes on English for Specific Purposes (ESP in institutions of higher learning. One of the rhetorical segments in research reports that merit attention has to do with the descriptions and justifications of sampling procedures. This genre-based study looks into sampling delineations in the Method-related sections of research articles on the teaching of English as a second language (TESL written by expert writers and published in eight reputed international refereed journals. Using Swales’s (1990 & 2004 framework, I conducted a quantitative analysis of the rhetorical steps and a qualitative investigation into the language resources employed in delineating sampling procedures. This investigation has considerable relevance to ESP students and instructors as it has yielded pertinent findings on how samples can be appropriately described to meet the expectations of dissertation examiners, reviewers, and supervisors. The findings of this study have furnished insights into how supervisors and instructors can possibly teach novice writers ways of using specific linguistic mechanisms to lucidly describe and convincingly justify the sampling procedures in the Method sections of experimental research reports.
Sean P. Healey; Paul L. Patterson; Sassan S. Saatchi; Michael A. Lefsky; Andrew J. Lister; Elizabeth A. Freeman
2012-01-01
Lidar height data collected by the Geosciences Laser Altimeter System (GLAS) from 2002 to 2008 has the potential to form the basis of a globally consistent sample-based inventory of forest biomass. GLAS lidar return data were collected globally in spatially discrete full waveform "shots," which have been shown to be strongly correlated with aboveground forest...
Two-Sample Two-Stage Least Squares (TSTSLS estimates of earnings mobility: how consistent are they?
Directory of Open Access Journals (Sweden)
John Jerrim
2016-08-01
Full Text Available Academics and policymakers have shown great interest in cross-national comparisons of intergenerational earnings mobility. However, producing consistent and comparable estimates of earnings mobility is not a trivial task. In most countries researchers are unable to observe earnings information for two generations. They are thus forced to rely upon imputed data from different surveys instead. This paper builds upon previous work by considering the consistency of the intergenerational correlation (ρ as well as the elasticity (β, how this changes when using a range of different instrumental (imputer variables, and highlighting an important but infrequently discussed measurement issue. Our key finding is that, while TSTSLS estimates of β and ρ are both likely to be inconsistent, the magnitude of this problem is much greater for the former than it is for the latter. We conclude by offering advice on estimating earnings mobility using this methodology.
Ponterotto, Joseph G; Ruckdeschel, Daniel E
2007-12-01
The present article addresses issues in reliability assessment that are often neglected in psychological research such as acceptable levels of internal consistency for research purposes, factors affecting the magnitude of coefficient alpha (alpha), and considerations for interpreting alpha within the research context. A new reliability matrix anchored in classical test theory is introduced to help researchers judge adequacy of internal consistency coefficients with research measures. Guidelines and cautions in applying the matrix are provided.
RANKED SET SAMPLING FOR ECOLOGICAL RESEARCH: ACCOUNTING FOR THE TOTAL COSTS OF SAMPLING
Researchers aim to design environmental studies that optimize precision and allow for generalization of results, while keeping the costs of associated field and laboratory work at a reasonable level. Ranked set sampling is one method to potentially increase precision and reduce ...
Samples in applied psychology: over a decade of research in review.
Shen, Winny; Kiger, Thomas B; Davies, Stacy E; Rasch, Rena L; Simon, Kara M; Ones, Deniz S
2011-09-01
This study examines sample characteristics of articles published in Journal of Applied Psychology (JAP) from 1995 to 2008. At the individual level, the overall median sample size over the period examined was approximately 173, which is generally adequate for detecting the average magnitude of effects of primary interest to researchers who publish in JAP. Samples using higher units of analyses (e.g., teams, departments/work units, and organizations) had lower median sample sizes (Mdn ≈ 65), yet were arguably robust given typical multilevel design choices of JAP authors despite the practical constraints of collecting data at higher units of analysis. A substantial proportion of studies used student samples (~40%); surprisingly, median sample sizes for student samples were smaller than working adult samples. Samples were more commonly occupationally homogeneous (~70%) than occupationally heterogeneous. U.S. and English-speaking participants made up the vast majority of samples, whereas Middle Eastern, African, and Latin American samples were largely unrepresented. On the basis of study results, recommendations are provided for authors, editors, and readers, which converge on 3 themes: (a) appropriateness and match between sample characteristics and research questions, (b) careful consideration of statistical power, and (c) the increased popularity of quantitative synthesis. Implications are discussed in terms of theory building, generalizability of research findings, and statistical power to detect effects. PsycINFO Database Record (c) 2011 APA, all rights reserved
Cognitive consistency and math-gender stereotypes in Singaporean children.
Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu
2014-01-01
In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.
Domain Adaptation for Pedestrian Detection Based on Prediction Consistency
Directory of Open Access Journals (Sweden)
Yu Li-ping
2014-01-01
Full Text Available Pedestrian detection is an active area of research in computer vision. It remains a quite challenging problem in many applications where many factors cause a mismatch between source dataset used to train the pedestrian detector and samples in the target scene. In this paper, we propose a novel domain adaptation model for merging plentiful source domain samples with scared target domain samples to create a scene-specific pedestrian detector that performs as well as rich target domain simples are present. Our approach combines the boosting-based learning algorithm with an entropy-based transferability, which is derived from the prediction consistency with the source classifications, to selectively choose the samples showing positive transferability in source domains to the target domain. Experimental results show that our approach can improve the detection rate, especially with the insufficient labeled data in target scene.
Viapiana, Agnieszka; Struck-Lewicka, Wiktoria; Konieczynski, Pawel; Wesolowski, Marek; Kaliszan, Roman
2016-01-01
Chamomile has been used as an herbal medication since ancient times and is still popular because it contains various bioactive phytochemicals that could provide therapeutic effects. In this study, a simple and reliable HPLC method was developed to evaluate the quality consistency of nineteen chamomile samples through establishing a chromatographic fingerprint, quantification of phenolic compounds and determination of antioxidant activity. For fingerprint analysis, 12 peaks were selected as th...
Directory of Open Access Journals (Sweden)
Sharmila Vaz
Full Text Available The social skills rating system (SSRS is used to assess social skills and competence in children and adolescents. While its characteristics based on United States samples (US are published, corresponding Australian figures are unavailable. Using a 4-week retest design, we examined the internal consistency, retest reliability and measurement error (ME of the SSRS secondary student form (SSF in a sample of Year 7 students (N = 187, from five randomly selected public schools in Perth, western Australia. Internal consistency (IC of the total scale and most subscale scores (except empathy on the frequency rating scale was adequate to permit independent use. On the importance rating scale, most IC estimates for girls fell below the benchmark. Test-retest estimates of the total scale and subscales were insufficient to permit reliable use. ME of the total scale score (frequency rating for boys was equivalent to the US estimate, while that for girls was lower than the US error. ME of the total scale score (importance rating was larger than the error using the frequency rating scale. The study finding supports the idea of using multiple informants (e.g. teacher and parent reports, not just student as recommended in the manual. Future research needs to substantiate the clinical meaningfulness of the MEs calculated in this study by corroborating them against the respective Minimum Clinically Important Difference (MCID.
Vaz, Sharmila; Parsons, Richard; Passmore, Anne Elizabeth; Andreou, Pantelis; Falkmer, Torbjörn
2013-01-01
The social skills rating system (SSRS) is used to assess social skills and competence in children and adolescents. While its characteristics based on United States samples (US) are published, corresponding Australian figures are unavailable. Using a 4-week retest design, we examined the internal consistency, retest reliability and measurement error (ME) of the SSRS secondary student form (SSF) in a sample of Year 7 students (N = 187), from five randomly selected public schools in Perth, western Australia. Internal consistency (IC) of the total scale and most subscale scores (except empathy) on the frequency rating scale was adequate to permit independent use. On the importance rating scale, most IC estimates for girls fell below the benchmark. Test-retest estimates of the total scale and subscales were insufficient to permit reliable use. ME of the total scale score (frequency rating) for boys was equivalent to the US estimate, while that for girls was lower than the US error. ME of the total scale score (importance rating) was larger than the error using the frequency rating scale. The study finding supports the idea of using multiple informants (e.g. teacher and parent reports), not just student as recommended in the manual. Future research needs to substantiate the clinical meaningfulness of the MEs calculated in this study by corroborating them against the respective Minimum Clinically Important Difference (MCID).
Sample size in psychological research over the past 30 years.
Marszalek, Jacob M; Barber, Carolyn; Kohlhart, Julie; Holmes, Cooper B
2011-04-01
The American Psychological Association (APA) Task Force on Statistical Inference was formed in 1996 in response to a growing body of research demonstrating methodological issues that threatened the credibility of psychological research, and made recommendations to address them. One issue was the small, even dramatically inadequate, size of samples used in studies published by leading journals. The present study assessed the progress made since the Task Force's final report in 1999. Sample sizes reported in four leading APA journals in 1955, 1977, 1995, and 2006 were compared using nonparametric statistics, while data from the last two waves were fit to a hierarchical generalized linear growth model for more in-depth analysis. Overall, results indicate that the recommendations for increasing sample sizes have not been integrated in core psychological research, although results slightly vary by field. This and other implications are discussed in the context of current methodological critique and practice.
Cha, Christine B; Tezanos, Katherine M; Peros, Olivia M; Ng, Mei Yi; Ribeiro, Jessica D; Nock, Matthew K; Franklin, Joseph C
2018-04-01
Research on suicidal thoughts and behaviors (STB) has identified many risk factors, but whether these findings generalize to diverse populations remains unclear. We review longitudinal studies on STB risk factors over the past 50 years in the United States and evaluate the methodological practices of sampling and reporting sample characteristics. We found that articles frequently reported participant age and sex, less frequently reported participant race and ethnicity, and rarely reported participant veteran status or lesbian, gay, bisexual, and transgender status. Sample reporting practices modestly and inconsistently improved over time. Finally, articles predominantly featured White, non-Hispanic, young adult samples. © 2017 The American Association of Suicidology.
Convenience samples and caregiving research: how generalizable are the findings?
Pruchno, Rachel A; Brill, Jonathan E; Shands, Yvonne; Gordon, Judith R; Genderson, Maureen Wilson; Rose, Miriam; Cartwright, Francine
2008-12-01
We contrast characteristics of respondents recruited using convenience strategies with those of respondents recruited by random digit dial (RDD) methods. We compare sample variances, means, and interrelationships among variables generated from the convenience and RDD samples. Women aged 50 to 64 who work full time and provide care to a community-dwelling older person were recruited using either RDD (N = 55) or convenience methods (N = 87). Telephone interviews were conducted using reliable, valid measures of demographics, characteristics of the care recipient, help provided to the care recipient, evaluations of caregiver-care recipient relationship, and outcomes common to caregiving research. Convenience and RDD samples had similar variances on 68.4% of the examined variables. We found significant mean differences for 63% of the variables examined. Bivariate correlations suggest that one would reach different conclusions using the convenience and RDD sample data sets. Researchers should use convenience samples cautiously, as they may have limited generalizability.
Kaphingst, K A; Janoff, J M; Harris, L N; Emmons, K M
2006-05-01
Although social and ethical issues related to the storage and use of biologic specimens for genetic research have been discussed extensively in the medical literature, few empiric data exist describing patients' views. This qualitative study explored the views of 26 female breast cancer patients who had consented to donate blood or tissue samples for breast cancer research. Participants generally did not expect personal benefits from research and had few unprompted concerns. Few participants had concerns about use of samples for studies not planned at the time of consent. Some participants did express concerns about insurance or employment discrimination, while others believed that current privacy protections might actually slow breast cancer research. Participants were generally more interested in receiving individual genetic test results from research studies than aggregate results. Most participants did not want individual results of uncertain clinical significance, although others believed that they should be able to receive such information. These data examined the range of participants' views regarding the storage and use of biologic samples. Further research with different and diverse patient populations is critical to establishing an appropriate balance between protecting the rights of human subjects in genetic research and allowing research to progress.
Research and application of sampling and analysis method of sodium aerosol
International Nuclear Information System (INIS)
Yu Xiaochen; Guo Qingzhou; Wen Ximeng
1998-01-01
Method of sampling-analysis for sodium aerosol is researched. The vacuum sampling technology is used in the sampling process, and the analysis method adopted is volumetric analysis and atomic absorption. When the absolute content of sodium is in the rang of 0.1 mg to 1.0 mg, the deviation of results between volumetric analysis and atomic absorption is less than 2%. The method has been applied in a sodium aerosol removal device successfully. The analysis range, accuracy and precision can meet the requirements for researching sodium aerosol
Breaking Free of Sample Size Dogma to Perform Innovative Translational Research
Bacchetti, Peter; Deeks, Steven G.; McCune, Joseph M.
2011-01-01
Innovative clinical and translational research is often delayed or prevented by reviewers’ expectations that any study performed in humans must be shown in advance to have high statistical power. This supposed requirement is not justifiable and is contradicted by the reality that increasing sample size produces diminishing marginal returns. Studies of new ideas often must start small (sometimes even with an N of 1) because of cost and feasibility concerns, and recent statistical work shows that small sample sizes for such research can produce more projected scientific value per dollar spent than larger sample sizes. Renouncing false dogma about sample size would remove a serious barrier to innovation and translation. PMID:21677197
Methodological integrative review of the work sampling technique used in nursing workload research.
Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael
2014-11-01
To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.
Samples and data accessibility in research biobanks: an explorative survey
Directory of Open Access Journals (Sweden)
Marco Capocasa
2016-02-01
Full Text Available Biobanks, which contain human biological samples and/or data, provide a crucial contribution to the progress of biomedical research. However, the effective and efficient use of biobank resources depends on their accessibility. In fact, making bio-resources promptly accessible to everybody may increase the benefits for society. Furthermore, optimizing their use and ensuring their quality will promote scientific creativity and, in general, contribute to the progress of bio-medical research. Although this has become a rather common belief, several laboratories are still secretive and continue to withhold samples and data. In this study, we conducted a questionnaire-based survey in order to investigate sample and data accessibility in research biobanks operating all over the world. The survey involved a total of 46 biobanks. Most of them gave permission to access their samples (95.7% and data (85.4%, but free and unconditioned accessibility seemed not to be common practice. The analysis of the guidelines regarding the accessibility to resources of the biobanks that responded to the survey highlights three issues: (i the request for applicants to explain what they would like to do with the resources requested; (ii the role of funding, public or private, in the establishment of fruitful collaborations between biobanks and research labs; (iii the request of co-authorship in order to give access to their data. These results suggest that economic and academic aspects are involved in determining the extent of sample and data sharing stored in biobanks. As a second step of this study, we investigated the reasons behind the high diversity of requirements to access biobank resources. The analysis of informative answers suggested that the different modalities of resource accessibility seem to be largely influenced by both social context and legislation of the countries where the biobanks operate.
Feasibility studies on large sample neutron activation analysis using a low power research reactor
International Nuclear Information System (INIS)
Gyampo, O.
2008-06-01
Instrumental neutron activation analysis (INAA) using Ghana Research Reactor-1 (GHARR-1) can be directly applied to samples with masses in grams. Samples weights were in the range of 0.5g to 5g. Therefore, the representativity of the sample is improved as well as sensitivity. Irradiation of samples was done using a low power research reactor. The correction for the neutron self-shielding within the sample is determined from measurement of the neutron flux depression just outside the sample. Correction for gamma ray self-attenuation in the sample was performed via linear attenuation coefficients derived from transmission measurements. Quantitative and qualitative analysis of data were done using gamma ray spectrometry (HPGe detector). The results of this study on the possibilities of large sample NAA using a miniature neutron source reactor (MNSR) show clearly that the Ghana Research Reactor-1 (GHARR-1) at the National Nuclear Research Institute (NNRI) can be used for sample analyses up to 5 grams (5g) using the pneumatic transfer systems.
Utilization of the National Inpatient Sample for abdominal aortic aneurysm research.
Dua, Anahita; Ali, Fadwa; Traudt, Elizabeth; Desai, Sapan S
2017-10-01
Large administrative databases, including the Medicare database by the Centers for Medicare and Medicaid Services, the National Surgical Quality Improvement Project database sponsored by the American College of Surgeons, and the National Inpatient Sample, have been used by major public health agencies for years. More recently, medical researchers have turned to database research to power studies on diseases that are noted to be relatively scarce. This study aimed to review and discuss the utilization of the National Inpatient Sample for abdominal aortic aneurysm research, inclusive of its advantages, disadvantages, and best practices. Copyright © 2017 Elsevier Inc. All rights reserved.
Consistency of Trend Break Point Estimator with Underspecified Break Number
Directory of Open Access Journals (Sweden)
Jingjing Yang
2017-01-01
Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.
Reproducibility of preclinical animal research improves with heterogeneity of study samples
Vogt, Lucile; Sena, Emily S.; Würbel, Hanno
2018-01-01
Single-laboratory studies conducted under highly standardized conditions are the gold standard in preclinical animal research. Using simulations based on 440 preclinical studies across 13 different interventions in animal models of stroke, myocardial infarction, and breast cancer, we compared the accuracy of effect size estimates between single-laboratory and multi-laboratory study designs. Single-laboratory studies generally failed to predict effect size accurately, and larger sample sizes rendered effect size estimates even less accurate. By contrast, multi-laboratory designs including as few as 2 to 4 laboratories increased coverage probability by up to 42 percentage points without a need for larger sample sizes. These findings demonstrate that within-study standardization is a major cause of poor reproducibility. More representative study samples are required to improve the external validity and reproducibility of preclinical animal research and to prevent wasting animals and resources for inconclusive research. PMID:29470495
Research Paper Prevalence of enuresis in a community sample of ...
African Journals Online (AJOL)
Research suggests a higher prevalence of coexisting behavioural disorders, particularly Attention-Deficit Hyperactivity Disorder (ADHD), among children with enuresis in comparison to the general population. Studies generally have consisted of participants attending general paediatric medical clinics as opposed to ...
Sampling in interview-based qualitative research: A theoretical and practical guide
Robinson, Oliver
2014-01-01
Sampling is central to the practice of qualitative methods, but compared with data collection and analysis, its processes are discussed relatively little. A four-point approach to sampling in qualitative interview-based research is presented and critically discussed in this article, which integrates theory and process for the following: (1) Defining a sample universe, by way of specifying inclusion and exclusion criteria for potential participation; (2) Deciding upon a sample size, through th...
Vujanovic, Anka A.; Arrindell, Willem A.; Bernstein, Amit; Norton, Peter J.; Zvolensky, Michael J.
The present investigation examined the factor structure, internal consistency, and construct validity of the 16-item Anxiety Sensitivity Index (ASI; Reiss Peterson, Gursky, & McNally 1986) in a young adult sample (n = 420)from the Netherlands. Confirmatory factor analysis was used to comparatively
Research on stored biological samples: views of African American and White American cancer patients.
Pentz, Rebecca D; Billot, Laurent; Wendler, David
2006-04-01
Proposals on consent for research with biological samples should be informed by empirical studies of individuals' views. Studies to date queried mostly white research subjects. The aim of this study was to compare the views of two groups of patients: cancer patients at a university clinic (Winship Cancer Institute at Emory Healthcare) and cancer patients at an inner city county hospital (Grady) who were given the option of tissue banking. Overall, 315/452 (70%) patients completed the survey. The Grady cohort was 86% African American; the Winship cohort was 82% White. The vast majority (95%) of individuals in both cohorts agreed to provide a biological sample for future research. Both cohorts were willing for their samples to be used to study cancer and other diseases, including Alzheimer disease. Few participants preferred to control the disease to be studied (10%) or wished to be contacted again for consent for each future research project (11%). In our sample, almost all clinical patients, regardless of site of care, ethnicity or socioeconomic status, were willing to provide a biological sample for research purposes and allow investigators to determine the research to be done without contacting the patients again. These findings support the recommendation to offer individuals a simplified consent with a one-time binary choice whether to provide biological samples for future research. Copyright 2006 Wiley-Liss, Inc.
Curtis, S; Gesler, W; Smith, G; Washburn, S
2000-04-01
This paper focuses on the question of sampling (or selection of cases) in qualitative research. Although the literature includes some very useful discussions of qualitative sampling strategies, the question of sampling often seems to receive less attention in methodological discussion than questions of how data is collected or is analysed. Decisions about sampling are likely to be important in many qualitative studies (although it may not be an issue in some research). There are varying accounts of the principles applicable to sampling or case selection. Those who espouse 'theoretical sampling', based on a 'grounded theory' approach, are in some ways opposed to those who promote forms of 'purposive sampling' suitable for research informed by an existing body of social theory. Diversity also results from the many different methods for drawing purposive samples which are applicable to qualitative research. We explore the value of a framework suggested by Miles and Huberman [Miles, M., Huberman,, A., 1994. Qualitative Data Analysis, Sage, London.], to evaluate the sampling strategies employed in three examples of research by the authors. Our examples comprise three studies which respectively involve selection of: 'healing places'; rural places which incorporated national anti-malarial policies; young male interviewees, identified as either chronically ill or disabled. The examples are used to show how in these three studies the (sometimes conflicting) requirements of the different criteria were resolved, as well as the potential and constraints placed on the research by the selection decisions which were made. We also consider how far the criteria Miles and Huberman suggest seem helpful for planning 'sample' selection in qualitative research.
Xu, Henglong; Yong, Jiang; Xu, Guangjian
2015-12-30
Sampling frequency is important to obtain sufficient information for temporal research of microfauna. To determine an optimal strategy for exploring the seasonal variation in ciliated protozoa, a dataset from the Yellow Sea, northern China was studied. Samples were collected with 24 (biweekly), 12 (monthly), 8 (bimonthly per season) and 4 (seasonally) sampling events. Compared to the 24 samplings (100%), the 12-, 8- and 4-samplings recovered 94%, 94%, and 78% of the total species, respectively. To reveal the seasonal distribution, the 8-sampling regime may result in >75% information of the seasonal variance, while the traditional 4-sampling may only explain sampling frequency, the biotic data showed stronger correlations with seasonal variables (e.g., temperature, salinity) in combination with nutrients. It is suggested that the 8-sampling events per year may be an optimal sampling strategy for ciliated protozoan seasonal research in marine ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wijayanti, M. D.; Raharjo, S. B.; Saputro, S.; Mulyani, S.
2017-09-01
This study aims to examine the consistency of critical thinking ability of PGSD students in Energy material. The study population is PGSD students in UNS Surakarta. Samples are using cluster random sampling technique obtained by 101 students. Consistency of student’s response in knowing the critical thinking ability of PGSD students can be used as a benchmark of PGSD students’ understanding to see the equivalence of IPA problem, especially in energy material presented with various phenomena. This research uses descriptive method. Data are obtained through questionnaires and interviews. The research results that the average level of critical thinking in this study is divided into 3 levels, i.e.: level 1 (54.85%), level 2 (19.93%), and level 3 (25.23%). The data of the research result affect to the weak of students’ Energy materials’ understanding. In addition, indicators identify that assumptions and arguments analysis are also still low. Ideally, the consistency of critical thinking ability as a whole has an impact on the expansion of students’ conceptual understanding. The results of the study may become a reference to improve the subsequent research in order to obtain positive changes in the ability of critical thinking of students who directly improve the concept of students’ better understanding, especially in energy materials at various real problems occured.
Solar System Samples for Research, Education, and Public Outreach
Allen, J.; Luckey, M.; McInturff, B.; Kascak, A.; Tobola, K.; Galindo, C.; Allen, C.
2011-01-01
In the next two years, during the NASA Year of the Solar System, spacecraft from NASA and our international partners will; encounter a comet, orbit asteroid 4 Vesta, continue to explore Mars with rovers, and launch robotic explorers to the Moon and Mars. We have pieces of all these worlds in our laboratories, and their continued study provides incredibly valuable "ground truth" to complement space exploration missions. Extensive information about these unique materials, as well as actual lunar samples and meteorites, are available for display and education. The Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation, and distribution of samples for research, education, and public outreach.
Roberts, Tonya; Nolet, Kimberly; Bowers, Barbara
2015-06-01
Consistent assignment of nursing staff to residents is promoted by a number of national organizations as a strategy for improving nursing home quality and is included in pay for performance schedules in several states. However, research has shown inconsistent effects of consistent assignment on quality outcomes. In order to advance the state of the science of research on consistent assignment and inform current practice and policy, a literature review was conducted to critique conceptual and methodological understandings of consistent assignment. Twenty original research reports of consistent assignment in nursing homes were found through a variety of search strategies. Consistent assignment was conceptualized and operationalized in multiple ways with little overlap from study to study. There was a lack of established methods to measure consistent assignment. Methodological limitations included a lack of control and statistical analyses of group differences in experimental-level studies, small sample sizes, lack of attention to confounds in multicomponent interventions, and outcomes that were not theoretically linked. Future research should focus on developing a conceptual understanding of consistent assignment focused on definition, measurement, and links to outcomes. To inform current policies, testing consistent assignment should include attention to contexts within and levels at which it is most effective. Published by Oxford University Press on behalf of the Gerontological Society of America 2013.
Researchers’ views on research evaluation and the Danish Bibliometric Research Indicator
DEFF Research Database (Denmark)
Brøndum, Iben
2013-01-01
it might have affected their research practice. The study was carried out in 2013 (Febr-Sept) and consisted of a web-based questionnaire sent to 500 researchers in five Danish universities. Respondents were selected using systematic random sampling. Preliminary data analysis indicates that researchers...
Comparing two sampling methods to engage hard-to-reach communities in research priority setting.
Valerio, Melissa A; Rodriguez, Natalia; Winkler, Paula; Lopez, Jaime; Dennison, Meagen; Liang, Yuanyuan; Turner, Barbara J
2016-10-28
Effective community-partnered and patient-centered outcomes research needs to address community priorities. However, optimal sampling methods to engage stakeholders from hard-to-reach, vulnerable communities to generate research priorities have not been identified. In two similar rural, largely Hispanic communities, a community advisory board guided recruitment of stakeholders affected by chronic pain using a different method in each community: 1) snowball sampling, a chain- referral method or 2) purposive sampling to recruit diverse stakeholders. In both communities, three groups of stakeholders attended a series of three facilitated meetings to orient, brainstorm, and prioritize ideas (9 meetings/community). Using mixed methods analysis, we compared stakeholder recruitment and retention as well as priorities from both communities' stakeholders on mean ratings of their ideas based on importance and feasibility for implementation in their community. Of 65 eligible stakeholders in one community recruited by snowball sampling, 55 (85 %) consented, 52 (95 %) attended the first meeting, and 36 (65 %) attended all 3 meetings. In the second community, the purposive sampling method was supplemented by convenience sampling to increase recruitment. Of 69 stakeholders recruited by this combined strategy, 62 (90 %) consented, 36 (58 %) attended the first meeting, and 26 (42 %) attended all 3 meetings. Snowball sampling recruited more Hispanics and disabled persons (all P research, focusing on non-pharmacologic interventions for management of chronic pain. Ratings on importance and feasibility for community implementation differed only on the importance of massage services (P = 0.045) which was higher for the purposive/convenience sampling group and for city improvements/transportation services (P = 0.004) which was higher for the snowball sampling group. In each of the two similar hard-to-reach communities, a community advisory board partnered with researchers
Sampling Methods and the Accredited Population in Athletic Training Education Research
Carr, W. David; Volberding, Jennifer
2009-01-01
Context: We describe methods of sampling the widely-studied, yet poorly defined, population of accredited athletic training education programs (ATEPs). Objective: There are two purposes to this study; first to describe the incidence and types of sampling methods used in athletic training education research, and second to clearly define the…
Twenty-year trends of authorship and sampling in applied biomechanics research.
Knudson, Duane
2012-02-01
This study documented the trends in authorship and sampling in applied biomechanics research published in the Journal of Applied Biomechanics and ISBS Proceedings. Original research articles of the 1989, 1994, 1999, 2004, and 2009 volumes of these serials were reviewed, excluding reviews, modeling papers, technical notes, and editorials. Compared to 1989 volumes, the mean number of authors per paper significantly increased (35 and 100%, respectively) in the 2009 volumes, along with increased rates of hyperauthorship, and a decline in rates of single authorship. Sample sizes varied widely across papers and did not appear to change since 1989.
Are samples drawn from Mechanical Turk valid for research on political ideology?
Directory of Open Access Journals (Sweden)
Scott Clifford
2015-12-01
Full Text Available Amazon’s Mechanical Turk (MTurk is an increasingly popular tool for the recruitment of research subjects. While there has been much focus on the demographic differences between MTurk samples and the national public, we know little about whether liberals and conservatives recruited from MTurk share the same psychological dispositions as their counterparts in the mass public. In the absence of such evidence, some have argued that the selection process involved in joining MTurk invalidates the subject pool for studying questions central to political science. In this paper, we evaluate this claim by comparing a large MTurk sample to two benchmark national samples – one conducted online and one conducted face-to-face. We examine the personality and value-based motivations of political ideology across the three samples. All three samples produce substantively identical results with only minor variation in effect sizes. In short, liberals and conservatives in our MTurk sample closely mirror the psychological divisions of liberals and conservatives in the mass public, though MTurk liberals hold more characteristically liberal values and attitudes than liberals from representative samples. Overall, our results suggest that MTurk is a valid recruitment tool for psychological research on political ideology.
Hayashi-Takagi, Akiko; Vawter, Marquis P; Iwamoto, Kazuya
2014-06-15
Peripheral samples, such as blood and skin, have been used for decades in psychiatric research as surrogates for central nervous system samples. Although the validity of the data obtained from peripheral samples has been questioned and other state-of-the-art techniques, such as human brain imaging, genomics, and induced pluripotent stem cells, seem to reduce the value of peripheral cells, accumulating evidence has suggested that revisiting peripheral samples is worthwhile. Here, we re-evaluate the utility of peripheral samples and argue that establishing an understanding of the common signaling and biological processes in the brain and peripheral samples is required for the validity of such models. First, we present an overview of the available types of peripheral cells and describe their advantages and disadvantages. We then briefly summarize the main achievements of omics studies, including epigenome, transcriptome, proteome, and metabolome analyses, as well as the main findings of functional cellular assays, the results of which imply that alterations in neurotransmission, metabolism, the cell cycle, and the immune system may be partially responsible for the pathophysiology of major psychiatric disorders such as schizophrenia. Finally, we discuss the future utility of peripheral samples for the development of biomarkers and tailor-made therapies, such as multimodal assays that are used as a battery of disease and trait pathways and that might be potent and complimentary tools for use in psychiatric research. © 2013 Society of Biological Psychiatry Published by Society of Biological Psychiatry All rights reserved.
Comparing two sampling methods to engage hard-to-reach communities in research priority setting
Directory of Open Access Journals (Sweden)
Melissa A. Valerio
2016-10-01
Full Text Available Abstract Background Effective community-partnered and patient-centered outcomes research needs to address community priorities. However, optimal sampling methods to engage stakeholders from hard-to-reach, vulnerable communities to generate research priorities have not been identified. Methods In two similar rural, largely Hispanic communities, a community advisory board guided recruitment of stakeholders affected by chronic pain using a different method in each community: 1 snowball sampling, a chain- referral method or 2 purposive sampling to recruit diverse stakeholders. In both communities, three groups of stakeholders attended a series of three facilitated meetings to orient, brainstorm, and prioritize ideas (9 meetings/community. Using mixed methods analysis, we compared stakeholder recruitment and retention as well as priorities from both communities’ stakeholders on mean ratings of their ideas based on importance and feasibility for implementation in their community. Results Of 65 eligible stakeholders in one community recruited by snowball sampling, 55 (85 % consented, 52 (95 % attended the first meeting, and 36 (65 % attended all 3 meetings. In the second community, the purposive sampling method was supplemented by convenience sampling to increase recruitment. Of 69 stakeholders recruited by this combined strategy, 62 (90 % consented, 36 (58 % attended the first meeting, and 26 (42 % attended all 3 meetings. Snowball sampling recruited more Hispanics and disabled persons (all P < 0.05. Despite differing recruitment strategies, stakeholders from the two communities identified largely similar ideas for research, focusing on non-pharmacologic interventions for management of chronic pain. Ratings on importance and feasibility for community implementation differed only on the importance of massage services (P = 0.045 which was higher for the purposive/convenience sampling group and for city improvements
Directory of Open Access Journals (Sweden)
Timothy C. Guetterman
2015-05-01
Full Text Available Although recommendations exist for determining qualitative sample sizes, the literature appears to contain few instances of research on the topic. Practical guidance is needed for determining sample sizes to conduct rigorous qualitative research, to develop proposals, and to budget resources. The purpose of this article is to describe qualitative sample size and sampling practices within published studies in education and the health sciences by research design: case study, ethnography, grounded theory methodology, narrative inquiry, and phenomenology. I analyzed the 51 most highly cited studies using predetermined content categories and noteworthy sampling characteristics that emerged. In brief, the findings revealed a mean sample size of 87. Less than half of the studies identified a sampling strategy. I include a description of findings by approach and recommendations for sampling to assist methodologists, reviewers, program officers, graduate students, and other qualitative researchers in understanding qualitative sampling practices in recent studies. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1502256
Guetterman, Timothy C.
2015-01-01
Although recommendations exist for determining qualitative sample sizes, the literature appears to contain few instances of research on the topic. Practical guidance is needed for determining sample sizes to conduct rigorous qualitative research, to develop proposals, and to budget resources. The purpose of this article is to describe qualitative sample size and sampling practices within published studies in education and the health sciences by research design: case study, ethnography, ground...
Improving the quality of biomarker discovery research: the right samples and enough of them.
Pepe, Margaret S; Li, Christopher I; Feng, Ziding
2015-06-01
Biomarker discovery research has yielded few biomarkers that validate for clinical use. A contributing factor may be poor study designs. The goal in discovery research is to identify a subset of potentially useful markers from a large set of candidates assayed on case and control samples. We recommend the PRoBE design for selecting samples. We propose sample size calculations that require specifying: (i) a definition for biomarker performance; (ii) the proportion of useful markers the study should identify (Discovery Power); and (iii) the tolerable number of useless markers amongst those identified (False Leads Expected, FLE). We apply the methodology to a study of 9,000 candidate biomarkers for risk of colon cancer recurrence where a useful biomarker has positive predictive value ≥ 30%. We find that 40 patients with recurrence and 160 without recurrence suffice to filter out 98% of useless markers (2% FLE) while identifying 95% of useful biomarkers (95% Discovery Power). Alternative methods for sample size calculation required more assumptions. Biomarker discovery research should utilize quality biospecimen repositories and include sample sizes that enable markers meeting prespecified performance characteristics for well-defined clinical applications to be identified. The scientific rigor of discovery research should be improved. ©2015 American Association for Cancer Research.
Williamson, Graham R
2003-11-01
This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.
Fries, M. D.; Allen, C. C.; Calaway, M. J.; Evans, C. A.; Stansbery, E. K.
2015-01-01
Curation of NASA's astromaterials sample collections is a demanding and evolving activity that supports valuable science from NASA missions for generations, long after the samples are returned to Earth. For example, NASA continues to loan hundreds of Apollo program samples to investigators every year and those samples are often analyzed using instruments that did not exist at the time of the Apollo missions themselves. The samples are curated in a manner that minimizes overall contamination, enabling clean, new high-sensitivity measurements and new science results over 40 years after their return to Earth. As our exploration of the Solar System progresses, upcoming and future NASA sample return missions will return new samples with stringent contamination control, sample environmental control, and Planetary Protection requirements. Therefore, an essential element of a healthy astromaterials curation program is a research and development (R&D) effort that characterizes and employs new technologies to maintain current collections and enable new missions - an Advanced Curation effort. JSC's Astromaterials Acquisition & Curation Office is continually performing Advanced Curation research, identifying and defining knowledge gaps about research, development, and validation/verification topics that are critical to support current and future NASA astromaterials sample collections. The following are highlighted knowledge gaps and research opportunities.
Neutron activation analysis of bulk samples from Chinese ancient porcelain to provenance research
International Nuclear Information System (INIS)
Jian Zhu; Wentao Hao; Jianming Zhen; Tongxiu Zhen; Glascock, M.D.
2013-01-01
Neutron activation analysis (NAA) is an important technique to determine the provenance of ancient ceramics. The most common technique used for preparing ancient samples for NAA is to grind them into a powder and then encapsulate them before neutron irradiation. Unfortunately, ceramic materials are typically very hard making it a challenge to grind them into a powder. In this study we utilize bulk porcelain samples cut from ancient shards. The bulk samples are irradiated by neutrons alongside samples that have been conventionally ground into a powder. The NAA for both the bulk samples and powders are compared and shown to provide equivalent information regarding their chemical composition. Also, the multivariate statistical have been employed to the analysis data for check the consistency. The findings suggest that NAA results are less dependent on the state of the porcelain sample, and thus bulk samples cut from shards may be used to effectively determine their provenance. (author)
African Journals Online (AJOL)
This gap in the training of nurse educators may result in low in- and output in the research ... Which factors influence the manner in which PG nursing students perceive the .... sample consisted of females (83.9%; n=47), with males representing only. 16.1% (n=9). ..... Winsett and Cashion[20] assert that a research method.
Consistency of color representation in smart phones.
Dain, Stephen J; Kwan, Benjamin; Wong, Leslie
2016-03-01
One of the barriers to the construction of consistent computer-based color vision tests has been the variety of monitors and computers. Consistency of color on a variety of screens has necessitated calibration of each setup individually. Color vision examination with a carefully controlled display has, as a consequence, been a laboratory rather than a clinical activity. Inevitably, smart phones have become a vehicle for color vision tests. They have the advantage that the processor and screen are associated and there are fewer models of smart phones than permutations of computers and monitors. Colorimetric consistency of display within a model may be a given. It may extend across models from the same manufacturer but is unlikely to extend between manufacturers especially where technologies vary. In this study, we measured the same set of colors in a JPEG file displayed on 11 samples of each of four models of smart phone (iPhone 4s, iPhone5, Samsung Galaxy S3, and Samsung Galaxy S4) using a Photo Research PR-730. The iPhones are white LED backlit LCD and the Samsung are OLEDs. The color gamut varies between models and comparison with sRGB space shows 61%, 85%, 117%, and 110%, respectively. The iPhones differ markedly from the Samsungs and from one another. This indicates that model-specific color lookup tables will be needed. Within each model, the primaries were quite consistent (despite the age of phone varying within each sample). The worst case in each model was the blue primary; the 95th percentile limits in the v' coordinate were ±0.008 for the iPhone 4 and ±0.004 for the other three models. The u'v' variation in white points was ±0.004 for the iPhone4 and ±0.002 for the others, although the spread of white points between models was u'v'±0.007. The differences are essentially the same for primaries at low luminance. The variation of colors intermediate between the primaries (e.g., red-purple, orange) mirror the variation in the primaries. The variation in
Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D
This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.
Chemical and Metallurgy Research (CMR) Sample Tracking System Design Document
International Nuclear Information System (INIS)
Bargelski, C. J.; Berrett, D. E.
1998-01-01
The purpose of this document is to describe the system architecture of the Chemical and Metallurgy Research (CMR) Sample Tracking System at Los Alamos National Laboratory. During the course of the document observations are made concerning the objectives, constraints and limitations, technical approaches, and the technical deliverables
Radiological air monitoring and sample analysis research and development progress report
International Nuclear Information System (INIS)
1992-01-01
Sponsored by a Department Of Energy (DOE) research and development grant, the State of Idaho INEL Oversight Program (OP) personnel designed an independent air monitoring system that provides detection of the presence of priority airborne contaminants potentially migrating beyond INEL boundaries. Initial locations for off-site ambient air monitoring stations were chosen in consultation with: DOE and NOAA reports; Mesodif modeling; review of the relevant literature; and communication with private contractors and experts in pertinent fields. Idaho State University (ISU) has initiated an Environmental Monitoring Program (EMP). The EMP provides an independent monitoring function as well as a training ground for students. Students learn research techniques dedicated to environmental studies and learn analytical skills and rules of compliance related to monitoring. ISU-EMP assisted OP in specific aspects of identifying optimum permanent monitoring station locations, and in selecting appropriate sample collection equipment for each station. The authorization to establish, prepare and install sampling devices on selected sites was obtained by OP personnel in conjunction with ISU-EMP personnel. All samples described in this program are collected by OP or ISU-EMP personnel and returned to the ISU for analysis. This report represents the summary of results of those samples collected and analyzed for radioactivity during the year of 1992
Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Hanna, R. D.; Ketcham, R. A.
2016-01-01
New technologies make possible the advancement of documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. With increasing demands for accessibility to updated comprehensive data, and with new sample return missions on the horizon, it is of primary importance to develop new standards for contemporary documentation and visualization methodologies. Our interdisciplinary team has expertise in the fields of heritage conservation practices, professional photography, photogrammetry, imaging science, application engineering, data curation, geoscience, and astromaterials curation. Our objective is to create virtual 3D reconstructions of Apollo Lunar and Antarctic Meteorite samples that are a fusion of two state-of-the-art data sets: the interior view of the sample by collecting Micro-XCT data and the exterior view of the sample by collecting high-resolution precision photography data. These new data provide researchers an information-rich visualization of both compositional and textural information prior to any physical sub-sampling. Since January 2013 we have developed a process that resulted in the successful creation of the first image-based 3D reconstruction of an Apollo Lunar Sample correlated to a 3D reconstruction of the same sample's Micro- XCT data, illustrating that this technique is both operationally possible and functionally beneficial. In May of 2016 we began a 3-year research period during which we aim to produce Virtual Astromaterials Samples for 60 high-priority Apollo Lunar and Antarctic Meteorite samples and serve them on NASA's Astromaterials Acquisition and Curation website. Our research demonstrates that research-grade Virtual Astromaterials Samples are beneficial in preserving for posterity a precise 3D reconstruction of the sample prior to sub-sampling, which greatly improves documentation practices, provides unique and novel visualization of the sample's interior and
DEFF Research Database (Denmark)
Køster, Brian; Søndergaard, Jens; Nielsen, Jesper Bo
2018-01-01
in protection behavior was low. To our knowledge, this is the first study to report reliability for a completely validated questionnaire on sun-related behavior in a national random population based sample. Further, we show that attitude and knowledge questions confirmed their validity with good reliability......An important feature of questionnaire validation is reliability. To be able to measure a given concept by questionnaire validly, the reliability needs to be high. The objectives of this study were to examine reliability of attitude and knowledge and behavioral consistency of sunburn in a developed...... questionnaire for monitoring and evaluating population sun-related behavior. Sun related behavior, attitude and knowledge was measured weekly by a questionnaire in the summer of 2013 among 664 Danes. Reliability was tested in a test-retest design. Consistency of behavioral information was tested similarly...
Consistency of self-reported alcohol consumption on randomized and sequential alcohol purchase tasks
Directory of Open Access Journals (Sweden)
Michael eAmlung
2012-07-01
Full Text Available Behavioral economic demand for addictive substances is commonly assessed via purchase tasks that measure estimated drug consumption at a range of prices. Purchase tasks typically use escalating prices in sequential order, which may influence performance by providing explicit price reference points. This study investigated the consistency of value preferences on two alcohol purchase tasks (APTs that used either a randomized or sequential price order (price range: free to $30 per drink in a sample of ninety-one young adult monthly drinkers. Randomization of prices significantly reduced relative response consistency (p < .01, although absolute consistency was high for both versions (>95%. Self-reported alcohol consumption across prices and indices of demand were highly similar across versions, although a few notable exceptions were found. These results suggest generally high consistency and overlapping performance between randomized and sequential price assessment. Implications for the behavioral economics literature and priorities for future research are discussed.
Valle Mansilla, José Ignacio
2011-01-01
Biomedical researchers often now ask subjects to donate samples to be deposited in biobanks. This is not only of interest to researchers, patients and society as a whole can benefit from the improvements in diagnosis, treatment, and prevention that the advent of genomic medicine portends. However, there is a growing debate regarding the social and ethical implications of creating biobanks and using stored human tissue samples for genomic research. Our aim was to identify factors related to both scientists and patients' preferences regarding the sort of information to convey to subjects about the results of the study and the risks related to genomic research. The method used was a survey addressed to 204 scientists and 279 donors from the U.S. and Spain. In this sample, researchers had already published genomic epidemiology studies; and research subjects had actually volunteered to donate a human sample for genomic research. Concerning the results, patients supported more frequently than scientists their right to know individual results from future genomic research. These differences were statistically significant after adjusting by the opportunity to receive genetic research results from the research they had previously participated and their perception of risks regarding genetic information compared to other clinical data. A slight majority of researchers supported informing participants about individual genomic results only if the reliability and clinical validity of the information had been established. Men were more likely than women to believe that patients should be informed of research results even if these conditions were not met. Also among patients, almost half of them would always prefer to be informed about individual results from future genomic research. The three main factors associated to a higher support of a non-limited access to individual results were: being from the US, having previously been offered individual information and considering
The memory failures of everyday questionnaire (MFE): internal consistency and reliability.
Montejo Carrasco, Pedro; Montenegro, Peña Mercedes; Sueiro, Manuel J
2012-07-01
The Memory Failures of Everyday Questionnaire (MFE) is one of the most widely-used instruments to assess memory failures in daily life. The original scale has nine response options, making it difficult to apply; we created a three-point scale (0-1-2) with response choices that make it easier to administer. We examined the two versions' equivalence in a sample of 193 participants between 19 and 64 years of age. The test-retest reliability and internal consistency of the version we propose were also computed in a sample of 113 people. Several indicators attest to the two forms' equivalence: the correlation between the items' means (r = .94; p MFE 1-9. The MFE 0-2 provides a brief, simple evaluation, so we recommend it for use in clinical practice as well as research.
Evaluating Temporal Consistency in Marine Biodiversity Hotspots.
Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S
2015-01-01
With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other
Connecting Research to Teaching: Using Data to Motivate the Use of Empirical Sampling Distributions
Lee, Hollylynne S.; Starling, Tina T.; Gonzalez, Marggie D.
2014-01-01
Research shows that students often struggle with understanding empirical sampling distributions. Using hands-on and technology models and simulations of problems generated by real data help students begin to make connections between repeated sampling, sample size, distribution, variation, and center. A task to assist teachers in implementing…
Hartwell, Erica E; Serovich, Julianne M; Reed, Sandra J; Boisvert, Danielle; Falbo, Teresa
2017-07-01
The purpose of this study is to review samples from research on gay, lesbian, and bisexual (GLB) issues and to evaluate the suitability of this body of research to support affirmative and evidence-based practice with GLB clients. The authors systematically reviewed the sampling methodology and sample composition of GLB-related research. All original, quantitative articles focusing on GLB issues published in couple and family therapy (CFT)-related journals since 1975 were coded (n = 153). Results suggest that within the GLB literature base there is some evidence of heterocentrism as well as neglect of issues of class, race, and gender. Suggestions to improve the diversity and representativeness of samples-and, thus, clinical implications-of GLB-related research in CFT literature are provided. © 2017 American Association for Marriage and Family Therapy.
Structural Consistency, Consistency, and Sequential Rationality.
Kreps, David M; Ramey, Garey
1987-01-01
Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...
Directory of Open Access Journals (Sweden)
Meizhen Liao
Full Text Available BACKGROUND: Routine surveillance using convenient sampling found low prevalence of HIV and syphilis among female sex workers in China. Two consecutive surveys using respondent driven sampling were conducted in 2008 and 2009 to examine the prevalence of HIV and syphilis among female sex workers in Jinan, China. METHODS: A face-to-face interview was conducted to collect demographic, behavioral and service utilization information using a structured questionnaire. Blood samples were drawn for serological tests of HIV-1 antibody and syphilis antibody. Respondent Driven Sampling Analysis Tool was used to generate population level estimates. RESULTS: In 2008 and in 2009, 363 and 432 subjects were recruited and surveyed respectively. Prevalence of syphilis was 2.8% in 2008 and 2.2% in 2009, while no HIV case was found in both years. Results are comparable to those from routine sentinel surveillance system in the city. Only 60.8% subjects in 2008 and 48.3% in 2009 reported a consistent condom use with clients during the past month. Over 50% subjects had not been covered by any HIV-related services in the past year, with only 15.6% subjects in 2008 and 13.1% in 2009 ever tested for HIV. CONCLUSIONS: Despite the low prevalence of syphilis and HIV, risk behaviors are common. Targeted interventions to promote the safe sex and utilization of existing intervention services are still needed to keep the epidemic from growing.
Using the Perceptron Algorithm to Find Consistent Hypotheses
Anthony, M.; Shawe-Taylor, J.
1993-01-01
The perceptron learning algorithm yields quite naturally an algorithm for finding a linearly separable boolean function consistent with a sample of such a function. Using the idea of a specifying sample, we give a simple proof that this algorithm is not efficient, in general.
Mayer, B; Muche, R
2013-01-01
Animal studies are highly relevant for basic medical research, although their usage is discussed controversially in public. Thus, an optimal sample size for these projects should be aimed at from a biometrical point of view. Statistical sample size calculation is usually the appropriate methodology in planning medical research projects. However, required information is often not valid or only available during the course of an animal experiment. This article critically discusses the validity of formal sample size calculation for animal studies. Within the discussion, some requirements are formulated to fundamentally regulate the process of sample size determination for animal experiments.
Personality consistency in dogs: a meta-analysis.
Fratkin, Jamie L; Sinn, David L; Patall, Erika A; Gosling, Samuel D
2013-01-01
Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests') versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.
Personality consistency in dogs: a meta-analysis.
Directory of Open Access Journals (Sweden)
Jamie L Fratkin
Full Text Available Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family. Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43. Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests' versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.
Personality Consistency in Dogs: A Meta-Analysis
Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.
2013-01-01
Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787
Sample Identification at Scale - Implementing IGSN in a Research Agency
Klump, J. F.; Golodoniuc, P.; Wyborn, L. A.; Devaraju, A.; Fraser, R.
2015-12-01
Earth sciences are largely observational and rely on natural samples, types of which vary significantly between science disciplines. Sharing and referencing of samples in scientific literature and across the Web requires the use of globally unique identifiers essential for disambiguation. This practice is very common in other fields, e.g. ISBN in publishing, doi in scientific literature, etc. In Earth sciences however, this is still often done in an ad-hoc manner without the use of unique identifiers. The International Geo Sample Number (IGSN) system provides a persistent, globally unique label for identifying environmental samples. As an IGSN allocating agency, CSIRO implements the IGSN registration service at the organisational scale with contributions from multiple research groups. Capricorn Distal Footprints project is one of the first pioneers and early adopters of the technology in Australia. For this project, IGSN provides a mechanism for identification of new and legacy samples, as well as derived sub-samples. It will ensure transparency and reproducibility in various geochemical sampling campaigns that will involve a diversity of sampling methods. Hence, diverse geochemical and isotopic results can be linked back to the parent sample, particularly where multiple children of that sample have also been analysed. The IGSN integration for this project is still in early stages and requires further consultations on the governance mechanisms that we need to put in place to allow efficient collaboration within CSIRO and collaborating partners on the project including naming conventions, service interfaces, etc. In this work, we present the results of the initial implementation of IGSN in the context of the Capricorn Distal Footprints project. This study has so far demonstrated the effectiveness of the proposed approach, while maintaining the flexibility to adapt to various media types, which is critical in the context of a multi-disciplinary project.
Directory of Open Access Journals (Sweden)
Patrice L. Capers
2015-03-01
Full Text Available BACKGROUND: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. OBJECTIVE: To evaluate the use of double sampling combined with multiple imputation (DS+MI to address meta-research questions, using as an example adherence of PubMed entries to two simple Consolidated Standards of Reporting Trials (CONSORT guidelines for titles and abstracts. METHODS: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT; human; abstract available; and English language (n=322,107. For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO human rating method. Multiple imputation of the missing-completely-at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. RESULTS: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title=1.00, abstract=0.92. Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS+MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by Year: subsample RHITLO 1.050-1.174 vs. DS+MI 1.082-1.151. As evidence of improved accuracy, DS+MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. CONCLUSIONS: Our results support our hypothesis that DS+MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of
Directory of Open Access Journals (Sweden)
CODRUŢA DURA
2010-01-01
Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.
Life Science Research Sample Transfer Technology for On Orbit Analysis, Phase II
National Aeronautics and Space Administration — With retirement of the space shuttle program, microgravity researchers can no longer count on bringing experiment samples back to earth for post-flight analysis....
Scholey, A B; Owen, L; Gates, J; Rodgers, J; Buchanan, T; Ling, J; Heffernan, T; Swan, P; Stough, C; Parrott, A C
2011-01-01
Our group has conducted several Internet investigations into the biobehavioural effects of self-reported recreational use of MDMA (3,4-methylenedioxymethamphetamine or Ecstasy) and other psychosocial drugs. Here we report a new study examining the relationship between self-reported Ecstasy use and traces of MDMA found in hair samples. In a laboratory setting, 49 undergraduate volunteers performed an Internet-based assessment which included mood scales and the University of East London Drug Use Questionnaire, which asks for history and current drug use. They also provided a hair sample for determination of exposure to MDMA over the previous month. Self-report of Ecstasy use and presence in hair samples were consistent (p happiness and higher self-reported stress. Self-reported Ecstasy use, but not presence in hair, was also associated with decreased tension. Different psychoactive drugs can influence long-term mood and cognition in complex and dynamically interactive ways. Here we have shown a good correspondence between self-report and objective assessment of exposure to MDMA. These data suggest that the Internet has potentially high utility as a useful medium to complement traditional laboratory studies into the sequelae of recreational drug use. Copyright © 2010 S. Karger AG, Basel.
Sample size estimation and sampling techniques for selecting a representative sample
Directory of Open Access Journals (Sweden)
Aamir Omair
2014-01-01
Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.
Consistent Estimation of Continuous-Time Signals from Nonlinear Transformations of Noisy Samples,
1980-03-10
t, then hn is given by (5) (with W = n) and represents the Szasz operator. Theorem 3.0, while guaranteeing mean-square consistency of the estimate Sw...t), provides no bounds on the rate of convergence. We shall derive such bounds for linear systems hW corresponding to the class of generalized Szasz ...operators [6] (see below) and to the Bernstein operator. While the Szasz operator (5) can be generated as in Proposition 3.0, the class of generalized
Consistency of the least weighted squares under heteroscedasticity
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2011-01-01
Roč. 2011, č. 47 (2011), s. 179-206 ISSN 0023-5954 Grant - others:GA UK(CZ) GA402/09/055 Institutional research plan: CEZ:AV0Z10750506 Keywords : Regression * Consistency * The least weighted squares * Heteroscedasticity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-consistency of the least weighted squares under heteroscedasticity.pdf
Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation
Lindell, Annukka K.
2017-01-01
Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora. PMID:28270790
Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne
2018-01-01
Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in
L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne
2018-01-01
Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile
Directory of Open Access Journals (Sweden)
Kelly L'Engle
Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit
Directory of Open Access Journals (Sweden)
R. Eric Heidel
2016-01-01
Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.
Shemesh, Eyal; Mitchell, Jeffrey; Neighbors, Katie; Feist, Susan; Hawkins, Andre; Brown, Amanda; Wanrong, Yin; Anand, Ravinder; Stuber, Margaret L; Annunziato, Rachel A
2017-12-01
Medication adherence is an important determinant of transplant outcomes. Attempts to investigate adherence are frequently undermined by selection bias: It is very hard to recruit and retain non-adherent patients in research efforts. This manuscript presents recruitment strategies and results from the MALT (Medication Adherence in children who had a Liver Transplant) multisite prospective cohort study. MALT sites recruited 400 pediatric liver transplant patients who agreed to be followed for 2 years. The primary purpose was to determine whether a marker of adherence, the Medication Level Variability Index (MLVI), predicts rejection outcomes. The present manuscript describes methods used in MALT to ensure that a representative sample was recruited, and presents detailed recruitment results. MALT sites were able to recruit a nationally representative sample, as determined by a comparison between the MALT cohort and a national sample of transplant recipients. Strategies that helped ensure that the sample was representative included monitoring of the outcome measure in comparison with a national sample, drastically limiting patient burden, and specific recruitment methods. We discuss the importance of a representative sample in adherence research and recommend that future efforts to study adherence pay special attention to sample characteristics. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Directory of Open Access Journals (Sweden)
Jan Zavadsky
2014-07-01
Full Text Available Purpose: The performance management system (PMS is a metasystem over all business processes at the strategic and operational level. Effectiveness of the various management systems depends on many factors. One of them is the consistent definition of each system elements. The main purpose of this study is to explore if the performance management systems of the sample companies is consistent and how companies can create such a system. The consistency in this case is based on the homogenous definition of attributes relating to the performance indicator as a basic element of PMS.Methodology: At the beginning, we used an affinity diagram that helped us to clarify and to group various attributes of performance indicators. The main research results we achieved are through empirical study. The empirical study was carried out in a sample of Slovak companies. The criterion for selection was the existence of the certified management systems according to the ISO 9001. Representativeness of the sample companies was confirmed by application of Pearson´s chi-squared test (χ2 - test due to above standards. Findings: Coming from the review of various literature, we defined four groups of attributes relating to the performance indicator: formal attributes, attributes of target value, informational attributes and attributes of evaluation. The whole set contains 21 attributes. The consistency of PMS is based not on maximum or minimum number of attributes, but on the same type of attributes for each performance indicator used in PMS at both the operational and strategic level. The main findings are: companies use various financial and non-financial indicators at strategic or operational level; companies determine various attributes of performance indicator, but most of the performance indicators are otherwise determined; we identified the common attributes for the whole sample of companies. Practical implications: The research results have got an implication for
Ethics and law in research with human biological samples: a new approach.
Petrini, Carlo
2014-01-01
During the last century a large number of documents (regulations, ethical codes, treatises, declarations, conventions) were published on the subject of ethics and clinical trials, many of them focusing on the protection of research participants. More recently various proposals have been put forward to relax some of the constraints imposed on research by these documents and regulations. It is important to distinguish between risks deriving from direct interventions on human subjects and other types of risk. In Italy the Data Protection Authority has acted in the question of research using previously collected health data and biological samples to simplify the procedures regarding informed consent. The new approach may be of help to other researchers working outside Italy.
Types of non-probabilistic sampling used in marketing research. „Snowball” sampling
Manuela Rozalia Gabor
2007-01-01
A significant way of investigating a firm’s market is the statistical sampling. The sampling typology provides a non / probabilistic models of gathering information and this paper describes thorough information related to network sampling, named “snowball” sampling. This type of sampling enables the survey of occurrence forms concerning the decision power within an organisation and of the interpersonal relation network governing a certain collectivity, a certain consumer panel. The snowball s...
Yam, Eileen A; Okal, Jerry; Musyoki, Helgar; Muraguri, Nicholas; Tun, Waimar; Sheehy, Meredith; Geibel, Scott
2016-03-01
To examine whether nonbarrier modern contraceptive use is associated with less consistent condom use among Kenyan female sex workers (FSWs). Researchers recruited 579 FSWs using respondent-driven sampling. We conducted multivariate logistic regression to examine the association between consistent condom use and female-controlled nonbarrier modern contraceptive use. A total of 98.8% reported using male condoms in the past month, and 64.6% reported using female-controlled nonbarrier modern contraception. In multivariate analysis, female-controlled nonbarrier modern contraceptive use was not associated with decreased condom use with clients or nonpaying partners. Consistency of condom use is not compromised when FSWs use available female-controlled nonbarrier modern contraception. FSWs should be encouraged to use condoms consistently, whether or not other methods are used simultaneously. Copyright © 2016 Elsevier Inc. All rights reserved.
Ngwakongnwi, Emmanuel; King-Shier, Kathryn M; Hemmelgarn, Brenda R; Musto, Richard; Quan, Hude
2014-01-01
Francophones who live outside the primarily French-speaking province of Quebec, Canada, risk being excluded from research by lack of a sampling frame. We examined the adequacy of random sampling, advertising, and respondent-driven sampling for recruitment of francophones for survey research. We recruited francophones residing in the city of Calgary, Alberta, through advertising and respondentdriven sampling. These 2 samples were then compared with a random subsample of Calgary francophones derived from the 2006 Canadian Community Health Survey (CCHS). We assessed the effectiveness of advertising and respondent-driven sampling in relation to the CCHS sample by comparing demographic characteristics and selected items from the CCHS (specifically self-reported general health status, perceived weight, and having a family doctor). We recruited 120 francophones through advertising and 145 through respondent-driven sampling; the random sample from the CCHS consisted of 259 records. The samples derived from advertising and respondentdriven sampling differed from the CCHS in terms of age (mean ages 41.0, 37.6, and 42.5 years, respectively), sex (proportion of males 26.1%, 40.6%, and 56.6%, respectively), education (college or higher 86.7% , 77.9% , and 59.1%, respectively), place of birth (immigrants accounting for 45.8%, 55.2%, and 3.7%, respectively), and not having a regular medical doctor (16.7%, 34.5%, and 16.6%, respectively). Differences were not tested statistically because of limitations on the analysis of CCHS data imposed by Statistics Canada. The samples generated exclusively through advertising and respondent-driven sampling were not representative of the gold standard sample from the CCHS. Use of such biased samples for research studies could generate misleading results.
Research on self-absorption corrections for laboratory γ spectral analysis of soil samples
International Nuclear Information System (INIS)
Tian Zining; Jia Mingyan; Li Huibin; Cheng Ziwei; Ju Lingjun; Shen Maoquan; Yang Xiaoyan; Yan Ling; Fen Tiancheng
2010-01-01
Based on the calibration results of the point sources,dimensions of HPGe crystal were characterized.Linear attenuation coefficients and detection efficiencies of all kinds of samples were calculated,and the function F(μ) of φ75 mm x 25 mm sample was established. Standard surface source was used to simulate the source of different heights in the soil sample. And the function ε(h) which reflect the relationship between detection efficiencies and heights of the surface sources was determined. The detection efficiency of calibration source can be obtained by integration, F(μ) functions of soil samples established is consistent with the result of MCNP calculation code. Several φ75 mm x 25 mm soil samples were measured by the HPGe spectrometer,and the function F(μ) was used to correct the self absorption. F(μ) functions of soil samples of various dimensions can be calculated by MCNP calculation code established, and self absorption correction can be done. To verify the efficiency of calculation results, φ75 mm x 75 mm soil samples were measured. Several φ75 mm x 25 mm soil samples from aerosphere nuclear testing field was measured by the HPGe spectrometer,and the function F(μ) was used to correct the self absorption. The function F(m) was established, and the technical method which is used to correct the soil samples of unknown area is also given. The correction method of surface source greatly improves the gamma spectrum's metrical accuracy, and it will be widely applied to environmental radioactive investigation. (authors)
Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G
2015-10-01
Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.
Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.
2016-01-01
Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969
Fuller, Daniel; Gauvin, Lise; Fournier, Michel; Kestens, Yan; Daniel, Mark; Morency, Patrick; Drouin, Louis
2012-04-01
Active living is a broad conceptualization of physical activity that incorporates domains of exercise; recreational, household, and occupational activities; and active transportation. Policy makers develop and implement a variety of transportation policies that can influence choices about how to travel from one location to another. In making such decisions, policy makers act in part in response to public opinion or support for proposed policies. Measures of the public's support for policies aimed at promoting active transportation can inform researchers and policy makers. This study examined the internal consistency, and concurrent and discriminant validity of a newly developed measure of the public's support for policies for active living in transportation (PAL-T). A series of 17 items representing potential policies for promoting active transportation was generated. Two samples of participants (n = 2,001 and n = 2,502) from Montreal, Canada, were recruited via random digit dialling. Analyses were conducted on the combined data set (n = 4,503). Participants were aged 18 through 94 years (58% female). The concurrent and discriminant validity of the PAL-T was assessed by examining relationships with physical activity and smoking. To explore the usability of the PAL-T, predicted scale scores were compared to the summed values of responses. Results showed that the internal consistency of the PAL-T was 0.70. Multilevel regression demonstrated no relationship between the PAL-T and smoking status (p > 0.05) but significant relationships with utilitarian walking (p public opinion can inform policy makers and support advocacy efforts aimed at making built environments more suitable for active transportation while allowing researchers to examine the antecedents and consequences of public support for policies.
De Groot, Mark C.H.; Schlienger, Raymond; Reynolds, Robert; Gardarsdottir, Helga; Juhaeri, Juhaeri; Hesse, Ulrik; Gasse, Christiane; Rottenkolber, Marietta; Schuerch, Markus; Kurz, Xavier; Klungel, Olaf H.
2013-01-01
Background: Pharmacoepidemiological (PE) research should provide consistent, reliable and reproducible results to contribute to the benefit-risk assessment of medicines. IMI-PROTECT aims to identify sources of methodological variations in PE studies using a common protocol and analysis plan across
Griffiths, Robert B.
2001-11-01
Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics
Underwater Sediment Sampling Research
2017-01-01
impacted sediments was found to be directly related to the concentration of crude oil detected in the sediment pore waters . Applying this mathematical...Kurt.A.Hansen@uscg.mil. 16. Abstract (MAXIMUM 200 WORDS ) The USCG R&D Center sought to develop a bench top system to determine the amount of total...scattered. The approach here is to sample the interstitial water between the grains of sand and attempt to determine the amount of oil in and on
Consistency analysis of network traffic repositories
Lastdrager, Elmer; Lastdrager, E.E.H.; Pras, Aiko
Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for
Exploring Technostress: Results of a Large Sample Factor Analysis
Jonušauskas, Steponas; Raišienė, Agota Giedrė
2016-01-01
With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...
Energy Technology Data Exchange (ETDEWEB)
Wright, Phillip M.; Ruth, Kathryn A.; Langton, David R.; Bullett, Michael J.
1990-03-30
The Earth Science Laboratory of the University of Utah Research Institute has been involved in research in geothermal exploration and development for the past eleven years. Our work has resulted in the publication of nearly 500 reports, which are listed in this document. Over the years, we have collected drill chip and core samples from more than 180 drill holes in geothermal areas, and most of these samples are available to others for research, exploration and similar purposes. We hope that scientists and engineers involved in industrial geothermal development will find our technology transfer and service efforts helpful.
Self-consistent asset pricing models
Malevergne, Y.; Sornette, D.
2007-08-01
self-consistency condition derives a risk-factor decomposition in the multi-factor case which is identical to the principal component analysis (PCA), thus providing a direct link between model-driven and data-driven constructions of risk factors. This correspondence shows that PCA will therefore suffer from the same limitations as the CAPM and its multi-factor generalization, namely lack of out-of-sample explanatory power and predictability. In the multi-period context, the self-consistency conditions force the betas to be time-dependent with specific constraints.
Shedding consistency of strongyle-type eggs in dutch boarding horses
Dopfer, D.D.V.; Kerssens, C.M.; Meijer, Y.G.M.; Boersema, J.H.; Eysker, M.
2004-01-01
Faeces of 484 horses were sampled twice with an interval of 6 weeks while anthelmintic therapy was halted. Faecal eggs counts revealed that 267 (55.2%) horses had consistently low numbers of eggs per gram faeces (EPG) (EPG <100 or = 100), 155 (32.0%) horses had consistently high EPGs (EPG >
McCreesh, Nicky; Tarsh, Matilda Nadagire; Seeley, Janet; Katongole, Joseph; White, Richard G
2013-01-01
Respondent-driven sampling (RDS) is a widely-used variant of snowball sampling. Respondents are selected not from a sampling frame, but from a social network of existing members of the sample. Incentives are provided for participation and for the recruitment of others. Ethical and methodological criticisms have been raised about RDS. Our purpose was to evaluate whether these criticisms were justified. In this study RDS was used to recruit male household heads in rural Uganda. We investigated community members' understanding and experience of the method, and explored how these may have affected the quality of the RDS survey data. Our findings suggest that because participants recruit participants, the use of RDS in medical research may result in increased difficulties in gaining informed consent, and data collected using RDS may be particularly susceptible to bias due to differences in the understanding of key concepts between researchers and members of the community.
Gel dosimetry - a laser based 3D scanner for gel samples - research in India
Energy Technology Data Exchange (ETDEWEB)
Widmer, Johannes [Institut fuer Angewandte Photophysik, TU Dresden (Germany); Photonics Division, VIT University, Vellore, Tamil Nadu (India); Dhiviyaraj Kalaiselven, Senthil Kumar [Photonics Division, VIT University, Vellore, Tamil Nadu (India); Department of Therapeutic Radiology, University of Minnesota, Minneapolis (United States); James, Jebaseelan Samuel [Photonics Division, VIT University, Vellore, Tamil Nadu (India)
2013-07-01
A laser based 3D scanner is developed to take tomography images of partly transparent samples. The scanner is optimized to characterize gel samples from spatially resolved dosimetry measurements. The resulting device should be suitably designed to be constructed in India. This gave me valuable insight into the scientific and technological environment of the country and made me find my way through a quite different culture of research and commerce, within and beyond the scientific context of the university. The project was implemented during a nine months stay at the Vellore Institute of Technology University in Vellore, Tamil Nadu, India, in co-operation with the Christian Medical College, Vellore, in 2006/07. It was conducted within the framework of existing research activities of the host university.
Le Mens, Gaël; Denrell, Jerker
2011-04-01
Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them. Here, we show that this "naivety" assumption is not necessary. Systematically biased judgments can emerge even when decision makers process available information perfectly and are also aware of how the information sample has been generated. Specifically, we develop a rational analysis of Denrell's (2005) experience sampling model, and we prove that when information search is interested rather than disinterested, even rational information sampling and processing can give rise to systematic patterns of errors in judgments. Our results illustrate that a tendency to favor alternatives for which outcome information is more accessible can be consistent with rational behavior. The model offers a rational explanation for behaviors that had previously been attributed to cognitive and motivational biases, such as the in-group bias or the tendency to prefer popular alternatives. 2011 APA, all rights reserved
High-performance speech recognition using consistency modeling
Digalakis, Vassilios; Murveit, Hy; Monaco, Peter; Neumeyer, Leo; Sankar, Ananth
1994-12-01
The goal of SRI's consistency modeling project is to improve the raw acoustic modeling component of SRI's DECIPHER speech recognition system and develop consistency modeling technology. Consistency modeling aims to reduce the number of improper independence assumptions used in traditional speech recognition algorithms so that the resulting speech recognition hypotheses are more self-consistent and, therefore, more accurate. At the initial stages of this effort, SRI focused on developing the appropriate base technologies for consistency modeling. We first developed the Progressive Search technology that allowed us to perform large-vocabulary continuous speech recognition (LVCSR) experiments. Since its conception and development at SRI, this technique has been adopted by most laboratories, including other ARPA contracting sites, doing research on LVSR. Another goal of the consistency modeling project is to attack difficult modeling problems, when there is a mismatch between the training and testing phases. Such mismatches may include outlier speakers, different microphones and additive noise. We were able to either develop new, or transfer and evaluate existing, technologies that adapted our baseline genonic HMM recognizer to such difficult conditions.
International Nuclear Information System (INIS)
Rodriguez Gual, Maritza; Mas Milian, Felix; Deppman, Airton; Pinto Coelho, Paulo Rogerio
2010-01-01
For the demand of an experimental device for biological samples positioning system for irradiations on a radial channel at the nuclear research reactor in operation was constructed and started up a device for the place and remove of the biological samples from the irradiation channels without interrupting the operation of the reactor. The economical valuations are effected comparing with another type of device with the same functions. This work formed part of an international project between Cuba and Brazil that undertook the study of the induced damages by various types of ionizing radiation in DNA molecules. Was experimentally tested the proposed solution, which demonstrates the practical validity of the device. As a result of the work, the experimental device for biological samples irradiations are installed and operating in the radial beam hole No3(BH3) for more than five years at the IEA-R1 Brazilian research reactor according to the solicited requirements the device. The designed device increases considerably the type of studies can be conducted in this reactor. Its practical application in research taking place in that facility, in the field of radiobiology and dosimetry, and so on is immediate
Donnellan, M. Brent; Kenny, David A.; Trzesniewski, Kali H.; Lucas, Richard E.; Conger, Rand D.
2012-01-01
The present research used a latent variable trait-state model to evaluate the longitudinal consistency of self-esteem during the transition from adolescence to adulthood. Analyses were based on ten administrations of the Rosenberg Self-Esteem scale (Rosenberg, 1965) spanning the ages of approximately 13 to 32 for a sample of 451 participants. Results indicated that a completely stable trait factor and an autoregressive trait factor accounted for the majority of the variance in latent self-esteem assessments, whereas state factors accounted for about 16% of the variance in repeated assessments of latent self-esteem. The stability of individual differences in self-esteem increased with age consistent with the cumulative continuity principle of personality development. PMID:23180899
Donnellan, M Brent; Kenny, David A; Trzesniewski, Kali H; Lucas, Richard E; Conger, Rand D
2012-12-01
The present research used a latent variable trait-state model to evaluate the longitudinal consistency of self-esteem during the transition from adolescence to adulthood. Analyses were based on ten administrations of the Rosenberg Self-Esteem scale (Rosenberg, 1965) spanning the ages of approximately 13 to 32 for a sample of 451 participants. Results indicated that a completely stable trait factor and an autoregressive trait factor accounted for the majority of the variance in latent self-esteem assessments, whereas state factors accounted for about 16% of the variance in repeated assessments of latent self-esteem. The stability of individual differences in self-esteem increased with age consistent with the cumulative continuity principle of personality development.
Bootstrap-Based Inference for Cube Root Consistent Estimators
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi
This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....
Consistency Analysis of Nearest Subspace Classifier
Wang, Yi
2015-01-01
The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...
Donnellan, M. Brent; Kenny, David A.; Trzesniewski, Kali H.; Lucas, Richard E.; Conger, Rand D.
2012-01-01
The present research used a latent variable trait-state model to evaluate the longitudinal consistency of self-esteem during the transition from adolescence to adulthood. Analyses were based on ten administrations of the Rosenberg Self-Esteem scale (Rosenberg, 1965) spanning the ages of approximately 13 to 32 for a sample of 451 participants. Results indicated that a completely stable trait factor and an autoregressive trait factor accounted for the majority of the variance in latent self-est...
Exploring Technostress: Results of a Large Sample Factor Analysis
Directory of Open Access Journals (Sweden)
Steponas Jonušauskas
2016-06-01
Full Text Available With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ answers, revealing technostress causes and consequences as well as technostress prevalence in the population in a statistically validated pattern. A key elements of technostress based on factor analysis can serve for the construction of technostress measurement scales in further research.
Culture, cross-role consistency, and adjustment: testing trait and cultural psychology perspectives.
Church, A Timothy; Anderson-Harumi, Cheryl A; del Prado, Alicia M; Curtis, Guy J; Tanaka-Matsumi, Junko; Valdez Medina, José L; Mastor, Khairul A; White, Fiona A; Miramontes, Lilia A; Katigbak, Marcia S
2008-09-01
Trait and cultural psychology perspectives on cross-role consistency and its relation to adjustment were examined in 2 individualistic cultures, the United States (N=231) and Australia (N=195), and 4 collectivistic cultures, Mexico (N=199), the Philippines (N=195), Malaysia (N=217), and Japan (N=180). Cross-role consistency in trait ratings was evident in all cultures, supporting trait perspectives. Cultural comparisons of mean consistency provided support for cultural psychology perspectives as applied to East Asian cultures (i.e., Japan) but not collectivistic cultures more generally. Some but not all of the hypothesized predictors of consistency were supported across cultures. Cross-role consistency predicted aspects of adjustment in all cultures, but prediction was most reliable in the U.S. sample and weakest in the Japanese sample. Alternative constructs proposed by cultural psychologists--personality coherence, social appraisal, and relationship harmony--predicted adjustment in all cultures but were not, as hypothesized, better predictors of adjustment in collectivistic cultures than in individualistic cultures.
Güleda Doğan
2017-01-01
This editorial is on statistical sampling, which is one of the most two important reasons for editorial rejection from our journal Turkish Librarianship. The stages of quantitative research, the stage in which we are sampling, the importance of sampling for a research, deciding on sample size and sampling methods are summarised briefly.
Consistency in color parameters of a commonly used shade guide.
Tashkandi, Esam
2010-01-01
The use of shade guides to assess the color of natural teeth subjectively remains one of the most common means for dental shade assessment. Any variation in the color parameters of the different shade guides may lead to significant clinical implications. Particularly, since the communication between the clinic and the dental laboratory is based on using the shade guide designation. The purpose of this study was to investigate the consistency of the L∗a∗b∗ color parameters of a sample of a commonly used shade guide. The color parameters of a total of 100 VITAPAN Classical Vacuum shade guide (VITA Zahnfabrik, Bad Säckingen, Germany(were measured using a X-Rite ColorEye 7000A Spectrophotometer (Grand Rapids, Michigan, USA). Each shade guide consists of 16 tabs with different designations. Each shade tab was measured five times and the average values were calculated. The ΔE between the average L∗a∗b∗ value for each shade tab and the average of the 100 shade tabs of the same designation was calculated. Using the Student t-test analysis, no significant differences were found among the measured sample. There is a high consistency level in terms of color parameters of the measured VITAPAN Classical Vacuum shade guide sample tested.
Personality consistency analysis in cloned quarantine dog candidates
Directory of Open Access Journals (Sweden)
Jin Choi
2017-01-01
Full Text Available In recent research, personality consistency has become an important characteristic. Diverse traits and human-animal interactions, in particular, are studied in the field of personality consistency in dogs. Here, we investigated the consistency of dominant behaviours in cloned and control groups followed by the modified Puppy Aptitude Test, which consists of ten subtests to ascertain the influence of genetic identity. In this test, puppies are exposed to stranger, restraint, prey-like object, noise, startling object, etc. Six cloned and four control puppies participated and the consistency of responses at ages 7–10 and 16 weeks in the two groups was compared. The two groups showed different consistencies in the subtests. While the average scores of the cloned group were consistent (P = 0.7991, those of the control group were not (P = 0.0089. Scores of Pack Drive and Fight or Flight Drive were consistent in the cloned group, however, those of the control group were not. Scores of Prey Drive were not consistent in either the cloned or the control group. Therefore, it is suggested that consistency of dominant behaviour is affected by genetic identity and some behaviours can be influenced more than others. Our results suggest that cloned dogs could show more consistent traits than non-cloned. This study implies that personality consistency could be one of the ways to analyse traits of puppies.
Personality and Situation Predictors of Consistent Eating Patterns.
Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K
2015-01-01
A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Personality and Situation Predictors of Consistent Eating Patterns.
Directory of Open Access Journals (Sweden)
Uku Vainik
Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S
2016-11-01
There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.
Consistency of Network Traffic Repositories: An Overview
Lastdrager, E.; Lastdrager, E.E.H.; Pras, Aiko
2009-01-01
Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for
A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples
Directory of Open Access Journals (Sweden)
Rígel Licier
2016-10-01
Full Text Available The proper handling of samples to be analyzed by mass spectrometry (MS can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.
A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples.
Licier, Rígel; Miranda, Eric; Serrano, Horacio
2016-10-17
The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.
Directory of Open Access Journals (Sweden)
Chris Patterson
Full Text Available Human biological samples (biosamples are increasingly important in diagnosing, treating and measuring the prevalence of illnesses. For the gay and bisexual population, biosample research is particularly important for measuring the prevalence of human immunodeficiency virus (HIV. By determining people's understandings of, and attitudes towards, the donation and use of biosamples, researchers can design studies to maximise acceptability and participation. In this study we examine gay and bisexual men's attitudes towards donating biosamples for HIV research. Semi-structured telephone interviews were conducted with 46 gay and bisexual men aged between 18 and 63 recruited in commercial gay scene venues in two Scottish cities. Interview transcripts were analysed thematically using the framework approach. Most men interviewed seemed to have given little prior consideration to the issues. Participants were largely supportive of donating tissue for medical research purposes, and often favourable towards samples being stored, reused and shared. Support was often conditional, with common concerns related to: informed consent; the protection of anonymity and confidentiality; the right to withdraw from research; and ownership of samples. Many participants were in favour of the storage and reuse of samples, but expressed concerns related to data security and potential misuse of samples, particularly by commercial organisations. The sensitivity of tissue collection varied between tissue types and collection contexts. Blood, urine, semen and bowel tissue were commonly identified as sensitive, and donating saliva and as unlikely to cause discomfort. To our knowledge, this is the first in-depth study of gay and bisexual men's attitudes towards donating biosamples for HIV research. While most men in this study were supportive of donating tissue for research, some clear areas of concern were identified. We suggest that these minority concerns should be accounted
Classification analysis of emotional appeals on sample Czech television commercials
Káčerková, Radka
2013-01-01
Abstract: The work deals with the possibilities of using emotional appeals in advertising. The main goal was classification and definition of emotions and emotional appeals with regard to marketing. The work focused on emotional appeals in Czech TV adverts and found out the way how emotional appeals are used in these adverts. Research question used in this work concerned the problem of which emmotional appeals are in adverts the most. Research sample consisting of 150 TV adverts was divided i...
Sarpkaya, Ruhi
2010-01-01
The aim of this research is to determine the factors affecting individual education demands at the entrance to university. The research is in survey model. The universe of the study consists of 1630 freshmen at the faculties and vocational schools of Adnan Menderes University, Aydin. 574 students from 7 schools were included in the sample. The…
Directory of Open Access Journals (Sweden)
M. M. Aligadjiev
2015-01-01
Full Text Available Aim. The paper discusses the improvement of methods of hydrobiological studies by modifying tools for plankton and benthic samples collecting. Methods. In order to improve the standard methods of hydro-biological research, we have developed tools for sampling zooplankton and benthic environment of the Caspian Sea. Results. Long-term practice of selecting hydrobiological samples in the Caspian Sea shows that it is required to complete the modernization of the sampling tools used to collect hydrobiological material. With the introduction of Azov and Black Sea invasive comb jelly named Mnemiopsis leidyi A. Agassiz to the Caspian Sea there is a need to collect plankton samples without disturbing its integrity. Tools for collecting benthic fauna do not always give a complete picture of the state of benthic ecosystems because of the lack of visual site selection for sampling. Moreover, while sampling by dredge there is a probable loss of the samples, especially in areas with difficult terrain. Conclusion. We propose to modify a small model of Upstein net (applied in shallow water to collect zooplankton samples with an upper inverted cone that will significantly improve the catchability of the net in theCaspian Sea. Bottom sampler can be improved by installing a video camera for visual inspection of the bottom topography, and use sensors to determine tilt of the dredge and the position of the valves of the bucket.
[A comparison of convenience sampling and purposive sampling].
Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien
2014-06-01
Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.
Yiadom, Maame Yaa A B; Scheulen, James; McWade, Conor M; Augustine, James J
2016-07-01
The objective was to obtain a commitment to adopt a common set of definitions for emergency department (ED) demographic, clinical process, and performance metrics among the ED Benchmarking Alliance (EDBA), ED Operations Study Group (EDOSG), and Academy of Academic Administrators of Emergency Medicine (AAAEM) by 2017. A retrospective cross-sectional analysis of available data from three ED operations benchmarking organizations supported a negotiation to use a set of common metrics with identical definitions. During a 1.5-day meeting-structured according to social change theories of information exchange, self-interest, and interdependence-common definitions were identified and negotiated using the EDBA's published definitions as a start for discussion. Methods of process analysis theory were used in the 8 weeks following the meeting to achieve official consensus on definitions. These two lists were submitted to the organizations' leadership for implementation approval. A total of 374 unique measures were identified, of which 57 (15%) were shared by at least two organizations. Fourteen (4%) were common to all three organizations. In addition to agreement on definitions for the 14 measures used by all three organizations, agreement was reached on universal definitions for 17 of the 57 measures shared by at least two organizations. The negotiation outcome was a list of 31 measures with universal definitions to be adopted by each organization by 2017. The use of negotiation, social change, and process analysis theories achieved the adoption of universal definitions among the EDBA, EDOSG, and AAAEM. This will impact performance benchmarking for nearly half of US EDs. It initiates a formal commitment to utilize standardized metrics, and it transitions consistency in reporting ED operations metrics from consensus to implementation. This work advances our ability to more accurately characterize variation in ED care delivery models, resource utilization, and performance. In
Draper, D. S.
2016-01-01
NASA Johnson Space Center's (JSC's) Astromaterials Research and Exploration Science (ARES) Division, part of the Exploration Integration and Science Directorate, houses a unique combination of laboratories and other assets for conducting cutting edge planetary research. These facilities have been accessed for decades by outside scientists, most at no cost and on an informal basis. ARES has thus provided substantial leverage to many past and ongoing science projects at the national and international level. Here we propose to formalize that support via an ARES/JSC Plane-tary Sample Analysis and Mission Science Laboratory (PSAMS Lab). We maintain three major research capa-bilities: astromaterial sample analysis, planetary process simulation, and robotic-mission analog research. ARES scientists also support planning for eventual human ex-ploration missions, including astronaut geological training. We outline our facility's capabilities and its potential service to the community at large which, taken together with longstanding ARES experience and expertise in curation and in applied mission science, enable multi-disciplinary planetary research possible at no other institution. Comprehensive campaigns incorporating sample data, experimental constraints, and mission science data can be conducted under one roof.
Tsujimura-Ito, Takako; Inoue, Yusuke; Muto, Kaori; Yoshida, Ken-Ichi
2017-04-01
Background Leftover samples obtained during autopsies are extremely important basic materials for forensic research. However, there are no established practices for research-related use of obtained samples. Objective This study discusses good practice for the secondary use of samples collected during medicolegal autopsies. Methods A questionnaire was posted to all 76 departments of forensic medicine performing medicolegal autopsies in Japan, and 48 responses were received (response rate: 63.2%). As a secondary analysis, we surveyed information provided on department websites. Results Ethical reviews conducted when samples were to be used for research varied greatly among departments, with 21 (43.8%) departments reporting 'fundamentally, all cases are subject to review', eight (16.7%) reporting 'only some are subject to review' and 17 (39.6%) reporting 'none are subject to review'. Information made available on websites indicated that 11 departments had a statement of some type to bereaved families about the potential research use of human samples obtained during autopsies. Nine of these included a notice stating that bereaved families may revoke their consent for use. Several departments used an opt-out system. Conclusion There is no common practice in the field of legal medicine on the ethical use for medical research of leftover samples from medicolegal autopsies. The trust of not only bereaved families but also society in general is required for the scientific validity and social benefits of medical studies using leftover samples from medicolegal autopsies through the use of opt-out consenting and offline and online dissemination and public-relations activities.
Self-consistent Bayesian analysis of space-time symmetry studies
International Nuclear Information System (INIS)
Davis, E.D.
1996-01-01
We introduce a Bayesian method for the analysis of epithermal neutron transmission data on space-time symmetries in which unique assignment of the prior is achieved by maximisation of the cross entropy and the imposition of a self-consistency criterion. Unlike the maximum likelihood method used in previous analyses of parity-violation data, our method is freed of an ad hoc cutoff parameter. Monte Carlo studies indicate that our self-consistent Bayesian analysis is superior to the maximum likelihood method when applied to the small data samples typical of symmetry studies. (orig.)
MultiSIMNRA: A computational tool for self-consistent ion beam analysis using SIMNRA
International Nuclear Information System (INIS)
Silva, T.F.; Rodrigues, C.L.; Mayer, M.; Moro, M.V.; Trindade, G.F.; Aguirre, F.R.; Added, N.; Rizzutto, M.A.; Tabacniks, M.H.
2016-01-01
Highlights: • MultiSIMNRA enables the self-consistent analysis of multiple ion beam techniques. • Self-consistent analysis enables unequivocal and reliable modeling of the sample. • Four different computational algorithms available for model optimizations. • Definition of constraints enables to include prior knowledge into the analysis. - Abstract: SIMNRA is widely adopted by the scientific community of ion beam analysis for the simulation and interpretation of nuclear scattering techniques for material characterization. Taking advantage of its recognized reliability and quality of the simulations, we developed a computer program that uses multiple parallel sessions of SIMNRA to perform self-consistent analysis of data obtained by different ion beam techniques or in different experimental conditions of a given sample. In this paper, we present a result using MultiSIMNRA for a self-consistent multi-elemental analysis of a thin film produced by magnetron sputtering. The results demonstrate the potentialities of the self-consistent analysis and its feasibility using MultiSIMNRA.
Directory of Open Access Journals (Sweden)
Mohie Eldin Elmashad
2016-08-01
Full Text Available This paper presents the results of an experimental investigation performed to quantify the effect of mixing clayey soils with saltwater on consistency and swelling characteristics of clays. Massive natural clay deposits and compacted clay backfills either exist or are used in certain important and sensitive applications such as dams, liners, barriers and buffers in waste disposal facilities. In many cases, the clay deposits in these applications are subjected to saltwater. However, in standard laboratory classification tests, distilled or potable water are usually used in mixing test samples. This may lead to faulty interpretation of the actual in-situ consistency and volume change behaviors. In this research, an attempt is made to quantify the changes in consistency and swelling of clay soils from various locations around the Nile valley and possessing a wide range of consistency, when mixed with natural seawater with different salt concentrations. The results showed that the increase of the salt concentration of the mixing water may result in major decrease in the liquid limit and swelling characteristics of high plasticity montmorillonite clays. The reduction in the swelling of the clay soils is also proportional to the rate of saltwater infiltration. In an attempt to correlate the swelling of clays to the rate of water infiltration, a new simplified laboratory apparatus is proposed where swelling and infiltration are measured in one simple test “the swelling infiltrometer”.
Water born pollutants sampling using porous suction samples
International Nuclear Information System (INIS)
Baig, M.A.
1997-01-01
The common standard method of sampling water born pollutants in the vadoze zone is core sampling and it is followed by extraction of pore fluid. This method does not allow sampling at the same location next time and again later on. There is an alternative approach for sampling fluids (water born pollutants) from both saturated and unsaturated regions of vadose zone using porous suction samplers. There are three types of porous suction samplers, vacuum-operated, pressure-vacuum lysimeters, high pressure vacuum samples. The suction samples are operated in the range of 0-70 centi bars and usually consist of ceramic and polytetrafluorethylene (PTFE). The operation range of PTFE is higher than ceramic cups. These samplers are well suited for in situ and repeated sampling form the same location. This paper discusses the physical properties and operating condition of such samplers to the utilized under our environmental sampling. (author)
Methodology Series Module 5: Sampling Strategies.
Setia, Maninder Singh
2016-01-01
Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.
Methodology series module 5: Sampling strategies
Directory of Open Access Journals (Sweden)
Maninder Singh Setia
2016-01-01
Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.
Krause, Marita; Irwin, Judith; Wiegert, Theresa; Miskolczi, Arpad; Damas-Segovia, Ancor; Beck, Rainer; Li, Jiang-Tao; Heald, George; Müller, Peter; Stein, Yelena; Rand, Richard J.; Heesen, Volker; Walterbos, Rene A. M.; Dettmar, Ralf-Jürgen; Vargas, Carlos J.; English, Jayanne; Murphy, Eric J.
2018-03-01
Aim. The vertical halo scale height is a crucial parameter to understand the transport of cosmic-ray electrons (CRE) and their energy loss mechanisms in spiral galaxies. Until now, the radio scale height could only be determined for a few edge-on galaxies because of missing sensitivity at high resolution. Methods: We developed a sophisticated method for the scale height determination of edge-on galaxies. With this we determined the scale heights and radial scale lengths for a sample of 13 galaxies from the CHANG-ES radio continuum survey in two frequency bands. Results: The sample average values for the radio scale heights of the halo are 1.1 ± 0.3 kpc in C-band and 1.4 ± 0.7 kpc in L-band. From the frequency dependence analysis of the halo scale heights we found that the wind velocities (estimated using the adiabatic loss time) are above the escape velocity. We found that the halo scale heights increase linearly with the radio diameters. In order to exclude the diameter dependence, we defined a normalized scale height h˜ which is quite similar for all sample galaxies at both frequency bands and does not depend on the star formation rate or the magnetic field strength. However, h˜ shows a tight anticorrelation with the mass surface density. Conclusions: The sample galaxies with smaller scale lengths are more spherical in the radio emission, while those with larger scale lengths are flatter. The radio scale height depends mainly on the radio diameter of the galaxy. The sample galaxies are consistent with an escape-dominated radio halo with convective cosmic ray propagation, indicating that galactic winds are a widespread phenomenon in spiral galaxies. While a higher star formation rate or star formation surface density does not lead to a higher wind velocity, we found for the first time observational evidence of a gravitational deceleration of CRE outflow, e.g. a lowering of the wind velocity from the galactic disk.
Hayroyan, H. S.; Hayroyan, S. H.; Karapetyan, K. A.
2018-04-01
In this paper, three types of clayish soils with different consistency and humidity properties and slip-slide resistance indexes are considered on impact of different cyclic shear stresses. The side-surface deformation charts are constructed on the basis of experimental data obtained testing cylindrical soil samples. It is shown that the fluctuation amplitude depends on time and the consistency index depends on the humidity condition in the soil inner contact and the connectivity coefficients. Consequently, each experiment is interpreted. The main result of this research is that it is necessary to make corrections in the currently active schemes of slip-hazardous slopes stability estimation, which is a crucial problem requiring ASAP solution.
Methodology Series Module 5: Sampling Strategies
Setia, Maninder Singh
2016-01-01
Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438
Sampling knowledge: the hermeneutics of snowball sampling in qualitative research
Noy, Chaim
2008-01-01
During the past two decades we have witnessed a rather impressive growth of theoretical innovations and conceptual revisions of epistemological and methodological approaches within constructivist-qualitative quarters of the social sciences. Methodological discussions have commonly addressed a variety of methods for collecting and analyzing empirical material, yet the critical grounds upon which these were reformulated have rarely been extended to embrace sampling concepts and procedures. The ...
Research on pathogens at Great Lakes beaches: sampling, influential factors, and potential sources
,
2013-01-01
The overall mission of this work is to provide science-based information and methods that will allow beach managers to more accurately make beach closure and advisory decisions, understand the sources and physical processes affecting beach contaminants, and understand how science-based information can be used to mitigate and restore beaches and protect the public. The U.S. Geological Survey (USGS), in collaboration with many Federal, State, and local agencies and universities, has conducted research on beach health issues in the Great Lakes Region for more than a decade. The work consists of four science elements that align with the USGS Beach Health Initiative Mission: real-time assessments of water quality; coastal processes; pathogens and source tracking; and data analysis, interpretation, and communication. The ongoing or completed research for the pathogens and source tracking topic is described in this fact sheet.
[Practical aspects regarding sample size in clinical research].
Vega Ramos, B; Peraza Yanes, O; Herrera Correa, G; Saldívar Toraya, S
1996-01-01
The knowledge of the right sample size let us to be sure if the published results in medical papers had a suitable design and a proper conclusion according to the statistics analysis. To estimate the sample size we must consider the type I error, type II error, variance, the size of the effect, significance and power of the test. To decide what kind of mathematics formula will be used, we must define what kind of study we have, it means if its a prevalence study, a means values one or a comparative one. In this paper we explain some basic topics of statistics and we describe four simple samples of estimation of sample size.
The Effect of Asymmetrical Sample Training on Retention Functions for Hedonic Samples in Rats
Simmons, Sabrina; Santi, Angelo
2012-01-01
Rats were trained in a symbolic delayed matching-to-sample task to discriminate sample stimuli that consisted of the presence of food or the absence of food. Asymmetrical sample training was provided in which one group was initially trained with only the food sample and the other group was initially trained with only the no-food sample. In…
Time-consistent and market-consistent evaluations
Pelsser, A.; Stadje, M.A.
2014-01-01
We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Miner, Michael H; Bockting, Walter O; Romine, Rebecca Swinburne; Raman, Sivakumaran
2012-05-01
Health research on transgender people has been hampered by the challenges inherent in studying a hard-to-reach, relatively small, and geographically dispersed population. The Internet has the potential to facilitate access to transgender samples large enough to permit examination of the diversity and syndemic health disparities found among this population. In this article, we describe the experiences of a team of investigators using the Internet to study HIV risk behaviors of transgender people in the United States. We developed an online instrument, recruited participants exclusively via websites frequented by members of the target population, and collected data using online quantitative survey and qualitative synchronous and asynchronous interview methods. Our experiences indicate that the Internet environment presents the investigator with some unique challenges and that commonly expressed criticisms about Internet research (e.g., lack of generalizable samples, invalid study participants, and multiple participation by the same subject) can be overcome with careful method design, usability testing, and pilot testing. The importance of both usability and pilot testing are described with respect to participant engagement and retention and the quality of data obtained online.
Fisher, W. P., Jr.; Petry, P.
2016-11-01
Many published research studies document item calibration invariance across samples using Rasch's probabilistic models for measurement. A new approach to outcomes evaluation for very small samples was employed for two workshop series focused on stress reduction and joyful living conducted for health system employees and caregivers since 2012. Rasch-calibrated self-report instruments measuring depression, anxiety and stress, and the joyful living effects of mindfulness behaviors were identified in peer-reviewed journal articles. Items from one instrument were modified for use with a US population, other items were simplified, and some new items were written. Participants provided ratings of their depression, anxiety and stress, and the effects of their mindfulness behaviors before and after each workshop series. The numbers of participants providing both pre- and post-workshop data were low (16 and 14). Analysis of these small data sets produce results showing that, with some exceptions, the item hierarchies defining the constructs retained the same invariant profiles they had exhibited in the published research (correlations (not disattenuated) range from 0.85 to 0.96). In addition, comparisons of the pre- and post-workshop measures for the three constructs showed substantively and statistically significant changes. Implications for program evaluation comparisons, quality improvement efforts, and the organization of communications concerning outcomes in clinical fields are explored.
Tindana, Paulina; Molyneux, Catherine S; Bull, Susan; Parker, Michael
2014-10-18
For many decades, access to human biological samples, such as cells, tissues, organs, blood, and sub-cellular materials such as DNA, for use in biomedical research, has been central in understanding the nature and transmission of diseases across the globe. However, the limitations of current ethical and regulatory frameworks in sub-Saharan Africa to govern the collection, export, storage and reuse of these samples have resulted in inconsistencies in practice and a number of ethical concerns for sample donors, researchers and research ethics committees. This paper examines stakeholders' perspectives of and responses to the ethical issues arising from these research practices. We employed a qualitative strategy of inquiry for this research including in-depth interviews and focus group discussions with key research stakeholders in Kenya (Nairobi and Kilifi), and Ghana (Accra and Navrongo). The stakeholders interviewed emphasised the compelling scientific importance of sample export, storage and reuse, and acknowledged the existence of some structures governing these research practices, but they also highlighted the pressing need for a number of practical ethical concerns to be addressed in order to ensure high standards of practice and to maintain public confidence in international research collaborations. These concerns relate to obtaining culturally appropriate consent for sample export and reuse, understanding cultural sensitivities around the use of blood samples, facilitating a degree of local control of samples and sustainable scientific capacity building. Drawing on these findings and existing literature, we argue that the ethical issues arising in practice need to be understood in the context of the interactions between host research institutions and local communities and between collaborating institutions. We propose a set of 'key points-to-consider' for research institutions, ethics committees and funding agencies to address these issues.
DNA Qualification Workflow for Next Generation Sequencing of Histopathological Samples
Simbolo, Michele; Gottardi, Marisa; Corbo, Vincenzo; Fassan, Matteo; Mafficini, Andrea; Malpeli, Giorgio; Lawlor, Rita T.; Scarpa, Aldo
2013-01-01
Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA) and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR) was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF) tissues, 6 formalin-fixed paraffin-embedded (FFPE) tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard workflow for
DNA qualification workflow for next generation sequencing of histopathological samples.
Directory of Open Access Journals (Sweden)
Michele Simbolo
Full Text Available Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF tissues, 6 formalin-fixed paraffin-embedded (FFPE tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard
Device for sampling liquid radioactive materials
International Nuclear Information System (INIS)
Vlasak, L.
1987-01-01
Remote sampling of radioactive materials in the process of radioactive waste treatment is claimed by the Czechoslovak Patent Document 238599. The existing difficulties are eliminated consisting in a complex remote control of sampling featuring the control of sliding and rotary movements of the sampling device. The new device consists of a vertical pipe with an opening provided with a cover. A bend is provided above the opening level housing flow distributors. A sampling tray is pivoted in the cover. In sampling, the tray is tilted in the vertical pipe space while it tilts back when filled. The sample flows into a vessel below the tray. Only rotary movement is thus sufficient for controlling the tray. (Z.M.)
Measuring consistency in translation memories: a mixed-methods case study
Moorkens, Joss
2012-01-01
Introduced in the early 1990s, translation memory (TM) tools have since become widely used as an aid to human translation based on commonly‐held assumptions that they save time, reduce cost, and maximise consistency. The purpose of this research is twofold: it aims to develop a method for measuring consistency in TMs; and it aims to use this method to interrogate selected TMs from the localisation industry in order to find out whether the use of TM tools does, in fact, promote consistency in ...
Macroscopic self-consistent model for external-reflection near-field microscopy
International Nuclear Information System (INIS)
Berntsen, S.; Bozhevolnaya, E.; Bozhevolnyi, S.
1993-01-01
The self-consistent macroscopic approach based on the Maxwell equations in two-dimensional geometry is developed to describe tip-surface interaction in external-reflection near-field microscopy. The problem is reduced to a single one-dimensional integral equation in terms of the Fourier components of the field at the plane of the sample surface. This equation is extended to take into account a pointlike scatterer placed on the sample surface. The power of light propagating toward the detector as the fiber mode is expressed by using the self-consistent field at the tip surface. Numerical results for trapezium-shaped tips are presented. The authors show that the sharper tip and the more confined fiber mode result in better resolution of the near-field microscope. Moreover, it is found that the tip-surface distance should not be too small so that better resolution is ensured. 14 refs., 10 figs
Self-consistent T-matrix theory of superconductivity
Czech Academy of Sciences Publication Activity Database
Šopík, B.; Lipavský, Pavel; Männel, M.; Morawetz, K.; Matlock, P.
2011-01-01
Roč. 84, č. 9 (2011), 094529/1-094529/13 ISSN 1098-0121 R&D Projects: GA ČR GAP204/10/0212; GA ČR(CZ) GAP204/11/0015 Institutional research plan: CEZ:AV0Z10100521 Keywords : superconductivity * T-matrix * superconducting gap * restricted self-consistency Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 3.691, year: 2011
Measuring consistency of autobiographical memory recall in depression.
LENUS (Irish Health Repository)
Semkovska, Maria
2012-05-15
Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.
International Nuclear Information System (INIS)
Wu, Jianrong; Xiao, Deli; Peng, Jun; Wang, Cuixia; Zhang, Chan; He, Jia; Zhao, Hongyan; He, Hua
2015-01-01
We describe a single-step solvothermal method for the preparation of nanocomposites consisting of graphene oxide and Fe 3 O 4 nanoparticles (GO/Fe 3 O 4 ). This material is shown to be useful as a magnetic sorbent for the extraction of flavonoids from green tea, red wine, and urine samples. The nanocomposite is taking advantage of the high surface area of GO and the magnetic phase separation feature of the magnetic sorbent. The nanocomposite is recyclable and was applied to the extraction of flavonoids prior to their determination by HPLC. The effects of amount of surfactant, pH value of the sample solution, extraction time, and desorption condition on the extraction efficiency, and the regeneration conditions were optimized. The limits of detection for luteolin, quercetin and kaempferol range from 0.2 to 0.5 ng∙ mL −1 in urine, from 3.0 to 6.0 ng∙mL −1 in green tea, and from 1.0 to 2.5 ng∙mL −1 in red wine. The recoveries are between 82.0 and 101.4 %, with relative standard deviations of <9.3 %. (author)
Changes in pectins and product consistency during the concentration of tomato juice to paste.
Anthon, Gordon E; Diaz, Jerome V; Barrett, Diane M
2008-08-27
Concentrating tomato juice to paste during the tomato season allows for preservation and long-term storage, but subsequent dilution for formulation of value-added products is known to result in a loss of consistency. To understand the reasons for this, samples of unconcentrated juice, processing intermediates, and concentrated paste were collected from an industrial processing plant during normal commercial production. All samples were diluted with water to 5 degrees Brix and then analyzed for consistency and pectin content. Whole juice consistency, measured with a Bostwick consistometer, decreased through the course of juice concentration, with the largest change occurring early in the process, as the juice was concentrated from 5 to 10 degrees Brix. This decrease in consistency occurred during the production of paste from both hot- and cold-break juices. The change in Bostwick value was correlated with a decrease in the precipitate weight ratio. The loss of consistency during commercial processing was not the direct result of water removal because a sample of this same 5 degrees Brix juice could be concentrated 2-fold in a vacuum oven and then diluted back to 5 degrees Brix with no change in consistency or precipitate ratio. Total pectin content did not change as the juice was concentrated to paste, but the proportion of the total pectin that was water soluble increased. The greatest increases in pectin solubility occurred during the hot break and late in the process where the evaporator temperature was the highest.
Freeden, Willi; Schreiner, Michael
2018-01-01
This book presents, in a consistent and unified overview, results and developments in the field of today´s spherical sampling, particularly arising in mathematical geosciences. Although the book often refers to original contributions, the authors made them accessible to (graduate) students and scientists not only from mathematics but also from geosciences and geoengineering. Building a library of topics in spherical sampling theory it shows how advances in this theory lead to new discoveries in mathematical, geodetic, geophysical as well as other scientific branches like neuro-medicine. A must-to-read for everybody working in the area of spherical sampling.
2010-10-01
... 45 Public Welfare 4 2010-10-01 2010-10-01 false Sampling. 1356.84 Section 1356.84 Public Welfare....84 Sampling. (a) The State agency may collect and report the information required in section 1356.83(e) of this part on a sample of the baseline population consistent with the sampling requirements...
Experience-Sampling Research Methods and Their Potential for Education Research
Zirkel, Sabrina; Garcia, Julie A.; Murphy, Mary C.
2015-01-01
Experience-sampling methods (ESM) enable us to learn about individuals' lives in context by measuring participants' feelings, thoughts, actions, context, and/or activities as they go about their daily lives. By capturing experience, affect, and action "in the moment" and with repeated measures, ESM approaches allow researchers…
Cortés, M Alicia; Irrazábal, Emanuel; García-Jerez, Andrea; Bohórquez-Magro, Lourdes; Luengo, Alicia; Ortiz-Arduán, Alberto; Calleros, Laura; Rodríguez-Puyol, Manuel
2014-01-01
Biobank certification ISO 9001:2008 aims to improve the management of processes performed. This has two objectives: customer satisfaction and continuous improvement. This paper presents the impact of certification ISO 9001:2008 on the sample transfer process in a Spanish biobank specialising in kidney patient samples. The biobank experienced a large increase in the number of samples between 2009 (12,582 vials) and 2010 (37,042 vials). The biobank of the Spanish Renal Research Network (REDinREN), located at the University of Alcalá, has implemented ISO standard 9001:2008 for the effective management of human material given to research centres. Using surveys, we analysed two periods in the “sample transfer” process. During the first period between 1-10-12 and 26-11-12 (8 weeks), minimal changes were made to correct isolated errors. In the second period, between 7-01-13 and 18-02-13 (6 weeks), we carried out general corrective actions. The identification of problems and implementation of corrective actions for certification allowed: a 70% reduction in the process execution time, a significant increase (200%) in the number of samples processed and a 25% improvement in the process. The increase in the number of samples processed was directly related to process improvement. The certification of ISO standard 9001:2008, obtained in July 2013, allowed an improvement of the REDinREN biobank processes to be achieved, which increased quality and customer satisfaction.
Take It or Leave It: Students' Attitudes about Research Methods
Wisecup, Allison K.
2017-01-01
This study employs a cross-sectional design to explore sociology majors' attitudes toward research methods. Survey data from a convenience sample of students enrolled in 16 departments are used to compare the attitudes of students who have and have not completed a research methods course. Despite consistent anecdotal claims that students harbor…
Present status of NMCC and sample preparation method for bio-samples
International Nuclear Information System (INIS)
Futatsugawa, S.; Hatakeyama, S.; Saitou, S.; Sera, K.
1993-01-01
In NMCC(Nishina Memorial Cyclotron Center) we are doing researches on PET of nuclear medicine (Positron Emission Computed Tomography) and PIXE analysis (Particle Induced X-ray Emission) using a small cyclotron of compactly designed. The NMCC facilities have been opened to researchers of other institutions since April 1993. The present status of NMCC is described. Bio-samples (medical samples, plants, animals and environmental samples) have mainly been analyzed by PIXE in NMCC. Small amounts of bio-samples for PIXE are decomposed quickly and easily in a sealed PTFE (polytetrafluoroethylene) vessel with a microwave oven. This sample preparation method of bio-samples also is described. (author)
Evidence for Consistency of the Glycation Gap in Diabetes
Nayak, Ananth U.; Holland, Martin R.; Macdonald, David R.; Nevill, Alan; Singh, Baldev M.
2011-01-01
OBJECTIVE Discordance between HbA1c and fructosamine estimations in the assessment of glycemia is often encountered. A number of mechanisms might explain such discordance, but whether it is consistent is uncertain. This study aims to coanalyze paired glycosylated hemoglobin (HbA1c)-fructosamine estimations by using fructosamine to determine a predicted HbA1c, to calculate a glycation gap (G-gap) and to determine whether the G-gap is consistent over time. RESEARCH DESIGN AND METHODS We include...
Miner, Michael H.; Bockting, Walter O.; Romine, Rebecca Swinburne; Raman, Sivakumaran
2013-01-01
Health research on transgender people has been hampered by the challenges inherent in studying a hard-to-reach, relatively small, and geographically dispersed population. The Internet has the potential to facilitate access to transgender samples large enough to permit examination of the diversity and syndemic health disparities found among this population. In this article, we describe the experiences of a team of investigators using the Internet to study HIV risk behaviors of transgender people in the United States. We developed an online instrument, recruited participants exclusively via websites frequented by members of the target population, and collected data using online quantitative survey and qualitative synchronous and asynchronous interview methods. Our experiences indicate that the Internet environment presents the investigator with some unique challenges and that commonly expressed criticisms about Internet research (e.g., lack of generalizable samples, invalid study participants, and multiple participation by the same subject) can be overcome with careful method design, usability testing, and pilot testing. The importance of both usability and pilot testing are described with respect to participant engagement and retention and the quality of data obtained online. PMID:24031157
Eckberg, Deborah A.
2015-01-01
This study explores race as a potential predictor of research methods anxiety among a sample of undergraduates. While differences in academic achievement based on race and ethnicity have been well documented, few studies have examined racial differences in anxiety with regard to specific subject matter in undergraduate curricula. This exploratory…
Sludge characterization: the role of physical consistency
Energy Technology Data Exchange (ETDEWEB)
Spinosa, Ludovico; Wichmann, Knut
2003-07-01
The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)
Hart, Shelley R; Musci, Rashelle J; Ialongo, Nicholas; Ballard, Elizabeth D; Wilcox, Holly C
2013-10-01
Within the context of the recent release of the 2012 National Suicide Prevention Strategy, and as the third leading cause of death for individuals 10- to 24-years-old, suicide prevention is a national priority. A consistently reported and robust risk factor for suicide is a prior suicide attempt; however few studies have investigated the consistency of self-reported lifetime suicide attempts. The goal of this study is to describe the prevalence and characteristics of inconsistent reporting of suicide attempt in a longitudinal cohort of participants annually assessed in 12 waves of data collected from middle school (age 12) to early adulthood (age 22). Among this cohort (n = 678), we compared those who consistently, inconsistently, and never reported a suicide attempt according to demographic and clinical variables. Almost 90% (88.5%) of our sample inconsistently reported a lifetime suicide attempt. Consistent and inconsistent reporters of lifetime suicide attempt did not differ on demographic or clinical variables with the exception of higher rates of lifetime suicidal ideation among consistent reporters (P adolescents. Inconsistent and consistent reporters of suicide attempt differ on few demographic or clinical variables; further prospective research should investigate the reasons for inconsistent reporting as well as the validity and stability of reporting in predicting future suicidal behavior. © 2013 Wiley Periodicals, Inc.
Methodology Series Module 5: Sampling Strategies
Setia, Maninder Singh
2016-01-01
Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ? Sampling Method?. There are essentially two types of sampling methods: 1) probability sampling ? based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling ? based on researcher's choice, population that accessible & available. Some of the non-probabilit...
Measuring consistency of autobiographical memory recall in depression.
Semkovska, Maria; Noone, Martha; Carton, Mary; McLoughlin, Declan M
2012-05-15
Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Thompson, Steven K
2012-01-01
Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat
Meta-Analysis of Inquiry-Based Instruction Research
Hasanah, N.; Prasetyo, A. P. B.; Rudyatmi, E.
2017-04-01
Inquiry-based instruction in biology has been the focus of educational research conducted by Unnes biology department students in collaboration with their university supervisors. This study aimed to describe the methodological aspects, inquiry teaching methods critically, and to analyse the results claims, of the selected four student research reports, grounded in inquiry, based on the database of Unnes biology department 2014. Four experimental quantitative research of 16 were selected as research objects by purposive sampling technique. Data collected through documentation study was qualitatively analysed regarding methods used, quality of inquiry syntax, and finding claims. Findings showed that the student research was still the lack of relevant aspects of research methodology, namely in appropriate sampling procedures, limited validity tests of all research instruments, and the limited parametric statistic (t-test) not supported previously by data normality tests. Their consistent inquiry syntax supported the four mini-thesis claims that inquiry-based teaching influenced their dependent variables significantly. In other words, the findings indicated that positive claims of the research results were not fully supported by good research methods, and well-defined inquiry procedures implementation.
Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.
Directory of Open Access Journals (Sweden)
Lauren Hund
Full Text Available Lot quality assurance sampling (LQAS surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.
Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.
Hund, Lauren; Bedrick, Edward J; Pagano, Marcello
2015-01-01
Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.
Evaluating Temporal Consistency in Marine Biodiversity Hotspots
Piacenza, Susan E.; Thurman, Lindsey L.; Barner, Allison K.; Benkwitt, Cassandra E.; Boersma, Kate S.; Cerny-Chipman, Elizabeth B.; Ingeman, Kurt E.; Kindinger, Tye L.; Lindsley, Amy J.; Nelson, Jake; Reimer, Jessica N.; Rowe, Jennifer C.; Shen, Chenchen; Thompson, Kevin A.; Heppell, Selina S.
2015-01-01
With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monito...
Meyerson, Paul; Tryon, Warren W
2003-11-01
This study evaluated the psychometric equivalency of Web-based research. The Sexual Boredom Scale was presented via the World-Wide Web along with five additional scales used to validate it. A subset of 533 participants that matched a previously published sample (Watt & Ewing, 1996) on age, gender, and race was identified. An 8 x 8 correlation matrix from the matched Internet sample was compared via structural equation modeling with a similar 8 x 8 correlation matrix from the previously published study. The Internet and previously published samples were psychometrically equivalent. Coefficient alpha values calculated on the matched Internet sample yielded reliability coefficients almost identical to those for the previously published sample. Factors such as computer administration and uncontrollable administration settings did not appear to affect the results. Demographic data indicated an overrepresentation of males by about 6% and Caucasians by about 13% relative to the U.S. Census (2000). A total of 2,230 participants were obtained in about 8 months without remuneration. These results suggest that data collection on the Web is (1) reliable, (2) valid, (3) reasonably representative, (4) cost effective, and (5) efficient.
International Nuclear Information System (INIS)
Sera, Koichiro
2003-01-01
Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)
Acceptance sampling using judgmental and randomly selected samples
Energy Technology Data Exchange (ETDEWEB)
Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl
2010-09-01
We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.
Research recruitment: A marketing framework to improve sample representativeness in health research.
Howcutt, Sarah J; Barnett, Anna L; Barbosa-Boucas, Sofia; Smith, Lesley A
2018-04-01
This discussion paper proposes a five-part theoretical framework to inform recruitment strategies. The framework is based on a marketing model of consumer decision-making. Respondents in surveys are typically healthier than non-respondents, which has an impact on the availability of information about those most in need. Previous research has identified response patterns, provided theories about why people participate in research and evaluated different recruitment strategies. Social marketing has been applied successfully to recruitment and promotes focus on the needs of the participant, but little attention has been paid to the periods before and after participant-researcher contact (during advertising and following completion of studies). We propose a new model which conceptualises participation as a decision involving motivation, perception of information, attitude formation, integration of intention and action and finally evaluation and sharing of experience. Discussion paper. This discussion paper presents a critical review. No literature was excluded on date and the included citations span the years 1981-2017. The proposed framework suggests that researchers could engage a broader demographic if they shape research design and advertising to perform functions that participants are seeking to achieve. The framework provides a novel and useful conceptualisation of recruitment which could help to inform public engagement in research design, researcher training and research policy. This framework challenges researchers to investigate the goals of the potential participants when designing a study's advertising and procedures. © 2017 John Wiley & Sons Ltd.
Consistency of Teacher-Reported Problems for Students in 21 Countries
Rescorla, Leslie A.; Achenbach, Thomas M.; Ginzburg, Sofia; Ivanova, Masha; Dumenci, Levent; Almqvist, Fredrik; Bathiche, Marie; Bilenberg, Niels; Bird, Hector; Domuta, Anca; Erol, Nese; Fombonne, Eric; Fonseca, Antonio; Frigerio, Alessandra; Kanbayashi, Yasuko; Lambert, Michael C.; Liu, Xianchen; Leung, Patrick; Minaei, Asghar; Roussos, Alexandra; Simsek, Zeynep; Weintraub, Sheila; Weisz, John; Wolanczyk, Tomasz; Zubrick, Stephen R.; Zukauskiene, Rita; Verhulst, Frank
2007-01-01
This study compared teachers' ratings of behavioral and emotional problems on the Teacher's Report Form for general population samples in 21 countries (N = 30,957). Correlations between internal consistency coefficients in different countries averaged 0.90. Effects of country on scale scores ranged from 3% to 13%. Gender effects ranged from less…
Sample container for neutron activation analysis
International Nuclear Information System (INIS)
Lersmacher, B.; Verheijke, M.L.; Jaspers, H.J.
1983-01-01
The sample container avoids contaminating the sample substance by diffusion of foreign matter from the wall of the sample container into the sample. It cannot be activated, so that the results of measurements are not falsified by a radioactive container wall. It consists of solid carbon. (orig./HP) [de
The consistency assessment of topological relations in cartographic generalization
Zheng, Chunyan; Guo, Qingsheng; Du, Xiaochu
2006-10-01
The field of research in the generalization assessment has been less studied than the generalization process itself, and it is very important to keep topological relation consistency for meeting generalization quality. This paper proposes a methodology to assess the quality of generalized map from topological relations consistency. Taking roads (including railway) and residential areas for examples, from the viewpoint of the spatial cognition, some issues about topological consistency in different map scales are analyzed. The statistic information about the inconsistent topological relations can be obtained by comparing the two matrices: one is the matrix for the topological relations in the generalized map; the other is the theoretical matrix for the topological relations that should be maintained after generalization. Based on the fuzzy set theory and the classification of map object types, the consistency evaluation model of topological relations is established. The paper proves the feasibility of the method through the example about how to evaluate the local topological relations between simple roads and residential area finally.
International Nuclear Information System (INIS)
Heldal, H.E.
2010-01-01
Full text: The Institute of Marine Research (IMR) is an important contributor to the Norwegian marine monitoring programme RAME (Radioactivity in the Marine Environment). RAME is funded by the Ministry of the Environment and coordinated by the Norwegian Radiation Protection Authority (NRPA). Sample collection is performed from IMRs research vessels in the open sea areas of the North-, Norwegian- and Barents Seas and in Norwegian fjords. The samples consist of biota (fish and other marine organisms), sediments and seawater. Biota samples are frozen onboard the ship and transported to IMR where the samples are subsequently ground up, freeze dried, homogenized and aliquoted into polyethylene counting boxes of appropriate size prior to analysis. Attempts are made to collect filets from 25 fish for each sample of large fish such as cod, haddock, saithe, red-fish and Greenland halibut. For smaller fish (e.g. blue whiting, polar cod, capelin and Atlantic herring) and other organisms such as amphipods, krill, and deep-sea shrimps, a sample of 2-3 kg of each species is taken. These samples are ground up whole. Sediment samples are collected using a Smoegen boxcorer, from where both surface samples and cores are taken. The samples are frozen onboard the ship. While half-frozen, the cores are cut into slices of 1 or 2 cm thickness on board the ship, then frozen again and transported to IMR where they are treated as described above for the biota samples. Large volumes (typically 50-200 L) of seawater are needed in order to get enough material for analysis. Pre-treatment of the samples in the field is therefore an advantage. Surface samples (5 m) of seawater are collected from a shipboard pump, while a CTD-rosette multi-bottle sampler with 12 10 L samplers is used to collect seawater from depths below 5 meters. For the analysis of Cs-137, Cu 2 [Fe(CN) 6 ]-impregnated cotton filters are used for the pre-concentration. One pre-filter without impregnation, two Cu 2 [Fe(CN) 6
Bungay, Vicky; Oliffe, John; Atchison, Chris
2016-06-01
Men, transgender people, and those working in off-street locales have historically been underrepresented in sex work health research. Failure to include all sections of sex worker populations precludes comprehensive understandings about a range of population health issues, including potential variations in the manifestation of such issues within and between population subgroups, which in turn can impede the development of effective services and interventions. In this article, we describe our attempts to define, determine, and recruit a purposeful sample for a qualitative study examining the interrelationships between sex workers' health and the working conditions in the Vancouver off-street sex industry. Detailed is our application of ethnographic mapping approaches to generate information about population diversity and work settings within distinct geographical boundaries. Bearing in mind the challenges and the overwhelming discrimination sex workers experience, we scope recommendations for safe and effective purposeful sampling inclusive of sex workers' heterogeneity. © The Author(s) 2015.
Affective, Cognitive, and Behavioral Consistency of Chinese-Malay Interracial Attitudes
Rabushka, Alvin
1970-01-01
This study, part of an overall study of political and social integration in Malaya carried out in 1966-67, has resulted in findings that vary from consistency theory research in developed countries. (DB)
Miner, Michael H.; Bockting, Walter O.; Romine, Rebecca Swinburne; Raman, Sivakumaran
2011-01-01
Health research on transgender people has been hampered by the challenges inherent in studying a hard-to-reach, relatively small, and geographically dispersed population. The Internet has the potential to facilitate access to transgender samples large enough to permit examination of the diversity and syndemic health disparities found among this population. In this article, we describe the experiences of a team of investigators using the Internet to study HIV risk behaviors of transgender peop...
Consistent Partial Least Squares Path Modeling via Regularization.
Jung, Sunho; Park, JaeHong
2018-01-01
Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.
Consistent Partial Least Squares Path Modeling via Regularization
Directory of Open Access Journals (Sweden)
Sunho Jung
2018-02-01
Full Text Available Partial least squares (PLS path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc, designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.
Dietary intakes of pesticides based on community duplicate diet samples.
Melnyk, Lisa Jo; Xue, Jianping; Brown, G Gordon; McCombs, Michelle; Nishioka, Marcia; Michael, Larry C
2014-01-15
The calculation of dietary intake of selected pesticides was accomplished using food samples collected from individual representatives of a defined demographic community using a community duplicate diet approach. A community of nine participants was identified in Apopka, FL from which intake assessments of organophosphate (OP) and pyrethroid pesticides were made. From these nine participants, sixty-seven individual samples were collected and subsequently analyzed by gas chromatography/mass spectrometry. Measured concentrations were used to estimate dietary intakes for individuals and for the community. Individual intakes of total OP and pyrethroid pesticides ranged from 6.7 to 996 ng and 1.2 to 16,000 ng, respectively. The community intake was 256 ng for OPs and 3430 ng for pyrethroid pesticides. The most commonly detected pesticide was permethrin, but the highest overall intake was of bifenthrin followed by esfenvalerate. These data indicate that the community in Apopka, FL, as represented by the nine individuals, was potentially exposed to both OP and pyrethroid pesticides at levels consistent with a dietary model and other field studies in which standard duplicate diet samples were collected. Higher levels of pyrethroid pesticides were measured than OPs, which is consistent with decreased usage of OPs. The diversity of pyrethroid pesticides detected in food samples was greater than expected. Continually changing pesticide usage patterns need to be considered when determining analytes of interest for large scale epidemiology studies. The Community Duplicate Diet Methodology is a tool for researchers to meet emerging exposure measurement needs that will lead to more accurate assessments of intake which may enhance decisions for chemical regulation. Successfully determining the intake of pesticides through the dietary route will allow for accurate assessments of pesticide exposures to a community of individuals, thereby significantly enhancing the research benefit
The Cross-Cultural Consistency of Marital Communication Associated with Marital Distress.
Halford, W. Kim; And Others
1990-01-01
Compared problem-solving behaviors of four samples of couples, sorted by marital happiness/distress and culture (German and Australian). Results showed cultural differences in frequency and functional significance of negative verbal communication, along with cross-culturally consistent marital behaviors associated with marital distress. (Author/TE)
ANL small-sample calorimeter system design and operation
International Nuclear Information System (INIS)
Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.
1978-07-01
The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg
The perils of straying from protocol: sampling bias and interviewer effects.
Directory of Open Access Journals (Sweden)
Carrie J Ngongo
Full Text Available Fidelity to research protocol is critical. In a contingent valuation study in an informal urban settlement in Nairobi, Kenya, participants responded differently to the three trained interviewers. Interviewer effects were present during the survey pilot, then magnified at the start of the main survey after a seemingly slight adaptation of the survey sampling protocol allowed interviewers to speak with the "closest neighbor" in the event that no one was home at a selected household. This slight degree of interviewer choice led to inferred sampling bias. Multinomial logistic regression and post-estimation tests revealed that the three interviewers' samples differed significantly from one another according to six demographic characteristics. The two female interviewers were 2.8 and 7.7 times less likely to talk with respondents of low socio-economic status than the male interviewer. Systematic error renders it impossible to determine which of the survey responses might be "correct." This experience demonstrates why researchers must take care to strictly follow sampling protocols, consistently train interviewers, and monitor responses by interview to ensure similarity between interviewers' groups and produce unbiased estimates of the parameters of interest.
Honest Importance Sampling with Multiple Markov Chains.
Tan, Aixin; Doss, Hani; Hobert, James P
2015-01-01
Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π 1 , is used to estimate an expectation with respect to another, π . The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π 1 is replaced by a Harris ergodic Markov chain with invariant density π 1 , then the resulting estimator remains strongly consistent. There is a price to be paid however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this paper, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general set up, where we assume that Markov chain samples from several probability densities, π 1 , …, π k , are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effects models under different priors. The second involves Bayesian variable
Is LambdaCDM consistent with the Tully-Fisher relation?
Reyes, Reinabelle; Gunn, J. E.; Mandelbaum, R.
2013-07-01
We consider the question of the origin of the Tully-Fisher relation in LambdaCDM cosmology. Reproducing the observed tight relation between stellar masses and rotation velocities of disk galaxies presents a challenge for semi-analytical models and hydrodynamic simulations of galaxy formation. Here, our goal is to construct a suite of galaxy mass models that is fully consistent with observations, and that also reproduces the observed Tully-Fisher relation. We take advantage of a well-defined sample of disk galaxies in SDSS with measured rotation velocities (from long-slit spectroscopy of H-alpha), stellar bulge and disk profiles (from fits to SDSS images), and average dark matter halo masses (from stacked weak lensing of a larger, similarly-selected sample). The primary remaining freedom in the mass models come from the final dark matter halo profile (after contraction from baryon infall and, possibly, feedback) and the stellar IMF. We find that the observed velocities are reproduced by models with Kroupa IMF and NFW (i.e., unmodified) dark matter haloes for galaxies with stellar masses 10^9-10^10 M_sun. For higher stellar masses, models with contracted NFW haloes are favored. A scenario in which the amount of halo contraction varies with stellar mass is able to reproduce the observed Tully-Fisher relation over the full stellar mass range of our sample from 10^9 to 10^11 M_sun. We present this as a proof-of-concept for consistency between LambdaCDM and the Tully-Fisher relation.
Energy Technology Data Exchange (ETDEWEB)
Greenberg, Judith H.
2002-05-22
The First Community Consultation on the Responsible Collection and Use of Samples for Genetic Research was held in Bethesda, Maryland, on September 25-26, 2000. The consultation was convened by the National Institute of General Medical Sciences (NIGMS) of the National Institutes of Health (NIH). Approximately 120 individuals participated in the consultation, half from a broad range of communities and populations, and half from government. The participants shared their views and concerns about population- and community-based genetic research, expanding the focus of the meeting from the collection and use of blood or other tissue samples for genetic research to broader issues and concerns about the conduct of genetic research in general with populations and communities.
Estimators of internal consistency in health research: the use of the alpha coefficient
Cascaes da Silva, Fraciele; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil.; Gonçalves, Elizandra; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil.; Valdivia Arancibia, Beatriz Angélica; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil; Graziele Bento, Salma; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil.; da Silva Castro, Thiago Luis; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil; Soleman Hernandez, Salma Stephany; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil; da Silva, Rudney; Centro de Ciencias de la Salud y del Deporte, Universidad del Estado de Santa Catarina, Santa Catarina, Brasil
2015-01-01
Academic production has increased in the area of health, increasingly demanding high quality in publications of great impact. One of the ways to consider quality is through methods that increase the consistency of data analysis, such as reliability which, depending on the type of data, can be evaluated by different coefficients, especially the alpha coefficient. Based on this, the present review systematically gathers scientific articles produced in the last five years, which in a methodologi...
Hallett, B. W.; Dere, A. L. D.; Lehnert, K.; Carter, M.
2016-12-01
Vast numbers of physical samples are routinely collected by geoscientists to probe key scientific questions related to global climate change, biogeochemical cycles, magmatic processes, mantle dynamics, etc. Despite their value as irreplaceable records of nature the majority of these samples remain undiscoverable by the broader scientific community because they lack a digital presence or are not well-documented enough to facilitate their discovery and reuse for future scientific and educational use. The NSF EarthCube iSamples Research Coordination Network seeks to develop a unified approach across all Earth Science disciplines for the registration, description, identification, and citation of physical specimens in order to take advantage of the new opportunities that cyberinfrastructure offers. Even as consensus around best practices begins to emerge, such as the use of the International Geo Sample Number (IGSN), more work is needed to communicate these practices to investigators to encourage widespread adoption. Recognizing the importance of students and early career scientists in particular to transforming data and sample management practices, the iSamples Education and Training Working Group is developing training modules for sample collection, documentation, and management workflows. These training materials are made available to educators/research supervisors online at http://earthcube.org/group/isamples and can be modularized for supervisors to create a customized research workflow. This study details the design and development of several sample management tutorials, created by early career scientists and documented in collaboration with undergraduate research students in field and lab settings. Modules under development focus on rock outcrops, rock cores, soil cores, and coral samples, with an emphasis on sample management throughout the collection, analysis and archiving process. We invite others to share their sample management/registration workflows and to
Gasparini, Patrizia; Di Cosmo, Lucio; Cenni, Enrico; Pompei, Enrico; Ferretti, Marco
2013-07-01
In the frame of a process aiming at harmonizing National Forest Inventory (NFI) and ICP Forests Level I Forest Condition Monitoring (FCM) in Italy, we investigated (a) the long-term consistency between FCM sample points (a subsample of the first NFI, 1985, NFI_1) and recent forest area estimates (after the second NFI, 2005, NFI_2) and (b) the effect of tree selection method (tree-based or plot-based) on sample composition and defoliation statistics. The two investigations were carried out on 261 and 252 FCM sites, respectively. Results show that some individual forest categories (larch and stone pine, Norway spruce, other coniferous, beech, temperate oaks and cork oak forests) are over-represented and others (hornbeam and hophornbeam, other deciduous broadleaved and holm oak forests) are under-represented in the FCM sample. This is probably due to a change in forest cover, which has increased by 1,559,200 ha from 1985 to 2005. In case of shift from a tree-based to a plot-based selection method, 3,130 (46.7%) of the original 6,703 sample trees will be abandoned, and 1,473 new trees will be selected. The balance between exclusion of former sample trees and inclusion of new ones will be particularly unfavourable for conifers (with only 16.4% of excluded trees replaced by new ones) and less for deciduous broadleaves (with 63.5% of excluded trees replaced). The total number of tree species surveyed will not be impacted, while the number of trees per species will, and the resulting (plot-based) sample composition will have a much larger frequency of deciduous broadleaved trees. The newly selected trees have-in general-smaller diameter at breast height (DBH) and defoliation scores. Given the larger rate of turnover, the deciduous broadleaved part of the sample will be more impacted. Our results suggest that both a revision of FCM network to account for forest area change and a plot-based approach to permit statistical inference and avoid bias in the tree sample
Immunoreactive LH in long-term frozen human urine samples.
Singh, Gurmeet Kaur Surindar; Jimenez, Mark; Newman, Ron; Handelsman, David J
2014-04-01
Urine provides a convenient non-invasive alternative to blood sampling for measurement of certain hormones. Urinary luteinizing hormone (LH) measurements have been used for endocrinology research and anti-doping testing. However, the commercially available LH immunoassays are developed and validated for human blood samples but not urine so that LH assays intended for use with urine samples need thorough validation. Therefore, the present study evaluated the measurement of urinary LH immunoreactivity using previously validated immunofluorometric (IF) and immunochemiluminometric (ICL) LH assays after prolonged frozen storage. LH was measured in serial urine samples following administration of a single injection of one of two doses of recombinant human chorionic hormone (rhCG) with assays run at the end of study (2008) and again after four years of frozen (-20 °C) storage where samples were stored without adding preservatives. The ICL assay showed quantitatively reproducible LH measurements after prolonged -20 °C storage. However, the IF immunoassay gave consistently lower LH levels relative to ICL (2008) with a further proportionate reduction after four years of sample storage (2012). Yet, both the assays displayed similar patterns of the time-course of urine LH measurement both before and after four years of frozen storage. In conclusion, we found that both immunoassays are suitable for urinary LH measurements with ICL assay being more robust for quantitative urinary LH measurement such as for anti-doping purposes, whereas the IF could be applicable for research studies where urine LH levels are compared within-study but not in absolute terms. Copyright © 2013 John Wiley & Sons, Ltd.
Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy
2006-01-01
We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…
Development and installation of an automatic sample changer for neutron activation analysis
International Nuclear Information System (INIS)
Domienikan, Claudio; Lapolli, Andre L.; Schoueri, Roberto M.; Moreira, Edson G.; Vasconcellos, Marina B.A.
2013-01-01
A Programmable and Automatic Sample Changer was built and installed at the Neutron Activation Analysis Laboratory of the Nuclear and Energy Research Institute - IPEN-CNEN/SP, Brazil. This Automatic Sample Changer allows the fully automated measurement of up to 25 samples in one run. Basically it consists of an electronic circuit and C++ program that controls the positioning of a sample holder in two axes of motion (X and Y). Each sample is transported and positioned, one by one, inside the shielding coupled to a high-purity germanium (HPGe) radiation detector. A Canberra DSA-1000 Multichannel Analyzer coupled to the Genie 2000 software performs the data acquisition for analysis of the samples. When the counting is finished the results are saved in a hard disk of a PC computer. The sample is brought back by the sample holder to its initial position, and the next sample is carried to the shielding. The Sample Changer was designed and constructed at IPEN-CNEN/SP by employing national components and expertise. (author)
40 CFR 761.312 - Compositing of samples.
2010-07-01
... to composite surface wipe test samples and to use the composite measurement to represent the PCB concentration of the entire surface. Composite samples consist of more than one sample gauze extracted and... arithmetic mean of the composited samples. (a) Compositing samples from surfaces to be used or reused. For...
Consistent model driven architecture
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
Protective Factors, Risk Indicators, and Contraceptive Consistency Among College Women.
Morrison, Leslie F; Sieving, Renee E; Pettingell, Sandra L; Hellerstedt, Wendy L; McMorris, Barbara J; Bearinger, Linda H
2016-01-01
To explore risk and protective factors associated with consistent contraceptive use among emerging adult female college students and whether effects of risk indicators were moderated by protective factors. Secondary analysis of National Longitudinal Study of Adolescent to Adult Health Wave III data. Data collected through in-home interviews in 2001 and 2002. National sample of 18- to 25-year-old women (N = 842) attending 4-year colleges. We examined relationships between protective factors, risk indicators, and consistent contraceptive use. Consistent contraceptive use was defined as use all of the time during intercourse in the past 12 months. Protective factors included external supports of parental closeness and relationship with caring nonparental adult and internal assets of self-esteem, confidence, independence, and life satisfaction. Risk indicators included heavy episodic drinking, marijuana use, and depression symptoms. Multivariable logistic regression models were used to evaluate relationships between protective factors and consistent contraceptive use and between risk indicators and contraceptive use. Self-esteem, confidence, independence, and life satisfaction were significantly associated with more consistent contraceptive use. In a final model including all internal assets, life satisfaction was significantly related to consistent contraceptive use. Marijuana use and depression symptoms were significantly associated with less consistent use. With one exception, protective factors did not moderate relationships between risk indicators and consistent use. Based on our findings, we suggest that risk and protective factors may have largely independent influences on consistent contraceptive use among college women. A focus on risk and protective factors may improve contraceptive use rates and thereby reduce unintended pregnancy among college students. Copyright © 2016 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published
Adolph, Karen E.; Robinson, Scott R.
2011-01-01
Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…
Topologically Consistent Models for Efficient Big Geo-Spatio Data Distribution
Jahn, M. W.; Bradley, P. E.; Doori, M. Al; Breunig, M.
2017-10-01
Geo-spatio-temporal topology models are likely to become a key concept to check the consistency of 3D (spatial space) and 4D (spatial + temporal space) models for emerging GIS applications such as subsurface reservoir modelling or the simulation of energy and water supply of mega or smart cities. Furthermore, the data management for complex models consisting of big geo-spatial data is a challenge for GIS and geo-database research. General challenges, concepts, and techniques of big geo-spatial data management are presented. In this paper we introduce a sound mathematical approach for a topologically consistent geo-spatio-temporal model based on the concept of the incidence graph. We redesign DB4GeO, our service-based geo-spatio-temporal database architecture, on the way to the parallel management of massive geo-spatial data. Approaches for a new geo-spatio-temporal and object model of DB4GeO meeting the requirements of big geo-spatial data are discussed in detail. Finally, a conclusion and outlook on our future research are given on the way to support the processing of geo-analytics and -simulations in a parallel and distributed system environment.
The least weighted squares II. Consistency and asymptotic normality
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2002-01-01
Roč. 9, č. 16 (2002), s. 1-28 ISSN 1212-074X R&D Projects: GA AV ČR KSK1019101 Grant - others:GA UK(CR) 255/2000/A EK /FSV Institutional research plan: CEZ:AV0Z1075907 Keywords : robust regression * consistency * asymptotic normality Subject RIV: BA - General Mathematics
Energy Technology Data Exchange (ETDEWEB)
Cho, Yong Woo; Han, Man Jung; Cho, Seong Won; Cho, Hong Jun; Oh, Hyeon Kyun; Lee, Jeong Min; Chang, Jae Sook [KORTIC, Taejon (Korea, Republic of)
2002-12-15
Twelve kinds of environmental samples such as soil, seawater, underground water, etc. around Nuclear Power Plants(NPPs) were collected. Tritium chemical analysis was tried for the samples of rain water, pine-needle, air, seawater, underground water, chinese cabbage, a grain of rice and milk sampled around NPPs, and surface seawater and rain water sampled over the country. Strontium in the soil that sere sampled at 60 point of district in Korea were analyzed. Tritium were sampled at 60 point of district in Korea were analyzed. Tritium were analyzed in 21 samples of surface seawater around the Korea peninsular that were supplied from KFRDI(National Fisheries Research and Development Institute). Sampling and chemical analysis environmental samples around Kori, Woolsung, Youngkwang, Wooljin Npps and Taeduk science town for tritium and strontium analysis was managed according to plans. Succeed to KINS after all samples were tried.
Amplification Biases and Consistent Recovery of Loci in a Double-Digest RAD-seq Protocol
DaCosta, Jeffrey M.; Sorenson, Michael D.
2014-01-01
A growing variety of “genotype-by-sequencing” (GBS) methods use restriction enzymes and high throughput DNA sequencing to generate data for a subset of genomic loci, allowing the simultaneous discovery and genotyping of thousands of polymorphisms in a set of multiplexed samples. We evaluated a “double-digest” restriction-site associated DNA sequencing (ddRAD-seq) protocol by 1) comparing results for a zebra finch (Taeniopygia guttata) sample with in silico predictions from the zebra finch reference genome; 2) assessing data quality for a population sample of indigobirds (Vidua spp.); and 3) testing for consistent recovery of loci across multiple samples and sequencing runs. Comparison with in silico predictions revealed that 1) over 90% of predicted, single-copy loci in our targeted size range (178–328 bp) were recovered; 2) short restriction fragments (38–178 bp) were carried through the size selection step and sequenced at appreciable depth, generating unexpected but nonetheless useful data; 3) amplification bias favored shorter, GC-rich fragments, contributing to among locus variation in sequencing depth that was strongly correlated across samples; 4) our use of restriction enzymes with a GC-rich recognition sequence resulted in an up to four-fold overrepresentation of GC-rich portions of the genome; and 5) star activity (i.e., non-specific cutting) resulted in thousands of “extra” loci sequenced at low depth. Results for three species of indigobirds show that a common set of thousands of loci can be consistently recovered across both individual samples and sequencing runs. In a run with 46 samples, we genotyped 5,996 loci in all individuals and 9,833 loci in 42 or more individuals, resulting in <1% missing data for the larger data set. We compare our approach to similar methods and discuss the range of factors (fragment library preparation, natural genetic variation, bioinformatics) influencing the recovery of a consistent set of loci among
Electron beam charging of insulators: A self-consistent flight-drift model
International Nuclear Information System (INIS)
Touzin, M.; Goeuriot, D.; Guerret-Piecourt, C.; Juve, D.; Treheux, D.; Fitting, H.-J.
2006-01-01
Electron beam irradiation and the self-consistent charge transport in bulk insulating samples are described by means of a new flight-drift model and an iterative computer simulation. Ballistic secondary electron and hole transport is followed by electron and hole drifts, their possible recombination and/or trapping in shallow and deep traps. The trap capture cross sections are the Poole-Frenkel-type temperature and field dependent. As a main result the spatial distributions of currents j(x,t), charges ρ(x,t), the field F(x,t), and the potential slope V(x,t) are obtained in a self-consistent procedure as well as the time-dependent secondary electron emission rate σ(t) and the surface potential V 0 (t). For bulk insulating samples the time-dependent distributions approach the final stationary state with j(x,t)=const=0 and σ=1. Especially for low electron beam energies E 0 G of a vacuum grid in front of the target surface. For high beam energies E 0 =10, 20, and 30 keV high negative surface potentials V 0 =-4, -14, and -24 kV are obtained, respectively. Besides open nonconductive samples also positive ion-covered samples and targets with a conducting and grounded layer (metal or carbon) on the surface have been considered as used in environmental scanning electron microscopy and common SEM in order to prevent charging. Indeed, the potential distributions V(x) are considerably small in magnitude and do not affect the incident electron beam neither by retarding field effects in front of the surface nor within the bulk insulating sample. Thus the spatial scattering and excitation distributions are almost not affected
SIMPLE ESTIMATOR AND CONSISTENT STRONGLY OF STABLE DISTRIBUTIONS
Directory of Open Access Journals (Sweden)
Cira E. Guevara Otiniano
2016-06-01
Full Text Available Stable distributions are extensively used to analyze earnings of financial assets, such as exchange rates and stock prices assets. In this paper we propose a simple and strongly consistent estimator for the scale parameter of a symmetric stable L´evy distribution. The advantage of this estimator is that your computational time is minimum thus it can be used to initialize intensive computational procedure such as maximum likelihood. With random samples of sized n we tested the efficacy of these estimators by Monte Carlo method. We also included applications for three data sets.
Determination of 7BE in soil sample by gamma spectrometry for erosion researchs
International Nuclear Information System (INIS)
Esquivel, Alexander D.; Kastner, Geraldo F.; Amaral, Angela M.; Monteiro, Roberto Pellacani G.; Moreira, Rubens M.
2015-01-01
Cosmogenic 7 Be is a natural radiotracer produced in the stratosphere and troposphere and reached to the Earth surface via wet and dry fallout and hence its measurement for research of erosion in soils is very significant. The 7 Be radio analyse based on gamma spectrometry technique has been a routine methodology for decades and although is the reference procedure is not free of analytical interference. 7 Be is a β-γ emitting radionuclide (Eγ = 477.59 keV, T½ = 53.12d) and depending on the chemical profile of the soil its determination is susceptible to 228 Ac (E γ = 478.40 keV, T½ = 6.15h) interference. The aim of this work was to establish an analytical protocol for the 7 Be determination in soil samples from Juatuba-Mg region in different sampling periods of dry and rainy seasons for erosion studies and to establish some methodologies for evaluating and correcting the interference level of 228 Ac in the 7 Be activity measurements by gamma spectrometry. (author)
The prevalence of dementia in a Portuguese community sample: a 10/66 Dementia Research Group study.
Gonçalves-Pereira, Manuel; Cardoso, Ana; Verdelho, Ana; Alves da Silva, Joaquim; Caldas de Almeida, Manuel; Fernandes, Alexandra; Raminhos, Cátia; Ferri, Cleusa P; Prina, A Matthew; Prince, Martin; Xavier, Miguel
2017-11-07
Dementia imposes a high burden of disease worldwide. Recent epidemiological studies in European community samples are scarce. In Portugal, community prevalence data is very limited. The 10/66 Dementia Research Group (DRG) population-based research programmes are focused in low and middle income countries, where the assessments proved to be culture and education fair. We applied the 10/66 DRG prevalence survey methodology in Portugal, where levels of illiteracy in older populations are still high. A cross-sectional comprehensive one-phase survey was conducted of all residents aged 65 and over of two geographically defined catchment areas in Southern Portugal (one urban and one rural site). Nursing home residents were not included in the present study. Standardized 10/66 DRG assessments include a cognitive module, an informant interview and the Geriatric Mental State-AGECAT, providing data on dementia diagnosis and subtypes, mental disorders including depression, physical health, anthropometry, demographics, disability/functioning, health service utilization, care arrangements and caregiver strain. We interviewed 1405 old age participants (mean age 74.9, SD = 6.7 years; 55.5% women) after 313 (18.2%) refusals to participate. The prevalence rate for dementia in community-dwellers was 9.23% (95% CI 7.80-10.90) using the 10/66 DRG algorithm and 3.65% (95% CI 2.97-4.97) using DSM-IV criteria. Pure Alzheimer's disease was the most prevalent dementia subtype (41.9%). The prevalence of dementia was strongly age-dependent for both criteria, but there was no association with sex. Dementia prevalence was higher than previously reported in Portugal. The discrepancy between prevalence according to the 10/66 DRG algorithm and the DSM-IV criteria is consistent with that observed in less developed countries; this suggests potential underestimation using the latter approach, although relative validity of these two approaches remains to be confirmed in the European context. We
Teknik Sampling Snowball dalam Penelitian Lapangan
Directory of Open Access Journals (Sweden)
Nina Nurdiani
2014-12-01
Full Text Available Field research can be associated with both qualitative and quantitative research methods, depending on the problems faced and the goals to be achieved. The success of data collection in the field research depends on the determination of the appropriate sampling technique, to obtain accurate data, and reliably. In studies that have problems related to specific issues, requiring a non-probability sampling techniques one of which is the snowball sampling technique. This technique is useful for finding, identifying, selecting and taking samples in a network or chain of relationships. Phased implementation procedures performed through interviews and questionnaires. Snowball sampling technique has strengths and weaknesses in its application. Field research housing sector become the case study to explain this sampling technique.
Microfluidic Sample Preparation for Diagnostic Cytopathology
Mach, Albert J.; Adeyiga, Oladunni B.; Di Carlo, Dino
2014-01-01
The cellular components of body fluids are routinely analyzed to identify disease and treatment approaches. While significant focus has been placed on developing cell analysis technologies, tools to automate the preparation of cellular specimens have been more limited, especially for body fluids beyond blood. Preparation steps include separating, concentrating, and exposing cells to reagents. Sample preparation continues to be routinely performed off-chip by technicians, preventing cell-based point-of-care diagnostics, increasing the cost of tests, and reducing the consistency of the final analysis following multiple manually-performed steps. Here, we review the assortment of biofluids for which suspended cells are analyzed, along with their characteristics and diagnostic value. We present an overview of the conventional sample preparation processes for cytological diagnosis. We finally discuss the challenges and opportunities in developing microfluidic devices for the purpose of automating or miniaturizing these processes, with particular emphases on preparing large or small volume samples, working with samples of high cellularity, automating multi-step processes, and obtaining high purity subpopulations of cells. We hope to convey the importance of and help identify new research directions addressing the vast biological and clinical applications in preparing and analyzing the array of available biological fluids. Successfully addressing the challenges described in this review can lead to inexpensive systems to improve diagnostic accuracy while simultaneously reducing overall systemic healthcare costs. PMID:23380972
The objective of this research was to examine diet- and body size-related attitudes and behaviors associated with supplement use in a representative sample of fourth-grade students in Texas. The research design consisted of cross-sectional data from the School Physical Activity and Nutrition study, ...
DEFF Research Database (Denmark)
Abma, Femke I.; Bültmann, Ute; Amick, Benjamin C.
2017-01-01
Objective: The Work Role Functioning Questionnaire v2.0 (WRFQ) is an outcome measure linking a persons’ health to the ability to meet work demands in the twenty-first century. We aimed to examine the construct validity of the WRFQ in a heterogeneous set of working samples in the Netherlands...
Abma, F.I.; Bultmann, U.; Amick III, B.C.; Arends, I.; Dorland, P.A.; Flach, P.A.; Klink, J.J.L van der; Ven H.A., van de; Bjørner, J.B.
2017-01-01
Objective The Work Role Functioning Questionnaire v2.0 (WRFQ) is an outcome measure linking a persons’ health to the ability to meet work demands in the twenty-first century. We aimed to examine the construct validity of the WRFQ in a heterogeneous set of working samples in the Netherlands with
Water sampling using a drone at Yugama crater lake, Kusatsu-Shirane volcano, Japan
Terada, Akihiko; Morita, Yuichi; Hashimoto, Takeshi; Mori, Toshiya; Ohba, Takeshi; Yaguchi, Muga; Kanda, Wataru
2018-04-01
Remote sampling of water from Yugama crater lake at Kusatsu-Shirane volcano, Japan, was performed using a drone. Despite the high altitude of over 2000 m above sea level, our simple method was successful in retrieving a 250 mL sample of lake water. The procedure presented here is easy for any researcher to follow who operates a drone without additional special apparatus. We compare the lake water sampled by drone with that sampled by hand at a site where regular samplings have previously been carried out. Chemical concentrations and stable isotope ratios are largely consistent between the two techniques. As the drone can fly automatically with the aid of navigation by Global Navigation Satellite System (GNSS), it is possible to repeatedly sample lake water from the same location, even when entry to Yugama crater lake is restricted due to the risk of eruption.[Figure not available: see fulltext.
Consistent dynamical and statistical description of fission and comparison
Energy Technology Data Exchange (ETDEWEB)
Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)
1996-06-01
The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).
Turbidity Threshold sampling in watershed research
Rand Eads; Jack Lewis
2003-01-01
Abstract - When monitoring suspended sediment for watershed research, reliable and accurate results may be a higher priority than in other settings. Timing and frequency of data collection are the most important factors influencing the accuracy of suspended sediment load estimates, and, in most watersheds, suspended sediment transport is dominated by a few, large...
Consistency maintenance for constraint in role-based access control model
Institute of Scientific and Technical Information of China (English)
韩伟力; 陈刚; 尹建伟; 董金祥
2002-01-01
Constraint is an important aspect of role-based access control and is sometimes argued to be the principal motivation for role-based access control (RBAC). But so far few authors have discussed consistency maintenance for constraint in RBAC model. Based on researches of constraints among roles and types of inconsistency among constraints, this paper introduces corresponding formal rules, rule-based reasoning and corresponding methods to detect, avoid and resolve these inconsistencies. Finally, the paper introduces briefly the application of consistency maintenance in ZD-PDM, an enterprise-oriented product data management (PDM) system.
Consistency maintenance for constraint in role-based access control model
Institute of Scientific and Technical Information of China (English)
韩伟力; 陈刚; 尹建伟; 董金祥
2002-01-01
Constraint is an important aspect of role-based access control and is sometimes argued to be the principal motivation for role-based access control (RBAC). But so far'few authors have discussed consistency maintenance for constraint in RBAC model. Based on researches of constraints among roles and types of inconsistency among constraints, this paper introduces correaponding formal rules, rulebased reasoning and corresponding methods to detect, avoid and resolve these inconsistencies. Finally,the paper introduces briefly the application of consistency maintenance in ZD-PDM, an enterprise-ori-ented product data management (PDM) system.
DEFF Research Database (Denmark)
Rijkhoff, Jan; Bakker, Dik
1998-01-01
This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research....
Annual research plan, 1983-84. [Organic compounds derived from fossil substances
Energy Technology Data Exchange (ETDEWEB)
None
1984-05-01
The National Institute for Petroleum and Energy Research (NIPER) resulted from efforts by the Department of Energy (DOE) to ensure the continuity of the unique energy research capabilities that had been developed at the Bartlesville Energy Technology Center (BETC) over the past 65 years. This was accomplished by a Cooperative Agreement between DOE and IIT Research Institute (IITRI). The agreement to operate NIPER for the five fiscal years 1984-88 became effective October 1, 1983. The NIPER Annual Research Plan for 1983-84 consists of eight projects in the Base Program and 13 projects in the Optional Program. A sampling of potential Work for Others projects is also presented. The Base Program consists of five EOR and three Fundamental Petroleum Chemistry projects. The Optional Program has three EOR projects, one Unconventional Gas Recovery project, five APT projects, and four Advanced Utilization Research projects.
Iowa State University GIS Support and Research Facility — Point locations of geologic samples/files in the IGS repository. Types of samples include well cuttings, outcrop samples, cores, drillers logs, measured sections,...
Gentles, Stephen J; Charles, Cathy; Nicholas, David B; Ploeg, Jenny; McKibbon, K Ann
2016-10-11
Overviews of methods are potentially useful means to increase clarity and enhance collective understanding of specific methods topics that may be characterized by ambiguity, inconsistency, or a lack of comprehensiveness. This type of review represents a distinct literature synthesis method, although to date, its methodology remains relatively undeveloped despite several aspects that demand unique review procedures. The purpose of this paper is to initiate discussion about what a rigorous systematic approach to reviews of methods, referred to here as systematic methods overviews, might look like by providing tentative suggestions for approaching specific challenges likely to be encountered. The guidance offered here was derived from experience conducting a systematic methods overview on the topic of sampling in qualitative research. The guidance is organized into several principles that highlight specific objectives for this type of review given the common challenges that must be overcome to achieve them. Optional strategies for achieving each principle are also proposed, along with discussion of how they were successfully implemented in the overview on sampling. We describe seven paired principles and strategies that address the following aspects: delimiting the initial set of publications to consider, searching beyond standard bibliographic databases, searching without the availability of relevant metadata, selecting publications on purposeful conceptual grounds, defining concepts and other information to abstract iteratively, accounting for inconsistent terminology used to describe specific methods topics, and generating rigorous verifiable analytic interpretations. Since a broad aim in systematic methods overviews is to describe and interpret the relevant literature in qualitative terms, we suggest that iterative decision making at various stages of the review process, and a rigorous qualitative approach to analysis are necessary features of this review type
Garnier-Laplace, J; Vandenhove, H; Beresford, N; Muikku, M; Real, A
2018-03-01
The ALLIANCE 6 Strategic Research Agenda (SRA) initiated by the STAR 7 Network of Excellence and integrated in the research strategy implemented by the COMET consortium, defines a long-term vision of the needs for, and implementation of, research in radioecology. This reference document, reflecting views from many stakeholders groups and researchers, serves as an input to those responsible for defining EU research call topics through the ALLIANCE SRA statement delivered each year to the EJP-CONCERT 8 (2015-2020). This statement highlights a focused number of priorities for funding. Research in radioecology and related sciences is justified by various drivers, such as policy changes, scientific advances and knowledge gaps, radiological risk perception by the public, and a growing awareness of interconnections between human and ecosystem health. The SRA is being complemented by topical roadmaps that have been initiated by the COMET 9 EC-funded project, with the help and endorsement of the ALLIANCE. The strategy underlying roadmap development is driven by the need for improved mechanistic understanding across radioecology. By meeting this need, we can provide fit-for-purpose human and environmental impact/risk assessments in support of the protection of man and the environment in interaction with society and for the three exposure situations defined by the ICRP (i.e., planned, existing and emergency). Within the framework of the EJP-CONCERT the development of a joint roadmap is under discussion among all the European research platforms and will highlight the major research needs for the whole radiation protection field and how these are likely to be addressed by 2030.
Large sample neutron activation analysis of a reference inhomogeneous sample
International Nuclear Information System (INIS)
Vasilopoulou, T.; Athens National Technical University, Athens; Tzika, F.; Stamatelatos, I.E.; Koster-Ammerlaan, M.J.J.
2011-01-01
A benchmark experiment was performed for Neutron Activation Analysis (NAA) of a large inhomogeneous sample. The reference sample was developed in-house and consisted of SiO 2 matrix and an Al-Zn alloy 'inhomogeneity' body. Monte Carlo simulations were employed to derive appropriate correction factors for neutron self-shielding during irradiation as well as self-attenuation of gamma rays and sample geometry during counting. The large sample neutron activation analysis (LSNAA) results were compared against reference values and the trueness of the technique was evaluated. An agreement within ±10% was observed between LSNAA and reference elemental mass values, for all matrix and inhomogeneity elements except Samarium, provided that the inhomogeneity body was fully simulated. However, in cases that the inhomogeneity was treated as not known, the results showed a reasonable agreement for most matrix elements, while large discrepancies were observed for the inhomogeneity elements. This study provided a quantification of the uncertainties associated with inhomogeneity in large sample analysis and contributed to the identification of the needs for future development of LSNAA facilities for analysis of inhomogeneous samples. (author)
Sampling vs. taking some - 59349
International Nuclear Information System (INIS)
Francois-Bongarcon, D.
2012-01-01
Collecting a sample is a delicate task that is Not naively equivalent to simply 'taking some of the material'. The question examined is: 'What is it exactly?' The problem of sampling in general, and for nuclear decontamination in particular, is properly defined. A theory is presented (Gy's Theory of Sampling, a.k.a. TOS) that brings all the answers and allows us to put them to work. The author draws form his lifelong experience in research, teaching and practical applications in this domain to emphasize the critical odds (i.e. risks) of not taking sampling explicitly into account when assessing grades and concentrations. The evolution of the acceptance of this theory in the nuclear industry is finally illustrated, and a hopeful glimpse into the future concludes the presentation. Equally interesting, however, besides what has already been achieved at the CEA along these years, is the realization of what could not be done with TOS, and therefore had to be treated in some other ways - e.g. using mapping tools (geostatistical). It is one the great side-advantages of using a consistent theory that it warns you, before it is too late, that what you are trying to do will not work: TOS, indeed, much like its Geo-statistics sister, besides preventing many a disaster, can provide pragmatic lessons in scientific humility that are best not being left ignored. In conclusion, there are great tools out there, such as TOS, that are well worth investing into, and that our community should be much more attuned to. (author)
Church, A. Timothy; Katigbak, Marcia S.; Reyes, Jose Alberto S.; Salanga, Maria Guadalupe C.; Miramontes, Lilia A.; Adams, Nerissa B.
2008-01-01
Trait and cultural psychology perspectives on the cross-situational consistency of behavior, and the predictive validity of traits, were tested in a daily process study in the United States (N = 68), an individualistic culture, and the Philippines (N = 80), a collectivistic culture. Participants completed the Revised NEO Personality Inventory (Costa & McCrae, 1992) and a measure of self-monitoring, then reported their daily behaviors and associated situational contexts for approximately 30 days. Consistent with trait perspectives, the Big Five traits predicted daily behaviors in both cultures, and relative (interindividual) consistency was observed across many, although not all, situational contexts. The frequency of various Big Five behaviors varied across relevant situational contexts in both cultures and, consistent with cultural psychology perspectives, there was a tendency for Filipinos to exhibit greater situational variability than Americans. Self-monitoring showed some ability to account for individual differences in situational variability in the American sample, but not the Filipino sample. PMID:22146866
Lacruz, Me; Emeny, Rt; Bickel, H; Linkohr, B; Ladwig, Kh
2013-09-01
Test the feasibility of the modified telephone interview for cognitive status (TICS-m) as a screening tool to detect cognitive impairment in a population-based sample of older subjects. Data were collected from 3,578 participants, age 65-94 years, of the KORA-Age study. We used analysis of covariance to test for significant sex, age and educational differences in raw TICS-m scores. Internal consistency was analysed by assessing Cronbach's alpha. Correction for education years was undertaken, and participants were divided in three subgroups following validated cut-offs. Finally, a logistic regression was performed to determine the impact of sex on cognition subgroups. Internal consistency of the TICS-m was 0.78. Study participants needed approximately 5.4 min to complete the interview. Lower raw TICS-m scores were associated with male sex, older age and lower education (all p education years, 2,851 (79%) had a non-impaired cognitive status (score >31). Male sex was independently associated with having a score equal to or below 27 and 31 (OR = 1.9, 95% CI 1.4-2.5 and OR = 1.5, 95% CI 1.2-1.7, respectively). The TICS-m is a feasible questionnaire for community-dwelling older adults with normal cognitive function or moderate cognitive impairment. Lower cognitive performance was associated with being a man, being older, and having fewer years of formal education. Copyright © 2012 John Wiley & Sons, Ltd.
Measuring process and knowledge consistency
DEFF Research Database (Denmark)
Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders
2007-01-01
When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...
Bitcoin Meets Strong Consistency
Decker, Christian; Seidel, Jochen; Wattenhofer, Roger
2014-01-01
The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...
Directory of Open Access Journals (Sweden)
Tara Macrae
Full Text Available Accurate quantification of gene expression by qRT-PCR relies on normalization against a consistently expressed control gene. However, control genes in common use often vary greatly between samples, especially in cancer. The advent of Next Generation Sequencing technology offers the possibility to better select control genes with the least cell to cell variability in steady state transcript levels. Here we analyze the transcriptomes of 55 leukemia samples to identify the most consistent genes. This list is enriched for components of the proteasome (ex. PSMA1 and spliceosome (ex. SF3B2, and also includes the translation initiation factor EIF4H, and many heterogeneous nuclear ribonucleoprotein genes (ex. HNRNPL. We have validated the consistency of our new control genes in 1933 cancer and normal tissues using publically available RNA-seq data, and their usefulness in qRT-PCR analysis is clearly demonstrated.
Ireland and medical research with minors: some medico-legal aspects.
Sheikh, Asim A
2008-07-01
The practice of medical research with minors in Ireland consist of practices pertaining to therapeutic and non-therapeutic medical research. Clinical trials (a category of therapeutic research), is governed by legislation. However, any other therapeutic research (non-clinical trials research) and non-therapeutic research, e.g. observational medical research such as a longitudinal study of children or non-therapeutic research such as blood sample collection for analysis of cause of disease, are unregulated by legislation. This, article will outline and describe some of the medico-legal issues involved in both types of research and will comment on matters such as what national law exists, how the directive on good clinical practice has been implemented, what guidelines, if any, exist.
In search of a representative sample of residential building work.
Lobb, Brenda; Woods, Gregory R
2012-09-01
Most research investigating injuries in construction work is limited by reliance on work samples unrepresentative of the multiple, variable-cycle tasks involved, resulting in incomplete characterisation of ergonomic exposures. In this case study, a participatory approach was used including hierarchical task analysis and site observations of a typical team of house builders in New Zealand, over several working days, to obtain a representative work sample. The builders' work consisted of 14 goal-defined jobs using varying subsets of 15 task types, each taking from less than 1 s to more than 1 h and performed in a variety of postures. Task type and duration varied within and between participants and days, although all participants spent at least 25% of the time moving from place to place, mostly carrying materials, and more than half the time either reaching up or bending down to work. This research has provided a description of residential building work based on a work sample more nearly representative than those previously published and has demonstrated a simple, low-cost but robust field observation method that can provide a valid basis for further study of hazard exposures. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Attitude of A Sample of Iranian Researchers toward The Future of Stem Cell Research.
Lotfipanah, Mahdi; Azadeh, Fereydoon; Totonchi, Mehdi; Omani-Samani, Reza
2018-10-01
Stem cells that have unlimited proliferation potential as well as differentiation potency are considered to be a promising future treatment method for incurable diseases. The aim of the present study is to evaluate the future trend of stem cell researches from researchers' viewpoints. This was a cross-sectional descriptive study on researchers involved in stem cell research at Royan Institute. We designed a questionnaire using a qualitative study based on expert opinion and a literature review. Content validity was performed using three rounds of the Delphi method with experts. Face validity was undertaken by a Persian literature expert and a graphics designer. The questionnaire was distributed among 150 researchers involved in stem cell studies in Royan Institute biology laboratories. We collected 138 completed questionnaires. The mean age of participants was 31.13 ± 5.8 years; most (60.9%) were females. Participants (76.1%) considered the budget to be the most important issue in stem cell research, 79.7% needed financial support from the government, and 77.5% felt that charities could contribute substantially to stem cell research. A total of 90.6% of participants stated that stem cells should lead to commercial usage which could support future researches (86.2%). The aim of stem cell research was stipulated as increasing health status of the society according to 92.8% of the participants. At present, among cell types, importance was attached to cord blood and adult stem cells. Researchers emphasized the importance of mesenchymal stem cells (MSCs) rather than hematopoietic stem cells (HSCs, 57.73%). The prime priorities were given to cancer so that stem cell research could be directed to sphere stem cell research whereas the least preference was given to skin research. Regenerative medicine is considered the future of stem cell research with emphasis on application of these cells, especially in cancer treatment. Copyright© by Royan Institute. All rights
Reliability and validity of the Modified Erikson Psychosocial Stage Inventory in diverse samples.
Leidy, N K; Darling-Fisher, C S
1995-04-01
The Modified Erikson Psychosocial Stage Inventory (MEPSI) is a relatively simple survey measure designed to assess the strength of psychosocial attributes that arise from progression through Erikson's eight stages of development. The purpose of this study was to employ secondary analysis to evaluate the internal-consistency reliability and construct validity of the MEPSI across four diverse samples: healthy young adults, hemophilic men, healthy older adults, and older adults with chronic obstructive pulmonary disease. Special attention was given to the performance of the measure across gender, with exploratory analyses examining possible age cohort and health status effects. Internal-consistency estimates for the aggregate measure were high, whereas subscale reliability levels varied across age groups. Construct validity was supported across samples. Gender, cohort, and health effects offered interesting psychometric and theoretical insights and direction for further research. Findings indicated that the MEPSI might be a useful instrument for operationalizing and testing Eriksonian developmental theory in adults.
A New Heteroskedastic Consistent Covariance Matrix Estimator using Deviance Measure
Directory of Open Access Journals (Sweden)
Nuzhat Aftab
2016-06-01
Full Text Available In this article we propose a new heteroskedastic consistent covariance matrix estimator, HC6, based on deviance measure. We have studied and compared the finite sample behavior of the new test and compared it with other this kind of estimators, HC1, HC3 and HC4m, which are used in case of leverage observations. Simulation study is conducted to study the effect of various levels of heteroskedasticity on the size and power of quasi-t test with HC estimators. Results show that the test statistic based on our new suggested estimator has better asymptotic approximation and less size distortion as compared to other estimators for small sample sizes when high level ofheteroskedasticity is present in data.
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Study of phosphors determination in biological samples
International Nuclear Information System (INIS)
Oliveira, Rosangela Magda de.
1994-01-01
In this paper, phosphors determination by neutron activation analysis in milk and bone samples was studied employing both instrumental and radiochemical separation methods. The analysis with radiochemistry separation consisted of the simultaneous irradiation of the samples and standards during 30 minutes, dissolution of the samples, hold back carrier, addition precipitation of phosphorus with ammonium phosphomolibdate (A.M.P.) and phosphorus-32 by counting by using Geiger-Mueller detector. The instrumental analysis consisted of the simultaneous irradiation of the samples and standards during 30 minutes, transfer of the samples into a counting planchet and measurement of the beta radiation emitted by phosphorus-32, after a suitable decay period. After the phosphorus analysis methods were established they were applied to both commercial milk and animal bone samples, and data obtained in the instrumental and radiochemical separation methods for each sample, were compared between themselves. In this work, it became possible to obtain analysis methods for phosphorus that can be applied independently of the sample quantity available, and the phosphorus content in the samples or interference that can be present in them. (author). 51 refs., 7 figs., 4 tabs
The Internet of Samples in the Earth Sciences (iSamples)
Carter, M. R.; Lehnert, K. A.
2015-12-01
Across most Earth Science disciplines, research depends on the availability of samples collected above, at, and beneath Earth's surface, on the moon and in space, or generated in experiments. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). The Internet of Samples in the Earth Sciences (iSamples) is an initiative funded as a Research Coordination Network (RCN) within the EarthCube program to address this need. iSamples aims to advance the use of innovative cyberinfrastructure to connect physical samples and sample collections across the Earth Sciences with digital data infrastructures to revolutionize their utility for science. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture of a shared cyberinfrastructure for collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical
Remaining useful life prediction based on variation coefficient consistency test of a Wiener process
Directory of Open Access Journals (Sweden)
Juan LI
2018-01-01
Full Text Available High-cost equipment is often reused after maintenance, and whether the information before the maintenance can be used for the Remaining Useful Life (RUL prediction after the maintenance is directly determined by the consistency of the degradation pattern before and after the maintenance. Aiming at this problem, an RUL prediction method based on the consistency test of a Wiener process is proposed. Firstly, the parameters of the Wiener process estimated by Maximum Likelihood Estimation (MLE are proved to be biased, and a modified unbiased estimation method is proposed and verified by derivation and simulations. Then, the h statistic is constructed according to the reciprocal of the variation coefficient of the Wiener process, and the sampling distribution is derived. Meanwhile, a universal method for the consistency test is proposed based on the sampling distribution theorem, which is verified by simulation data and classical crack degradation data. Finally, based on the consistency test of the degradation model, a weighted fusion RUL prediction method is presented for the fuel pump of an airplane, and the validity of the presented method is verified by accurate computation results of real data, which provides a theoretical and practical guidance for engineers to predict the RUL of equipment after maintenance.
Sampling and sample preparation methods for the analysis of trace elements in biological material
International Nuclear Information System (INIS)
Sansoni, B.; Iyengar, V.
1978-05-01
The authors attempt to give a most systamtic possible treatment of the sample taking and sample preparation of biological material (particularly in human medicine) for trace analysis (e.g. neutron activation analysis, atomic absorption spectrometry). Contamination and loss problems are discussed as well as the manifold problems of the different consistency of solid and liquid biological materials, as well as the stabilization of the sample material. The process of dry and wet ashing is particularly dealt with, where new methods are also described. (RB) [de
Surfactant modified clays’ consistency limits and contact angles
Directory of Open Access Journals (Sweden)
S Akbulut
2012-07-01
Full Text Available This study was aimed at preparing a surfactant modified clay (SMC and researching the effect of surfactants on clays' contact angles and consistency limits; clay was thus modified by surfactants formodifying their engineering properties. Seven surfactants (trimethylglycine, hydroxyethylcellulose octyl phenol ethoxylate, linear alkylbenzene sulfonic acid, sodium lauryl ether sulfate, cetyl trimethylammonium chloride and quaternised ethoxylated fatty amine were used as surfactants in this study. The experimental results indicated that SMC consistency limits (liquid and plastic limits changedsignificantly compared to those of natural clay. Plasticity index and liquid limit (PI-LL values representing soil class approached the A-line when zwitterion, nonionic, and anionic surfactant percentageincreased. However, cationic SMC became transformed from CH (high plasticity clay to MH (high plasticity silt class soils, according to the unified soil classification system (USCS. Clay modifiedwith cationic and anionic surfactants gave higher and lower contact angles than natural clay, respectively.
Consistency assessment of rating curve data in various locations using Bidirectional Reach (BReach)
Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Coxon, Gemma; Freer, Jim; Verhoest, Niko E. C.
2017-10-01
When estimating discharges through rating curves, temporal data consistency is a critical issue. In this research, consistency in stage-discharge data is investigated using a methodology called Bidirectional Reach (BReach), which departs from a (in operational hydrology) commonly used definition of consistency. A period is considered to be consistent if no consecutive and systematic deviations from a current situation occur that exceed observational uncertainty. Therefore, the capability of a rating curve model to describe a subset of the (chronologically sorted) data is assessed in each observation by indicating the outermost data points for which the rating curve model behaves satisfactorily. These points are called the maximum left or right reach, depending on the direction of the investigation. This temporal reach should not be confused with a spatial reach (indicating a part of a river). Changes in these reaches throughout the data series indicate possible changes in data consistency and if not resolved could introduce additional errors and biases. In this research, various measurement stations in the UK, New Zealand and Belgium are selected based on their significant historical ratings information and their specific characteristics related to data consistency. For each country, regional information is maximally used to estimate observational uncertainty. Based on this uncertainty, a BReach analysis is performed and, subsequently, results are validated against available knowledge about the history and behavior of the site. For all investigated cases, the methodology provides results that appear to be consistent with this knowledge of historical changes and thus facilitates a reliable assessment of (in)consistent periods in stage-discharge measurements. This assessment is not only useful for the analysis and determination of discharge time series, but also to enhance applications based on these data (e.g., by informing hydrological and hydraulic model
Opportunities and Challenges of Linking Scientific Core Samples to the Geoscience Data Ecosystem
Noren, A. J.
2016-12-01
Core samples generated in scientific drilling and coring are critical for the advancement of the Earth Sciences. The scientific themes enabled by analysis of these samples are diverse, and include plate tectonics, ocean circulation, Earth-life system interactions (paleoclimate, paleobiology, paleoanthropology), Critical Zone processes, geothermal systems, deep biosphere, and many others, and substantial resources are invested in their collection and analysis. Linking core samples to researchers, datasets, publications, and funding agencies through registration of globally unique identifiers such as International Geo Sample Numbers (IGSNs) offers great potential for advancing several frontiers. These include maximizing sample discoverability, access, reuse, and return on investment; a means for credit to researchers; and documentation of project outputs to funding agencies. Thousands of kilometers of core samples and billions of derivative subsamples have been generated through thousands of investigators' projects, yet the vast majority of these samples are curated at only a small number of facilities. These numbers, combined with the substantial similarity in sample types, make core samples a compelling target for IGSN implementation. However, differences between core sample communities and other geoscience disciplines continue to create barriers to implementation. Core samples involve parent-child relationships spanning 8 or more generations, an exponential increase in sample numbers between levels in the hierarchy, concepts related to depth/position in the sample, requirements for associating data derived from core scanning and lithologic description with data derived from subsample analysis, and publications based on tens of thousands of co-registered scan data points and thousands of analyses of subsamples. These characteristics require specialized resources for accurate and consistent assignment of IGSNs, and a community of practice to establish norms
HBV infection in relation to consistent condom use: a population-based study in Peru.
Bernabe-Ortiz, Antonio; Carcamo, Cesar P; Scott, John D; Hughes, James P; Garcia, Patricia J; Holmes, King K
2011-01-01
Data on hepatitis B virus (HBV) prevalence are limited in developing countries. There is also limited information of consistent condom use efficacy for reducing HBV transmission at the population level. The study goal was to evaluate the prevalence and factors associated with HBV infection in Peru, and the relationship between anti-HBc positivity and consistent condom use. Data from two different surveys performed in 28 mid-sized Peruvian cities were analyzed. Participants aged 18-29 years were selected using a multistage cluster sampling. Information was collected through a validated two-part questionnaire. The first part (face-to-face) concerned demographic data, while the second part (self-administered using handheld computers) concerned sexual behavior. Hepatitis B core antibody (anti-HBc) was tested in 7,000 blood samples. Prevalences and associations were adjusted for sample strata, primary sampling units and population weights. Anti-HBc prevalence was 5.0% (95%CI 4.1%-5.9%), with the highest prevalence among jungle cities: 16.3% (95%CI 13.8%-19.1%). In the multivariable analysis, Anti-HBc positivity was directly associated with geographic region (highlands OR = 2.05; 95%CI 1.28-3.27, and jungle OR = 4.86; 95%CI 3.05-7.74; compared to coastal region); and inversely associated with age at sexual debut (OR = 0.90; 95%CI 0.85-0.97). Consistent condom use, evaluated in about 40% of participants, was associated with reduced prevalence (OR = 0.34; 95%CI 0.15-0.79) after adjusting for gender, geographic region, education level, lifetime number of sex partners, age at sexual debut and year of survey. Residence in highlands or jungle cities is associated with higher anti-HBc prevalences, whereas increasing age at sexual debut were associated with lower prevalences. Consistent condom use was associated with decreased risk of anti-HBc. Findings from this study emphasize the need of primary prevention programs (vaccination) especially in the jungle
HBV infection in relation to consistent condom use: a population-based study in Peru.
Directory of Open Access Journals (Sweden)
Antonio Bernabe-Ortiz
Full Text Available Data on hepatitis B virus (HBV prevalence are limited in developing countries. There is also limited information of consistent condom use efficacy for reducing HBV transmission at the population level. The study goal was to evaluate the prevalence and factors associated with HBV infection in Peru, and the relationship between anti-HBc positivity and consistent condom use.Data from two different surveys performed in 28 mid-sized Peruvian cities were analyzed. Participants aged 18-29 years were selected using a multistage cluster sampling. Information was collected through a validated two-part questionnaire. The first part (face-to-face concerned demographic data, while the second part (self-administered using handheld computers concerned sexual behavior. Hepatitis B core antibody (anti-HBc was tested in 7,000 blood samples. Prevalences and associations were adjusted for sample strata, primary sampling units and population weights. Anti-HBc prevalence was 5.0% (95%CI 4.1%-5.9%, with the highest prevalence among jungle cities: 16.3% (95%CI 13.8%-19.1%. In the multivariable analysis, Anti-HBc positivity was directly associated with geographic region (highlands OR = 2.05; 95%CI 1.28-3.27, and jungle OR = 4.86; 95%CI 3.05-7.74; compared to coastal region; and inversely associated with age at sexual debut (OR = 0.90; 95%CI 0.85-0.97. Consistent condom use, evaluated in about 40% of participants, was associated with reduced prevalence (OR = 0.34; 95%CI 0.15-0.79 after adjusting for gender, geographic region, education level, lifetime number of sex partners, age at sexual debut and year of survey.Residence in highlands or jungle cities is associated with higher anti-HBc prevalences, whereas increasing age at sexual debut were associated with lower prevalences. Consistent condom use was associated with decreased risk of anti-HBc. Findings from this study emphasize the need of primary prevention programs (vaccination especially in the
Directory of Open Access Journals (Sweden)
Ozlem Ates
2018-04-01
Full Text Available This study aims to explain the extent to which prospective physics teachers’ views and practices are consistent with the constructivist framework. A case study design was employed as the research approach. The study was conducted with 11 prospective physics teachers attending a state university in Turkey. Data was collected through semi-structured interviews, observation notes and lesson plans. The interview guide consisted of questions which allowed the interviewer to probe participants’ views of constructivism based on 5E learning model. Such questions as “how do you plan your teaching?” (introducing new topics, continuing the lecture, types of questions to ask, evaluating students’ understanding etc. were included in the interview. Following the analysis of the interview data, participants’ profiles were classified into three categories: traditional, transition and constructivist under the dimensions “beginning of a lesson,” “learning process,” “learning environment” and “assessment.” Observations were carried out using an observation checklist consisting of 24 items based on 5E learning model. Another checklist developed by the researchers was used to evaluate participants’ teaching qualifications. Interview results showed that seven participants had transitional, three had constructivist and one had traditional views. However, none of the participants were observed to exhibit constructivist teaching styles. Moreover, observation and interview results were consistent only for six participants, indicating that almost half of the participants had difficulty putting their views into practice.
Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering
Sicat, Ronell Barrera
2014-12-31
This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.
Imaging and cognitive genetics: the Norwegian Cognitive NeuroGenetics sample.
Espeseth, Thomas; Christoforou, Andrea; Lundervold, Astri J; Steen, Vidar M; Le Hellard, Stephanie; Reinvang, Ivar
2012-06-01
Data collection for the Norwegian Cognitive NeuroGenetics sample (NCNG) was initiated in 2003 with a research grant (to Ivar Reinvang) to study cognitive aging, brain function, and genetic risk factors. The original focus was on the effects of aging (from middle age and up) and candidate genes (e.g., APOE, CHRNA4) in cross-sectional and longitudinal designs, with the cognitive and MRI-based data primarily being used for this purpose. However, as the main topic of the project broadened from cognitive aging to imaging and cognitive genetics more generally, the sample size, age range of the participants, and scope of available phenotypes and genotypes, have developed beyond the initial project. In 2009, a genome-wide association (GWA) study was undertaken, and the NCNG proper was established to study the genetics of cognitive and brain function more comprehensively. The NCNG is now controlled by the NCNG Study Group, which consists of the present authors. Prominent features of the NCNG are the adult life-span coverage of healthy participants with high-dimensional imaging, and cognitive data from a genetically homogenous sample. Another unique property is the large-scale (sample size 300-700) use of experimental cognitive tasks focusing on attention and working memory. The NCNG data is now used in numerous ongoing GWA-based studies and has contributed to several international consortia on imaging and cognitive genetics. The objective of the following presentation is to give other researchers the information necessary to evaluate possible contributions from the NCNG to various multi-sample data analyses.
Consistently violating the non-Gaussian consistency relation
International Nuclear Information System (INIS)
Mooij, Sander; Palma, Gonzalo A.
2015-01-01
Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations
18- and 24-month-olds' discrimination of gender-consistent and inconsistent activities.
Hill, Sara E; Flom, Ross
2007-02-01
18- and 24-month-olds' ability to discriminate gender-stereotyped activities was assessed. Using a preferential looking paradigm, toddlers viewed male and female actors performing masculine and feminine-stereotyped activities. Consistent with our predictions, and previous research, 24-month-olds, but not 18-month-olds, looked longer at the gender-inconsistent activities than the gender-consistent activities. Results are discussed in terms of toddlers emerging gender stereotypes and perception of everyday events.
Estimating True Short-Term Consistency in Vocational Interests: A Longitudinal SEM Approach
Gaudron, Jean-Philippe; Vautier, Stephane
2007-01-01
This study aimed at estimating the correlation between true scores (true consistency) of vocational interest over a short time span in a sample of 1089 adults. Participants were administered 54 items assessing vocational, family, and leisure interests twice over a 1-month period. Responses were analyzed with a multitrait (MT) model, which supposes…
The picture test of separation and individuation - preliminary research
Directory of Open Access Journals (Sweden)
Gregor Žvelc
2000-06-01
Full Text Available Authors introduce a new instrument, which they developed for measuring separation and individuation process and attachment in adolescence and adulthood. The Picture Test of Separation and Individuation (PTSI is a semi–projective test. It consists of various pictures, which represent relationships with significant others. PTSI is divided into three subtests: Relationship with Mother, Relationship with Father and Attachment. In a preliminary research on a sample of college and university students authors studied basic properties of the test. The results of the research indicate that PTSI is consistent with theoretical background, has good sensitivity and is economical. The Picture Test of Separation and Individuation enables quick but complex insight into individual's relationships with significant others as well as into his/her stage of separation and individuation process. Considering satisfying results of pilot study, authors suggest further research for validation of the test.
Buczkowski, Brian J.; Kelsey, Sarah A.
2007-01-01
The Woods Hole Science Center of the U.S. Geological Survey (USGS) has been an active member of the Woods Hole research community, Woods Hole, Massachusetts, for over 40 years. In that time there have been many projects that involved the collection of sediment samples conducted by USGS scientists and technicians for the research and study of seabed environments and processes. These samples were collected at sea or near shore and then brought back to the Woods Hole Science Center (WHSC) for analysis. While at the center, samples are stored in ambient temperature, refrigerated and freezing conditions ranging from +2º Celsius to -18º Celsius, depending on the best mode of preparation for the study being conducted or the duration of storage planned for the samples. Recently, storage methods and available storage space have become a major concern at the WHSC. The core and sediment archive program described herein has been initiated to set standards for the management, methods, and duration of sample storage. A need has arisen to maintain organizational consistency and define storage protocol. This handbook serves as a reference and guide to all parties interested in using and accessing the WHSC's sample archive and also defines all the steps necessary to construct and maintain an organized collection of geological samples. It answers many questions as to the way in which the archive functions.
Teknik Sampling Snowball dalam Penelitian Lapangan
Nina Nurdiani
2014-01-01
Field research can be associated with both qualitative and quantitative research methods, depending on the problems faced and the goals to be achieved. The success of data collection in the field research depends on the determination of the appropriate sampling technique, to obtain accurate data, and reliably. In studies that have problems related to specific issues, requiring a non-probability sampling techniques one of which is the snowball sampling technique. This technique is useful for f...
The prevalence of dementia in a Portuguese community sample: a 10/66 Dementia Research Group study
Directory of Open Access Journals (Sweden)
Manuel Gonçalves-Pereira
2017-11-01
Full Text Available Abstract Background Dementia imposes a high burden of disease worldwide. Recent epidemiological studies in European community samples are scarce. In Portugal, community prevalence data is very limited. The 10/66 Dementia Research Group (DRG population-based research programmes are focused in low and middle income countries, where the assessments proved to be culture and education fair. We applied the 10/66 DRG prevalence survey methodology in Portugal, where levels of illiteracy in older populations are still high. Methods A cross-sectional comprehensive one-phase survey was conducted of all residents aged 65 and over of two geographically defined catchment areas in Southern Portugal (one urban and one rural site. Nursing home residents were not included in the present study. Standardized 10/66 DRG assessments include a cognitive module, an informant interview and the Geriatric Mental State-AGECAT, providing data on dementia diagnosis and subtypes, mental disorders including depression, physical health, anthropometry, demographics, disability/functioning, health service utilization, care arrangements and caregiver strain. Results We interviewed 1405 old age participants (mean age 74.9, SD = 6.7 years; 55.5% women after 313 (18.2% refusals to participate. The prevalence rate for dementia in community-dwellers was 9.23% (95% CI 7.80–10.90 using the 10/66 DRG algorithm and 3.65% (95% CI 2.97–4.97 using DSM-IV criteria. Pure Alzheimer’s disease was the most prevalent dementia subtype (41.9%. The prevalence of dementia was strongly age-dependent for both criteria, but there was no association with sex. Conclusions Dementia prevalence was higher than previously reported in Portugal. The discrepancy between prevalence according to the 10/66 DRG algorithm and the DSM-IV criteria is consistent with that observed in less developed countries; this suggests potential underestimation using the latter approach, although relative validity of these two
Consistency of orthodox gravity
Energy Technology Data Exchange (ETDEWEB)
Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)
1997-01-01
A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.
Hackett, Paul M. W.
2016-01-01
When behavior is interpreted in a reliable manner (i.e., robustly across different situations and times) its explained meaning may be seen to possess hermeneutic consistency. In this essay I present an evaluation of the hermeneutic consistency that I propose may be present when the research tool known as the mapping sentence is used to create generic structural ontologies. I also claim that theoretical and empirical validity is a likely result of employing the mapping sentence in research design and interpretation. These claims are non-contentious within the realm of quantitative psychological and behavioral research. However, I extend the scope of both facet theory based research and claims for its structural utility, reliability and validity to philosophical and qualitative investigations. I assert that the hermeneutic consistency of a structural ontology is a product of a structural representation's ontological components and the mereological relationships between these ontological sub-units: the mapping sentence seminally allows for the depiction of such structure. PMID:27065932
X-Ray Micro-Computed Tomography of Apollo Samples as a Curation Technique Enabling Better Research
Ziegler, R. A.; Almeida, N. V.; Sykes, D.; Smith, C. L.
2014-01-01
X-ray micro-computed tomography (micro-CT) is a technique that has been used to research meteorites for some time and many others], and recently it is becoming a more common tool for the curation of meteorites and Apollo samples. Micro-CT is ideally suited to the characterization of astromaterials in the curation process as it can provide textural and compositional information at a small spatial resolution rapidly, nondestructively, and without compromising the cleanliness of the samples (e.g., samples can be scanned sealed in Teflon bags). This data can then inform scientists and curators when making and processing future sample requests for meteorites and Apollo samples. Here we present some preliminary results on micro-CT scans of four Apollo regolith breccias. Methods: Portions of four Apollo samples were used in this study: 14321, 15205, 15405, and 60639. All samples were 8-10 cm in their longest dimension and approximately equant. These samples were micro-CT scanned on the Nikon HMXST 225 System at the Natural History Museum in London. Scans were made at 205-220 kV, 135-160 microamps beam current, with an effective voxel size of 21-44 microns. Results: Initial examination of the data identify a variety of mineral clasts (including sub-voxel FeNi metal grains) and lithic clasts within the regolith breccias. Textural information within some of the lithic clasts was also discernable. Of particular interest was a large basalt clast (approx.1.3 cc) found within sample 60639, which appears to have a sub-ophitic texture. Additionally, internal void space, e.g., fractures and voids, is readily identifiable. Discussion: It is clear from the preliminary data that micro-CT analyses are able to identify important "new" clasts within the Apollo breccias, and better characterize previously described clasts or igneous samples. For example, the 60639 basalt clast was previously believed to be quite small based on its approx.0.5 sq cm exposure on the surface of the main mass
A New Bias Corrected Version of Heteroscedasticity Consistent Covariance Estimator
Directory of Open Access Journals (Sweden)
Munir Ahmed
2016-06-01
Full Text Available In the presence of heteroscedasticity, different available flavours of the heteroscedasticity consistent covariance estimator (HCCME are used. However, the available literature shows that these estimators can be considerably biased in small samples. Cribari–Neto et al. (2000 introduce a bias adjustment mechanism and give the modified White estimator that becomes almost bias-free even in small samples. Extending these results, Cribari-Neto and Galvão (2003 present a similar bias adjustment mechanism that can be applied to a wide class of HCCMEs’. In the present article, we follow the same mechanism as proposed by Cribari-Neto and Galvão to give bias-correction version of HCCME but we use adaptive HCCME rather than the conventional HCCME. The Monte Carlo study is used to evaluate the performance of our proposed estimators.
Adaptive Angular Sampling for SPECT Imaging
Li, Nan; Meng, Ling-Jian
2011-01-01
This paper presents an analytical approach for performing adaptive angular sampling in single photon emission computed tomography (SPECT) imaging. It allows for a rapid determination of the optimum sampling strategy that minimizes image variance in regions-of-interest (ROIs). The proposed method consists of three key components: (a) a set of close-form equations for evaluating image variance and resolution attainable with a given sampling strategy, (b) a gradient-based algor...
Design unbiased estimation in line intersect sampling using segmented transects
David L.R. Affleck; Timothy G. Gregoire; Harry T. Valentine; Harry T. Valentine
2005-01-01
In many applications of line intersect sampling. transects consist of multiple, connected segments in a prescribed configuration. The relationship between the transect configuration and the selection probability of a population element is illustrated and a consistent sampling protocol, applicable to populations composed of arbitrarily shaped elements, is proposed. It...
International Nuclear Information System (INIS)
Maenttaeri, I.; Mattila, J.; Zwingmann, H.; Todd, A.J.
2007-08-01
Illite K-Ar age determinations were done on five fault breccia samples from the ONKALO underground research facility, Olkiluoto, Eurajoki, S-W Finland. The XRD, SEM, and TEM studies and K-Ar analyses were done in John deLaeter Center in Mass Spectrometry at Curtin University, Perth, Western Australia. The <2 micron grain size fractions contain illite, chlorite, dickite, and quartz. All fractions had minor contamination phases comprising mainly of quartz but traces of K-feldspar contamination could be identified in all samples. The authigenic illite shows variable K concentrations. The illite contents of the ONK-PL68 and ONK-PL87 samples are the smallest. The K-Ar ages for the <2 micron fractions vary from ∼0.55 Ga to 1.38 Ga. The sample ONKPL68 yields a K-Ar age of 912 ± 18 Ma corresponding to a Neoproterozoic-Tonian age. This age can be roughly temporally linked with late events related to Sveconorwegian orogeny. Sample ONK-PL87 has a K-Ar age of 550 ± 11 Ma corresponding to a Neoproterozoic - Lower Cambrian age. The samples ONK-PL522 and ONK-PL901 sampled from the storage hall fault show identical K-Ar ages of 1385 ± 27 Ma and 1373 ± 27 Ma, respectively. These correspond to a Mesoproterozoic-Ectasian age related to Subjotnian or Postjotnian events. ONK-PL960 yields a K-Ar age of 1225 ± 24 Ma corresponding to a Mesoproterozoic-Ectasian age. This age agrees well with the ages from Postjotnian diabase dykes in W Finland. The 2-3 % detrital K-feldspar contamination in clay fractions increases the age. Especially for the youngest sample ONK-PL87, the effect may be geologically meaningful as after the correction the age clearly indicates Caledonian events. Moreover, the age for the low K sample ONKPL901 shifts to indicate Postjotnian diabase age. (orig.)
Radioactive air sampling methods
Maiello, Mark L
2010-01-01
Although the field of radioactive air sampling has matured and evolved over decades, it has lacked a single resource that assimilates technical and background information on its many facets. Edited by experts and with contributions from top practitioners and researchers, Radioactive Air Sampling Methods provides authoritative guidance on measuring airborne radioactivity from industrial, research, and nuclear power operations, as well as naturally occuring radioactivity in the environment. Designed for industrial hygienists, air quality experts, and heath physicists, the book delves into the applied research advancing and transforming practice with improvements to measurement equipment, human dose modeling of inhaled radioactivity, and radiation safety regulations. To present a wide picture of the field, it covers the international and national standards that guide the quality of air sampling measurements and equipment. It discusses emergency response issues, including radioactive fallout and the assets used ...
Directory of Open Access Journals (Sweden)
Pasi Nieminen
2012-05-01
Full Text Available Previous physics education research has raised the question of “hidden variables” behind students’ success in learning certain concepts. In the context of the force concept, it has been suggested that students’ reasoning ability is one such variable. Strong positive correlations between students’ preinstruction scores for reasoning ability (measured by Lawson’s Classroom Test of Scientific Reasoning and their learning of forces [measured by the Force Concept Inventory (FCI] have been reported in high school and university introductory courses. However, there is no published research concerning the relation between students’ ability to interpret multiple representations consistently (i.e., representational consistency and their learning of forces. To investigate this, we collected 131 high school students’ pre- and post-test data of the Representational Variant of the Force Concept Inventory (for representational consistency and the FCI. The students’ Lawson pretest data were also collected. We found that the preinstruction level of students’ representational consistency correlated strongly with student learning gain of forces. The correlation (0.51 was almost equal to the correlation between Lawson prescore and learning gain of forces (0.52. Our results support earlier findings which suggest that scientific reasoning ability is a hidden variable behind the learning of forces. In addition, we suggest that students’ representational consistency may also be such a factor, and that this should be recognized in physics teaching.
Hanushek, Eric A.; Woessmann, Ludger
2010-01-01
Critics of international student comparisons argue that results may be influenced by differences in the extent to which countries adequately sample their entire student populations. In this research note, we show that larger exclusion and non-response rates are related to better country average scores on international tests, as are larger…
Energy Technology Data Exchange (ETDEWEB)
Vavpetič, P., E-mail: primoz.vavpetic@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Vogel-Mikuš, K. [Biotechnical Faculty, Department of Biology, University of Ljubljana, Jamnikarjeva 101, SI-1000 Ljubljana (Slovenia); Jeromel, L. [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Ogrinc Potočnik, N. [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); FOM-Institute AMOLF, Science Park 104, 1098 XG Amsterdam (Netherlands); Pongrac, P. [Biotechnical Faculty, Department of Biology, University of Ljubljana, Jamnikarjeva 101, SI-1000 Ljubljana (Slovenia); Department of Plant Physiology, University of Bayreuth, Universitätstr. 30, 95447 Bayreuth (Germany); Drobne, D.; Pipan Tkalec, Ž.; Novak, S.; Kos, M.; Koren, Š.; Regvar, M. [Biotechnical Faculty, Department of Biology, University of Ljubljana, Jamnikarjeva 101, SI-1000 Ljubljana (Slovenia); Pelicon, P. [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia)
2015-04-01
The analysis of biological samples in frozen-hydrated state with micro-PIXE technique at Jožef Stefan Institute (JSI) nuclear microprobe has matured to a point that enables us to measure and examine frozen tissue samples routinely as a standard research method. Cryotome-cut slice of frozen-hydrated biological sample is mounted between two thin foils and positioned on the sample holder. The temperature of the cold stage in the measuring chamber is kept below 130 K throughout the insertion of the samples and the proton beam exposure. Matrix composition of frozen-hydrated tissue is consisted mostly of ice. Sample deterioration during proton beam exposure is monitored during the experiment, as both Elastic Backscattering Spectrometry (EBS) and Scanning Transmission Ion Microscopy (STIM) in on–off axis geometry are recorded together with the events in two PIXE detectors and backscattered ions from the chopper in a single list-mode file. The aim of this experiment was to determine differences and similarities between two kinds of biological sample preparation techniques for micro-PIXE analysis, namely freeze-drying and frozen-hydrated sample preparation in order to evaluate the improvements in the elemental localisation of the latter technique if any. In the presented work, a standard micro-PIXE configuration for tissue mapping at JSI was used with five detection systems operating in parallel, with proton beam cross section of 1.0 × 1.0 μm{sup 2} and a beam current of 100 pA. The comparison of the resulting elemental distributions measured at the biological tissue prepared in the frozen-hydrated and in the freeze-dried state revealed differences in elemental distribution of particular elements at the cellular level due to the morphology alteration in particular tissue compartments induced either by water removal in the lyophilisation process or by unsatisfactory preparation of samples for cutting and mounting during the shock-freezing phase of sample preparation.
Directory of Open Access Journals (Sweden)
Virginia Lopez-Alonso
2018-04-01
Full Text Available Non-invasive brain stimulation (NIBS has been widely explored as a way to safely modulate brain activity and alter human performance for nearly three decades. Research using NIBS has grown exponentially within the last decade with promising results across a variety of clinical and healthy populations. However, recent work has shown high inter-individual variability and a lack of reproducibility of previous results. Here, we conducted a small preliminary study to explore the effects of three of the most commonly used excitatory NIBS paradigms over the primary motor cortex (M1 on motor learning (Sequential Visuomotor Isometric Pinch Force Tracking Task and secondarily relate changes in motor learning to changes in cortical excitability (MEP amplitude and SICI. We compared anodal transcranial direct current stimulation (tDCS, paired associative stimulation (PAS25, and intermittent theta burst stimulation (iTBS, along with a sham tDCS control condition. Stimulation was applied prior to motor learning. Participants (n = 28 were randomized into one of the four groups and were trained on a skilled motor task. Motor learning was measured immediately after training (online, 1 day after training (consolidation, and 1 week after training (retention. We did not find consistent differential effects on motor learning or cortical excitability across groups. Within the boundaries of our small sample sizes, we then assessed effect sizes across the NIBS groups that could help power future studies. These results, which require replication with larger samples, are consistent with previous reports of small and variable effect sizes of these interventions on motor learning.
Lopez-Alonso, Virginia; Liew, Sook-Lei; Fernández del Olmo, Miguel; Cheeran, Binith; Sandrini, Marco; Abe, Mitsunari; Cohen, Leonardo G.
2018-01-01
Non-invasive brain stimulation (NIBS) has been widely explored as a way to safely modulate brain activity and alter human performance for nearly three decades. Research using NIBS has grown exponentially within the last decade with promising results across a variety of clinical and healthy populations. However, recent work has shown high inter-individual variability and a lack of reproducibility of previous results. Here, we conducted a small preliminary study to explore the effects of three of the most commonly used excitatory NIBS paradigms over the primary motor cortex (M1) on motor learning (Sequential Visuomotor Isometric Pinch Force Tracking Task) and secondarily relate changes in motor learning to changes in cortical excitability (MEP amplitude and SICI). We compared anodal transcranial direct current stimulation (tDCS), paired associative stimulation (PAS25), and intermittent theta burst stimulation (iTBS), along with a sham tDCS control condition. Stimulation was applied prior to motor learning. Participants (n = 28) were randomized into one of the four groups and were trained on a skilled motor task. Motor learning was measured immediately after training (online), 1 day after training (consolidation), and 1 week after training (retention). We did not find consistent differential effects on motor learning or cortical excitability across groups. Within the boundaries of our small sample sizes, we then assessed effect sizes across the NIBS groups that could help power future studies. These results, which require replication with larger samples, are consistent with previous reports of small and variable effect sizes of these interventions on motor learning. PMID:29740271
Concurrent analysis: towards generalisable qualitative research.
Snowden, Austyn; Martin, Colin R
2011-10-01
This study develops an original method of qualitative analysis coherent with its interpretivist principles. The objective is to increase the likelihood of achieving generalisability and so improve the chance of the findings being translated into practice. Good qualitative research depends on coherent analysis of different types of data. The limitations of existing methodologies are first discussed to justify the need for a novel approach. To illustrate this approach, primary evidence is presented using the new methodology. The primary evidence consists of a constructivist grounded theory of how mental health nurses with prescribing authority integrate prescribing into practice. This theory is built concurrently from interviews, reflective accounts and case study data from the literature. Concurrent analysis. Ten research articles and 13 semi-structured interviews were sampled purposively and then theoretically and analysed concurrently using constructivist grounded theory. A theory of the process of becoming competent in mental health nurse prescribing was generated through this process. This theory was validated by 32 practising mental health nurse prescribers as an accurate representation of their experience. The methodology generated a coherent and generalisable theory. It is therefore claimed that concurrent analysis engenders consistent and iterative treatment of different sources of qualitative data in a manageable manner. This process supports facilitation of the highest standard of qualitative research. Concurrent analysis removes the artificial delineation of relevant literature from other forms of constructed data. This gives researchers clear direction to treat qualitative data consistently raising the chances of generalisability of the findings. Raising the generalisability of qualitative research will increase its chances of informing clinical practice. © 2010 Blackwell Publishing Ltd.
A Study on the Consistency of Discretization Equation in Unsteady Heat Transfer Calculations
Directory of Open Access Journals (Sweden)
Wenhua Zhang
2013-01-01
Full Text Available The previous studies on the consistency of discretization equation mainly focused on the finite difference method, but the issue of consistency still remains with several problems far from totally solved in the actual numerical computation. For instance, the consistency problem is involved in the numerical case where the boundary variables are solved explicitly while the variables away from the boundary are solved implicitly. And when the coefficient of discretization equation of nonlinear numerical case is the function of variables, calculating the coefficient explicitly and the variables implicitly might also give rise to consistency problem. Thus the present paper mainly researches the consistency problems involved in the explicit treatment of the second and third boundary conditions and that of thermal conductivity which is the function of temperature. The numerical results indicate that the consistency problem should be paid more attention and not be neglected in the practical computation.
"A Simplified 'Benchmark” Stock-flow Consistent (SFC) Post-Keynesian Growth Model"
Claudio H. Dos Santos; Gennaro Zezza
2007-01-01
Despite being arguably one of the most active areas of research in heterodox macroeconomics, the study of the dynamic properties of stock-flow consistent (SFC) growth models of financially sophisticated economies is still in its early stages. This paper attempts to offer a contribution to this line of research by presenting a simplified Post-Keynesian SFC growth model with well-defined dynamic properties, and using it to shed light on the merits and limitations of the current heterodox SFC li...
W. Cohen; H. Andersen; S. Healey; G. Moisen; T. Schroeder; C. Woodall; G. Domke; Z. Yang; S. Stehman; R. Kennedy; C. Woodcock; Z. Zhu; J. Vogelmann; D. Steinwand; C. Huang
2014-01-01
The authors are developing a REDD+ MRV system that tests different biomass estimation frameworks and components. Design-based inference from a costly fi eld plot network was compared to sampling with LiDAR strips and a smaller set of plots in combination with Landsat for disturbance monitoring. Biomass estimation uncertainties associated with these different data sets...
Sample Size Determination for One- and Two-Sample Trimmed Mean Tests
Luh, Wei-Ming; Olejnik, Stephen; Guo, Jiin-Huarng
2008-01-01
Formulas to determine the necessary sample sizes for parametric tests of group comparisons are available from several sources and appropriate when population distributions are normal. However, in the context of nonnormal population distributions, researchers recommend Yuen's trimmed mean test, but formulas to determine sample sizes have not been…
Market-consistent actuarial valuation
Wüthrich, Mario V
2016-01-01
This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.
Neelotpol, Sharmind; Hay, Alastair W M; Jolly, A Jim; Woolridge, Mike W
2016-08-31
To recruit South Asian pregnant women, living in the UK, into a clinicoepidemiological study for the collection of lifestyle survey data and antenatal blood and to retain the women for the later collection of cord blood and meconium samples from their babies for biochemical analysis. A longitudinal study recruiting pregnant women of South Asian and Caucasian origin living in the UK. Recruitment of the participants, collection of clinical samples and survey data took place at the 2 sites within a single UK Northern Hospital Trust. Pregnant women of South Asian origin (study group, n=98) and of Caucasian origin (comparison group, n=38) living in Leeds, UK. Among the participants approached, 81% agreed to take part in the study while a 'direct approach' method was followed. The retention rate of the participants was a remarkable 93.4%. The main challenges in recruiting the ethnic minority participants were their cultural and religious conservativeness, language barrier, lack of interest and feeling of extra 'stress' in taking part in research. The chief investigator developed an innovative participant retention method, associated with the women's cultural and religious practices. The method proved useful in retaining the participants for about 5 months and in enabling successful collection of clinical samples from the same mother-baby pairs. The collection of clinical samples and lifestyle data exceeded the calculated sample size required to give the study sufficient power. The numbers of samples obtained were: maternal blood (n=171), cord blood (n=38), meconium (n=176), lifestyle questionnaire data (n=136) and postnatal records (n=136). Recruitment and retention of participants, according to the calculated sample size, ensured sufficient power and success for a clinicoepidemiological study. Results suggest that development of trust and confidence between the participant and the researcher is the key to the success of a clinical and epidemiological study involving
Energy Technology Data Exchange (ETDEWEB)
Rusin, Tiago; Rebello, Wilson F.; Vellozo, Sergio O.; Gomes, Renato G., E-mail: tiagorusin@ime.eb.b, E-mail: rebello@ime.eb.b, E-mail: vellozo@cbpf.b, E-mail: renatoguedes@ime.eb.b [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil). Dept. de Engenharia Nuclear; Vital, Helio C., E-mail: vital@ctex.eb.b [Centro Tecnologico do Exercito (CTEx), Rio de Janeiro, RJ (Brazil); Silva, Ademir X., E-mail: ademir@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Engenharia Nuclear
2011-07-01
A cavity-type cesium-137 research irradiating facility at CTEx has been modeled by using the Monte Carlo code MCNPX. The irradiator has been daily used in experiments to optimize the use of ionizing radiation for conservation of many kinds of food and to improve materials properties. In order to correlate the effects of the treatment, average doses have been calculated for each irradiated sample, accounting for the measured dose rate distribution in the irradiating chambers. However that approach is only approximate, being subject to significant systematic errors due to the heterogeneous internal structure of most samples that can lead to large anisotropy in attenuation and Compton scattering properties across the media. Thus this work is aimed at further investigating such uncertainties by calculating the dose rate distribution inside the items treated such that a more accurate and representative estimate of the total absorbed dose can be determined for later use in the effects-versus-dose correlation curves. Samples of different simplified geometries and densities (spheres, cylinders, and parallelepipeds), have been modeled to evaluate internal dose rate distributions within the volume of the samples and the overall effect on the average dose. (author)
International Nuclear Information System (INIS)
Rusin, Tiago; Rebello, Wilson F.; Vellozo, Sergio O.; Gomes, Renato G.; Silva, Ademir X.
2011-01-01
A cavity-type cesium-137 research irradiating facility at CTEx has been modeled by using the Monte Carlo code MCNPX. The irradiator has been daily used in experiments to optimize the use of ionizing radiation for conservation of many kinds of food and to improve materials properties. In order to correlate the effects of the treatment, average doses have been calculated for each irradiated sample, accounting for the measured dose rate distribution in the irradiating chambers. However that approach is only approximate, being subject to significant systematic errors due to the heterogeneous internal structure of most samples that can lead to large anisotropy in attenuation and Compton scattering properties across the media. Thus this work is aimed at further investigating such uncertainties by calculating the dose rate distribution inside the items treated such that a more accurate and representative estimate of the total absorbed dose can be determined for later use in the effects-versus-dose correlation curves. Samples of different simplified geometries and densities (spheres, cylinders, and parallelepipeds), have been modeled to evaluate internal dose rate distributions within the volume of the samples and the overall effect on the average dose. (author)
Irradiation chamber and sample changer for biological samples
International Nuclear Information System (INIS)
Kraft, G.; Daues, H.W.; Fischer, B.; Kopf, U.; Liebold, H.P.; Quis, D.; Stelzer, H.; Kiefer, J.; Schoepfer, F.; Schneider, E.
1980-01-01
This paper describes an irradiaton system with which living cells of different origin are irradiated with heavy ion beams (18 <- Z <- 92) at energies up to 10 MeV/amu. The system consists of a beam monitor connected to the vacuum system of the accelerator and the irradiation chamber, containing the biological samples under atmospheric pressure. The requirements and aims of the set up are discussed. The first results with saccharomyces cerevisiae and Chinese Hamster tissue cells are presented. (orig.)
Quasiparticles and thermodynamical consistency
International Nuclear Information System (INIS)
Shanenko, A.A.; Biro, T.S.; Toneev, V.D.
2003-01-01
A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)
DiGirolamo, Ann; Geller, Alan C.; Tendulkar, Shalini A.; Patil, Pratima; Hacker, Karen
2012-01-01
Abstract Purpose: To determine the community‐based participatory research (CBPR) training interests and needs of researchers interested in CBPR to inform efforts to build infrastructure for conducting community‐engaged research. Method: A 20‐item survey was completed by 127 academic health researchers at Harvard Medical School, Harvard School of Public Health, and Harvard affiliated hospitals. Results: Slightly more than half of the participants reported current or prior experience with CBPR (58 %). Across all levels of academic involvement, approximately half of the participants with CBPR experience reported lacking skills in research methods and dissemination, with even fewer reporting skills in training of community partners. Regardless of prior CBPR experience, about half of the respondents reported having training needs in funding, partnership development, evaluation, and dissemination of CBPR projects. Among those with CBPR experience, more than one‐third of the participants wanted a mentor in CBPR; however only 19 % were willing to act as a mentor. Conclusions: Despite having experience with CBPR, many respondents did not have the comprehensive package of CBPR skills, reporting a need for training in a variety of CBPR skill sets. Further, the apparent mismatch between the need for mentors and availability in this sample suggests an important area for development. Clin Trans Sci 2012; Volume #: 1–5 PMID:22686211
International Nuclear Information System (INIS)
Baik, Min Hoon; Lee, Seung Yeop; Cho, Won Jin
2006-11-01
In this report, a comprehensive review on the research results and status for the various effects of microbes in the radioactive waste disposal including definition and classification of microbes, and researches related with the waste containers, engineered barriers, natural barriers, natural analogue studies, and radionuclide migration and retardation. Cultivation, isolation, and classification of aerobic microbes found in a groundwater sampled from the KAERI Underground Research Tunnel (KURT) located in the KAERI site have carried out and over 20 microbes were found to be present in the groundwater. Microbial identification by a 16S rDNA genetic analysis of the selected major 10 aerobic microbes was performed and the identified microbes were characterized
Research of pneumatic control transmission system for small irradiation samples
International Nuclear Information System (INIS)
Bai Zhongxiong; Zhang Haibing; Rong Ru; Zhang Tao
2008-01-01
In order to reduce the absorbed dose damage for the operator, pneumatic control has been adopted to realize the rapid transmission of small irradiation samples. On/off of pneumatic circuit and directions for the rapid transmission system are controlled by the electrical control part. The main program initializes the system and detects the location of the manual/automatic change-over switch, and call for the corresponding subprogram to achieve the automatic or manual operation. Automatic subprogram achieves the automatic sample transmission; Manual subprogram completes the deflation, and back and forth movement of the radiation samples. This paper introduces in detail the implementation of the system, in terms of both hardware and software design. (authors)
International Nuclear Information System (INIS)
Jin Meisun; Wang Benli; Liu Wencang
1988-04-01
A large rapid-dry-ashing apparatus and a rapid ashing method for biological samples are described. The apparatus consists of specially made ashing furnace, gas supply system and temperature-programming control cabinet. The following adventages have been showed by ashing experiment with the above apparatus: (1) high speed of ashing and saving of electric energy; (2) The apparatus can ash a large amount of samples at a time; (3) The ashed sample is pure white (or spotless), loose and easily soluble with few content of residual char; (4) The fresh sample can also be ashed directly. The apparatus is suitable for ashing a large amount of the environmental samples containing low level radioactivity trace elements and the medical, food and agricultural research samples
International Nuclear Information System (INIS)
Rahman, M.; Molla, N.I.; Sharif, A.K.M.; Basunia, S.; Islam, S.; Miah, R.U.; Hossain, S.M.; Chowdhury, M.I.; Bhuiyan, A.D.; Stegnar, P.
1993-01-01
Uranium and thorium were determined in geological materials such as radioactive rock samples collected from the Harargaj Anticline in Moulavi Bazar. The pure instrumental neutron activation analysis (INAA) technique was used in qualitative and quantitative analysis of the rock samples for U and Th. The samples were properly prepared together with their standards and simultaneously irradiated in a neutron flux of the order of 10 12 n*cm -2 *s -1 using the TRIGA MARK II research reactor facility at the AERE, Savar, Dhaka. After activation the samples were subjected to γ-ray spectrometry using a high purity germanium detection system. As a result of the analysis, U and Th could be determined. The data are consistent with the values reported by the ground radiometric survey group for some of the samples. (author) 7 refs.; 1 fig.; 2 tabs
Power Spectrum Estimation of Randomly Sampled Signals
DEFF Research Database (Denmark)
Velte, C. M.; Buchhave, P.; K. George, W.
algorithms; sample and-hold and the direct spectral estimator without residence time weighting. The computer generated signal is a Poisson process with a sample rate proportional to velocity magnitude that consist of well-defined frequency content, which makes bias easy to spot. The idea...
Statistical sampling methods for soils monitoring
Ann M. Abbott
2010-01-01
Development of the best sampling design to answer a research question should be an interactive venture between the land manager or researcher and statisticians, and is the result of answering various questions. A series of questions that can be asked to guide the researcher in making decisions that will arrive at an effective sampling plan are described, and a case...
Research on test of product based on spatial sampling criteria and variable step sampling mechanism
Li, Ruihong; Han, Yueping
2014-09-01
This paper presents an effective approach for online testing the assembly structures inside products using multiple views technique and X-ray digital radiography system based on spatial sampling criteria and variable step sampling mechanism. Although there are some objects inside one product to be tested, there must be a maximal rotary step for an object within which the least structural size to be tested is predictable. In offline learning process, Rotating the object by the step and imaging it and so on until a complete cycle is completed, an image sequence is obtained that includes the full structural information for recognition. The maximal rotary step is restricted by the least structural size and the inherent resolution of the imaging system. During online inspection process, the program firstly finds the optimum solutions to all different target parts in the standard sequence, i.e., finds their exact angles in one cycle. Aiming at the issue of most sizes of other targets in product are larger than that of the least structure, the paper adopts variable step-size sampling mechanism to rotate the product specific angles with different steps according to different objects inside the product and match. Experimental results show that the variable step-size method can greatly save time compared with the traditional fixed-step inspection method while the recognition accuracy is guaranteed.
Major ions in spitsbergen snow samples
International Nuclear Information System (INIS)
Semb, A.; Braekkan, R.; Joranger, E.
1984-01-01
Chemical analysis of Spitsbergen snow cores sampled in spring 1983, reveals a spatial pattern consistent with orographic deposition of major anthropogenic pollutants with air movements from southeast towards northwest. The highest concentrations of pollutant species were found at an altitude of 700 metres above sea level, and are higher than for any other recorded snow samples from the Arctic
Concepts in sample size determination
Directory of Open Access Journals (Sweden)
Umadevi K Rao
2012-01-01
Full Text Available Investigators involved in clinical, epidemiological or translational research, have the drive to publish their results so that they can extrapolate their findings to the population. This begins with the preliminary step of deciding the topic to be studied, the subjects and the type of study design. In this context, the researcher must determine how many subjects would be required for the proposed study. Thus, the number of individuals to be included in the study, i.e., the sample size is an important consideration in the design of many clinical studies. The sample size determination should be based on the difference in the outcome between the two groups studied as in an analytical study, as well as on the accepted p value for statistical significance and the required statistical power to test a hypothesis. The accepted risk of type I error or alpha value, which by convention is set at the 0.05 level in biomedical research defines the cutoff point at which the p value obtained in the study is judged as significant or not. The power in clinical research is the likelihood of finding a statistically significant result when it exists and is typically set to >80%. This is necessary since the most rigorously executed studies may fail to answer the research question if the sample size is too small. Alternatively, a study with too large a sample size will be difficult and will result in waste of time and resources. Thus, the goal of sample size planning is to estimate an appropriate number of subjects for a given study design. This article describes the concepts in estimating the sample size.
Zhou, Jie; Dovidio, John; Wang, Erping
2013-01-01
The moderating role of affective-cognitive consistency in the effects of affectively-based and cognitively-based attitudes on consummatory and instrumental behaviors was explored using two experimental studies in the intergroup context. Study 1 revealed that affectively-based attitudes were better predictors than cognitively-based attitudes regardless of affective-cognitive consistency for consummatory behaviors (e.g., undergraduates’ supportive behaviors toward government officials). Study 2, which investigated task groups’ supportive behaviors toward an immediate supervisory group, found that for these instrumental behaviors cognitively-based attitudes were better predictors than affectively-based attitudes only when affective-cognitive consistency was high. The present research also examined the mechanism by which affective-cognitive consistency moderates the relative roles of affectively-based and cognitively-based attitudes in attitude-behavior consistency. Results indicated that attitude-behavior consistency is eroded primarily because of the weaker relationship of affective or cognitive components to behaviors than to general attitudes. The reciprocal implications of research on attitudes and work on intergroup relations are considered. PMID:24244751
Mixed Methods Sampling: A Typology with Examples
Teddlie, Charles; Yu, Fen
2007-01-01
This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…
Directory of Open Access Journals (Sweden)
Gang Ma
2015-01-01
Full Text Available Lightweight aggregate concrete consisting of glazed hollow bead (GHB as lightweight aggregate is studied for the influence of nanosilica (NS content, prewetting time for GHB, water-cement ratio, and curing humidity, on the interface structure between GHB and cement paste. This research analyzed the influences of various factors on the interface zone structure by measuring microhardness (HV and hydration degree of cement paste (HD nearby the interface zone (1 mm between GHB and cement paste at different periods of aging. Due to the sampling limitation, the interface zone in this test is within 1 mm away from the surface of lightweight aggregate. The HD of cement paste was determined through chemically combined water (CCW test. The results were expected to reflect the influence of various factors on the interface zone structure. Results showed that the rational control of the four factors studied could fully mobilize the water absorption and desorption properties of GHB to improve the characteristics of the interfacial transition zone.
Recent Results from the SAMPLE Experiment
International Nuclear Information System (INIS)
Ito, Takeyasu M.
2004-01-01
The previous two SAMPLE experiments yielded a measurement of the axial e-N form factor G A e substantially different from the theoretical estimate. In order to confirm this observation, a third SAMPLE experiment was carried out at a lower beam energy of 125 MeV (Q2 = 0.038 (GeV/c)2) on a deuterium target. The data analysis is now at the final stage and the results are consistent with the theoretical prediction of the axial form factor G A e . Also, reevaluation of the background dilution factor and the electromagnetic radiative correction for the 200 MeV deuterium data lead to updated results, which are also consistent with the theoretical prediction
Helena Prosen
2014-01-01
Solvent extraction remains one of the fundamental sample preparation techniques in the analysis of environmental solid samples, but organic solvents are toxic and environmentally harmful, therefore one of the possible greening directions is its miniaturization. The present review covers the relevant research from the field of application of microextraction to the sample preparation of environmental solid samples (soil, sediments, sewage sludge, dust etc.) published in the last decade. Several...
Vapor and gas sampling of single-shell tank 241-B-102 using the in situ vapor sampling system
International Nuclear Information System (INIS)
Lockrem, L.L.
1997-01-01
The Vapor Issue Resolution Program tasked the Vapor Team (the team) to collect representative headspace samples from Hanford Site single-shell tank (SST) 241-B-102. This document presents sampling data resulting from the April 18, 1996 sampling of SST 241-B-102. Analytical results will be presented in a separate report issued by Pacific Northwest National Laboratory (PNNL), which supplied and analyzed the sampling media. The team, consisting of Sampling and Mobile Laboratories (SML) and Special Analytical Studies (SAS) personnel, used the vapor sampling system (VSS) to collect representative samples of the air, gases, and vapors from the headspace of SST 241-B-102 with sorbent traps and SUMMA canisters
Yahalom, Ran; Yarom, Noam; Shani, Tali; Amariglio, Ninet; Kaplan, Ilana; Trakhtenbrot, Luba; Hirshberg, Abraham
2016-04-01
Oral lichen planus (OLP) carries an increased risk for malignant transformation with aneuploid cells (ACs) being found in brush samples of a quarter of patients with OLP. Patients with OLP were followed and repeated brush samples were simultaneously analyzed for morphology and fluorescent in situ hybridization (FISH) using centromeric probes for chromosomes 2 and 8. Three patients with a high proportion of ACs developed oral cancer. Fifteen patients had ≥1% ACs (13 in affected sites and 2 in nonaffected sites), whereas only 2 of the 15 patients with <1% ACs in the first sample had ≥1% ACs in the second sample. A strong positive correlation between the results of the initial and repeated samples was found. High proportion of ACs in brush samples from patients with OLP may imply an impending malignant transformation. As FISH analysis is consistent over time, it can be used to identify a subgroup of patients who would require close follow-up. © 2015 Wiley Periodicals, Inc. Head Neck 38: E741-E746, 2016. © 2015 Wiley Periodicals, Inc.
Internal Consistency and Convergent Validity of the Klontz Money Behavior Inventory (KMBI
Directory of Open Access Journals (Sweden)
Colby D. Taylor
2015-12-01
Full Text Available The Klontz Money Behavior Inventory (KMBI is a standalone, multi-scale measure than can screen for the presence of eight distinct money disorders. Given the well-established relationship between mental health and financial behaviors, results from the KMBI can be used to inform both mental health care professionals and financial planners. The present study examined the internal consistency and convergent validity of the KMBI, through comparison with similar measures, among a sample of college students (n = 232. Results indicate that the KMBI demonstrates acceptable internal consistency reliability and some convergence for most subscales when compared to other analogous measures. These findings highlight a need for literature and assessments to identify and describe disordered money behaviors.
40 CFR Appendix I to Part 261 - Representative Sampling Methods
2010-07-01
... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the form and consistency of the waste materials to be sampled. Samples collected using the sampling...
Radiochemical analysis of phosphorus in milk samples
International Nuclear Information System (INIS)
Oliveira, R.M. de; Cunha, I.I.L.
1991-01-01
The determination of phosphorus in milk samples by thermal neutron activation analysis employing radiochemical separation is described. The radiochemical separation consists of the simultaneous irradiation of samples and standards, dissolution of the milk samples in a perchloric acid and nitric acid mixture, addition of zinc hold-back carrier, precipitation of phosphorus as ammonium phospho molybdate (A.M.P.) and sample counting in a Geiger-Mueller detector. The analysis sources of error were studied and the established method was applied to phosphorus analyses in commercial milk samples. (author)
Where Will All Your Samples Go?
Lehnert, K.
2017-12-01
Even in the digital age, physical samples remain an essential component of Earth and space science research. Geoscientists collect samples, sometimes locally, often in remote locations during expensive field expeditions, or at sample repositories and museums. They take these samples to their labs to describe and analyze them. When the analyses are completed and the results are published, the samples get stored away in sheds, basements, or desk drawers, where they remain unknown and inaccessible to the broad science community. In some cases, they will get re-analyzed or shared with other researchers, who know of their existence through personal connections. The sad end comes when the researcher retires: There are many stories of samples and entire collections being discarded to free up space for new samples or other purposes, even though these samples may be unique and irreplaceable. Institutions do not feel obligated and do not have the resources to store samples in perpetuity. Only samples collected in large sampling campaigns such as the Ocean Discovery Program or cores taken on ships find a home in repositories that curate and preserve them for reuse in future science endeavors. In the era of open, transparent, and reproducible science, preservation and persistent access to samples must be considered a mandate. Policies need to be developed that guide investigators, institutions, and funding agencies to plan and implement solutions for reliably and persistently curating and providing access to samples. Registration of samples in online catalogs and use of persistent identifiers such as the International Geo Sample Number are first steps to ensure discovery and access of samples. But digital discovery and access loses its value if the physical objects are not preserved and accessible. It is unreasonable to expect that every sample ever collected can be archived. Selection of those samples that are worth preserving requires guidelines and policies. We also need to
Validity and internal consistency of a whiplash-specific disability measure.
Pinfold, Melanie; Niere, Ken R; O'Leary, Elizabeth F; Hoving, Jan Lucas; Green, Sally; Buchbinder, Rachelle
2004-02-01
Cross-sectional study of patients with whiplash-associated disorders investigating the internal consistency, factor structure, response rates, and presence of floor and ceiling effects of the Whiplash Disability Questionnaire (WDQ). The aim of this study was to confirm the appropriateness of the proposed WDQ items. Whiplash injuries are a common cause of pain and disability after motor vehicle accidents. Neck disability questionnaires are often used in whiplash studies to assess neck pain but lack content validity for patients with whiplash-associated disorders. The newly developed WDQ measures functional limitations associated with whiplash injury and was designed after interviews with 83 patients with whiplash in a previous study. Researchers sought expert opinion on items of the WDQ, and items were then tested on a clinical whiplash population. Data were inspected to determine floor and ceiling effects, response rates, factor structure, and internal consistency. Packages of questionnaires were distributed to 55 clinicians, whose patients with whiplash completed and returned 101 questionnaires to researchers. No substantial floor or ceiling effects were identified on inspection of data. The overall floor effect was 12%, and the overall ceiling effect was 4%. Principal component analysis identified one broad factor that accounted for 65% of the variance in responses. Internal consistency was high; Cronbach's alpha = 0.96. Results of the study supported the retention of the 13 proposed items in a whiplash-specific disability questionnaire. Dependent on the results of further psychometric testing, the WDQ is likely to be an appropriate outcome measure for patients with whiplash.
Sample preparation techniques of biological material for isotope analysis
International Nuclear Information System (INIS)
Axmann, H.; Sebastianelli, A.; Arrillaga, J.L.
1990-01-01
Sample preparation is an essential step in all isotope-aided experiments but often it is not given enough attention. The methods of sample preparation are very important to obtain reliable and precise analytical data and for further interpretation of results. The size of a sample required for chemical analysis is usually very small (10mg-1500mg). On the other hand the amount of harvested plant material from plots in a field experiment is often bulky (several kilograms) and the entire sample is too large for processing. In addition, while approaching maturity many crops show not only differences in physical consistency but also a non-uniformity in 15 N content among plant parts, requiring a plant fractionation or separation into parts (vegetative and reproductive) e.g. shoots and spikes, in case of small grain cereals, shoots and pods in case of grain legumes and tops and roots or beets (including crown) in case of sugar beet, etc. In any case the ultimate goal of these procedures is to obtain representative subsample harvested from greenhouse or field experiments for chemical analysis. Before harvesting an isotopic-aided experiment the method of sampling has to be selected. It should be based on the type of information required in relation to the objectives of the research and the availability of resources (staff, sample preparation equipment, analytical facilities, chemicals and supplies, etc.). 10 refs, 3 figs, 3 tabs
Ross, M W; Tikkanen, R; Månsson, S A
2000-09-01
The Internet is becoming a new erotic oasis for obtaining sex online or in person. We reviewed the literature on cybersex and compared differences in data from samples of homosexually active men obtained on identical questionnaires from a conventional written questionnaire, distributed through the mailing and contact lists of a large national gay organization in Sweden, and through the same organization's website and chat room. A total of 716 written questionnaires and 678 Internet questionnaires were obtained. The Internet sample was younger, more likely to live in small towns or cities, live with parents or a girlfriend, and have lower formal education. They are less likely to have previous sexual experience solely with other men (one in three of the Internet sample vs. 1 in 14 of the written sample defined themselves as bisexual) and more likely to visit erotic oases such as bathhouses, video clubs and erotic movie houses. They also visited Internet chat rooms more frequently (86% of the Internet sample vs. 50% of the written sample). One third of the Internet sample wanted the opportunity to talk with an expert about HIV compared with a quarter of the written sample. Sexual practices between the two samples were generally similar, although the Internet sample reported significantly less body contact, kissing, hugging, mutual masturbation, and more condom use for anal intercourse with steady partners. Over four times as many of the Internet samples reported sex with women in the past year as the written sample. These data indicate that Internet data collection is feasible and that this mode of data collection, despite the nonrandom and self-selected nature of both types of samples, is likely to be more significantly oriented toward the young, geographically more isolated, and more behaviorally and self-identified bisexual respondent than conventionally distributed written questionnaires.
Campo-Arias, Adalberto; Oviedo, Heidi Celina; Díaz, Carmen Elena; Cogollo, Zuleima
2006-12-01
This study evaluated the internal consistency of a Spanish version of the short form of the Francis Scale of Attitude Toward Christianity based on responses of 405 Colombian adolescent students ages 13 to 17 years. This translated short-form version of the scale had an internal consistency of .80. This estimate indicates suitable internal consistency reliability for research use in this population.
Rebuilding Status Consistency in a Post-Communist Society. The Czech Republic, 1991-97
Czech Academy of Sciences Publication Activity Database
Matějů, Petr; Kreidl, Martin
2001-01-01
Roč. 14, č. 1 (2001), s. 17-34 ISSN 1351-1610 Institutional research plan: CEZ:AV0Z7028912 Keywords : status consistency * social transformation * Czech republic Subject RIV: AO - Sociology, Demography
Ates, Ozlem; Unal Coban, Gul; Kaya Sengoren, Serap
2018-01-01
This study aims to explain the extent to which prospective physics teachers' views and practices are consistent with the constructivist framework. A case study design was employed as the research approach. The study was conducted with 11 prospective physics teachers attending a state university in Turkey. Data was collected through semi-structured…
Consistent-handed individuals are more authoritarian.
Lyle, Keith B; Grillo, Michael C
2014-01-01
Individuals differ in the consistency with which they use one hand over the other to perform everyday activities. Some individuals are very consistent, habitually using a single hand to perform most tasks. Others are relatively inconsistent, and hence make greater use of both hands. More- versus less-consistent individuals have been shown to differ in numerous aspects of personality and cognition. In several respects consistent-handed individuals resemble authoritarian individuals. For example, both consistent-handedness and authoritarianism have been linked to cognitive inflexibility. Therefore we hypothesised that consistent-handedness is an external marker for authoritarianism. Confirming our hypothesis, we found that consistent-handers scored higher than inconsistent-handers on a measure of submission to authority, were more likely to identify with a conservative political party (Republican), and expressed less-positive attitudes towards out-groups. We propose that authoritarianism may be influenced by the degree of interaction between the left and right brain hemispheres, which has been found to differ between consistent- and inconsistent-handed individuals.
Directory of Open Access Journals (Sweden)
Charlotte Benoot
2016-02-01
Full Text Available Abstract Background An increasing number of qualitative evidence syntheses papers are found in health care literature. Many of these syntheses use a strictly exhaustive search strategy to collect articles, mirroring the standard template developed by major review organizations such as the Cochrane and Campbell Collaboration. The hegemonic idea behind it is that non-comprehensive samples in systematic reviews may introduce selection bias. However, exhaustive sampling in a qualitative evidence synthesis has been questioned, and a more purposeful way of sampling papers has been proposed as an alternative, although there is a lack of transparency on how these purposeful sampling strategies might be applied to a qualitative evidence synthesis. We discuss in our paper why and how we used purposeful sampling in a qualitative evidence synthesis about ‘sexual adjustment to a cancer trajectory’, by giving a worked example. Methods We have chosen a mixed purposeful sampling, combining three different strategies that we considered the most consistent with our research purpose: intensity sampling, maximum variation sampling and confirming/disconfirming case sampling. Results The concept of purposeful sampling on the meta-level could not readily been borrowed from the logic applied in basic research projects. It also demands a considerable amount of flexibility, and is labour-intensive, which goes against the argument of many authors that using purposeful sampling provides a pragmatic solution or a short cut for researchers, compared with exhaustive sampling. Opportunities of purposeful sampling were the possible inclusion of new perspectives to the line-of-argument and the enhancement of the theoretical diversity of the papers being included, which could make the results more conceptually aligned with the synthesis purpose. Conclusions This paper helps researchers to make decisions related to purposeful sampling in a more systematic and transparent way
Benoot, Charlotte; Hannes, Karin; Bilsen, Johan
2016-02-18
An increasing number of qualitative evidence syntheses papers are found in health care literature. Many of these syntheses use a strictly exhaustive search strategy to collect articles, mirroring the standard template developed by major review organizations such as the Cochrane and Campbell Collaboration. The hegemonic idea behind it is that non-comprehensive samples in systematic reviews may introduce selection bias. However, exhaustive sampling in a qualitative evidence synthesis has been questioned, and a more purposeful way of sampling papers has been proposed as an alternative, although there is a lack of transparency on how these purposeful sampling strategies might be applied to a qualitative evidence synthesis. We discuss in our paper why and how we used purposeful sampling in a qualitative evidence synthesis about 'sexual adjustment to a cancer trajectory', by giving a worked example. We have chosen a mixed purposeful sampling, combining three different strategies that we considered the most consistent with our research purpose: intensity sampling, maximum variation sampling and confirming/disconfirming case sampling. The concept of purposeful sampling on the meta-level could not readily been borrowed from the logic applied in basic research projects. It also demands a considerable amount of flexibility, and is labour-intensive, which goes against the argument of many authors that using purposeful sampling provides a pragmatic solution or a short cut for researchers, compared with exhaustive sampling. Opportunities of purposeful sampling were the possible inclusion of new perspectives to the line-of-argument and the enhancement of the theoretical diversity of the papers being included, which could make the results more conceptually aligned with the synthesis purpose. This paper helps researchers to make decisions related to purposeful sampling in a more systematic and transparent way. Future research could confirm or disconfirm the hypothesis of conceptual
International Nuclear Information System (INIS)
Chung, Yong Sam; Moon, Jong Hwa; Chung, Young Ju; Jeong, Eui Sik; Lee, Sang Mi; Kang, Sang Hun; Cho, Seung Yeon; Kwon, Young Sik; Chung, Sang Wuk; Lee, Kyu Sung; Chun, Ki Hong; Kim, Nak Bae; Lee, Kil Yong; Yoon, Yoon Yeol; Chun, Sang Ki.
1997-09-01
This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs
Sampling Lesbian, Gay, and Bisexual Populations
Meyer, Ilan H.; Wilson, Patrick A.
2009-01-01
Sampling has been the single most influential component of conducting research with lesbian, gay, and bisexual (LGB) populations. Poor sampling designs can result in biased results that will mislead other researchers, policymakers, and practitioners. Investigators wishing to study LGB populations must therefore devote significant energy and…
Homosexual, gay, and lesbian: defining the words and sampling the populations.
Donovan, J M
1992-01-01
The lack of both specificity and consensus about definitions for homosexual, homosexuality, gay, and lesbian are first shown to confound comparative research and cumulative understanding because criteria for inclusion within the subject populations are often not consistent. The Description section examines sociolinguistic variables which determine patterns of preferred choice of terminology, and considers how these might impact gay and lesbian studies. Attitudes and style are found to influence word choice. These results are used in the second section to devise recommended definitional limits which would satisfy both communication needs and methodological purposes, especially those of sampling.
Experience sampling methodology in mental health research: new insights and technical developments.
Myin-Germeys, Inez; Kasanova, Zuzana; Vaessen, Thomas; Vachon, Hugo; Kirtley, Olivia; Viechtbauer, Wolfgang; Reininghaus, Ulrich
2018-06-01
In the mental health field, there is a growing awareness that the study of psychiatric symptoms in the context of everyday life, using experience sampling methodology (ESM), may provide a powerful and necessary addition to more conventional research approaches. ESM, a structured self-report diary technique, allows the investigation of experiences within, and in interaction with, the real-world context. This paper provides an overview of how zooming in on the micro-level of experience and behaviour using ESM adds new insights and additional perspectives to standard approaches. More specifically, it discusses how ESM: a) contributes to a deeper understanding of psychopathological phenomena, b) allows to capture variability over time, c) aids in identifying internal and situational determinants of variability in symptomatology, and d) enables a thorough investigation of the interaction between the person and his/her environment and of real-life social interactions. Next to improving assessment of psychopathology and its underlying mechanisms, ESM contributes to advancing and changing clinical practice by allowing a more fine-grained evaluation of treatment effects as well as by providing the opportunity for extending treatment beyond the clinical setting into real life with the development of ecological momentary interventions. Furthermore, this paper provides an overview of the technical details of setting up an ESM study in terms of design, questionnaire development and statistical approaches. Overall, although a number of considerations and challenges remain, ESM offers one of the best opportunities for personalized medicine in psychiatry, from both a research and a clinical perspective. © 2018 World Psychiatric Association.
Kobayashi, Eriko; Satoh, Nobunori
2009-11-01
To assess the attitudes of the Japanese general public towards pharmacogenomics research and a DNA bank for identifying genomic markers associated with ADRs and their willingness to donate DNA samples, we conducted a national survey for 1,103 Japanese adults from the general public, not a patient population. The response rate was 36.8%. The majority of the respondents showed a positive attitude towards pharmacogenomics research (81.0%) and a DNA bank (70.4%). Considering fictitious clinical situations such as taking medications and experiencing ADRs, the willingness to donate DNA samples when experiencing ADRs (61.7%) was higher than when taking medications (45.3%). Older generations were significantly associated with a decreased willingness to donate (OR = 0.45, CI 0.28-0.72 in 50s. OR = 0.49, CI: 0.31-0.77 in 60s). Positive attitudes towards pharmacogenomics research, a DNA bank, blood/bone marrow/organ donation were significantly associated with an increased willingness. However, the respondents had the following concerns regarding a DNA bank: the confidentiality of their personal information, the manner by which research results were utilized and simply the use of their own DNA for research. In order to attain public understanding to overcome these concerns, a process of public awareness should be put into place to emphasize the beneficial aspects of identifying genomic markers associated with ADRs and to address these concerns raised in our study. Further study is needed to assess the willingness of actual patients taking medications in real situations, since the respondents in our study were from the general public, not a patient population, and their willingness was assessed on the condition of assuming that they were patients taking medications.
Rape Myth Consistency and Gender Differences in Perceiving Rape Victims: A Meta-Analysis.
Hockett, Jericho M; Smith, Sara J; Klausing, Cathleen D; Saucier, Donald A
2016-02-01
An overview discusses feminist analyses of oppression, attitudes toward rape victims, and previously studied predictors of individuals' attitudes toward rape victims. To better understand such attitudes, this meta-analysis examines the moderating influences of various rape victim, perpetrator, and crime characteristics' rape myth consistency on gender differences in individuals' perceptions of rape victims (i.e., victim responsibility and blame attributions and rape minimizing attitudes). Consistent with feminist theoretical predictions, results indicated that, overall, men perceived rape victims more negatively than women did. However, this sex difference was moderated by the rape myth consistency within the rape vignettes. Implications for research are discussed. © The Author(s) 2015.
International Nuclear Information System (INIS)
Rafelski, J.
1979-01-01
After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de
Fourcade, Yoan; Engler, Jan O; Rödder, Dennis; Secondi, Jean
2014-01-01
MAXENT is now a common species distribution modeling (SDM) tool used by conservation practitioners for predicting the distribution of a species from a set of records and environmental predictors. However, datasets of species occurrence used to train the model are often biased in the geographical space because of unequal sampling effort across the study area. This bias may be a source of strong inaccuracy in the resulting model and could lead to incorrect predictions. Although a number of sampling bias correction methods have been proposed, there is no consensual guideline to account for it. We compared here the performance of five methods of bias correction on three datasets of species occurrence: one "virtual" derived from a land cover map, and two actual datasets for a turtle (Chrysemys picta) and a salamander (Plethodon cylindraceus). We subjected these datasets to four types of sampling biases corresponding to potential types of empirical biases. We applied five correction methods to the biased samples and compared the outputs of distribution models to unbiased datasets to assess the overall correction performance of each method. The results revealed that the ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species. However, the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases. The strong effect of initial conditions on correction performance highlights the need for further research to develop a step-by-step guideline to account for sampling bias. However, this method seems to be the most efficient in correcting sampling bias and should be advised in most cases.
Carson, John M., III; Bayard, David S.
2006-01-01
G-SAMPLE is an in-flight dynamical method for use by sample collection missions to identify the presence and quantity of collected sample material. The G-SAMPLE method implements a maximum-likelihood estimator to identify the collected sample mass, based on onboard force sensor measurements, thruster firings, and a dynamics model of the spacecraft. With G-SAMPLE, sample mass identification becomes a computation rather than an extra hardware requirement; the added cost of cameras or other sensors for sample mass detection is avoided. Realistic simulation examples are provided for a spacecraft configuration with a sample collection device mounted on the end of an extended boom. In one representative example, a 1000 gram sample mass is estimated to within 110 grams (95% confidence) under realistic assumptions of thruster profile error, spacecraft parameter uncertainty, and sensor noise. For convenience to future mission design, an overall sample-mass estimation error budget is developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.
DEFF Research Database (Denmark)
Pais, Alexandre; Valero, Paola
2012-01-01
We discuss contemporary theories in mathematics education in order to do research on research. Our strategy consists of analysing discursively and ideologically recent key publications addressing the role of theory in mathematics education research. We examine how the field fabricates its object...... of research by deploying Foucault’s notion of bio-politics - mainly to address the object “learning” - and Žižek’s ideology critique - to address the object “mathematics”. These theories, which have already been used in the field to research teaching and learning, have a great potential to contribute...... to a reflexivity of research on its discourses and effects. Furthermore, they enable us to present a clear distinction between what has been called the sociopolitical turn in mathematics education research and what we call a positioning of mathematics education (research) practices in the Political....
International Nuclear Information System (INIS)
Lindblom, S.R.
1992-08-01
The Rocky Mountain 1 (RMl) underground coal gasification (UCG) test was conducted from November 16, 1987 through February 26, 1988 (United Engineers and Constructors 1989) at a site approximately one mile south of Hanna, Wyoming. The test consisted of dual module operation to evaluate the controlled retracting injection point (CRIP) technology, the elongated linked well (ELW) technology, and the interaction of closely spaced modules operating simultaneously. The test caused two cavities to be formed in the Hanna No. 1 coal seam and associated overburden. The Hanna No. 1 coal seam is approximately 30 ft thick and lays at depths between 350 ft and 365 ft below the surface in the test area. The coal seam is overlain by sandstones, siltstones and claystones deposited by various fluvial environments. The groundwater monitoring was designed to satisfy the requirements of the Wyoming Department of Environmental Quality (WDEQ) in addition to providing research data toward the development of UCG technology that minimizes environmental impacts. The June 1992 semiannual groundwater.sampling took place from June 10 through June 13, 1992. This event occurred nearly 34 months after the second groundwater restoration at the RM1 site and was the fifteenth sampling event since UCG operations ceased. Samples were collected for analyses of a limited suite set of parameters as listed in Table 1. With a few exceptions, the groundwater is near baseline conditions. Data from the field measurements and analysis of samples are presented. Benzene concentrations in the groundwater were below analytical detection limits
Sample collection and documentation
International Nuclear Information System (INIS)
Cullings, Harry M.; Fujita, Shoichiro; Watanabe, Tadaaki; Yamashita, Tomoaki; Tanaka, Kenichi; Endo, Satoru; Shizuma, Kiyoshi; Hoshi, Masaharu; Hasai, Hiromi
2005-01-01
Beginning within a few weeks after the bombings and periodically during the intervening decades, investigators in Hiroshima and Nagasaki have collected samples of materials that were in the cities at the time of the bombings. Although some early efforts were not driven by specific measurement objectives, many others were. Even some of the very earliest samples collected in 1945 were based on carefully conceived research plans and detailed specifications for samples appropriate to particular retrospective measurements, i.e., of particular residual quantities remaining from exposure to the neutrons and gamma rays from the bombs. This chapter focuses mainly on the work of groups at two institutions that have actively collaborated since the 1980s in major collection efforts and have shared samples among themselves and with other investigators: the Radiation Effects Research Foundation (RERF) and its predecessor the Atomic Bomb Casualty Commission (ABCC), and Hiroshima University. In addition, a number of others are listed, who also contributed to the literature by their collection of samples. (J.P.N.)
How Do Principals Conceptualize Success: Are Their Actions Consistent with Their Definitions?
Patience, Brian J.
2012-01-01
My research study explored how principals allocated their time, their perceptions of success, and whether their actions were consistent with their definition of success. Findings revealed participants spent time performing three primary behaviors including communicating with school stakeholders, completing managerial practices, and serving as…
[Qualitative research approaches in practical use in child and adolescent psychiatry].
Fegert, J; Gerwert, U
1993-10-01
Experimental study designs and quantitative analysis are dominating the methodology of child psychiatric research. Sometimes the "box of tools" consisting of standardized software packages for statistical analysis seems to lead to a regrettable uniformity in research strategies. Elaborated sociological research concepts in the tradition of Max Weber and the "Chicago school" could close the scientific gap between quantitative studies on large samples and simple case-reports. They are excellent instruments for generating hypothesis on relatively rare clinical problems or in new fields of child psychiatric research. Based on a review of the literature potential applications of qualitative methodology in child psychiatry will be discussed.
DEFF Research Database (Denmark)
Staunstrup, Jørgen
1998-01-01
This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....
Researching Research: Mathematics Education in the Political
Pais, Alexandre; Valero, Paola
2012-01-01
We discuss contemporary theories in mathematics education in order to do research on research. Our strategy consists of analysing discursively and ideologically recent key publications addressing the role of theory in mathematics education research. We examine how the field fabricates its object of research by deploying Foucault's notion of…
Consistency of canonical formulation of Horava gravity
International Nuclear Information System (INIS)
Soo, Chopin
2011-01-01
Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.
Consistency of canonical formulation of Horava gravity
Energy Technology Data Exchange (ETDEWEB)
Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)
2011-09-22
Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.
McKay, Michael T; Andretta, James R
2017-09-01
Mental well-being is an important indicator of current, but also the future health of adolescents. The 14-item Warwick Edinburgh Mental Well-being Scale (WEMWBS) has been well validated in adults world-wide, but less work has been undertaken to examine the psychometric validity and internal consistency of WEMWBS scores in adolescents. In particular, little research has examined scores on the short 7-item version of the WEMWBS. The present study used two large samples of school children in Scotland and Northern Ireland and found that for both forms of the WEMWBS, scores were psychometrically valid, internally consistent, factor saturated, and measurement invariant by country. Using the WEMWBS full form, males reported significantly higher scores than females, and Northern Irish adolescents reported significantly higher scores than their Scottish counterparts. Last, the lowest overall levels of well-being were observed among Scottish females. Copyright © 2017. Published by Elsevier B.V.
PFP Wastewater Sampling Facility
International Nuclear Information System (INIS)
Hirzel, D.R.
1995-01-01
This test report documents the results obtained while conducting operational testing of the sampling equipment in the 225-WC building, the PFP Wastewater Sampling Facility. The Wastewater Sampling Facility houses equipment to sample and monitor the PFP's liquid effluents before discharging the stream to the 200 Area Treated Effluent Disposal Facility (TEDF). The majority of the streams are not radioactive and discharges from the PFP Heating, Ventilation, and Air Conditioning (HVAC). The streams that might be contaminated are processed through the Low Level Waste Treatment Facility (LLWTF) before discharging to TEDF. The sampling equipment consists of two flow-proportional composite samplers, an ultrasonic flowmeter, pH and conductivity monitors, chart recorder, and associated relays and current isolators to interconnect the equipment to allow proper operation. Data signals from the monitors are received in the 234-5Z Shift Office which contains a chart recorder and alarm annunciator panel. The data signals are also duplicated and sent to the TEDF control room through the Local Control Unit (LCU). Performing the OTP has verified the operability of the PFP wastewater sampling system. This Operability Test Report documents the acceptance of the sampling system for use
Replica consistency in a Data Grid
International Nuclear Information System (INIS)
Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt
2004-01-01
A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented
Pérez V, Cristhian; Vaccarezza G, Giulietta; Aguilar A, César; Coloma N, Katherine; Salgado F, Horacio; Baquedano R, Marjorie; Chavarría R, Carla; Bastías V, Nancy
2016-06-01
Teaching practice is one of the most complex topics of the training process in medicine and other health care careers. The Teaching Practices Questionnaire (TPQ) evaluates teaching skills. To assess the factor structure and internal consistency of the Spanish version of the TPP among health care teachers. The TPQ was answered by 315 university teachers from 13 of the 15 administrative Chilean regions, who were selected through a non-probabilistic volunteer sampling. The internal consistency of TPP factors was calculated and the correlation between them was analyzed. Six factors were identified: Student-centered teaching, Teaching planning, Assessment process, Dialogue relationship, Teacher-centered teaching and Use of technological resources. They had Cronbach alphas ranging from 0.60 to 0.85. The factorial structure of TPQ differentiates the most important functions of teaching. It also shows a theoretical consistency and a practical relevance to perform a diagnosis and continuous evaluation of teaching practices. Additionally, it has an adequate internal consistency. Thus, TPQ is valid and reliable to evaluate pedagogical practices in health care careers.
Sample holder for studying temperature dependent particle guiding
International Nuclear Information System (INIS)
Bereczky, R.J.; Toekesi, K.; Kowarik, G.; Aumayr, F.
2011-01-01
Complete text of publication follows. The so called guiding effect is a complex process involving the interplay of a large number of charged particles with a solid. Although many research groups joined this field and carried out various experiments with insulator capillaries many details of the interactions are still unknown. We investigated the temperature dependence of the guiding since it opens new possibilities both for a fundamental understanding of the guiding phenomenon and for applications. For the temperature dependent guiding experiments a completely new heatable sample holder was designed. We developed and built such a heatable sample holder to make accurate and reproducible studies of the temperature dependence of the ion guiding effect possible. The target holder (for an exploded view see Fig. 1) consists of two main parts, the front and the back plates. The two plates of the sample holder, which function as an oven, are made of copper. These parts surround the capillary in order to guarantee a uniform temperature along the whole tube. The temperature of the copper parts is monitored by a K-Type thermocouple. Stainless steel coaxial heaters surrounding the oven are used for heating. The heating power up to a few watts is regulated by a PID controller. Cooling of the capillary is achieved by a copper feed-through connected to a liquid nitrogen bath outside the UHV chamber. This solution allows us to change the temperature of the sample from -30 deg C up to 90 deg C. Our experiments with this newly developed temperature regulated capillary holder show that the glass temperature (i.e. conductivity) can be used to control the guiding properties of the glass capillary and adjust the conditions from guiding at room temperature to simple geometrical transmission at elevated temperatures. This holds the promise to investigate the effect of conductivity on particle transport (build-up and removal of charge patches) through capillaries in more details
Storey, Jennifer E; Hart, Stephen D; Cooke, David J; Michie, Christine
2016-04-01
The Hare Psychopathy Checklist-Revised (PCL-R; Hare, 2003) is a commonly used psychological test for assessing traits of psychopathic personality disorder. Despite the abundance of research using the PCL-R, the vast majority of research used samples of convenience rather than systematic methods to minimize sampling bias and maximize the generalizability of findings. This potentially complicates the interpretation of test scores and research findings, including the "norms" for offenders from the United States and Canada included in the PCL-R manual. In the current study, we evaluated the psychometric properties of PCL-R scores for all male offenders admitted to a regional reception center of the Correctional Service of Canada during a 1-year period (n = 375). Because offenders were admitted for assessment prior to institutional classification, they comprise a sample that was heterogeneous with respect to correctional risks and needs yet representative of all offenders in that region of the service. We examined the distribution of PCL-R scores, classical test theory indices of its structural reliability, the factor structure of test items, and the external correlates of test scores. The findings were highly consistent with those typically reported in previous studies. We interpret these results as indicating it is unlikely any sampling limitations of past research using the PCL-R resulted in findings that were, overall, strongly biased or unrepresentative. (c) 2016 APA, all rights reserved).
Edge Effects in Line Intersect Sampling With
David L. R. Affleck; Timothy G. Gregoire; Harry T. Valentine
2005-01-01
Transects consisting of multiple, connected segments with a prescribed configuration are commonly used in ecological applications of line intersect sampling. The transect configuration has implications for the probability with which population elements are selected and for how the selection probabilities can be modified by the boundary of the tract being sampled. As...
Coordinating user interfaces for consistency
Nielsen, Jakob
2001-01-01
In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys
Öztürk, Ayse; Dogan, Gülay Özdemir
2017-01-01
The purpose of this study was to investigate Effective Children's Rights Education (ECRE) from the perspectives of classroom teachers who are experts in children's rights education (TECR). The data were collected through focus group interview method in this research designed as a case study. The sample of the study consists of six qualified…
Proteomic Biomarker Discovery in 1000 Human Plasma Samples with Mass Spectrometry.
Cominetti, Ornella; Núñez Galindo, Antonio; Corthésy, John; Oller Moreno, Sergio; Irincheeva, Irina; Valsesia, Armand; Astrup, Arne; Saris, Wim H M; Hager, Jörg; Kussmann, Martin; Dayon, Loïc
2016-02-05
The overall impact of proteomics on clinical research and its translation has lagged behind expectations. One recognized caveat is the limited size (subject numbers) of (pre)clinical studies performed at the discovery stage, the findings of which fail to be replicated in larger verification/validation trials. Compromised study designs and insufficient statistical power are consequences of the to-date still limited capacity of mass spectrometry (MS)-based workflows to handle large numbers of samples in a realistic time frame, while delivering comprehensive proteome coverages. We developed a highly automated proteomic biomarker discovery workflow. Herein, we have applied this approach to analyze 1000 plasma samples from the multicentered human dietary intervention study "DiOGenes". Study design, sample randomization, tracking, and logistics were the foundations of our large-scale study. We checked the quality of the MS data and provided descriptive statistics. The data set was interrogated for proteins with most stable expression levels in that set of plasma samples. We evaluated standard clinical variables that typically impact forthcoming results and assessed body mass index-associated and gender-specific proteins at two time points. We demonstrate that analyzing a large number of human plasma samples for biomarker discovery with MS using isobaric tagging is feasible, providing robust and consistent biological results.
Dong, S.
2018-05-01
We present a reduction-consistent and thermodynamically consistent formulation and an associated numerical algorithm for simulating the dynamics of an isothermal mixture consisting of N (N ⩾ 2) immiscible incompressible fluids with different physical properties (densities, viscosities, and pair-wise surface tensions). By reduction consistency we refer to the property that if only a set of M (1 ⩽ M ⩽ N - 1) fluids are present in the system then the N-phase governing equations and boundary conditions will exactly reduce to those for the corresponding M-phase system. By thermodynamic consistency we refer to the property that the formulation honors the thermodynamic principles. Our N-phase formulation is developed based on a more general method that allows for the systematic construction of reduction-consistent formulations, and the method suggests the existence of many possible forms of reduction-consistent and thermodynamically consistent N-phase formulations. Extensive numerical experiments have been presented for flow problems involving multiple fluid components and large density ratios and large viscosity ratios, and the simulation results are compared with the physical theories or the available physical solutions. The comparisons demonstrate that our method produces physically accurate results for this class of problems.
Characteristics of quantitative nursing research from 1990 to 2010.
Yarcheski, Adela; Mahon, Noreen E
2013-12-01
To assess author credentials of quantitative research in nursing, the composition of the research teams, and the disciplinary focus of the theories tested. Nursing Research, Western Journal of Nursing Research, and Journal of Advanced Nursing were selected for this descriptive study; 1990, 1995, 2000, 2005, and 2010 were included. The final sample consisted of 484 quantitative research articles. From 1990 to 2010, there was an increase in first authors holding doctoral degrees, research from other countries, and funding. Solo authorship decreased; multi-authorship and multidisciplinary teams increased. Theories tested were mostly from psychology; the testing of nursing theory was modest. Multidisciplinary research far outdistanced interdisciplinary research. Quantitative nursing research can be characterized as multidisciplinary (distinct theories from different disciplines) rather than discipline-specific to nursing. Interdisciplinary (theories synthesized from different disciplines) research has been conducted minimally. This study provides information about the growth of the scientific knowledge base of nursing, which has implications for practice. © 2013 Sigma Theta Tau International.
Consistency between GRUAN sondes, LBLRTM and IASI
Directory of Open Access Journals (Sweden)
X. Calbet
2017-06-01
Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.
Joseph, Rebecca M; Soames, Jamie; Wright, Mark; Sultana, Kirin; van Staa, Tjeerd P; Dixon, William G
2018-02-01
To describe a novel observational study that supplemented primary care electronic health record (EHR) data with sample collection and patient diaries. The study was set in primary care in England. A list of 3974 potentially eligible patients was compiled using data from the Clinical Practice Research Datalink. Interested general practices opted into the study then confirmed patient suitability and sent out postal invitations. Participants completed a drug-use diary and provided saliva samples to the research team to combine with EHR data. Of 252 practices contacted to participate, 66 (26%) mailed invitations to patients. Of the 3974 potentially eligible patients, 859 (22%) were at participating practices, and 526 (13%) were sent invitations. Of those invited, 117 (22%) consented to participate of whom 86 (74%) completed the study. We have confirmed the feasibility of supplementing EHR with data collected directly from patients. Although the present study successfully collected essential data from patients, it also underlined the requirement for improved engagement with both patients and general practitioners to support similar studies. © 2017 The Authors. Pharmacoepidemiology & Drug Safety published by John Wiley & Sons Ltd.
Directory of Open Access Journals (Sweden)
Gina Görgens-Ekermans
2013-10-01
Research purpose: The objectives of this study were to investigate the internal validity (construct and discriminant validity, reliability and external validity (relationship with theoretically relevant variables, namely stress, burnout and work engagement of the PCQ-24. Motivation for the study: Multiple studies have underscored the value of PsyCap within the workplace. In order to harness the full potential of the construct in the South African environment, sound measurement thereof, evidenced by a psychometrically sound instrument, is needed. Research design, approach and method: A cross-sectional survey design was used. The sample consisted of employees at managerial and non-managerial levels, from a medium-sized construction company in the Western Cape, South Africa. In addition to PsyCap, perceived stress, work-related burnout and work engagement were measured. Main findings: The results provided preliminary evidence of construct and discriminant validity, reliability and significant relations with external theoretically relevant variables. Practical/managerial implications: Researchers may confidently use the PCQ-24 to measure the construct of PsyCap and investigate relations with workplace outcomes in the South African environment, informing human relations practices. Contribution/value-add: Preliminary evidence of the psychometric properties of the PCQ-24, which measures the construct of PsyCap (consisting of hope, self-efficacy, resilience and optimism on a South African sample, was provided in this study.
International Nuclear Information System (INIS)
1989-09-01
Concern about the release of radionuclides to the environment, especially to the foodchain, has been heightened by recent nuclear incidents. The assessment of any release of radioactivity demands rapid, reliable and practical techniques. In the intermediate and late post-accident period, where the interest is in food control rather then evacuation and sheltering, rapid methods would be useful for screening purposes as well as providing timely information and easing sample workload minimizing sample overloads. In the first research co-ordination meeting on the co-ordinated research program ''Rapid.... samples'', the specifications for the time required for sample preparation, separation, and analysis and the accuracy desired were outlined. Considerable attention was given to the need to develop rapid method for sample preparation and dissolution. Emphasis was placed on achieving the development of rapid methods with the minimum sacrifice in reliability, practicality and economy
Kanai, Yae; Nishihara, Hiroshi; Miyagi, Yohei; Tsuruyama, Tatsuhiro; Taguchi, Kenichi; Katoh, Hiroto; Takeuchi, Tomoyo; Gotoh, Masahiro; Kuramoto, Junko; Arai, Eri; Ojima, Hidenori; Shibuya, Ayako; Yoshida, Teruhiko; Akahane, Toshiaki; Kasajima, Rika; Morita, Kei-Ichi; Inazawa, Johji; Sasaki, Takeshi; Fukayama, Masashi; Oda, Yoshinao
2018-02-01
Genome research using appropriately collected pathological tissue samples is expected to yield breakthroughs in the development of biomarkers and identification of therapeutic targets for diseases such as cancers. In this connection, the Japanese Society of Pathology (JSP) has developed "The JSP Guidelines on the Handling of Pathological Tissue Samples for Genomic Research" based on an abundance of data from empirical analyses of tissue samples collected and stored under various conditions. Tissue samples should be collected from appropriate sites within surgically resected specimens, without disturbing the features on which pathological diagnosis is based, while avoiding bleeding or necrotic foci. They should be collected as soon as possible after resection: at the latest within about 3 h of storage at 4°C. Preferably, snap-frozen samples should be stored in liquid nitrogen (about -180°C) until use. When intending to use genomic DNA extracted from formalin-fixed paraffin-embedded tissue, 10% neutral buffered formalin should be used. Insufficient fixation and overfixation must both be avoided. We hope that pathologists, clinicians, clinical laboratory technicians and biobank operators will come to master the handling of pathological tissue samples based on the standard operating procedures in these Guidelines to yield results that will assist in the realization of genomic medicine. © 2018 The Authors. Pathology International published by Japanese Society of Pathology and John Wiley & Sons Australia, Ltd.
Consistency Anchor Formalization and Correctness Proofs
Miguel, Correia; Bessani, Alysson
2014-01-01
This is report contains the formal proofs for the techniques for increasing the consistency of cloud storage as presented in "Bessani et al. SCFS: A Cloud-backed File System. Proc. of the 2014 USENIX Annual Technical Conference. June 2014." The consistency anchor technique allows one to increase the consistency provided by eventually consistent cloud storage services like Amazon S3. This technique has been used in the SCFS (Shared Cloud File System) cloud-backed file system for solving rea...
Who Are We Studying? Sample Diversity in Teaching of Psychology Research
Richmond, Aaron S.; Broussard, Kristin A.; Sterns, Jillian L.; Sanders, Kristina K.; Shardy, Justin C.
2015-01-01
The purpose of the current study was to examine the sample diversity of empirical articles published in four premier teaching of psychology journals from 2008 to 2013. We investigated which demographic information was commonly reported and if samples were ethnically representative and whether gender was representative compared to National…
Fear, Anger, and Risk Preference Reversals: An Experimental Study on a Chinese Sample.
She, Shengxiang; Eimontaite, Iveta; Zhang, Dangli; Sun, Yan
2017-01-01
Fear and anger are basic emotions of the same valence which differ in terms of their certainty and control dimensions according to the Appraisal Tendency Framework, a theory addressing the relationship between specific emotions, and judgments and choices. Past research based on the Appraisal Theory revealed contradictory results for risky choice decision-making. However, these conclusions were drawn from Western samples (e.g., North American). Considering potential cultural differences, the present study aims to investigate whether the Appraisal Tendency hypothesis yields the same results in a Chinese sample. Our first study explores how dispositional fear and anger influence risk preferences through a classic virtual "Asia Disease Problem" task and the second study investigates how induced fear and anger influence risk preferences through an incentive-compatible task. Consistent with previous research, our results reveal that induced fear and anger have differential effects on risky decisions: angry participants prefer the risk-seeking option, whereas fearful participants prefer a risk-averse option. However, we find no associations between dispositional fear (or anger) and risky decisions.
Fear, Anger, and Risk Preference Reversals: An Experimental Study on a Chinese Sample
Directory of Open Access Journals (Sweden)
Shengxiang She
2017-08-01
Full Text Available Fear and anger are basic emotions of the same valence which differ in terms of their certainty and control dimensions according to the Appraisal Tendency Framework, a theory addressing the relationship between specific emotions, and judgments and choices. Past research based on the Appraisal Theory revealed contradictory results for risky choice decision-making. However, these conclusions were drawn from Western samples (e.g., North American. Considering potential cultural differences, the present study aims to investigate whether the Appraisal Tendency hypothesis yields the same results in a Chinese sample. Our first study explores how dispositional fear and anger influence risk preferences through a classic virtual “Asia Disease Problem” task and the second study investigates how induced fear and anger influence risk preferences through an incentive-compatible task. Consistent with previous research, our results reveal that induced fear and anger have differential effects on risky decisions: angry participants prefer the risk-seeking option, whereas fearful participants prefer a risk-averse option. However, we find no associations between dispositional fear (or anger and risky decisions.
A new approach to hull consistency
Directory of Open Access Journals (Sweden)
Kolev Lubomir
2016-06-01
Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.
Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)
Stadje, M.A.; Pelsser, A.
2014-01-01
Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
[Development of a microenvironment test chamber for airborne microbe research].
Zhan, Ningbo; Chen, Feng; Du, Yaohua; Cheng, Zhi; Li, Chenyu; Wu, Jinlong; Wu, Taihu
2017-10-01
One of the most important environmental cleanliness indicators is airborne microbe. However, the particularity of clean operating environment and controlled experimental environment often leads to the limitation of the airborne microbe research. This paper designed and implemented a microenvironment test chamber for airborne microbe research in normal test conditions. Numerical simulation by Fluent showed that airborne microbes were evenly dispersed in the upper part of test chamber, and had a bottom-up concentration growth distribution. According to the simulation results, the verification experiment was carried out by selecting 5 sampling points in different space positions in the test chamber. Experimental results showed that average particle concentrations of all sampling points reached 10 7 counts/m 3 after 5 minutes' distributing of Staphylococcus aureus , and all sampling points showed the accordant mapping of concentration distribution. The concentration of airborne microbe in the upper chamber was slightly higher than that in the middle chamber, and that was also slightly higher than that in the bottom chamber. It is consistent with the results of numerical simulation, and it proves that the system can be well used for airborne microbe research.
Latin hypercube sampling with inequality constraints
International Nuclear Information System (INIS)
Iooss, B.; Petelet, M.; Asserin, O.; Loredo, A.
2010-01-01
In some studies requiring predictive and CPU-time consuming numerical models, the sampling design of the model input variables has to be chosen with caution. For this purpose, Latin hypercube sampling has a long history and has shown its robustness capabilities. In this paper we propose and discuss a new algorithm to build a Latin hypercube sample (LHS) taking into account inequality constraints between the sampled variables. This technique, called constrained Latin hypercube sampling (cLHS), consists in doing permutations on an initial LHS to honor the desired monotonic constraints. The relevance of this approach is shown on a real example concerning the numerical welding simulation, where the inequality constraints are caused by the physical decreasing of some material properties in function of the temperature. (authors)
Archaeological and taxonomic significance of ancient wood samples ...
African Journals Online (AJOL)
Ancient wood samples from an archaeological excavation, Test Pit II, in Ahanve, near Badagry were analysed to ascertain their identity. Anatomical study of the wood samples revealed oval-circular xylem pores, diffuse apotracheal axial parenchyma, procumbent and homogeneous ray and non-septate fibres, all consistent ...
International Nuclear Information System (INIS)
1992-10-01
The purpose of this Second Research Co-ordinated Meeting (12-16 August 1991) on Rapid Instrumental and Separation Methods for Monitoring Radionuclides in Food and Environmental Samples is to discuss the progress of the programmes since the First Research Co-ordination Meeting, discuss how to validate the methodologies developed (e.g. reference samples, intercomparisons), and outline a schedule for CRP completion by the end of 1992. Radioactive contamination of the environment after a nuclear accident, such as had occurred at Chernobyl, is of serious concern to government officials and members of the general public. In 1990/1991 the Agency was asked to organize the International Chernobyl Project to assess the situation in the USSR. A network of laboratories was organized to carry out the environmental assessment needed for this project. The following recommendations are based on the experience gained by many of the laboratories involved in this project. 1. Maintain a network of analytical laboratories with special skills and experience to provide assessments of radionuclide contamination in the environment in case of a radiological emergency. 2. Methodologies for assessment of contamination in the environment should take into consideration potential trajectories, radioecology, and food chain parameters. 3. Focus on areas of representative sample collection, is situ instrumental and chemical analysis, as well as advanced streamlined laboratory analyses which will facilitate the timeline of an assessment. 4. Conduct intercomparison and testing of technologies, employing standard reference materials and procedures, and field measurements at significantly contaminated area. 5. Conduct training of Member State laboratory personnel through fellowships, special courses, and workshops. 5 refs
Shahaeian, Ameneh; Henry, Julie D.; Razmjoee, Maryam; Teymoori, Ali; Wang, Cen
2015-01-01
Previous research has consistently indicated that theory of mind (ToM) is associated with executive control in the preschool years. However, interpretation of this literature is limited by the fact that most studies have focused exclusively on urbanized Western cultural samples. Consequently, it is not clear whether the association between ToM and…
Sex identification of polar bears from blood and tissue samples
Amstrup, Steven C.; Garner, G.W.; Cronin, M.A.; Patton, J.C.
1993-01-01
Polar bears (Ursus maritimus) can be adversely affected by hunting and other human perturbations because of low population densities and low reproduction rates. The sustainable take of adult females may be as low as 1.5% of the population. Females and accompanying young are most vulnerable to hunting, and hunters have not consistently reported the sex composition of the harvest, therefore a method to confirm the sexes of polar bears harvested in Alaska is needed. Evidence of the sex of harvested animals is often not available, but blood or other tissue samples often are. We extracted DNA from tissue and blood samples, and amplified segments of zinc finger (ZFX and ZFY) genes from both X and Y chromosomes with the polymerase chain reaction. Digestion of amplified portions of the X chromosome with the restriction enzyme HaeIII resulted in subdivision of the original amplified segment into four smaller fragments. Digestion with HaeIII did not subdivide the original segment amplified from the Y chromosome. The differing fragment sizes produced patterns in gel electrophoresis that distinguished samples from male and female bears 100% of the time. This technique is applicable to the investigation of many wildlife management and research questions.
Preston, Charles F; Bhandari, Mohit; Fulkerson, Eric; Ginat, Danial; Egol, Kenneth A; Koval, Kenneth J
2006-02-01
To determine the consistency of conclusions/statements made in podium presentations at the annual meeting of the Orthopaedic Trauma Association (OTA) with those in subsequent full-text publications. Also, to evaluate the nature and consistency of study design, methods, sample sizes, results and assign a corresponding level of evidence. Abstracts of the scientific programs of the OTA from 1994 to 1997 (N = 254) were queried by using the PubMed database to identify those studies resulting in a peer-reviewed, full-text publication. Of the 169 articles retrieved, 137 studies were the basis of our study after the exclusion criteria were applied: non-English language, basic science studies, anatomic dissection studies, and articles published in non-peer-reviewed journals. Information was abstracted onto a data form: first from the abstract published in the final meeting program, and then from the published journal article. Information was recorded regarding study issues, including the study design, primary objective, sample size, and statistical methods. We provided descriptive statistics about the frequency of consistent results between abstracts and full-text publications. The results were recorded as percentages and a 95% confidence interval was applied to each value. Study results were recorded for the abstract and full-text publication comparing results and the overall conclusion. A level of scientific-based evidence was assigned to each full-text publication. The final conclusion of the study remained the same 93.4% of the time. The method of study was an observational case series 52% of the time and a statement regarding the rate of patient follow-up was reported 42% of the time. Of the studies published, 18.2% consisted of a sample size smaller than the previously presented abstract. When the published papers had their level of evidence graded, 11% were level I, 16% level II, 17% level III, and 56% level IV. Authors conclusions were consistent with those in full
Fourier rebinning and consistency equations for time-of-flight PET planograms.
Li, Yusheng; Defrise, Michel; Matej, Samuel; Metzler, Scott D
2016-01-01
Due to the unique geometry, dual-panel PET scanners have many advantages in dedicated breast imaging and on-board imaging applications since the compact scanners can be combined with other imaging and treatment modalities. The major challenges of dual-panel PET imaging are the limited-angle problem and data truncation, which can cause artifacts due to incomplete data sampling. The time-of-flight (TOF) information can be a promising solution to reduce these artifacts. The TOF planogram is the native data format for dual-panel TOF PET scanners, and the non-TOF planogram is the 3D extension of linogram. The TOF planograms is five-dimensional while the objects are three-dimensional, and there are two degrees of redundancy. In this paper, we derive consistency equations and Fourier-based rebinning algorithms to provide a complete understanding of the rich structure of the fully 3D TOF planograms. We first derive two consistency equations and John's equation for 3D TOF planograms. By taking the Fourier transforms, we obtain two Fourier consistency equations and the Fourier-John equation, which are the duals of the consistency equations and John's equation, respectively. We then solve the Fourier consistency equations and Fourier-John equation using the method of characteristics. The two degrees of entangled redundancy of the 3D TOF data can be explicitly elicited and exploited by the solutions along the characteristic curves. As the special cases of the general solutions, we obtain Fourier rebinning and consistency equations (FORCEs), and thus we obtain a complete scheme to convert among different types of PET planograms: 3D TOF, 3D non-TOF, 2D TOF and 2D non-TOF planograms. The FORCEs can be used as Fourier-based rebinning algorithms for TOF-PET data reduction, inverse rebinnings for designing fast projectors, or consistency conditions for estimating missing data. As a byproduct, we show the two consistency equations are necessary and sufficient for 3D TOF planograms
Network Sampling with Memory: A proposal for more efficient sampling from social networks
Mouw, Ted; Verdery, Ashton M.
2013-01-01
Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246
Crestani, Anelise Henrich; Moraes, Anaelena Bragança de; Souza, Ana Paula Ramos de
2017-08-10
To analyze the results of the validation of building enunciative signs of language acquisition for children aged 3 to 12 months. The signs were built based on mechanisms of language acquisition in an enunciative perspective and on clinical experience with language disorders. The signs were submitted to judgment of clarity and relevance by a sample of six experts, doctors in linguistic in with knowledge of psycholinguistics and language clinic. In the validation of reliability, two judges/evaluators helped to implement the instruments in videos of 20% of the total sample of mother-infant dyads using the inter-evaluator method. The method known as internal consistency was applied to the total sample, which consisted of 94 mother-infant dyads to the contents of the Phase 1 (3-6 months) and 61 mother-infant dyads to the contents of Phase 2 (7 to 12 months). The data were collected through the analysis of mother-infant interaction based on filming of dyads and application of the parameters to be validated according to the child's age. Data were organized in a spreadsheet and then converted to computer applications for statistical analysis. The judgments of clarity/relevance indicated no modifications to be made in the instruments. The reliability test showed an almost perfect agreement between judges (0.8 ≤ Kappa ≥ 1.0); only the item 2 of Phase 1 showed substantial agreement (0.6 ≤ Kappa ≥ 0.79). The internal consistency for Phase 1 had alpha = 0.84, and Phase 2, alpha = 0.74. This demonstrates the reliability of the instruments. The results suggest adequacy as to content validity of the instruments created for both age groups, demonstrating the relevance of the content of enunciative signs of language acquisition.
Directory of Open Access Journals (Sweden)
Gustavo J. Fonseca D'El Rey
2007-01-01
Full Text Available CONTEXTO: A fobia social é um grave transtorno de ansiedade que traz incapacitação e sofrimento. OBJETIVOS: Investigar a consistência interna da versão em português do Mini-Inventário de Fobia Social (Mini-SPIN. MÉTODOS: Foi realizado um estudo da consistência interna do Mini-SPIN em uma amostra de 206 estudantes universitários da cidade de São Paulo, SP. RESULTADOS: A consistência interna do instrumento, analisada pelo coeficiente alfa de Cronbach, foi de 0,81. CONCLUSÕES: Esses achados permitiram concluir que a versão em português do Mini-SPIN exibiu resultados de boa consistência interna, semelhantes aos da versão original em inglês.BACKGROUND: Social phobia is a severe anxiety disorder that brings disability and distress. OBJECTIVES: To investigate the internal consistency of the Portuguese version of the Mini-Social Phobia Inventory (Mini-SPIN. METHODS: We conducted a study of internal consistency of the Mini-SPIN in a sample of 206 college students of the city of São Paulo, SP. RESULTS: The internal consistency of the instrument, analyzed by Cronbach's alpha coefficient, was 0.81. CONCLUSIONS: These findings suggest that the Portuguese version of the Mini-SPIN has a good internal consistency, similar to those obtained with the original English version.
Bidirectional associations between mothers' and fathers' parenting consistency and child bmi
Jansen, Pauline; Giallo, Rebecca; Westrupp, Elizabeth; Wake, Melissa; Nicholson, Jan
2013-01-01
textabstractBACKGROUND: Research suggests that general parenting dimensions and styles are associated with children's BMI, but directionality in this relationship remains unknown. Moreover, there has been little attention to the influences of both mothers' and fathers' parenting. We aimed to examine reciprocal relationships between maternal and paternal parenting consistency and child BMI. METHODS: Participants were 4002 children and their parents in the population-based Longitudinal Study of...
Assessing bilingual Chinese-English young children in Malaysia using language sample measures.
Ooi, Carmen C-W; Wong, Anita M-Y
2012-12-01
One reason why specific language impairment (SLI) is grossly under-identified in Malaysia is the absence of locally- developed norm-referenced language assessment tools for its multilingual and multicultural population. Spontaneous language samples provide quantitative information for language assessment, and useful descriptive information on child language development in complex language and cultural environments. This research consisted of two studies and investigated the use of measures obtained from English conversational samples among bilingual Chinese-English Malaysian preschoolers. The research found that the language sample measures were sensitive to developmental changes in this population and could identify SLI. The first study examined the relationship between age and mean length of utterance (MLU(w)), lexical diversity (D), and the index of productive syntax (IPSyn) among 52 typically-developing (TD) children aged between 3;4-6;9. Analyses showed a significant linear relationship between age and D (r = .450), the IPsyn (r = .441), and MLU(w) (r = .318). The second study compared the same measures obtained from 10 children with SLI, aged between 3;8-5;11, and their age-matched controls. The children with SLI had significantly shorter MLU(w) and lower IPSyn scores than the TD children. These findings suggest that utterance length and syntax production can be potential clinical markers of SLI in Chinese-English Malaysian children.
Low Cost Mars Sample Return Utilizing Dragon Lander Project
Stoker, Carol R.
2014-01-01
We studied a Mars sample return (MSR) mission that lands a SpaceX Dragon Capsule on Mars carrying sample collection hardware (an arm, drill, or small rover) and a spacecraft stack consisting of a Mars Ascent Vehicle (MAV) and Earth Return Vehicle (ERV) that collectively carry the sample container from Mars back to Earth orbit.
Radvany, Martin G; Quinones-Hinojosa, Alfredo; Gallia, Gary L; Wand, Gary S; Salvatori, Roberto
2016-09-01
Because magnetic resonance imaging (MRI) fails to detect many adrenocorticotropic hormone (ACTH)-secreting pituitary adenomas, inferior petrosal sinus sampling (IPSS) is considered the gold standard to differentiate Cushing disease (CD) from ectopic ACTH secretion syndrome (EAS). Some authors have suggested internal jugular vein sampling (IJVS) as an alternative to IPSS. We simultaneously compared IJVS to IPSS in 30 consecutive patients referred for ACTH-dependent Cushing syndrome and equivocal MRI exams. Five sites were simultaneously sampled in each patient (right and left IPS, right and left IJV, and femoral vein) before and after the administration of corticotrophin-releasing hormone or desmopressin. The test was considered consistent with CD when the IPS to peripheral ratio was >2 at baseline or >3 after stimulus and the IJV to peripheral ratio was >1.7 at baseline or >2 after stimulus. In 27 of 30 patients, IPSS results were consistent with a central source of ACTH. Two of the other 3 patients had EAS (one lung carcinoid and one occult), and 1 patient had pathology-proven CD. The sensitivity of IPSS was 96.4%. Only 64.2% of these patients had results meeting criteria for a central source of ACTH by IJVS criteria. Twenty patients with centralizing IPPS have undergone pituitary surgery. Of these, the central origin of excessive ACTH was confirmed with certainty in 16 patients. Among these 16 patients, the IPSS sensitivity was 93.8%, whereas 5 patients had false-negative IJVS (68.7% sensitivity). These results do not support the routine use of IJVS in establishing if the pituitary is the source of excessive ACTH. ACTH = adrenocorticotropic hormone CD = Cushing disease CRH = corticotrophin-releasing hormone CS = Cushing syndrome DDAVP = desmopressin EAS = ectopic ACTH secretion IJVS = internal jugular vein sampling IPSS = inferior petrosal sinus sampling JVS = jugular venous sampling MRI = magnetic resonance imaging.
The Principle of Energetic Consistency
Cohn, Stephen E.
2009-01-01
A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of
How consistent are beliefs about the causes and solutions to illness? An experimental study.
Ogden, J; Jubb, A
2008-01-01
Objectives: Research illustrates that people hold beliefs about the causes and solutions to illness. This study aimed to assess the consistency in these beliefs in terms of their variation according to type of problem and whether they are consistent with each other. Further, the study aimed to assess whether they are open to change and whether changing beliefs about cause resulted in a subsequent shift in beliefs about solutions. Design: Experimental factorial 3 (problem) × 2 (manipulated cau...
Directory of Open Access Journals (Sweden)
A. V. Kurinnoy
2012-12-01
Full Text Available Using rotary viscometer «Reotest 2» researches of consistency properties of instillation gel-liniment for antimicrobial therapy of pyoinfl ammatory diseases of maxillufacial area are conducted. It is defi ned, that consistency properties of gel-liniment for antimicrobial therapy of pyoinflammatory diseases of maxillufacial area are within the limits of rheological optimum of consistency of ointments, and value «mechanical stability» (1,33 characterizes the system as exceptionally thixotropic, providing recoverability of the system after loading and allows to forecast stability of consistency properties of gel-liniment at the prolonged storage.
Le Mens, Gael; Denrell, Jerker
2011-01-01
Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them.…
Drummond, A; Rodrigo, A G
2000-12-01
Reconstruction of evolutionary relationships from noncontemporaneous molecular samples provides a new challenge for phylogenetic reconstruction methods. With recent biotechnological advances there has been an increase in molecular sequencing throughput, and the potential to obtain serial samples of sequences from populations, including rapidly evolving pathogens, is fast being realized. A new method called the serial-sample unweighted pair grouping method with arithmetic means (sUPGMA) is presented that reconstructs a genealogy or phylogeny of sequences sampled serially in time using a matrix of pairwise distances. The resulting tree depicts the terminal lineages of each sample ending at a different level consistent with the sample's temporal order. Since sUPGMA is a variant of UPGMA, it will perform best when sequences have evolved at a constant rate (i.e., according to a molecular clock). On simulated data, this new method performs better than standard cluster analysis under a variety of longitudinal sampling strategies. Serial-sample UPGMA is particularly useful for analysis of longitudinal samples of viruses and bacteria, as well as ancient DNA samples, with the minimal requirement that samples of sequences be ordered in time.
Drone Transport of Chemistry and Hematology Samples Over Long Distances.
Amukele, Timothy K; Hernandez, James; Snozek, Christine L H; Wyatt, Ryan G; Douglas, Matthew; Amini, Richard; Street, Jeff
2017-11-02
We addressed the stability of biological samples in prolonged drone flights by obtaining paired chemistry and hematology samples from 21 adult volunteers in a single phlebotomy event-84 samples total. Half of the samples were held stationary, while the other samples were flown for 3 hours (258 km) in a custom active cooling box mounted on the drone. After the flight, 19 chemistry and hematology tests were performed. Seventeen analytes had small or no bias, but glucose and potassium in flown samples showed an 8% and 6.2% bias, respectively. The flown samples (mean, 24.8°C) were a mean of 2.5°C cooler than the stationary samples (mean, 27.3°C) during transportation to the flight field as well as during the flight. The changes in glucose and potassium are consistent with the magnitude and duration of the temperature difference between the flown and stationary samples. Long drone flights of biological samples are feasible but require stringent environmental controls to ensure consistent results. © American Society for Clinical Pathology, 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Assessing distances and consistency of kinematics in Gaia/TGAS
Schönrich, Ralph; Aumer, Michael
2017-12-01
We apply the statistical methods by Schönrich, Binney & Asplund to assess the quality of distances and kinematics in the Radial Velocity Experiment (RAVE)-Tycho-Gaia Astrometric Solution (TGAS) and Large Sky Area Multiobject Fiber Spectroscopic Telescope (LAMOST)-TGAS samples of Solar neighbourhood stars. These methods yield a nominal distance accuracy of 1-2 per cent. Other than common tests on parallax accuracy, they directly test distance estimations including the effects of distance priors. We show how to construct these priors including the survey selection functions (SSFs) directly from the data. We demonstrate that neglecting the SSFs causes severe distance biases. Due to the decline of the SSFs in distance, the simple 1/parallax estimate only mildly underestimates distances. We test the accuracy of measured line-of-sight velocities (vlos) by binning the samples in the nominal vlos uncertainties. We find: (i) the LAMOST vlos have a ∼-5 km s-1 offset; (ii) the average LAMOST measurement error for vlos is ∼7 km s-1, significantly smaller than, and nearly uncorrelated with the nominal LAMOST estimates. The RAVE sample shows either a moderate distance underestimate, or an unaccounted source of vlos dispersion (e∥) from measurement errors and binary stars. For a subsample of suspected binary stars in RAVE, our methods indicate significant distance underestimates. Separating a sample in metallicity or kinematics to select thick-disc/halo stars, discriminates between distance bias and e∥. For LAMOST, this separation yields consistency with pure vlos measurement errors. We find an anomaly near longitude l ∼ (300 ± 60)° and distance s ∼ (0.32 ± 0.03) kpc on both sides of the galactic plane, which could be explained by either a localized distance error or a breathing mode.
Multirobot FastSLAM Algorithm Based on Landmark Consistency Correction
Directory of Open Access Journals (Sweden)
Shi-Ming Chen
2014-01-01
Full Text Available Considering the influence of uncertain map information on multirobot SLAM problem, a multirobot FastSLAM algorithm based on landmark consistency correction is proposed. Firstly, electromagnetism-like mechanism is introduced to the resampling procedure in single-robot FastSLAM, where we assume that each sampling particle is looked at as a charged electron and attraction-repulsion mechanism in electromagnetism field is used to simulate interactive force between the particles to improve the distribution of particles. Secondly, when multiple robots observe the same landmarks, every robot is regarded as one node and Kalman-Consensus Filter is proposed to update landmark information, which further improves the accuracy of localization and mapping. Finally, the simulation results show that the algorithm is suitable and effective.
Yavuz Konokman, Gamze; Yanpar Yelken, Tugba
2016-01-01
The purpose of the study was to determine the effect of preparing digital stories through an inquiry based learning approach on prospective teachers' resistive behaviors toward technology based instruction and conducting research. The research model was convergent parallel design. The sample consisted of 50 prospective teachers who had completed…
Will Women Diagnosed with Breast Cancer Provide Biological Samples for Research Purposes?
Directory of Open Access Journals (Sweden)
Shelley A Harris
Full Text Available Little is known about the response rates for biological sample donation and attitudes towards control recruitment, especially in younger women. The goals of this pilot study were to determine in women recently diagnosed with breast cancer, the proportion of cases willing to provide biological samples and for purposes of control recruitment, contact information for friends or colleagues.A population-based sample of breast cancer cases (n = 417, 25-74 years was recruited from the Ontario Cancer Registry in 2010 and self-administered questionnaires were completed to determine willingness to provide samples (spot or 24-hr urine, saliva, blood and contact information for friends/colleagues for control recruitment. Using Χ2 analyses of contingency tables we evaluated if these proportions varied by age group (<45 and 45+ and other factors such as ethnicity, education, income, body mass index (BMI, smoking status and alcohol consumption.Cases were willing to provide blood samples, by visiting a clinic (62% or by having a nurse visit the home (61%. Moreover, they would provide saliva (73%, and morning or 24-hr urine samples (66% and 52%. Younger cases (≤45 were 3 times (OR more likely more than older cases to agree to collect morning urine (95% CI: 1.15-8.35. Only 26% of cases indicated they would provide contact information of friends or work colleagues to act as controls. Educated cases were more likely to agree to provide samples, and cases who consumed alcohol were more willing to provide contact information. Ethnicity, income, BMI and smoking had little effect on response rates.Reasonable response rates for biological sample collection should be expected in future case controls studies in younger women, but other methods of control selection must be devised.
Simancas-Pallares, Miguel Angel; Fortich Mesa, Natalia; González Martínez, Farith Damián
To determine the internal consistency and content validity of the Maslach Burnout Inventory-Student Survey (MBI-SS) in dental students from Cartagena, Colombia. Scale validation study in 886 dental students from Cartagena, Colombia. Factor structure was determined through exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). Internal consistency was measured using the Cronbach's alpha coefficient. Analyses were performed using the Stata v.13.2 for Windows (Statacorp., USA) and Mplus v.7.31 for Windows (Muthén & Muthén, USA) software. Internal consistency was α=.806. The factor structure showed three that accounted for the 56.6% of the variance. CFA revealed: χ 2 =926.036; df=85; RMSEA=.106 (90%CI, .100-.112); CFI=.947; TLI=.934. The MBI showed an adequate internal consistency and a factor structure being consistent with the original proposed structure with a poor fit, which does not reflect adequate content validity in this sample. Copyright © 2016 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Frictional behaviour of sandstone: A sample-size dependent triaxial investigation
Roshan, Hamid; Masoumi, Hossein; Regenauer-Lieb, Klaus
2017-01-01
Frictional behaviour of rocks from the initial stage of loading to final shear displacement along the formed shear plane has been widely investigated in the past. However the effect of sample size on such frictional behaviour has not attracted much attention. This is mainly related to the limitations in rock testing facilities as well as the complex mechanisms involved in sample-size dependent frictional behaviour of rocks. In this study, a suite of advanced triaxial experiments was performed on Gosford sandstone samples at different sizes and confining pressures. The post-peak response of the rock along the formed shear plane has been captured for the analysis with particular interest in sample-size dependency. Several important phenomena have been observed from the results of this study: a) the rate of transition from brittleness to ductility in rock is sample-size dependent where the relatively smaller samples showed faster transition toward ductility at any confining pressure; b) the sample size influences the angle of formed shear band and c) the friction coefficient of the formed shear plane is sample-size dependent where the relatively smaller sample exhibits lower friction coefficient compared to larger samples. We interpret our results in terms of a thermodynamics approach in which the frictional properties for finite deformation are viewed as encompassing a multitude of ephemeral slipping surfaces prior to the formation of the through going fracture. The final fracture itself is seen as a result of the self-organisation of a sufficiently large ensemble of micro-slip surfaces and therefore consistent in terms of the theory of thermodynamics. This assumption vindicates the use of classical rock mechanics experiments to constrain failure of pressure sensitive rocks and the future imaging of these micro-slips opens an exciting path for research in rock failure mechanisms.
Sample geometry as critical factor for stability research
Klerk, W.P.C. de; Boers, M.N.
2003-01-01
Stability research on gun propellants has been widely performed by microcalorimetry since the 1980s. TNO Prins Maurits Laboratory has already a broad experience since the early 1970s. In the past many studies were performed, to investigate the influence of oxygen, humidity etc. Less attention was
Brennan, Sue E; McKenzie, Joanne E; Turner, Tari; Redman, Sally; Makkar, Steve; Williamson, Anna; Haynes, Abby; Green, Sally E
2017-01-17
Capacity building strategies are widely used to increase the use of research in policy development. However, a lack of well-validated measures for policy contexts has hampered efforts to identify priorities for capacity building and to evaluate the impact of strategies. We aimed to address this gap by developing SEER (Seeking, Engaging with and Evaluating Research), a self-report measure of individual policymakers' capacity to engage with and use research. We used the SPIRIT Action Framework to identify pertinent domains and guide development of items for measuring each domain. Scales covered (1) individual capacity to use research (confidence in using research, value placed on research, individual perceptions of the value their organisation places on research, supporting tools and systems), (2) actions taken to engage with research and researchers, and (3) use of research to inform policy (extent and type of research use). A sample of policymakers engaged in health policy development provided data to examine scale reliability (internal consistency, test-retest) and validity (relation to measures of similar concepts, relation to a measure of intention to use research, internal structure of the individual capacity scales). Response rates were 55% (150/272 people, 12 agencies) for the validity and internal consistency analyses, and 54% (57/105 people, 9 agencies) for test-retest reliability. The individual capacity scales demonstrated adequate internal consistency reliability (alpha coefficients > 0.7, all four scales) and test-retest reliability (intra-class correlation coefficients > 0.7 for three scales and 0.59 for fourth scale). Scores on individual capacity scales converged as predicted with measures of similar concepts (moderate correlations of > 0.4), and confirmatory factor analysis provided evidence that the scales measured related but distinct concepts. Items in each of these four scales related as predicted to concepts in the measurement model derived
Daily self-sampling for high-risk human papillomavirus (HR-HPV) testing.
Sanner, Karin; Wikström, Ingrid; Gustavsson, Inger; Wilander, Erik; Lindberg, Julia Hedlund; Gyllensten, Ulf; Olovsson, Matts
2015-12-01
Self-sampling for HPV as part of primary screening is a well-tolerated method for women not attending organized Pap smear screening and could increase coverage of cervical cancer screening. To investigate if the prevalence of HR-HPV varies from day to day in infected women and if one single sample is reliable for detecting an ongoing infection. This is a prospective cohort study on 12 premenopausal and 13 postmenopausal women performing daily self-sampling for HR-HPV testing. They were all HR-HPV-positive 1-3 months ago. Postmenopausal women were sampled for 28 days and premenopausal women sampled during bleeding-free days in one menstrual cycle. A possible difference in viral load between the estrogen-dominated proliferative phase and the progesterone-dominated secretory phase was analyzed. Consistent results throughout the sampling period were observed for 19 women, with either a daily presence of HPV (14 women) or no HPV at all during the sampling period (5 women). Of 607 samples from 25 women, 596 were consistently positive or negative for HPV during the sampling period and 11 were inconsistent (2%). There was no difference in HPV copy number between the estrogen dominated proliferative or progesterone dominated secretory menstrual cycle phases. The major finding was a high degree of consistency concerning HR-HPV positivity and negativity of HR-HPV in vaginal fluid during a sustained period of daily self-sampling. It does not appear to matter whether the sample is collected in the proliferative or secretory phase. Copyright © 2015 Elsevier B.V. All rights reserved.
Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A
2014-02-01
Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
McCABE, SEAN ESTEBAN; HUGHES, TONDA L.; BOSTWICK, WENDY; BOYD, CAROL J.
2011-01-01
Objective The present research examines the associations between three distinct dimensions of sexual orientation and substance use in a random sample of undergraduate students. Method A Web-based survey was administered to students attending a large, midwestern research university in the spring of 2003. The sample consisted of 9,161 undergraduate students: 56% female, 68% white, 13% Asian, 6% black, 4% Hispanic and 9% other racial categories. Using multivariate logistic regression analyses, several measures of alcohol and other drug use were compared across three dimensions of sexual orientation: sexual identity, sexual attraction and sexual behavior. Results All three dimensions of sexual orientation were associated with substance use, including heavy episodic drinking, cigarette smoking and illicit drug use. Consistent with results of several other recent studies, “nonheterosexual” identity, attraction or behavior was associated with a more pronounced and consistent risk of substance use in women than in men. Conclusions Study findings suggest substantial variability in substance use across the three dimensions of sexual orientation and reinforce the importance of stratifying by gender and using multiple measures to assess sexual orientation. Study results have implications for future research and for interventions aimed at reducing substance use among college students. PMID:16331847
Directory of Open Access Journals (Sweden)
José da Rocha Carvalheiro
1979-09-01
Full Text Available Descreve-se um sistema contínuo de levantamento de condições de saúde, por entrevistas domiciliárias, operando em Ribeirão Preto (SP desde 1974. Comentam-se as vantagens quanto à sua utilização na investigação de problemas específicos surgidos nesse período, bem como a sua utilização no ensino.The use of adequate populational-base survey is frequently impossible in epidemiological studies. Special studies are made among particular groups of individuals to investigate simultaneously the presence of both the factor and the disease. In these studies it is obviously important to use adequate sampling techniques. A system of continuous household sampling is described, designed to perform, simultaneously, epidemiological research, health system monitoring and to serve as a basis for courses on sampling techniques and epidemiological methods. In the municipality of Ribeirão Preto, S. Paulo, Brazil a household sampling system has been in operation since 1974, using a master sample of 8500 households. Every two weeks, 380 households are visited and information is gathered about diseases, accidents, and the use of health services. Special epidemiological research is introduced when necessary. Future development includes the use of standardized questionnaires and physical and laboratory examinations of the people interviewed.
Consistency of variables in PCS and JASTRO great area database
International Nuclear Information System (INIS)
Nishino, Tomohiro; Teshima, Teruki; Abe, Mitsuyuki
1998-01-01
To examine whether the Patterns of Care Study (PCS) reflects the data for the major areas in Japan, the consistency of variables in the PCS and in the major area database of the Japanese Society for Therapeutic Radiology and Oncology (JASTRO) were compared. Patients with esophageal or uterine cervical cancer were sampled from the PCS and JASTRO databases. From the JASTRO database, 147 patients with esophageal cancer and 95 patients with uterine cervical cancer were selected according to the eligibility criteria for the PCS. From the PCS, 455 esophageal and 432 uterine cervical cancer patients were surveyed. Six items for esophageal cancer and five items for uterine cervical cancer were selected for a comparative analysis of PCS and JASTRO databases. Esophageal cancer: Age (p=.0777), combination of radiation and surgery (p=.2136), and energy of the external beam (p=.6400) were consistent for PCS and JASTRO. However, the dose of the external beam for the non-surgery group showed inconsistency (p=.0467). Uterine cervical cancer: Age (p=.6301) and clinical stage (p=.8555) were consistent for the two sets of data. However, the energy of the external beam (p<.0001), dose rate of brachytherapy (p<.0001), and brachytherapy utilization by clinical stage (p<.0001) showed inconsistencies. It appears possible that the JASTRO major area database could not account for all patients' backgrounds and factors and that both surveys might have an imbalance in the stratification of institutions including differences in equipment and staffing patterns. (author)
Turbidity threshold sampling for suspended sediment load estimation
Jack Lewis; Rand Eads
2001-01-01
Abstract - The paper discusses an automated procedure for measuring turbidity and sampling suspended sediment. The basic equipment consists of a programmable data logger, an in situ turbidimeter, a pumping sampler, and a stage-measuring device. The data logger program employs turbidity to govern sample collection during each transport event. Mounting configurations and...
Evaluating the Consistency of the FNA Test in Pathologically Proven Nodules of Thyroidectomy
Directory of Open Access Journals (Sweden)
Alireza Khazaei
2018-02-01
Full Text Available Fine Needle Aspiration (FNA is a selective diagnostic technique for the evaluation of non-toxic thyroid nodules. Thyroid FNA results are either undiagnosed or suspicious and indeterminate in 20-30% of cases. Therefore, this study seeks to determine the consistency of the FNA test in pathologically proven nodules of thyroidectomy. This is a descriptive cross-sectional study carried out on a total of 73 candidates for thyroidectomy who had been admitted to Imam Ali Hospital. A census sampling method has been used in this study. The FNA samples and pathology samples were evaluated and the consistency of the FNA test in pathologically proven nodules were compared. The SPSS software was used for data analysis. The mean age of the patients was 40.1 ± 12.9 years. 23.3% of the participants were male and 76.7% of them were female. The malignancy rate in the pathology was 65.8% (48 cases and 53.4% (39 cases in the FNA. Of the 48 positive cases, the FNA pathology diagnosed 35 cases (72.9% as positive and 13 cases (27.1% as negative. Of the 25 negative cases, the FNA pathology diagnosed 21 cases (84% as negative and 4 cases (16% as positive. Sensitivity, specificity, positive and negative predictive values of FNA in malignancy diagnosis were 72.92, 84, 89.74, and 61.76%, respectively. The results show that FNA does not have a high sensitivity in the diagnosis of malignancy, but has good specificity and the use of other diagnostic methods before the operation of thyroid nodules seems necessary.
Lemke, Sonne; Brennan, Penny L; SooHoo, Sonya; Schutte, Kathleen K
2017-08-01
In 2011, the Veterans Health Administration (VHA) began implementing consistent staff assignment in its nursing homes (called Community Living Centers or CLCs). Consistent assignment, a cornerstone of culture change, minimizes the number of staff who provide a resident's care. The present research assessed the level and stability of consistent assignment in units within VHA CLCs and identified unit characteristics related to implementation of this staff assignment model. Schedulers in 185 of 335 organizational units that make up VHA CLCs completed a Staffing Practices Survey. For the month prior to the survey, 53% of CLC units had full implementation of consistent assignment. Tracked back over time, 37% of CLC units had stable high consistent assignment, 29% had stable low consistent assignment, and 34% were variable. Units with stable high consistent assignment were most likely to use care teams with stable membership and to obtain staff input for care assignments. Schedulers in these units reported more positive experiences with consistent staff assignment and better unit functioning in terms of staff absences, complaints about workload fairness, and resolution of scheduling problems. Units with stable low and variable consistent assignment were similar in most of these respects; however, units with variable consistent assignment made greater use of stable care teams and were less likely to change assignments at a staff member's request. Overall, consistent assignment implementation was not related to unit size, nursing hours per resident day, or specialty focus. Findings can help guide consistent staff assignment implementation in VHA and community nursing homes. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Bowman, Kaye; McKenna, Suzy
2016-01-01
This occasional paper provides an overview of the development of Australia's national training system and is a key knowledge document of a wider research project "Consistency with flexibility in the Australian national training system." This research project investigates the various approaches undertaken by each of the jurisdictions to…
Consistent classical supergravity theories
International Nuclear Information System (INIS)
Muller, M.
1989-01-01
This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included
Statistics and sampling in transuranic studies
International Nuclear Information System (INIS)
Eberhardt, L.L.; Gilbert, R.O.
1980-01-01
The existing data on transuranics in the environment exhibit a remarkably high variability from sample to sample (coefficients of variation of 100% or greater). This chapter stresses the necessity of adequate sample size and suggests various ways to increase sampling efficiency. Objectives in sampling are regarded as being of great importance in making decisions as to sampling methodology. Four different classes of sampling methods are described: (1) descriptive sampling, (2) sampling for spatial pattern, (3) analytical sampling, and (4) sampling for modeling. A number of research needs are identified in the various sampling categories along with several problems that appear to be common to two or more such areas
Wood, Louise; Smith, Michael; Miller, Christopher B; O'Carroll, Ronan E
2018-06-19
Vaccinations are important preventative health behaviors. The recently developed Vaccination Attitudes Examination (VAX) Scale aims to measure the reasons behind refusal/hesitancy regarding vaccinations. The aim of this replication study is to conduct an independent test of the newly developed VAX Scale in the UK. We tested (a) internal consistency (Cronbach's α); (b) convergent validity by assessing its relationships with beliefs about medication, medical mistrust, and perceived sensitivity to medicines; and (c) construct validity by testing how well the VAX Scale discriminated between vaccinators and nonvaccinators. A sample of 243 UK adults completed the VAX Scale, the Beliefs About Medicines Questionnaire, the Perceived Sensitivity to Medicines Scale, and the Medical Mistrust Index, in addition to demographics of age, gender, education levels, and social deprivation. Participants were asked (a) whether they received an influenza vaccination in the past year and (b) if they had a young child, whether they had vaccinated the young child against influenza in the past year. The VAX (a) demonstrated high internal consistency (α = .92); (b) was positively correlated with medical mistrust and beliefs about medicines, and less strongly correlated with perceived sensitivity to medicines; and (c) successfully differentiated parental influenza vaccinators from nonvaccinators. The VAX demonstrated good internal consistency, convergent validity, and construct validity in an independent UK sample. It appears to be a useful measure to help us understand the health beliefs that promote or deter vaccination behavior.
Influence of Water Content on the Flow Consistency of Dredged Marine Soils
Directory of Open Access Journals (Sweden)
Rosman M. Z.
2016-01-01
Full Text Available In present time, dredged marine soils (DMS are generally considered as geo-waste in Malaysia. It is also known to contain high value of water and low shear strength. Lightly solidified soils such as soilcement slurry and flowable fill are known as controlled low strength materials (CLSM. On site, the CLSM was tested for its consistency by using an open-ended cylinder pipe. The vertical and lateral displacement from the test would determine the quality and workability of the CLSM. In this study, manufactured kaolin powder was mixed with different percentages of water. Cement was also added to compare the natural soil with solidified soil samples. There are two methods of flowability test used, namely the conventional lift method and innovative drop method. The lateral displacement or soil spread diameter values were recorded and averaged. Tests showed that the soil spread diameter corresponded almost linear with the increasing amount of water. The binder-added samples show no significant difference with non-binder sample. Also, the mixing water content and percentage of fines had influenced the soil spread diameter.
The feasible research with measuring radon for taking the soils sample
International Nuclear Information System (INIS)
Zeng Bing, Ge Liangquan; Liu Hefan; Li Yeqiang; Zhang Jinzhao; Song Xiao'an
2010-01-01
It explains the mechanism of the separation of soil's radon. Through the designed experiment, it confirms the feasibility of measuring radon for taking the soil's sample. It determines the content of the radon and its sub field with indoor and outside through ways of the activated charcoal adsorption, the initiative suction and the diameter mark etching, also the 226 Ra. The paper indicates: it is feasible with measuring radon for taking the soil's sample, and the stability of data is that indoor data are better than outside's. The temperature, the humidity, the rainfall amount, the intensity and so on are the serious influence of the data. If you want to take a soil's sample, you must avoid the rain as far as possible, and avoid the fault zone, the belt of folded strata and complex geologic structure region, and so on. (authors)
Analysis of NASA Common Research Model Dynamic Data
Balakrishna, S.; Acheson, Michael J.
2011-01-01
Recent NASA Common Research Model (CRM) tests at the Langley National Transonic Facility (NTF) and Ames 11-foot Transonic Wind Tunnel (11-foot TWT) have generated an experimental database for CFD code validation. The database consists of force and moment, surface pressures and wideband wing-root dynamic strain/wing Kulite data from continuous sweep pitch polars. The dynamic data sets, acquired at 12,800 Hz sampling rate, are analyzed in this study to evaluate CRM wing buffet onset and potential CRM wing flow separation.
Hajcak, Greg; Meyer, Alexandria; Kotov, Roman
2017-08-01
In the clinical neuroscience literature, between-subjects differences in neural activity are presumed to reflect reliable measures-even though the psychometric properties of neural measures are almost never reported. The current article focuses on the critical importance of assessing and reporting internal consistency reliability-the homogeneity of "items" that comprise a neural "score." We demonstrate how variability in the internal consistency of neural measures limits between-subjects (i.e., individual differences) effects. To this end, we utilize error-related brain activity (i.e., the error-related negativity or ERN) in both healthy and generalized anxiety disorder (GAD) participants to demonstrate options for psychometric analyses of neural measures; we examine between-groups differences in internal consistency, between-groups effect sizes, and between-groups discriminability (i.e., ROC analyses)-all as a function of increasing items (i.e., number of trials). Overall, internal consistency should be used to inform experimental design and the choice of neural measures in individual differences research. The internal consistency of neural measures is necessary for interpreting results and guiding progress in clinical neuroscience-and should be routinely reported in all individual differences studies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Forensic Tools to Track and Connect Physical Samples to Related Data
Molineux, A.; Thompson, A. C.; Baumgardner, R. W.
2016-12-01
Identifiers, such as local sample numbers, are critical to successfully connecting physical samples and related data. However, identifiers must be globally unique. The International Geo Sample Number (IGSN) generated when registering the sample in the System for Earth Sample Registration (SESAR) provides a globally unique alphanumeric code associated with basic metadata, related samples and their current physical storage location. When registered samples are published, users can link the figured samples to the basic metadata held at SESAR. The use cases we discuss include plant specimens from a Permian core, Holocene corals and derived powders, and thin sections with SEM stubs. Much of this material is now published. The plant taxonomic study from the core is a digital pdf and samples can be directly linked from the captions to the SESAR record. The study of stable isotopes from the corals is not yet digitally available, but individual samples are accessible. Full data and media records for both studies are located in our database where higher quality images, field notes, and section diagrams may exist. Georeferences permit mapping in current and deep time plate configurations. Several aspects emerged during this study. The first, ensure adequate and consistent details are registered with SESAR. Second, educate and encourage the researcher to obtain IGSNs. Third, publish the archive numbers, assigned prior to publication, alongside the IGSN. This provides access to further data through an Integrated Publishing Toolkit (IPT)/aggregators/or online repository databases, thus placing the initial sample in a much richer context for future studies. Fourth, encourage software developers to customize community software to extract data from a database and use it to register samples in bulk. This would improve workflow and provide a path for registration of large legacy collections.
Fourier rebinning and consistency equations for time-of-flight PET planograms
International Nuclear Information System (INIS)
Li, Yusheng; Matej, Samuel; Metzler, Scott D; Defrise, Michel
2016-01-01
Due to the unique geometry, dual-panel PET scanners have many advantages in dedicated breast imaging and on-board imaging applications since the compact scanners can be combined with other imaging and treatment modalities. The major challenges of dual-panel PET imaging are the limited-angle problem and data truncation, which can cause artifacts due to incomplete data sampling. The time-of-flight (TOF) information can be a promising solution to reduce these artifacts. The TOF planogram is the native data format for dual-panel TOF PET scanners, and the non-TOF planogram is the 3D extension of linogram. The TOF planograms is five-dimensional while the objects are three-dimensional, and there are two degrees of redundancy. In this paper, we derive consistency equations and Fourier-based rebinning algorithms to provide a complete understanding of the rich structure of the fully 3D TOF planograms. We first derive two consistency equations and John’s equation for 3D TOF planograms. By taking the Fourier transforms, we obtain two Fourier consistency equations (FCEs) and the Fourier–John equation (FJE), which are the duals of the consistency equations and John’s equation, respectively. We then solve the FCEs and FJE using the method of characteristics. The two degrees of entangled redundancy of the 3D TOF data can be explicitly elicited and exploited by the solutions along the characteristic curves. As the special cases of the general solutions, we obtain Fourier rebinning and consistency equations (FORCEs), and thus we obtain a complete scheme to convert among different types of PET planograms: 3D TOF, 3D non-TOF, 2D TOF and 2D non-TOF planograms. The FORCEs can be used as Fourier-based rebinning algorithms for TOF-PET data reduction, inverse rebinnings for designing fast projectors, or consistency conditions for estimating missing data. As a byproduct, we show the two consistency equations are necessary and sufficient for 3D TOF planograms. Finally, we give
[Qualitative research methodology in health care].
Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara
2017-03-01
Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.
International Nuclear Information System (INIS)
Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias
2002-01-01
We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models
Design of a Turbulence Generator of Medium Consistency Pulp Pumps
Directory of Open Access Journals (Sweden)
Hong Li
2012-01-01
Full Text Available The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the shearing chamber is built, and the formula and procedure to calculate the radius of the turbulence generator are established. The blade laying angle is referenced from the turbine agitator which has the similar shape with the turbulence generator, and the CFD simulation is applied to study the different flow fields with different blade laying angles. Then the recommended blade laying angle of the turbulence generator is formed to be between 60° and 75°.
Standard practices for sampling uranium-Ore concentrate
American Society for Testing and Materials. Philadelphia
2010-01-01
1.1 These practices are intended to provide the nuclear industry with procedures for obtaining representative bulk samples from uranium-ore concentrates (UOC) (see Specification C967). 1.2 These practices also provide for obtaining a series of representative secondary samples from the original bulk sample for the determination of moisture and other test purposes, and for the preparation of pulverized analytical samples (see Test Methods C1022). 1.3 These practices consist of a number of alternative procedures for sampling and sample preparation which have been shown to be satisfactory through long experience in the nuclear industry. These procedures are described in the following order. Stage Procedure Section Primary Sampling One-stage falling stream 4 Two-stage falling stream 5 Auger 6 Secondary Sampling Straight-path (reciprocating) 7 Rotating (Vezin) 8, 9 Sample Preparation 10 Concurrent-drying 11-13 Natural moisture 14-16 Calcination 17, 18 Sample Packaging 19 Wax s...
A sampling algorithm for segregation analysis
Directory of Open Access Journals (Sweden)
Henshall John
2001-11-01
Full Text Available Abstract Methods for detecting Quantitative Trait Loci (QTL without markers have generally used iterative peeling algorithms for determining genotype probabilities. These algorithms have considerable shortcomings in complex pedigrees. A Monte Carlo Markov chain (MCMC method which samples the pedigree of the whole population jointly is described. Simultaneous sampling of the pedigree was achieved by sampling descent graphs using the Metropolis-Hastings algorithm. A descent graph describes the inheritance state of each allele and provides pedigrees guaranteed to be consistent with Mendelian sampling. Sampling descent graphs overcomes most, if not all, of the limitations incurred by iterative peeling algorithms. The algorithm was able to find the QTL in most of the simulated populations. However, when the QTL was not modeled or found then its effect was ascribed to the polygenic component. No QTL were detected when they were not simulated.
Recent developments in sample preparation and data pre-treatment in metabonomics research.
Li, Ning; Song, Yi peng; Tang, Huiru; Wang, Yulan
2016-01-01
Metabonomics is a powerful approach for biomarker discovery and an effective tool for pinpointing endpoint metabolic effects of external stimuli, such as pathogens and disease development. Due to its wide applications, metabonomics is required to deal with various biological samples of different properties. Hence sample preparation and corresponding data pre-treatment become important factors in ensuring validity of an investigation. In this review, we summarize some recent developments in metabonomics sample preparation and data-pretreatment procedures. Copyright © 2015 Elsevier Inc. All rights reserved.
Preparation of honey sample for tritium monitoring
International Nuclear Information System (INIS)
Chen Bingru; Wang Chenlian; Wang Weihua
1989-01-01
The method of preparation of honey sample for tritium monitoring was described. The equipments consist of an air and honey supply system, a quartz combustor with CM-type monolithic combustion catalyst and a condensation system. In the equipments, honey sample was converted into cooling water by the distilling, cracking and carbonizing procedures for tritium counting. The recovery ratio is 99.0 ± 4.5 percent for tritiated water and 96.0 ± 2.0 for tritiated organic compounds. It is a feasible preparing method for the total tritium monitoring in honey sample
Christman, Stephen
2014-01-01
Prior research indicates that consistent-handedness is associated with decreased access to right hemisphere processing and consequent decreased cognitive flexibility. Handedness differences on three dimensions of personality related to cognitive flexibility were investigated. Experiment 1 found that consistent-handedness was associated with decreased sensation seeking. Experiment 2 found that consistent-handedness was associated with increased Right Wing Authoritarianism. Experiment 3 found that consistent-handedness was associated with increased sensitivity to disgust. Prior research has shown associations between decreased sensation seeking, increased authoritarianism, and increased disgust sensitivity, and consistent-handedness appears to underlie all of these associations. Personality researchers are encouraged to include handedness as a factor in analyses, as failure to do so can lead to systematic mis-estimation of sex differences due to the over-representation of females among consistent-handers.
Bidirectional associations between mothers' and fathers' parenting consistency and child BMI.
Jansen, Pauline W; Giallo, Rebecca; Westrupp, Elizabeth M; Wake, Melissa; Nicholson, Jan M
2013-12-01
Research suggests that general parenting dimensions and styles are associated with children's BMI, but directionality in this relationship remains unknown. Moreover, there has been little attention to the influences of both mothers' and fathers' parenting. We aimed to examine reciprocal relationships between maternal and paternal parenting consistency and child BMI. Participants were 4002 children and their parents in the population-based Longitudinal Study of Australian Children. Mothers and fathers self-reported parenting consistency, and children's BMI was measured at 4 biennial waves starting at age 4 to 5 years in 2004. Bidirectionality between parenting and child BMI was examined by using regression analyses in cross-lagged models. The best-fitting models indicated a modest influence from parenting to child BMI, whereas no support was found for bidirectional influences. For mothers, higher levels of parenting consistency predicted lower BMI in children from Waves 1 to 2 and 3 to 4; for example, for every SD increase in mothers' parenting consistency at Wave 1, child BMIz fell by 0.025 in Wave 2 (95% confidence interval: -0.05 to -0.003). For fathers, higher levels of parenting consistency were associated with lower child BMI from Waves 1 to 2 and 2 to 3. Parenting inconsistency of mothers and fathers prospectively predicted small increases in offspring BMI over 2-year periods across middle childhood. However, child BMI did not appear to influence parenting behavior. These findings support recent calls for expanding childhood overweight interventions to address the broad parenting context while involving both mothers and fathers.
International Nuclear Information System (INIS)
Hazeltine, R.D.
1988-12-01
The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig
Guo, Jiin-Huarng; Luh, Wei-Ming
2009-05-01
When planning a study, sample size determination is one of the most important tasks facing the researcher. The size will depend on the purpose of the study, the cost limitations, and the nature of the data. By specifying the standard deviation ratio and/or the sample size ratio, the present study considers the problem of heterogeneous variances and non-normality for Yuen's two-group test and develops sample size formulas to minimize the total cost or maximize the power of the test. For a given power, the sample size allocation ratio can be manipulated so that the proposed formulas can minimize the total cost, the total sample size, or the sum of total sample size and total cost. On the other hand, for a given total cost, the optimum sample size allocation ratio can maximize the statistical power of the test. After the sample size is determined, the present simulation applies Yuen's test to the sample generated, and then the procedure is validated in terms of Type I errors and power. Simulation results show that the proposed formulas can control Type I errors and achieve the desired power under the various conditions specified. Finally, the implications for determining sample sizes in experimental studies and future research are discussed.
Monolith Chromatography as Sample Preparation Step in Virome Studies of Water Samples.
Gutiérrez-Aguirre, Ion; Kutnjak, Denis; Rački, Nejc; Rupar, Matevž; Ravnikar, Maja
2018-01-01
Viruses exist in aquatic media and many of them use this media as transmission route. Next-generation sequencing (NGS) technologies have opened new doors in virus research, allowing also to reveal a hidden diversity of viral species in aquatic environments. Not surprisingly, many of the newly discovered viruses are found in environmental fresh and marine waters. One of the problems in virome research can be the low amount of viral nucleic acids present in the sample in contrast to the background ones (host, eukaryotic, prokaryotic, environmental). Therefore, virus enrichment prior to NGS is necessary in many cases. In water samples, an added problem resides in the low concentration of viruses typically present in aquatic media. Different concentration strategies have been used to overcome such limitations. CIM monoliths are a new generation of chromatographic supports that due to their particular structural characteristics are very efficient in concentration and purification of viruses. In this chapter, we describe the use of CIM monolithic chromatography for sample preparation step in NGS studies targeting viruses in fresh or marine water. The step-by-step protocol will include a case study where CIM concentration was used to study the virome of a wastewater sample using NGS.
Gibberd, Alison J; Simpson, Judy M; Eades, Sandra J
2017-10-01
Algorithms are often used to improve identification of Aboriginal Australians in linked data sets with inconsistent and incomplete recording of Aboriginal status. We compared how consistently some common algorithms identified family members, developed a new algorithm incorporating relatives' information, and assessed the effects of these algorithms on health estimates. The sample was people born 1980-2011 recorded as Aboriginal at least once (or a relative) in four Western Australian data sets and their relatives (N = 156,407). A very inclusive approach, ever-Aboriginal (EA/EA+, where + denotes children's records incorporated), and two more specific approaches, multistage median (MSM/MSM+) and last record (LR/LR+), were chosen, along with the new algorithm (MSM+Family). Ever-Aboriginal (EA) categorized relatives the least consistently; 25% of parent-child triads had incongruent Aboriginal statuses with EA+, compared with only 9% with MSM+. With EA+, 14% of full siblings had different statuses compared with 8% for MSM+. EA produced the lowest estimates of the proportion of Aboriginal people with poor health outcomes. Using relatives' records reduced the number of uncategorized people and categorized as Aboriginal more people who had few records (e.g., no hospital admissions). When many data sets are linked, more specific algorithms select more representative Aboriginal samples and identify Aboriginality of relatives more consistently. Copyright © 2017 Elsevier Inc. All rights reserved.
Donna L. Washington, MD, MPH; Su Sun, MPH; Mark Canning, BA
2010-01-01
Most veteran research is conducted in Department of Veterans Affairs (VA) healthcare settings, although most veterans obtain healthcare outside the VA. Our objective was to determine the adequacy and relative contributions of Veterans Health Administration (VHA), Veterans Benefits Administration (VBA), and Department of Defense (DOD) administrative databases for representing the U.S. veteran population, using as an example the creation of a sampling frame for the National Survey of Women Vete...
International Nuclear Information System (INIS)
Peterson, M.E.; Scheele, R.D.; Tingey, J.M.
1989-09-01
In FY 1989, Westinghouse Hanford Company (WHC) successfully obtained four core samples (totaling seven segments) of neutralized current acid waste (NCAW) from double-shell tanks (DSTs) 101-AZ and 102-AZ. A segment was a 19-in.-long and 1-in.-diameter cylindrical sample of waste. A core sample consisted of enough 19-in.-long segments to obtain the waste of interest. Three core samples were obtained from DST 101-AZ and one core sample from DST 102-AZ. Two DST 101-AZ core samples consisted of two segments per core, and the third core sample consisted of only one segment. The third core consisted of the solids from the bottom of the tank and was used to determine the relative abrasiveness of this NCAW. The DST 102-AZ core sample consisted of two segments. The core samples were transported to the Pacific Northwest Laboratory (PNL), where the waste was extruded from its sampler and extensively characterized. A characterization plan was followed that simulated the processing of the NCAW samples through retrieval, pretreatment and vitrification process steps. Physical, rheological, chemical and radiochemical properties were measured throughout the process steps. The characterization of the first core sample from DST 101-AZ was completed, and the results are provided in this report. The results for the other core characterizations will be reported in future reports. 3 refs., 13 figs., 10 tabs
Choice, internal consistency, and rationality
Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu
2010-01-01
The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...
Intelligent sampling for the measurement of structured surfaces
International Nuclear Information System (INIS)
Wang, J; Jiang, X; Blunt, L A; Scott, P J; Leach, R K
2012-01-01
Uniform sampling in metrology has known drawbacks such as coherent spectral aliasing and a lack of efficiency in terms of measuring time and data storage. The requirement for intelligent sampling strategies has been outlined over recent years, particularly where the measurement of structured surfaces is concerned. Most of the present research on intelligent sampling has focused on dimensional metrology using coordinate-measuring machines with little reported on the area of surface metrology. In the research reported here, potential intelligent sampling strategies for surface topography measurement of structured surfaces are investigated by using numerical simulation and experimental verification. The methods include the jittered uniform method, low-discrepancy pattern sampling and several adaptive methods which originate from computer graphics, coordinate metrology and previous research by the authors. By combining the use of advanced reconstruction methods and feature-based characterization techniques, the measurement performance of the sampling methods is studied using case studies. The advantages, stability and feasibility of these techniques for practical measurements are discussed. (paper)
International Nuclear Information System (INIS)
Nishina, Kojiro; Nishihara, Hideaki; Mishima, Kaichiro
1994-12-01
This meeting was held on March 4, 1993. Since the first power generation with the JPDR and the initial criticality of the KUR, 30 years, and since the initial criticality of the KUCA, 20 years have elapsed. The researchers in universities have contributed greatly to the research and education of atomic energy, but the perspective of leading the world hereafter in this field is very uncertain. This study meeting was held to seek the way to make the proper contribution. In the meeting, lectures were given on Japanese policy on nuclear fuel cycle, the present state of the upstream research and the downstream research in Japan, the experimental plan in NUCEF, the present state of the researches on TRU decay heat data and TRU nucleus data, the present state of the experimental researches at KUCA and at FCA, the present state of the research on the heat removal from high conversion LWRs and the KUR, the present state of the research on radioactive waste treatment, and the present state of TRU chemical research. The record of the holding of this study meeting is added. (K.I.)
Integrity of the Human Faecal Microbiota following Long-Term Sample Storage.
Directory of Open Access Journals (Sweden)
Elahe Kia
Full Text Available In studies of the human microbiome, faecal samples are frequently used as a non-invasive proxy for the study of the intestinal microbiota. To obtain reliable insights, the need for bacterial DNA of high quality and integrity following appropriate faecal sample collection and preservation steps is paramount. In a study of dietary mineral balance in the context of type 2 diabetes (T2D, faecal samples were collected from healthy and T2D individuals throughout a 13-day residential trial. These samples were freeze-dried, then stored mostly at -20°C from the trial date in 2000/2001 until the current research in 2014. Given the relative antiquity of these samples (~14 years, we sought to evaluate DNA quality and comparability to freshly collected human faecal samples. Following the extraction of bacterial DNA, gel electrophoresis indicated that our DNA extracts were more sheared than extracts made from freshly collected faecal samples, but still of sufficiently high molecular weight to support amplicon-based studies. Likewise, spectrophotometric assessment of extracts revealed that they were of high quality and quantity. A subset of bacterial 16S rRNA gene amplicons were sequenced using Illumina MiSeq and compared against publicly available sequence data representing a similar cohort analysed by the American Gut Project (AGP. Notably, our bacterial community profiles were highly consistent with those from the AGP data. Our results suggest that when faecal specimens are stored appropriately, the microbial profiles are preserved and robust to extended storage periods.
Original Research Original Research
African Journals Online (AJOL)
RAGHAVENDRA
gen. The aim of the study was to determine effect of phorus, and sulphur fertilizers on growth, yield, yield of the garlic crop on the two soil types in the study area tments consisted of three .... Soil samples were analyzed at Debre Z. Agricultural .... valued at an average open market price of 10 birr kg-1, cost of Triple Super ...
Problematic Internet use in a sample of Colombian university students
Directory of Open Access Journals (Sweden)
Diana Ximena Puerta-Cortés
2013-06-01
Full Text Available Internet is a tool that facilitates the development of academic and social activities, business and entertainment. However, particular bevavior may arise in relation with its overuse. This research aims to identify sociodemographic characteristics and type of Internet use in a sample of Colombian university students and relate it to the possible use problematic. The sample consisted of 595 students from the University of Ibagué of 16-34 years of age who completed all three sections of the questionnaire: (1 socio-demographic data, (2 Internet usage information and (3 an adapted version of the Internet Addiction Test - IAT- (Young, 1998a. The results showed two groups, one with controlled use of the internet (88% and one with problematic use (12 %, only one case showed addictive use. Problematic Internet use was related to the number of hours pent on social networks, chat, sites with adult content and movies. The use of these Internet applications generated interference in daily activities.
Consistent histories and operational quantum theory
International Nuclear Information System (INIS)
Rudolph, O.
1996-01-01
In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail
International Nuclear Information System (INIS)
Johnson, S. G.; Keiser, D. D.; Frank, S. M.; DiSanto, T.; Noy, M.
1999-01-01
Argonne National Laboratory is developing an electrometallurgical treatment for spent fuel from the experimental breeder reactor II. A product of this treatment process is a metal waste form that incorporates the stainless steel cladding hulls, zirconium from the fuel and the fission products that are noble to the process, i.e., Tc, Ru, Nb, Pd, Rh, Ag. The nominal composition of this waste form is stainless steel/15 wt% zirconium/1--4 wt% noble metal fission products/1--2 wt % U. Leaching results are presented from several tests and sample types: (1) 2 week monolithic immersion tests on actual metal waste forms produced from irradiated cladding hulls, (2) long term (>2 years) pulsed flow tests on samples containing technetium and uranium and (3) crushed sample immersion tests on cold simulated metal waste form samples. The test results will be compared and their relevance for waste form product consistency testing discussed
International Nuclear Information System (INIS)
Saraiva, Fabio Petersen; Costa, Patricia Grativol; Inomata, Daniela Lumi; Melo, Carlos Sergio Nascimento; Helal Junior, John; Nakashima, Yoshitaka
2007-01-01
Objectives: To investigate optical coherence tomography consistency on foveal thickness, foveal volume, and macular volume measurements in patients with and without diffuse diabetic macular edema. Introduction: Optical coherence tomography represents an objective technique that provides cross-sectional tomographs of retinal structure in vivo. However, it is expected that poor fixation ability, as seen in diabetic macular edema, could alter its results. Several authors have discussed the reproducibility of optical coherence tomography, but only a few have addressed the topic with respect to diabetic maculopathy. Methods: The study recruited diabetic patients without clinically evident retinopathy (control group) and with diffuse macular edema (case group). Only one eye of each patient was evaluated. Five consecutive fast macular scans were taken using Ocular Coherence Tomography 3; the 6 mm macular map was chosen. The consistency in measurements of foveal thickness, foveal volume, and total macular volume for both groups was evaluated using the Pearson's coefficient of variation. The T-test for independent samples was used in order to compare measurements of both groups. Results: Each group consisted of 20 patients. All measurements had a coefficient of variation less than 10%. The most consistent parameter for both groups was the total macular volume. Discussion: Consistency in measurement is a mainstay of any test. A test is unreliable if its measurements can not be correctly repeated. We found a good index of consistency, even considering patients with an unstable gaze. Conclusions: Optical coherence tomography is a consistent method for diabetic subjects with diffuse macular edema. (author)
Energy Technology Data Exchange (ETDEWEB)
Saraiva, Fabio Petersen; Costa, Patricia Grativol; Inomata, Daniela Lumi; Melo, Carlos Sergio Nascimento; Helal Junior, John; Nakashima, Yoshitaka [Universidade de Sao Paulo (USP), SP (Brazil). Hospital das Clinicas. Dept. de Oftalmologia]. E-mail: fabiopetersen@yahoo.com.br
2007-07-01
Objectives: To investigate optical coherence tomography consistency on foveal thickness, foveal volume, and macular volume measurements in patients with and without diffuse diabetic macular edema. Introduction: Optical coherence tomography represents an objective technique that provides cross-sectional tomographs of retinal structure in vivo. However, it is expected that poor fixation ability, as seen in diabetic macular edema, could alter its results. Several authors have discussed the reproducibility of optical coherence tomography, but only a few have addressed the topic with respect to diabetic maculopathy. Methods: The study recruited diabetic patients without clinically evident retinopathy (control group) and with diffuse macular edema (case group). Only one eye of each patient was evaluated. Five consecutive fast macular scans were taken using Ocular Coherence Tomography 3; the 6 mm macular map was chosen. The consistency in measurements of foveal thickness, foveal volume, and total macular volume for both groups was evaluated using the Pearson's coefficient of variation. The T-test for independent samples was used in order to compare measurements of both groups. Results: Each group consisted of 20 patients. All measurements had a coefficient of variation less than 10%. The most consistent parameter for both groups was the total macular volume. Discussion: Consistency in measurement is a mainstay of any test. A test is unreliable if its measurements can not be correctly repeated. We found a good index of consistency, even considering patients with an unstable gaze. Conclusions: Optical coherence tomography is a consistent method for diabetic subjects with diffuse macular edema. (author)
International Nuclear Information System (INIS)
Shepard, J.R.
1991-01-01
The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data
DEFF Research Database (Denmark)
Thomsen, Christa; Nielsen, Anne Ellerup
2006-01-01
This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...
Kobayashi, Eriko; Sakurada, Tomoya; Ueda, Shiro; Satoh, Nobunori
2011-05-01
To assess the attitude of Japanese patients towards pharmacogenomics research and a DNA bank for identifying genomic markers associated with adverse drug reactions (ADRs) and their willingness to donate DNA samples, we conducted a survey of 550 male and female patients. The majority of the respondents showed a positive attitude towards pharmacogenomics research (87.6%) and a DNA bank (75.1%). The willingness to donate DNA samples when experiencing severe ADRs (55.8%) was higher than when taking medications (40.4%). Positive attitudes towards a DNA bank and organ donation were significantly associated with an increased willingness to donate. Though the level of positive attitude in the patient population was higher than that in the general public in our former study (81.0 and 70.4%, respectively), the level of the willingness of patients to donate was 40.4% when taking medications and 55.8% when experiencing severe ADRs which was lower than that of the general public in our former study (45.3 and 61.7%). The results suggested that the level of true willingness in the patient population was lower than that of the general public considering the fictitious situation presented to the public (to suppose that they were patients receiving medication). It is important to assess the willingness of patients who are true potential donors, not the general public.
Contributions to sampling statistics
Conti, Pier; Ranalli, Maria
2014-01-01
This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...
Consistency analysis of parenting styles in Thailand during children's first year.
Phuphaibul, Rutja; Wittayasooporn, Jariya; Choprapawon, Chanpen
2012-09-01
This descriptive study identifies and examines the consistency of parenting styles during the first year of their children's lives. The data were collected from interviewing 4088 parents or primary care takers of the 6 month old infants during the third wave of data collection of The Prospective Cohort of Thai Children project. The instrument used was the Infant Parenting Styles Questionnaire, developed by the researchers, which reflected parental responses to infant care in five different situations. After the answers were categorized into controlling, reasoning, overprotection, and neglectful parenting styles, the weighted kappa was used for the consistency analysis. The findings revealed that during the first 6 months of life, the overprotection style was the most common, followed by the reasoning style. The controlling and neglectful styles were very seldom used. The consistency of the parental styles in the same care givers using the kappa values showed that agreement between each of the styles was very low (-0.0419 to 0.0688). This suggests that parenting styles during the first year of life seem to occur in random patterns. © 2012 Wiley Publishing Asia Pty Ltd.
Directory of Open Access Journals (Sweden)
Mohammad Rahim Kamaluddin
2017-08-01
Full Text Available Religious Orientation Scale-Revised (ROS-R has been used increasingly as an important measure in psychology of religion based researches and widely administered in cross-cultural settings. Unfortunately, there is no valid and reliable ROS-R available in Malay language to assess religious orientations among Malaysians. With that in mind, the present study aims to validate and document the psychometric properties of Malay translated ROS-R (henceforth, M-ROS-R among sample of Malaysian adults. This study commenced with Forward-Backward translations and was followed by content and face validities. Subsequently, a cross-sectional study was conducted among Malaysian adults (n = 226 using convenience sampling method for the purpose of construct and factorial validations. Later, construct and factorial validity was performed via Exploratory Factor Analysis using Principal Component Analysis with Varimax rotation. Finally, reliability testing was performed to determine the internal consistency of the items which was achieved using Cronbach’s Alpha coefficient method (α. The factor loading consisted of three factors with a total variance of 64.76%. The final version of M-ROS-R consisted of 14 items with Factor 1 (Intrinsic Orientation comprised of 8 items, Factor 2 (Extrinsic-Socially Orientated with 3 items while Factor 3 (Extrinsic-Personally Orientated constituted 3 items. The internal consistency values of the factors ranged between 0.68 and 0.86, indicating the scale is reliable. The intercorrelations between factors were also significant with each other. M-ROS-R was concluded as a valid and reliable scale to measure and assess religious orientations among Malaysians.
Directory of Open Access Journals (Sweden)
Ali Akbar Velayati
2015-01-01
Full Text Available Nontuberculous mycobacteria (NTM are opportunistic pathogens that are widely distributed in the environment. There is a lack of data on species distribution of these organisms from Iran. This study consists of a review of NTM articles published in Iran between the years 1992 and 2014. In this review, 20 articles and 14 case reports were identified. Among the 20 articles, 13 (65% studies focused on NTM isolates from clinical specimens, 6 (30% studies examined NTM isolates from environmental samples, and one (5% article included both clinical and environmental isolates. M. fortuitum (229/997; 23% was recorded as the most prevalent and rapid growing mycobacteria (RGM species in both clinical (28% and environmental (19% isolated samples (P < 0.05. Among slow growing mycobacteria (SGM, M. simiae (103/494; 21% demonstrated a higher frequency in clinical samples whereas in environmental samples it was M. flavescens (44/503; 9%. These data represent information from 14 provinces out of 31 provinces of Iran. No information is available in current published data on clinical or environmental NTM from the remaining 17 provinces in Iran. These results emphasize the potential importance of NTM as well as the underestimation of NTM frequency in Iran. NTM is an important clinical problem associated with significant morbidity and mortality in Iran. Continued research is needed from both clinical and environmental sources to help clinicians and researchers better understand and address NTM treatment and prevention.
Velayati, Ali Akbar; Farnia, Parissa; Mozafari, Mohadese; Mirsaeidi, Mehdi
2015-01-01
Nontuberculous mycobacteria (NTM) are opportunistic pathogens that are widely distributed in the environment. There is a lack of data on species distribution of these organisms from Iran. This study consists of a review of NTM articles published in Iran between the years 1992 and 2014. In this review, 20 articles and 14 case reports were identified. Among the 20 articles, 13 (65%) studies focused on NTM isolates from clinical specimens, 6 (30%) studies examined NTM isolates from environmental samples, and one (5%) article included both clinical and environmental isolates. M. fortuitum (229/997; 23%) was recorded as the most prevalent and rapid growing mycobacteria (RGM) species in both clinical (28%) and environmental (19%) isolated samples (P < 0.05). Among slow growing mycobacteria (SGM), M. simiae (103/494; 21%) demonstrated a higher frequency in clinical samples whereas in environmental samples it was M. flavescens (44/503; 9%). These data represent information from 14 provinces out of 31 provinces of Iran. No information is available in current published data on clinical or environmental NTM from the remaining 17 provinces in Iran. These results emphasize the potential importance of NTM as well as the underestimation of NTM frequency in Iran. NTM is an important clinical problem associated with significant morbidity and mortality in Iran. Continued research is needed from both clinical and environmental sources to help clinicians and researchers better understand and address NTM treatment and prevention.
Directory of Open Access Journals (Sweden)
Mu-hsuan Huang
2001-12-01
Full Text Available This article analyzes the consistency in the selection of search terms and search contents of college and graduate students in National Taiwan University when they are using PsycLIT CD-ROM database. 31 students conducted pre-assigned searches, doing 59 searches generating 609 search terms. The study finds the consistency in selection of search terms of first level is 22.14% and second level is 35%. These results are similar with others’ researches. About the consistency in search concepts, no matter the overlaps of searched articles or judge relevant articles are lower than other researches. [Article content in Chinese
Correlates for Consistency of Contraceptive Use Among Sexually Active Female Adolescents
Directory of Open Access Journals (Sweden)
Ruey-Hsia Wang
2004-04-01
Full Text Available This study explored the correlates for consistency of contraceptive use among sexually active female adolescents in Kaohsiung County, Taiwan. Overall, 164 female adolescents who had engaged in sexual behavior within the last 6 months and were not pregnant at the time of the study were selected from two vocational high schools in Kaohsiung County, Taiwan. An anonymous questionnaire was used to measure demographic data, contraceptive attitudes, contraceptive knowledge, contraceptive self-efficacy, perception of peers' use of contraceptives, sexual history, and contraceptive use. The results showed that 45.7% of subjects had sex once or more per week, and that 39.6% of subjects always used contraceptives while 15.2% never used contraceptives. Condoms were the most popular contraceptives (51.2% and the withdrawal method was the second most popular (23.8%. Stepwise logistic regression showed that higher contraceptive attitudes (odds ratio, OR, 1.148 and previous contraceptive education in school (OR, 3.394 increased the probability of consistently using contraceptives, correctly classifying 67.2% of the sample.
Consistency in the World Wide Web
DEFF Research Database (Denmark)
Thomsen, Jakob Grauenkjær
Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...
Towards transparent and consistent exchange of knowledge for improved microbiological food safety
DEFF Research Database (Denmark)
Plaza-Rodrigues, Carolina; Ungaretti Haberbeck, Leticia; Desvignes, Virginie
2017-01-01
software tools and consistent rules for knowledge annotation. The knowledge repository would be a user friendly tool to benefit different users within the microbiological food safety community, especially users like risk assessors and managers, model developers and research scientists working......Predictive microbial modelling and quantitative microbiological risk assessment, two important and complementary areas within the food safety community, are generating a variety of scientific knowledge (experimental data and mathematical models) and resources (databases and software tools......) for the exploitation of this knowledge. However, the application and reusability of this knowledge is still hampered as the access to this knowledge and the exchange of information between databases and software tools are currently difficult and time consuming. To facilitate transparent and consistent knowledge access...
Optimal consistency in microRNA expression analysis using reference-gene-based normalization.
Wang, Xi; Gardiner, Erin J; Cairns, Murray J
2015-05-01
Normalization of high-throughput molecular expression profiles secures differential expression analysis between samples of different phenotypes or biological conditions, and facilitates comparison between experimental batches. While the same general principles apply to microRNA (miRNA) normalization, there is mounting evidence that global shifts in their expression patterns occur in specific circumstances, which pose a challenge for normalizing miRNA expression data. As an alternative to global normalization, which has the propensity to flatten large trends, normalization against constitutively expressed reference genes presents an advantage through their relative independence. Here we investigated the performance of reference-gene-based (RGB) normalization for differential miRNA expression analysis of microarray expression data, and compared the results with other normalization methods, including: quantile, variance stabilization, robust spline, simple scaling, rank invariant, and Loess regression. The comparative analyses were executed using miRNA expression in tissue samples derived from subjects with schizophrenia and non-psychiatric controls. We proposed a consistency criterion for evaluating methods by examining the overlapping of differentially expressed miRNAs detected using different partitions of the whole data. Based on this criterion, we found that RGB normalization generally outperformed global normalization methods. Thus we recommend the application of RGB normalization for miRNA expression data sets, and believe that this will yield a more consistent and useful readout of differentially expressed miRNAs, particularly in biological conditions characterized by large shifts in miRNA expression.
Gusev, Alexander; Vasyukova, Inna; Zakharova, Olga; Altabaeva, Yuliya; Saushkin, Nikolai; Samsonova, Jeanne; Kondakov, Sergey; Osipov, Alexander; Snegin, Eduard
2017-11-01
The aim of proposed research is to study the applicability of fiberglass porous membrane materials in a new strip format for dried blood storage in food industry monitoring. A comparative analysis of cellulosic and fiberglass porous membrane materials was carried out to obtain dried samples of serum or blood and the possibility of further species-specific analysis. Blood samples of Sus scrofa were used to study the comparative effectiveness of cellulose and fiberglass porous membrane carriers for long-term biomaterial storage allowing for further DNA detection by real-time polymerase chain reaction (PCR) method. Scanning electron microscopy of various membranes - native and with blood samples - indicate a fundamental difference in the form of dried samples. Membranes based on cellulosic materials sorb the components of the biological fluid on the surface of the fibers of their structure, partially penetrating the cellulose fibers, while in the case of glass fiber membranes the components of the biological fluid dry out as films in the pores of the membrane between the structural filaments. This fundamental difference in the retention mechanisms affects the rate of dissolution of the components of dry samples and contributes to an increase in the efficiency of the desorption process of the sample before subsequent analysis. Detecting of pig DNA in every analyzed sample under the performed Real-time PCR as well as good state of the biomaterial preservation on the glass fiber membranes was clearly demonstrated. Good biomaterials preservation has been revealed on the test cards for 4 days as well as for 1 hour.
Consistent guiding center drift theories
International Nuclear Information System (INIS)
Wimmel, H.K.
1982-04-01
Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)
Environmental surveillance master sampling schedule
International Nuclear Information System (INIS)
Bisping, L.E.
1994-02-01
This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring the onsite drinking water falls outside the scope of the SESP. The Hanford Environmental Health Foundation is responsible for monitoring the nonradiological parameters as defined in the National Drinking Water Standards while PNL conducts the radiological monitoring of the onsite drinking water. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize the expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control and reporting. The ground-water sampling schedule identifies ground-water sampling events used by PNL for environmental surveillance of the Hanford Site
Environmental surveillance master sampling schedule
Energy Technology Data Exchange (ETDEWEB)
Bisping, L.E.
1994-02-01
This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring the onsite drinking water falls outside the scope of the SESP. The Hanford Environmental Health Foundation is responsible for monitoring the nonradiological parameters as defined in the National Drinking Water Standards while PNL conducts the radiological monitoring of the onsite drinking water. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize the expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control and reporting. The ground-water sampling schedule identifies ground-water sampling events used by PNL for environmental surveillance of the Hanford Site.
Waste sampling and characterization facility (WSCF)
International Nuclear Information System (INIS)
1994-10-01
The Waste Sampling and Characterization Facility (WSCF) complex consists of the main structure (WSCF) and four support structures located in the 600 Area of the Hanford site east of the 200 West area and south of the Hanford Meterology Station. WSCF is to be used for low level sample analysis, less than 2 mRem. The Laboratory features state-of-the-art analytical and low level radiological counting equipment for gaseous, soil, and liquid sample analysis. In particular, this facility is to be used to perform Resource Conservation and Recovery Act (RCRA) of 1976 and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) of 1980 sample analysis in accordance with U.S. Environmental Protection Agency Protocols, room air and stack monitoring sample analysis, waste water treatment process support, and contractor laboratory quality assurance checks. The samples to be analyzed contain very low concentrations of radioisotopes. The main reason that WSCF is considered a Nuclear Facility is due to the storage of samples at the facility. This maintenance Implementation Plan has been developed for maintenace functions associate with the WSCF
Temporal consistent depth map upscaling for 3DTV
Schwarz, Sebastian; Sjöström, Mârten; Olsson, Roger
2014-03-01
The ongoing success of three-dimensional (3D) cinema fuels increasing efforts to spread the commercial success of 3D to new markets. The possibilities of a convincing 3D experience at home, such as three-dimensional television (3DTV), has generated a great deal of interest within the research and standardization community. A central issue for 3DTV is the creation and representation of 3D content. Acquiring scene depth information is a fundamental task in computer vision, yet complex and error-prone. Dedicated range sensors, such as the Time of-Flight camera (ToF), can simplify the scene depth capture process and overcome shortcomings of traditional solutions, such as active or passive stereo analysis. Admittedly, currently available ToF sensors deliver only a limited spatial resolution. However, sophisticated depth upscaling approaches use texture information to match depth and video resolution. At Electronic Imaging 2012 we proposed an upscaling routine based on error energy minimization, weighted with edge information from an accompanying video source. In this article we develop our algorithm further. By adding temporal consistency constraints to the upscaling process, we reduce disturbing depth jumps and flickering artifacts in the final 3DTV content. Temporal consistency in depth maps enhances the 3D experience, leading to a wider acceptance of 3D media content. More content in better quality can boost the commercial success of 3DTV.
Ren, Jiangtao; Beckner, Matthew A; Lynch, Kyle B; Chen, Huang; Zhu, Zaifang; Yang, Yu; Chen, Apeng; Qiao, Zhenzhen; Liu, Shaorong; Lu, Joann J
2018-05-15
A comprehensive two-dimensional liquid chromatography (LCxLC) system consisting of twelve columns in the second dimension was developed for comprehensive analysis of intact proteins in complex biological samples. The system consisted of an ion-exchange column in the first dimension and the twelve reverse-phase columns in the second dimension; all thirteen columns were monolithic and prepared inside 250 µm i.d. capillaries. These columns were assembled together through the use of three valves and an innovative configuration. The effluent from the first dimension was continuously fractionated and sequentially transferred into the twelve second-dimension columns, while the second-dimension separations were carried out in a series of batches (six columns per batch). This LCxLC system was tested first using standard proteins followed by real-world samples from E. coli. Baseline separation was observed for eleven standard proteins and hundreds of peaks were observed for the real-world sample analysis. Two-dimensional liquid chromatography, often considered as an effective tool for mapping proteins, is seen as laborious and time-consuming when configured offline. Our online LCxLC system with increased second-dimension columns promises to provide a solution to overcome these hindrances. Copyright © 2018 Elsevier B.V. All rights reserved.
Consistency Analysis of Ultrasound Echoes within a Dual Symmetric Path Inspection Framework
Directory of Open Access Journals (Sweden)
VASILE, C.
2015-05-01
Full Text Available Non-destructive ultrasound inspection of metallic structures is a perpetual high-interest area of research because of its well-known benefits in industrial applications, especially from an economic point of view, where detection and localisation of defects in their most initial stages can help maintain high production capabilities for any enterprise. This paper is aimed at providing further validation regarding a new technique for detecting and localising defects in metals, the Matched Filter-based Dual Symmetric Path Inspection (MF-DSPI. This validation consists in demonstrating the consistency of the useful ultrasound echoes, within the framework of the MF-DSPI. A description of the MF-DSPI method and the related work of the authors with it are presented in this paper, along with an experimental setup used to obtain the data with which the useful echo consistency was studied. The four proposed methods are: signal envelope analysis, L2-norm criterion, correlation coefficient criterion and sliding bounding rectangle analysis. The aim of this paper is to verify the useful echo consistency (with the help of these four approaches, as the MF-DSPI method strongly relies on this feature. The results and their implications are discussed in the latter portion of this study.
A Improved Seabed Surface Sand Sampling Device
Luo, X.
2017-12-01
In marine geology research it is necessary to obtain a suf fcient quantity of seabed surface samples, while also en- suring that the samples are in their original state. Currently,there are a number of seabed surface sampling devices available, but we fnd it is very diffcult to obtain sand samples using these devices, particularly when dealing with fne sand. Machine-controlled seabed surface sampling devices are also available, but generally unable to dive into deeper regions of water. To obtain larger quantities of seabed surface sand samples in their original states, many researchers have tried to improve upon sampling devices,but these efforts have generally produced ambiguous results, in our opinion.To resolve this issue, we have designed an improved andhighly effective seabed surface sand sampling device that incorporates the strengths of a variety of sampling devices. It is capable of diving into deepwater to obtain fne sand samples and is also suited for use in streams, rivers, lakes and seas with varying levels of depth (up to 100 m). This device can be used for geological mapping, underwater prospecting, geological engineering and ecological, environmental studies in both marine and terrestrial waters.
[Consistency study of PowerPlex 21 kit and Goldeneye 20A kit and forensic application].
Ren, He; Liu, Ying; Zhang, Qing-Xia; Jiao, Zhang-Ping
2014-06-01
To ensure the consistency of genotype results for PowerPlex 21 kit and Goldeneye 20A kit. The STR loci were amplified in DNA samples from 205 unrelated individuals in Beijing Han population. And consistency of 19 overlap STR loci typing were observed. The genetic polymorphism of D1S1656 locus was obtained. All 19 overlap loci typing showed consistent. The proportion of peak height of heterozygous loci in two kits showed no statistical difference (P > 0.05). The observed heterozygosis of D1S1656 was 0.878. The discrimination power was 0.949. The excluding probability of paternity of triplet was 0.751. The excluding probability of paternity of diploid was 0.506. The polymorphism information content was 0.810. PowerPlex 21 kit and Goldeneye 20A kit present a good consistency. The primer design is reasonable. The polymorphism of D1S1656 is good. The two kits can be used for human genetic analysis, paternity test, and individual identification in forensic practice.
Set Up of an Automatic Water Quality Sampling System in Irrigation Agriculture
Directory of Open Access Journals (Sweden)
Emanuel Heinz
2013-12-01
Full Text Available We have developed a high-resolution automatic sampling system for continuous in situ measurements of stable water isotopic composition and nitrogen solutes along with hydrological information. The system facilitates concurrent monitoring of a large number of water and nutrient fluxes (ground, surface, irrigation and rain water in irrigated agriculture. For this purpose we couple an automatic sampling system with a Wavelength-Scanned Cavity Ring Down Spectrometry System (WS-CRDS for stable water isotope analysis (δ2H and δ18O, a reagentless hyperspectral UV photometer (ProPS for monitoring nitrate content and various water level sensors for hydrometric information. The automatic sampling system consists of different sampling stations equipped with pumps, a switch cabinet for valve and pump control and a computer operating the system. The complete system is operated via internet-based control software, allowing supervision from nearly anywhere. The system is currently set up at the International Rice Research Institute (Los Baños, The Philippines in a diversified rice growing system to continuously monitor water and nutrient fluxes. Here we present the system’s technical set-up and provide initial proof-of-concept with results for the isotopic composition of different water sources and nitrate values from the 2012 dry season.
Stability of DREEM in a Sample of Medical Students: A Prospective Study
Directory of Open Access Journals (Sweden)
Muhamad Saiful Bahri Yusoff
2012-01-01
Full Text Available Background. Over the last 15 year, DREEM was applied in various educational settings to appraise educational climate. So far, none of article reported its stability in Malaysian medical students. Objective. To determine stability of the DREEM to measure educational climate at different time and occasions using a sample of medical students. Methodology. A prospective cohort study was done on 196 first year medical students. It was administered to the medical students at four different intervals. The Cronbach's alpha and intraclass correlation analysis were applied to measure internal consistency and agreement level across the intervals. The analysis was done using SPSS 18. Result. A total of 186 (94.9% medical students responded completely to the DREEM inventory. The overall Cronbach's alpha value of the DREEM at the four measurements ranged between 0.91 and 0.94. The average Cronbach's alpha values of the five subscales ranged between 0.45 and 0.83. The ICC coefficient values for the DREEM total score was 0.67 and its subscales ranged between 0.51 and 0.62. Conclusion. This study supported satisfactory levels of stability and internal consistency of the DREEM to measure educational climate over multiple observations in a sample of Malaysian medical students. Continued research is required to optimise its psychometric credential across educational settings.
Factorial composition of the Aggression Questionnaire: a multi-sample study in Greek adults.
Vitoratou, Silia; Ntzoufras, Ioannis; Smyrnis, Nikolaos; Stefanis, Nicholas C
2009-06-30
The primary aim of the current article was the evaluation of the factorial composition of the Aggression Questionnaire (AQ(29)) in the Greek population. The translated questionnaire was administered to the following three heterogeneous adult samples: a general population sample from Athens, a sample of young male conscripts and a sample of individuals facing problems related to substance use. Factor analysis highlighted a structure similar to the one proposed by Buss and Perry [Buss, A.F., Perry, M., 1992. The Aggression Questionnaire. Journal of Personality and Social Psychology 63, 452-459]. However, the refined 12-item version of Bryant and Smith [Bryant, F.B., Smith, B.D., 2001. Refining the architecture of aggression: a measurement model for the Buss-Perry Aggression Questionnaire. Journal of Research in Personality 35, 138-167] provided a better fit to our data. Therefore, the refined model was implemented in further analysis. Multiple group confirmatory factor analysis was applied in order to assess the variability of the 12-item AQ across gender and samples. The percentage of factor loading invariance between males and females and across the three samples defined above was high (higher than 75%). The reliability (internal consistency) of the scale was satisfactory in all cases. Content validity of the 12-item AQ was confirmed by comparison with the Symptom Check-List 90 Revised.
Hyperspectral band selection based on consistency-measure of neighborhood rough set theory
International Nuclear Information System (INIS)
Liu, Yao; Xie, Hong; Wang, Liguo; Tan, Kezhu; Chen, Yuehua; Xu, Zhen
2016-01-01
Band selection is a well-known approach for reducing dimensionality in hyperspectral imaging. In this paper, a band selection method based on consistency-measure of neighborhood rough set theory (CMNRS) was proposed to select informative bands from hyperspectral images. A decision-making information system was established by the reflection spectrum of soybeans’ hyperspectral data between 400 nm and 1000 nm wavelengths. The neighborhood consistency-measure, which reflects not only the size of the decision positive region, but also the sample distribution in the boundary region, was used as the evaluation function of band significance. The optimal band subset was selected by a forward greedy search algorithm. A post-pruning strategy was employed to overcome the over-fitting problem and find the minimum subset. To assess the effectiveness of the proposed band selection technique, two classification models (extreme learning machine (ELM) and random forests (RF)) were built. The experimental results showed that the proposed algorithm can effectively select key bands and obtain satisfactory classification accuracy. (paper)
Estimation of sample size and testing power (Part 4).
Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo
2012-01-01
Sample size estimation is necessary for any experimental or survey research. An appropriate estimation of sample size based on known information and statistical knowledge is of great significance. This article introduces methods of sample size estimation of difference test for data with the design of one factor with two levels, including sample size estimation formulas and realization based on the formulas and the POWER procedure of SAS software for quantitative data and qualitative data with the design of one factor with two levels. In addition, this article presents examples for analysis, which will play a leading role for researchers to implement the repetition principle during the research design phase.
Preserving Geological Samples and Metadata from Polar Regions
Grunow, A.; Sjunneskog, C. M.
2011-12-01
The Office of Polar Programs at the National Science Foundation (NSF-OPP) has long recognized the value of preserving earth science collections due to the inherent logistical challenges and financial costs of collecting geological samples from Polar Regions. NSF-OPP established two national facilities to make Antarctic geological samples and drill cores openly and freely available for research. The Antarctic Marine Geology Research Facility (AMGRF) at Florida State University was established in 1963 and archives Antarctic marine sediment cores, dredge samples and smear slides along with ship logs. The United States Polar Rock Repository (USPRR) at Ohio State University was established in 2003 and archives polar rock samples, marine dredges, unconsolidated materials and terrestrial cores, along with associated materials such as field notes, maps, raw analytical data, paleomagnetic cores, thin sections, microfossil mounts, microslides and residues. The existence of the AMGRF and USPRR helps to minimize redundant sample collecting, lessen the environmental impact of doing polar field work, facilitates field logistics planning and complies with the data sharing requirement of the Antarctic Treaty. USPRR acquires collections through donations from institutions and scientists and then makes these samples available as no-cost loans for research, education and museum exhibits. The AMGRF acquires sediment cores from US based and international collaboration drilling projects in Antarctica. Destructive research techniques are allowed on the loaned samples and loan requests are accepted from any accredited scientific institution in the world. Currently, the USPRR has more than 22,000 cataloged rock samples available to scientists from around the world. All cataloged samples are relabeled with a USPRR number, weighed, photographed and measured for magnetic susceptibility. Many aspects of the sample metadata are included in the database, e.g. geographical location, sample
Directory of Open Access Journals (Sweden)
Christian Ramón Hernández Sánchez
2015-01-01
Full Text Available In this study has been investigated the feasibility of obtaining DNA genetic profiling of biological samples from male human blood subject temperature and humidity factors. The methodology consisted of a sample preparation, DNA extraction, PCR amplification of genetic marker Amelogenin and finally DNA sequencing, we determined the incidence of effective amplification to obtain a complete profile, partial or no sample problem. Furthermore human blood samples over eight days of exposure showed a lower amplification. This research seeks to level the playing field having a scene, the type of samples found, so that the information gathered in this research is very useful trying to help establish viable which samples are to be analyzed in the laboratory.