WorldWideScience

Sample records for validly selected prior

  1. Validity in assessment of prior learning

    DEFF Research Database (Denmark)

    Wahlgren, Bjarne; Aarkrog, Vibe

    2015-01-01

    , the article discusses the need for specific criteria for assessment. The reliability and validity of the assessment procedures depend on whether the competences are well-defined, and whether the teachers are adequately trained for the assessment procedures. Keywords: assessment, prior learning, adult...... education, vocational training, lifelong learning, validity...

  2. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    The national policies for the education/training of adults are in the 21st century highly influenced by proposals which are formulated and promoted by The European Union (EU) as well as other transnational players and this shift in policy making has consequences. One is that ideas which in the past...... would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  3. Can natural selection encode Bayesian priors?

    Science.gov (United States)

    Ramírez, Juan Camilo; Marshall, James A R

    2017-08-07

    The evolutionary success of many organisms depends on their ability to make decisions based on estimates of the state of their environment (e.g., predation risk) from uncertain information. These decision problems have optimal solutions and individuals in nature are expected to evolve the behavioural mechanisms to make decisions as if using the optimal solutions. Bayesian inference is the optimal method to produce estimates from uncertain data, thus natural selection is expected to favour individuals with the behavioural mechanisms to make decisions as if they were computing Bayesian estimates in typically-experienced environments, although this does not necessarily imply that favoured decision-makers do perform Bayesian computations exactly. Each individual should evolve to behave as if updating a prior estimate of the unknown environment variable to a posterior estimate as it collects evidence. The prior estimate represents the decision-maker's default belief regarding the environment variable, i.e., the individual's default 'worldview' of the environment. This default belief has been hypothesised to be shaped by natural selection and represent the environment experienced by the individual's ancestors. We present an evolutionary model to explore how accurately Bayesian prior estimates can be encoded genetically and shaped by natural selection when decision-makers learn from uncertain information. The model simulates the evolution of a population of individuals that are required to estimate the probability of an event. Every individual has a prior estimate of this probability and collects noisy cues from the environment in order to update its prior belief to a Bayesian posterior estimate with the evidence gained. The prior is inherited and passed on to offspring. Fitness increases with the accuracy of the posterior estimates produced. Simulations show that prior estimates become accurate over evolutionary time. In addition to these 'Bayesian' individuals, we also

  4. Valid MR imaging predictors of prior knee arthroscopy

    International Nuclear Information System (INIS)

    Discepola, Federico; Le, Huy B.Q.; Park, John S.; Clopton, Paul; Knoll, Andrew N.; Austin, Matthew J.; Resnick, Donald L.

    2012-01-01

    To determine whether fibrosis of the medial patellar reticulum (MPR), lateral patellar reticulum (LPR), deep medial aspect of Hoffa's fat pad (MDH), or deep lateral aspect of Hoffa's fat pad (LDH) is a valid predictor of prior knee arthroscopy. Institutional review board approval and waiver of informed consent were obtained for this HIPPA-compliant study. Initially, fibrosis of the MPR, LPR, MDH, or LDH in MR imaging studies of 50 patients with prior knee arthroscopy and 100 patients without was recorded. Subsequently, two additional radiologists, blinded to clinical data, retrospectively and independently recorded the presence of fibrosis of the MPR in 50 patients with prior knee arthroscopy and 50 without. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy for detecting the presence of fibrosis in the MPR were calculated. κ statistics were used to analyze inter-observer agreement. Fibrosis of each of the regions examined during the first portion of the study showed a significant association with prior knee arthroscopy (p < 0.005 for each). A patient with fibrosis of the MPR, LDH, or LPR was 45.5, 9, or 3.7 times more likely, respectively, to have had a prior knee arthroscopy. Logistic regression analysis indicated that fibrosis of the MPR supplanted the diagnostic utility of identifying fibrosis of the LPR, LDH, or MDH, or combinations of these (p ≥ 0.09 for all combinations). In the second portion of the study, fibrosis of the MPR demonstrated a mean sensitivity of 82%, specificity of 72%, PPV of 75%, NPV of 81%, and accuracy of 77% for predicting prior knee arthroscopy. Analysis of MR images can be used to determine if a patient has had prior knee arthroscopy by identifying fibrosis of the MPR, LPR, MDH, or LDH. Fibrosis of the MPR was the strongest predictor of prior knee arthroscopy. (orig.)

  5. Valid MR imaging predictors of prior knee arthroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Discepola, Federico; Le, Huy B.Q. [McGill University Health Center, Jewsih General Hospital, Division of Musculoskeletal Radiology, Montreal, Quebec (Canada); Park, John S. [Annapolis Radiology Associates, Division of Musculoskeletal Radiology, Annapolis, MD (United States); Clopton, Paul; Knoll, Andrew N.; Austin, Matthew J.; Resnick, Donald L. [University of California San Diego (UCSD), Division of Musculoskeletal Radiology, San Diego, CA (United States)

    2012-01-15

    To determine whether fibrosis of the medial patellar reticulum (MPR), lateral patellar reticulum (LPR), deep medial aspect of Hoffa's fat pad (MDH), or deep lateral aspect of Hoffa's fat pad (LDH) is a valid predictor of prior knee arthroscopy. Institutional review board approval and waiver of informed consent were obtained for this HIPPA-compliant study. Initially, fibrosis of the MPR, LPR, MDH, or LDH in MR imaging studies of 50 patients with prior knee arthroscopy and 100 patients without was recorded. Subsequently, two additional radiologists, blinded to clinical data, retrospectively and independently recorded the presence of fibrosis of the MPR in 50 patients with prior knee arthroscopy and 50 without. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy for detecting the presence of fibrosis in the MPR were calculated. {kappa} statistics were used to analyze inter-observer agreement. Fibrosis of each of the regions examined during the first portion of the study showed a significant association with prior knee arthroscopy (p < 0.005 for each). A patient with fibrosis of the MPR, LDH, or LPR was 45.5, 9, or 3.7 times more likely, respectively, to have had a prior knee arthroscopy. Logistic regression analysis indicated that fibrosis of the MPR supplanted the diagnostic utility of identifying fibrosis of the LPR, LDH, or MDH, or combinations of these (p {>=} 0.09 for all combinations). In the second portion of the study, fibrosis of the MPR demonstrated a mean sensitivity of 82%, specificity of 72%, PPV of 75%, NPV of 81%, and accuracy of 77% for predicting prior knee arthroscopy. Analysis of MR images can be used to determine if a patient has had prior knee arthroscopy by identifying fibrosis of the MPR, LPR, MDH, or LDH. Fibrosis of the MPR was the strongest predictor of prior knee arthroscopy. (orig.)

  6. Bayesian genomic selection: the effect of haplotype lenghts and priors

    DEFF Research Database (Denmark)

    Villumsen, Trine Michelle; Janss, Luc

    2009-01-01

    Breeding values for animals with marker data are estimated using a genomic selection approach where data is analyzed using Bayesian multi-marker association models. Fourteen model scenarios with varying haplotype lengths, hyper parameter and prior distributions were compared to find the scenario ...

  7. Robust Object Tracking Using Valid Fragments Selection.

    Science.gov (United States)

    Zheng, Jin; Li, Bo; Tian, Peng; Luo, Gang

    Local features are widely used in visual tracking to improve robustness in cases of partial occlusion, deformation and rotation. This paper proposes a local fragment-based object tracking algorithm. Unlike many existing fragment-based algorithms that allocate the weights to each fragment, this method firstly defines discrimination and uniqueness for local fragment, and builds an automatic pre-selection of useful fragments for tracking. Then, a Harris-SIFT filter is used to choose the current valid fragments, excluding occluded or highly deformed fragments. Based on those valid fragments, fragment-based color histogram provides a structured and effective description for the object. Finally, the object is tracked using a valid fragment template combining the displacement constraint and similarity of each valid fragment. The object template is updated by fusing feature similarity and valid fragments, which is scale-adaptive and robust to partial occlusion. The experimental results show that the proposed algorithm is accurate and robust in challenging scenarios.

  8. Developing Conceptual Understanding of Natural Selection: The Role of Interest, Efficacy, and Basic Prior Knowledge

    Science.gov (United States)

    Linnenbrink-Garcia, Lisa; Pugh, Kevin J.; Koskey, Kristin L. K.; Stewart, Victoria C.

    2012-01-01

    Changes in high school students' (n = 94) conceptions of natural selection were examined as a function of motivational beliefs (individual interest, academic self-efficacy), basic prior knowledge, and gender across three assessments (pre, post, follow-up). Results from variable-centered analyses suggested that these variables had relatively little…

  9. P2-7: Encoding of Graded Changes in Validity of Spatial Priors in Human Visual Cortex

    Directory of Open Access Journals (Sweden)

    Yuko Hara

    2012-10-01

    Full Text Available If the spatial validity of prior information is varied systematically, does human behavioral performance improve in a graded fashion, and if so, does visual cortex represent the probability directly? Cortical activity was measured with fMRI while subjects performed a contrast-discrimination task in which the spatial validity of a prior cue for target location was systematically varied. Subjects viewed four sinusoidal gratings (randomized contrasts of 12.5, 25, and 50% shown in discrete visual quadrants presented twice. The contrast in one location (target was incremented in one of the two presentations. Subjects reported with a button press which presentation contained the greater contrast. The target grating was signaled in advance by a cue which varied in spatial validity; at trial onset, small lines pointed to four, two, or one of the possible target locations, thus indicating the target with 25, 50, or 100% probability. Behavioral performance was 2.1 and 3.3 times better in the 100% probability condition than the 50% and 25%, respectively (p < .001, ANOVA. Unlike behavioral performance, cortical activity in early visual areas showed the same increase in response amplitude for cued versus uncued stimuli for both 100% and 50% probability (V1-V4, V3A all p < .18, Student's t-test, 25% had no uncued condition. How could behavioral performance improve in a graded fashion if cortical activity showed the same effect for different probabilities? A model of efficient selection in which V1 responses were pooled according to their magnitude rather than as a simple average explained the observations (AIC difference = −15.

  10. Validation of Ulchin Units 1, 2 CONTEMPT Model Prior to the Production of EQ Envelope Curve

    International Nuclear Information System (INIS)

    Hwang, Su Hyun; Kim, Min Ki; Hong, Soon Joon; Lee, Byung Chul; Suh, Jeong Kwan; Lee, Jae Yong; Song, Dong Soo

    2010-01-01

    The Ulchin Units 1, 2 will be refurbished with RSG (Replacement of Steam Generator) and PU (Power Uprate). The current EQ (Environmental Qualification) envelope curve should be modified according to RSG and PU. The containment P/T (Pressure/Temperature) analysis in Ulchin Units 1, 2 FSAR was done using EDF computer program PAREO6. PAREO6 uses the same assumptions as the US NRC CONTEMPT program, and the results given by both programs are in good agreement. It is utilized to determine pressure and temperature variations in a PWR containment subsequent to a reactor coolant or secondary system pipe break. But PAREO6 cannot be available to the production of EQ envelope curve, so CONTEMPT code should be used instead of PAREO6. It is essential to validate the CONTEMPT OSG (Original Steam Generator) model prior to the production of EQ envelope curve considering RSG and PU. This study has been performed to validate the CONTEMPT model of Ulchin Units 1, 2 by comparing the CONTEMPT results with the PAERO6 results in Ulchin Units 1, 2 FSAR

  11. Validation of Ulchin Units 1, 2 CONTEMPT Model Prior to the Production of EQ Envelope Curve

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Su Hyun; Kim, Min Ki; Hong, Soon Joon; Lee, Byung Chul [FNC Technology Co., SNU, Seoul (Korea, Republic of); Suh, Jeong Kwan; Lee, Jae Yong; Song, Dong Soo [KEPCO Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    The Ulchin Units 1, 2 will be refurbished with RSG (Replacement of Steam Generator) and PU (Power Uprate). The current EQ (Environmental Qualification) envelope curve should be modified according to RSG and PU. The containment P/T (Pressure/Temperature) analysis in Ulchin Units 1, 2 FSAR was done using EDF computer program PAREO6. PAREO6 uses the same assumptions as the US NRC CONTEMPT program, and the results given by both programs are in good agreement. It is utilized to determine pressure and temperature variations in a PWR containment subsequent to a reactor coolant or secondary system pipe break. But PAREO6 cannot be available to the production of EQ envelope curve, so CONTEMPT code should be used instead of PAREO6. It is essential to validate the CONTEMPT OSG (Original Steam Generator) model prior to the production of EQ envelope curve considering RSG and PU. This study has been performed to validate the CONTEMPT model of Ulchin Units 1, 2 by comparing the CONTEMPT results with the PAERO6 results in Ulchin Units 1, 2 FSAR

  12. Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics

    Science.gov (United States)

    Abe, Sumiyoshi

    2014-11-01

    The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.

  13. Gamma prior distribution selection for Bayesian analysis of failure rate and reliability

    International Nuclear Information System (INIS)

    Waler, R.A.; Johnson, M.M.; Waterman, M.S.; Martz, H.F. Jr.

    1977-01-01

    It is assumed that the phenomenon under study is such that the time-to-failure may be modeled by an exponential distribution with failure-rate parameter, lambda. For Bayesian analyses of the assumed model, the family of gamma distributions provides conjugate prior models for lambda. Thus, an experimenter needs to select a particular gamma model to conduct a Bayesian reliability analysis. The purpose of this paper is to present a methodology which can be used to translate engineering information, experience, and judgment into a choice of a gamma prior distribution. The proposed methodology assumes that the practicing engineer can provide percentile data relating to either the failure rate or the reliability of the phenomenon being investigated. For example, the methodology will select the gamma prior distribution which conveys an engineer's belief that the failure rate, lambda, simultaneously satisfies the probability statements, P(lambda less than 1.0 x 10 -3 ) = 0.50 and P(lambda less than 1.0 x 10 -5 ) = 0.05. That is, two percentiles provided by an engineer are used to determine a gamma prior model which agrees with the specified percentiles. For those engineers who prefer to specify reliability percentiles rather than the failure-rate percentiles illustrated above, one can use the induced negative-log gamma prior distribution which satisfies the probability statements, P(R(t 0 ) less than 0.99) = 0.50 and P(R(t 0 ) less than 0.99999) = 0.95 for some operating time t 0 . Also, the paper includes graphs for selected percentiles which assist an engineer in applying the methodology

  14. Gamma prior distribution selection for Bayesian analysis of failure rate and reliability

    International Nuclear Information System (INIS)

    Waller, R.A.; Johnson, M.M.; Waterman, M.S.; Martz, H.F. Jr.

    1976-07-01

    It is assumed that the phenomenon under study is such that the time-to-failure may be modeled by an exponential distribution with failure rate lambda. For Bayesian analyses of the assumed model, the family of gamma distributions provides conjugate prior models for lambda. Thus, an experimenter needs to select a particular gamma model to conduct a Bayesian reliability analysis. The purpose of this report is to present a methodology that can be used to translate engineering information, experience, and judgment into a choice of a gamma prior distribution. The proposed methodology assumes that the practicing engineer can provide percentile data relating to either the failure rate or the reliability of the phenomenon being investigated. For example, the methodology will select the gamma prior distribution which conveys an engineer's belief that the failure rate lambda simultaneously satisfies the probability statements, P(lambda less than 1.0 x 10 -3 ) equals 0.50 and P(lambda less than 1.0 x 10 -5 ) equals 0.05. That is, two percentiles provided by an engineer are used to determine a gamma prior model which agrees with the specified percentiles. For those engineers who prefer to specify reliability percentiles rather than the failure rate percentiles illustrated above, it is possible to use the induced negative-log gamma prior distribution which satisfies the probability statements, P(R(t 0 ) less than 0.99) equals 0.50 and P(R(t 0 ) less than 0.99999) equals 0.95, for some operating time t 0 . The report also includes graphs for selected percentiles which assist an engineer in applying the procedure. 28 figures, 16 tables

  15. Modeling and validating Bayesian accrual models on clinical data and simulations using adaptive priors.

    Science.gov (United States)

    Jiang, Yu; Simon, Steve; Mayo, Matthew S; Gajewski, Byron J

    2015-02-20

    Slow recruitment in clinical trials leads to increased costs and resource utilization, which includes both the clinic staff and patient volunteers. Careful planning and monitoring of the accrual process can prevent the unnecessary loss of these resources. We propose two hierarchical extensions to the existing Bayesian constant accrual model: the accelerated prior and the hedging prior. The new proposed priors are able to adaptively utilize the researcher's previous experience and current accrual data to produce the estimation of trial completion time. The performance of these models, including prediction precision, coverage probability, and correct decision-making ability, is evaluated using actual studies from our cancer center and simulation. The results showed that a constant accrual model with strongly informative priors is very accurate when accrual is on target or slightly off, producing smaller mean squared error, high percentage of coverage, and a high number of correct decisions as to whether or not continue the trial, but it is strongly biased when off target. Flat or weakly informative priors provide protection against an off target prior but are less efficient when the accrual is on target. The accelerated prior performs similar to a strong prior. The hedging prior performs much like the weak priors when the accrual is extremely off target but closer to the strong priors when the accrual is on target or only slightly off target. We suggest improvements in these models and propose new models for future research. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Validation of elk resource selection models with spatially independent data

    Science.gov (United States)

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  17. Selective Mutism Questionnaire: Measurement Structure and Validity

    Science.gov (United States)

    Letamendi, Andrea M.; Chavira, Denise A.; Hitchcock, Carla A.; Roesch, Scott C.; Shipon-Blum, Elisa; Stein, Murray B.

    2008-01-01

    The psychometric properties of the Selective Mutism Questionnaire (SMQ) are evaluated using a clinical sample of children with selective mutism (SM). The study shows that SMQ is useful in determining the severity of a child's nonspeaking behaviors, the scope of these behaviors and necessary follow up assessment.

  18. Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies.

    Science.gov (United States)

    Savitsky, Terrance; Vannucci, Marina; Sha, Naijun

    2011-02-01

    This paper presents a unified treatment of Gaussian process models that extends to data from the exponential dispersion family and to survival data. Our specific interest is in the analysis of data sets with predictors that have an a priori unknown form of possibly nonlinear associations to the response. The modeling approach we describe incorporates Gaussian processes in a generalized linear model framework to obtain a class of nonparametric regression models where the covariance matrix depends on the predictors. We consider, in particular, continuous, categorical and count responses. We also look into models that account for survival outcomes. We explore alternative covariance formulations for the Gaussian process prior and demonstrate the flexibility of the construction. Next, we focus on the important problem of selecting variables from the set of possible predictors and describe a general framework that employs mixture priors. We compare alternative MCMC strategies for posterior inference and achieve a computationally efficient and practical approach. We demonstrate performances on simulated and benchmark data sets.

  19. TEM validation of immunohistochemical staining prior to assessment of tumour angiogenesis by computerised image analysis

    International Nuclear Information System (INIS)

    Killingsworth, M.C.

    2002-01-01

    Full text: Counts of microvessel density (MVD) within solid tumours have been shown to be an independent predictor of outcome with higher counts generally associated with a worse prognosis. These assessments are commonly performed on immunoperoxidase stained (IPX) sections with antibodies to CD34, CD31 and Factor VIII-related antigen routinely used as vascular markers. Tumour vascular density is thought to reflect the demand the growing neoplasm is placing on its feeding blood supply. Vascular density also appears to be associated with spread of invasive cells to distant sites. The present study of tumour angiogenesis in prostate cancer specimens aims to assess new vessel growth in addition to MVD counts. The hypothesis being that an assessment which takes into account vascular migration and proliferation as well as the number of patent vessels present may have improved predictive power over assessments based on MVD counts alone. We are employing anti-CD34 stained IPX sections which are digitally photographed and assessed by a computerised image analysis system. Our aim is to develop parameters whereby tumour angiogenesis may be assessed at the light microscopic level and then correlated with existing histological methods of tumour assessment such as Gleason grading. In order to use IPX stained sections for angiogenic assessment validation and understanding of the anti-CD34 immunostaining pattern was necessary. This involved the following steps: i) Morphological assessment of angiogenic changes present in tumour blood vessels. Morphological changes in endothelial cells and pericytes indicative of angiogenic activation are generally below the level of resolution available with light microscopy. TEM examination revealed endothelial cell budding, pericyte retraction, basement membrane duplication and endothelial sprout formation in capillaries and venules surrounding tumour glands. This information assisted with the development of parameters by which IPX sections

  20. Validity of selected cardiovascular field-based test among Malaysian ...

    African Journals Online (AJOL)

    Based on emerge obese problem among Malaysian, this research is formulated to validate published tests among healthy female adult. Selected test namely; 20 meter multi-stage shuttle run, 2.4km run test, 1 mile walk test and Harvard Step test were correlated with laboratory test (Bruce protocol) to find the criterion validity ...

  1. Optimizing radiology peer review: a mathematical model for selecting future cases based on prior errors.

    Science.gov (United States)

    Sheu, Yun Robert; Feder, Elie; Balsim, Igor; Levin, Victor F; Bleicher, Andrew G; Branstetter, Barton F

    2010-06-01

    Peer review is an essential process for physicians because it facilitates improved quality of patient care and continuing physician learning and improvement. However, peer review often is not well received by radiologists who note that it is time intensive, is subjective, and lacks a demonstrable impact on patient care. Current advances in peer review include the RADPEER() system, with its standardization of discrepancies and incorporation of the peer-review process into the PACS itself. The purpose of this study was to build on RADPEER and similar systems by using a mathematical model to optimally select the types of cases to be reviewed, for each radiologist undergoing review, on the basis of the past frequency of interpretive error, the likelihood of morbidity from an error, the financial cost of an error, and the time required for the reviewing radiologist to interpret the study. The investigators compiled 612,890 preliminary radiology reports authored by residents and attending radiologists at a large tertiary care medical center from 1999 to 2004. Discrepancies between preliminary and final interpretations were classified by severity and validated by repeat review of major discrepancies. A mathematical model was then used to calculate, for each author of a preliminary report, the combined morbidity and financial costs of expected errors across 3 modalities (MRI, CT, and conventional radiography) and 4 departmental divisions (neuroradiology, abdominal imaging, musculoskeletal imaging, and thoracic imaging). A customized report was generated for each on-call radiologist that determined the category (modality and body part) with the highest total cost function. A universal total cost based on probability data from all radiologists was also compiled. The use of mathematical models to guide case selection could optimize the efficiency and effectiveness of physician time spent on peer review and produce more concrete and meaningful feedback to radiologists

  2. Uterine and tubal anatomical abnormalities in infertile women: diagnosis with routine hysterosalpingography prior to selective laparoscopy

    Directory of Open Access Journals (Sweden)

    Mwaffaq Heis

    2011-12-01

    Full Text Available Objective: To assess the findings and usefulness of hysterosalpingography as a routine investigation in the fertility workup prior to selective laparoscopy. Design: Descriptive retrospective study. Setting: A university hospital in the north of Jordan. Subjects: All hysterosalpingographies performed in the period between 1st January and 31 December 2008. Outcome measures: Detection of uterine and fallopian tube abnormalities and their correlation with laparoscopic findings. Results: During the study period, 281 infertile women underwent hysterosalpingography with no post procedural complications. The mean (SD age was 31.5 (5.91 years. Mean (SD duration of infertility was 4 (3.44 years. Infertility was reported as primary and secondary by 119 (42.3 % and 162 (57.6 %, respectively. Altogether 281 patients and 562 tubes were examined. Of those, 402 were patent and 160 occluded. There was only one woman in whom peritubal adhesions were diagnosed. Because of hysterosalpingographically diagnosed tubal occlusion, 46 women (16.4 % were referred for laparoscopy. Eight (17.3% of them were treated with unilateral salpingectomy and 28 (60.8% with bilateral salpingectomy. Salpingolysis was performed for 7 (15.2% women, and 3 (6.7% women had untreatable adhesions. The concordance was 71.7%. The sensitivity of HSG was 80%, the specificity 50%, the negative predictive value 61% and the positive predictive value 71%. Of the total of 281 women, 30 (10.7% conceived within 1 - 11 months after the hysterosalpingography. Conclusions: The very high abnormal predictive value of hysterosalpingography in the diagnosis of tubal occlusion suggests that this procedure could be performed as a screening examination.

  3. On selecting a prior for the precision parameter of Dirichlet process mixture models

    Science.gov (United States)

    Dorazio, R.M.

    2009-01-01

    In hierarchical mixture models the Dirichlet process is used to specify latent patterns of heterogeneity, particularly when the distribution of latent parameters is thought to be clustered (multimodal). The parameters of a Dirichlet process include a precision parameter ?? and a base probability measure G0. In problems where ?? is unknown and must be estimated, inferences about the level of clustering can be sensitive to the choice of prior assumed for ??. In this paper an approach is developed for computing a prior for the precision parameter ?? that can be used in the presence or absence of prior information about the level of clustering. This approach is illustrated in an analysis of counts of stream fishes. The results of this fully Bayesian analysis are compared with an empirical Bayes analysis of the same data and with a Bayesian analysis based on an alternative commonly used prior.

  4. The Selective Mutism Questionnaire: Measurement Structure and Validity

    Science.gov (United States)

    Letamendi, Andrea M.; Chavira, Denise A.; Hitchcock, Carla A.; Roesch, Scott C.; Shipon-Blum, Elisa; Stein, Murray B.; Roesch, Scott C.

    2010-01-01

    Objective To evaluate the factor structure, reliability, and validity of the 17-item Selective Mutism Questionnaire. Method Diagnostic interviews were administered via telephone to 102 parents of children identified with selective mutism (SM) and 43 parents of children without SM from varying U.S. geographic regions. Children were between the ages of 3 and 11 inclusive and comprised 58% girls and 42% boys. SM diagnoses were determined using the Anxiety Disorders Interview Schedule for Children - Parent Version (ADIS-C/P); SM severity was assessed using the 17-item Selective Mutism Questionnaire (SMQ); and behavioral and affective symptoms were assessed using the Child Behavior Checklist (CBCL). An exploratory factor analysis (EFA) was conducted to investigate the dimensionality of the SMQ and a modified parallel analysis procedure was used to confirm EFA results. Internal consistency, construct validity, and incremental validity were also examined. Results The EFA yielded a 13-item solution consisting of three factors: a) Social Situations Outside of School, b) School Situations, and c) Home and Family Situations. Internal consistency of SMQ factors and total scale ranged from moderate to high. Convergent and incremental validity were also well supported. Conclusions Measure structure findings are consistent with the 3-factor solution found in a previous psychometric evaluation of the SMQ. Results also suggest that the SMQ provides useful and unique information in the prediction of SM phenomenon beyond other child anxiety measures. PMID:18698268

  5. Social validation of vocabulary selection: ensuring stakeholder relevance.

    Science.gov (United States)

    Bornman, Juan; Bryen, Diane Nelson

    2013-06-01

    The vocabulary needs of individuals who are unable to spell their messages continue to be of concern in the field of augmentative and alternative communication (AAC). Social validation of vocabulary selection has been suggested as one way to improve the effectiveness and relevance of service delivery in AAC. Despite increased emphasis on stakeholder accountability, social validation is not frequently used in AAC research. This paper describes an investigation of the social validity of a vocabulary set identified in earlier research. A previous study used stakeholder focus groups to identify vocabulary that could be used by South African adults who use AAC to disclose their experiences as victims of crime or abuse. Another study used this vocabulary to create communication boards for use by adults with complex communication needs. In this current project, 12 South African adults with complex communication needs who use AAC systems used a 5-point Likert scale to score the importance of each of the previously identified 57 vocabulary items. This two-step process of first using stakeholder focus groups to identify vocabulary, and then having literate persons who use AAC provide information on social validity of the vocabulary on behalf of their peers who are illiterate, appears to hold promise as a culturally relevant vocabulary selection approach for sensitive topics such as crime and abuse.

  6. Selective influence of prior allocentric knowledge on the kinesthetic learning of a path.

    Science.gov (United States)

    Lafon, Matthieu; Vidal, Manuel; Berthoz, Alain

    2009-04-01

    Spatial cognition studies have described two main cognitive strategies involved in the memorization of traveled paths in human navigation. One of these strategies uses the action-based memory (egocentric) of the traveled route or paths, which involves kinesthetic memory, optic flow, and episodic memory, whereas the other strategy privileges a survey memory of cartographic type (allocentric). Most studies have dealt with these two strategies separately, but none has tried to show the interaction between them in spite of the fact that we commonly use a map to imagine our journey and then proceed using egocentric navigation. An interesting question is therefore: how does prior allocentric knowledge of the environment affect the egocentric, purely kinesthetic navigation processes involved in human navigation? We designed an experiment in which blindfolded subjects had first to walk and memorize a path with kinesthetic cues only. They had previously been shown a map of the path, which was either correct or distorted (consistent shrinking or growing). The latter transformations were studied in order to observe what influence a distorted prior knowledge could have on spatial mechanisms. After having completed the first learning travel along the path, they had to perform several spatial tasks during the testing phase: (1) pointing towards the origin and (2) to specific points encountered along the path, (3) a free locomotor reproduction, and (4) a drawing of the memorized path. The results showed that prior cartographic knowledge influences the paths drawn and the spatial inference capacity, whereas neither locomotor reproduction nor spatial updating was disturbed. Our results strongly support the notion that (1) there are two independent neural bases underlying these mechanisms: a map-like representation allowing allocentric spatial inferences, and a kinesthetic memory of self-motion in space; and (2) a common use of, or a switching between, these two strategies is

  7. Behavioral Risk Factors: Selected Metropolitan Area Risk Trends (SMART) MMSA Prevalence Data (2010 and Prior)

    Data.gov (United States)

    U.S. Department of Health & Human Services — 2002-2010. BRFSS SMART MMSA Prevalence land line only data. The Selected Metropolitan Area Risk Trends (SMART) project uses the Behavioral Risk Factor Surveillance...

  8. Behavioral Risk Factors: Selected Metropolitan Area Risk Trends (SMART) County Prevalence Data (2010 and prior)

    Data.gov (United States)

    U.S. Department of Health & Human Services — 2002-2010. BRFSS SMART County Prevalence land line only data. The Selected Metropolitan Area Risk Trends (SMART) project uses the Behavioral Risk Factor Surveillance...

  9. Intracranial aneurysm segmentation in 3D CT angiography: Method and quantitative validation with and without prior noise filtering

    International Nuclear Information System (INIS)

    Firouzian, Azadeh; Manniesing, Rashindra; Flach, Zwenneke H.; Risselada, Roelof; Kooten, Fop van; Sturkenboom, Miriam C.J.M.; Lugt, Aad van der; Niessen, Wiro J.

    2011-01-01

    Intracranial aneurysm volume and shape are important factors for predicting rupture risk, for pre-surgical planning and for follow-up studies. To obtain these parameters, manual segmentation can be employed; however, this is a tedious procedure, which is prone to inter- and intra-observer variability. Therefore there is a need for an automated method, which is accurate, reproducible and reliable. This study aims to develop and validate an automated method for segmenting intracranial aneurysms in Computed Tomography Angiography (CTA) data. Also, it is investigated whether prior smoothing improves segmentation robustness and accuracy. The proposed segmentation method is implemented in the level set framework, more specifically Geodesic Active Surfaces, in which a surface is evolved to capture the aneurysmal wall via an energy minimization approach. The energy term is composed of three different image features, namely; intensity, gradient magnitude and intensity variance. The method requires minimal user interaction, i.e. a single seed point inside the aneurysm needs to be placed, based on which image intensity statistics of the aneurysm are derived and used in defining the energy term. The method has been evaluated on 15 aneurysms in 11 CTA data sets by comparing the results to manual segmentations performed by two expert radiologists. Evaluation measures were Similarity Index, Average Surface Distance and Volume Difference. The results show that the automated aneurysm segmentation method is reproducible, and performs in the range of inter-observer variability in terms of accuracy. Smoothing by nonlinear diffusion with appropriate parameter settings prior to segmentation, slightly improves segmentation accuracy.

  10. Integrative analysis to select cancer candidate biomarkers to targeted validation

    Science.gov (United States)

    Heberle, Henry; Domingues, Romênia R.; Granato, Daniela C.; Yokoo, Sami; Canevarolo, Rafael R.; Winck, Flavia V.; Ribeiro, Ana Carolina P.; Brandão, Thaís Bianca; Filgueiras, Paulo R.; Cruz, Karen S. P.; Barbuto, José Alexandre; Poppi, Ronei J.; Minghim, Rosane; Telles, Guilherme P.; Fonseca, Felipe Paiva; Fox, Jay W.; Santos-Silva, Alan R.; Coletta, Ricardo D.; Sherman, Nicholas E.; Paes Leme, Adriana F.

    2015-01-01

    Targeted proteomics has flourished as the method of choice for prospecting for and validating potential candidate biomarkers in many diseases. However, challenges still remain due to the lack of standardized routines that can prioritize a limited number of proteins to be further validated in human samples. To help researchers identify candidate biomarkers that best characterize their samples under study, a well-designed integrative analysis pipeline, comprising MS-based discovery, feature selection methods, clustering techniques, bioinformatic analyses and targeted approaches was performed using discovery-based proteomic data from the secretomes of three classes of human cell lines (carcinoma, melanoma and non-cancerous). Three feature selection algorithms, namely, Beta-binomial, Nearest Shrunken Centroids (NSC), and Support Vector Machine-Recursive Features Elimination (SVM-RFE), indicated a panel of 137 candidate biomarkers for carcinoma and 271 for melanoma, which were differentially abundant between the tumor classes. We further tested the strength of the pipeline in selecting candidate biomarkers by immunoblotting, human tissue microarrays, label-free targeted MS and functional experiments. In conclusion, the proposed integrative analysis was able to pre-qualify and prioritize candidate biomarkers from discovery-based proteomics to targeted MS. PMID:26540631

  11. Swainson's Thrushes do not show strong wind selectivity prior to crossing the Gulf of Mexico.

    Science.gov (United States)

    Bolus, Rachel T; Diehl, Robert H; Moore, Frank R; Deppe, Jill L; Ward, Michael P; Smolinsky, Jaclyn; Zenzal, Theodore J

    2017-10-27

    During long-distance fall migrations, nocturnally migrating Swainson's Thrushes often stop on the northern Gulf of Mexico coast before flying across the Gulf. To minimize energetic costs, trans-Gulf migrants should stop over when they encounter crosswinds or headwinds, and depart with supportive tailwinds. However, time constrained migrants should be less selective, balancing costs of headwinds with benefits of continuing their migrations. To test the hypotheses that birds select supportive winds and that selectivity is mediated by seasonal time constraints, we examined whether local winds affected Swainson's Thrushes' arrival and departure at Ft. Morgan, Alabama, USA at annual, seasonal, and nightly time scales. Additionally, migrants could benefit from forecasting future wind conditions, crossing on nights when winds are consistently supportive across the Gulf, thereby avoiding the potentially lethal consequences of depleting their energetic reserves over water. To test whether birds forecast, we developed a movement model, calculated to what extent departure winds were predictive of future Gulf winds, and tested whether birds responded to predictability. Swainson's Thrushes were only slightly selective and did not appear to forecast. By following the simple rule of avoiding only the strongest headwinds at departure, Swainson's Thrushes could survive the 1500 km flight between Alabama and Veracruz, Mexico.

  12. Selection, calibration, and validation of models of tumor growth.

    Science.gov (United States)

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  13. Investigation of thermal treatment on selective separation of post consumer plastics prior to froth flotation

    International Nuclear Information System (INIS)

    Guney, Ali; Poyraz, M. Ibrahim; Kangal, Olgac; Burat, Firat

    2013-01-01

    Highlights: • Both PET and PVC have nearly the same densities. • The best pH value will be 4 for optimizing pH values. • Malic acid gave the best results for selective separation of PET and PVC. - Abstract: Plastics have become the widely used materials because of their advantages, such as cheapness, endurance, lightness, and hygiene. However, they cause waste and soil pollution and they do not easily decompose. Many promising technologies are being investigated for separating mixed thermoplastics, but they are still uneconomical and unreliable. Depending on their surface characteristics, these plastics can be separated from each other by flotation method which is useful mineral processing technique with its low cost and simplicity. The main objective of this study is to investigate the flotation characteristics of PET and PVC and determine the effect of plasticizer reagents on efficient plastic separation. For that purpose, various parameters such as pH, plasticizer concentration, plasticizer type, conditioning temperature and thermal conditioning were investigated. As a result, PET particles were floated with 95.1% purity and 65.3% efficiency while PVC particles were obtained with 98.1% purity and 65.3% efficiency

  14. Investigation of thermal treatment on selective separation of post consumer plastics prior to froth flotation

    Energy Technology Data Exchange (ETDEWEB)

    Guney, Ali; Poyraz, M. Ibrahim; Kangal, Olgac, E-mail: kangal@itu.edu.tr; Burat, Firat

    2013-09-15

    Highlights: • Both PET and PVC have nearly the same densities. • The best pH value will be 4 for optimizing pH values. • Malic acid gave the best results for selective separation of PET and PVC. - Abstract: Plastics have become the widely used materials because of their advantages, such as cheapness, endurance, lightness, and hygiene. However, they cause waste and soil pollution and they do not easily decompose. Many promising technologies are being investigated for separating mixed thermoplastics, but they are still uneconomical and unreliable. Depending on their surface characteristics, these plastics can be separated from each other by flotation method which is useful mineral processing technique with its low cost and simplicity. The main objective of this study is to investigate the flotation characteristics of PET and PVC and determine the effect of plasticizer reagents on efficient plastic separation. For that purpose, various parameters such as pH, plasticizer concentration, plasticizer type, conditioning temperature and thermal conditioning were investigated. As a result, PET particles were floated with 95.1% purity and 65.3% efficiency while PVC particles were obtained with 98.1% purity and 65.3% efficiency.

  15. Integrated account of method, site selection and programme prior to the site investigation phase

    International Nuclear Information System (INIS)

    2000-12-01

    applications and have these applications reviewed by the appropriate authorities. An analysis of conceivable alternatives for managing and disposing of spent nuclear fuel has confirmed that deep geological disposal according to the KBS-3 method has the best prospects of meeting all requirements. The alternative of putting off a decision until some future time (the zero alternative) does not appear tenable. The assessment of long-term safety shows that the prospects of building a safe deep repository in the Swedish bedrock are good. Independent Swedish and international review of the safety assessment confirm that the body of data in this respect is adequate for the siting process to proceed to the site investigation phase. A fuller summary is given below of the account given in this report of method as well as site selection and programme for the site investigation phase. The point of departure for the account is the review comments made by the regulatory authorities and the Government's decision regarding RD and D-Programme 98. In its decision, the Government stipulated conditions for SKB's continued research and development programme. The analysis of alternative system designs was to be supplemented, mainly with regard to the zero alternative and very deep boreholes. Furthermore, the Government decided that SKB shall submit an integrated evaluation of completed feasibility studies and other background material for selection of sites for site investigations and present a clear programme for site investigations

  16. The chemotherapeutic agent paclitaxel selectively impairs reversal learning while sparing prior learning, new learning and episodic memory.

    Science.gov (United States)

    Panoz-Brown, Danielle; Carey, Lawrence M; Smith, Alexandra E; Gentry, Meredith; Sluka, Christina M; Corbin, Hannah E; Wu, Jie-En; Hohmann, Andrea G; Crystal, Jonathon D

    2017-10-01

    Chemotherapy is widely used to treat patients with systemic cancer. The efficacy of cancer therapies is frequently undermined by adverse side effects that have a negative impact on the quality of life of cancer survivors. Cancer patients who receive chemotherapy often experience chemotherapy-induced cognitive impairment across a variety of domains including memory, learning, and attention. In the current study, the impact of paclitaxel, a taxane derived chemotherapeutic agent, on episodic memory, prior learning, new learning, and reversal learning were evaluated in rats. Neurogenesis was quantified post-treatment in the dentate gyrus of the same rats using immunostaining for 5-Bromo-2'-deoxyuridine (BrdU) and Ki67. Paclitaxel treatment selectively impaired reversal learning while sparing episodic memory, prior learning, and new learning. Furthermore, paclitaxel-treated rats showed decreases in markers of hippocampal cell proliferation, as measured by markers of cell proliferation assessed using immunostaining for Ki67 and BrdU. This work highlights the importance of using multiple measures of learning and memory to identify the pattern of impaired and spared aspects of chemotherapy-induced cognitive impairment. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Prior selection for Gumbel distribution parameters using multiple-try metropolis algorithm for monthly maxima PM10 data

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma

    2014-09-01

    The Multiple-try Metropolis (MTM) algorithm is the new alternatives in the field of Bayesian extremes for summarizing the posterior distribution. MTM produce efficient estimation scheme for modelling extreme data in term of the convergence and small burn-in periods. The main objective is to explore the accuracy of the parameter estimation to the change of priors and compare the results with a classical likelihood-based analysis. Focus is on modelling the extreme data based on block maxima approach using Gumbel distribution. The comparative study between MTM and MLE is shown by the numerical problems. Several goodness of fit tests are compute for selecting the best model. The application is on the monthly maxima PM10 data for Johor state.

  18. Influence of DC plasma modification on the selected properties and the geometrical surface structure of polylactide prior to autocatalytic metallization

    Energy Technology Data Exchange (ETDEWEB)

    Moraczewski, Krzysztof, E-mail: kmm@ukw.edu.pl [Kazimierz Wielki University, Chodkiewicza 30, 85-064 Bydgoszcz (Poland); Rytlewski, Piotr [Kazimierz Wielki University, Chodkiewicza 30, 85-064 Bydgoszcz (Poland); Malinowski, Rafał [Institute for Engineering of Polymer Materials and Dyes, Marii Skłodowskiej-Curie 55, 87-100 Toruń (Poland); Tracz, Adam [Centre for Molecular and Macromolecular Studies of the Polish Academy of Sciences, Sienkiewicza 112, 90-363 Łódź (Poland); Żenkiewicz, Marian [Institute for Engineering of Polymer Materials and Dyes, Marii Skłodowskiej-Curie 55, 87-100 Toruń (Poland)

    2015-03-01

    The paper presents the results of studies to determine the applicability of plasma modification in the process of polylactide (PLA) surface preparation prior to the autocatalytic metallization. The polylactide plasma modification was carried out in an oxygen or nitrogen chemistry. The samples were tested with the following methods: scanning electron microscopy (SEM), atomic force microscopy (AFM), goniometry and electron spectrophotometry (XPS). Scanning electron microscopy and atomic force microscopy images were demonstrated. The results of surface free energy calculations, performed based on the results of the contact angle measurements have been presented. The results of the qualitative (degree of oxidation or nitridation) and quantitative analysis of the chemical composition of the polylactide surface layer have also been described. The results of the studies show that the DC plasma modification performed in the proposed condition is a suitable as a method of surface preparation for the polylactide metallization. - Highlights: • We modified polylactide surface layer with plasma generated in oxygen or nitrogen. • We tested selected properties and surface structure of modified samples. • DC plasma modification can be used to prepare the PLA surface for metallization. • For better results metallization should be preceded by sonication process.

  19. Prior Cocaine Self-Administration Increases Response-Outcome Encoding That Is Divorced from Actions Selected in Dorsal Lateral Striatum.

    Science.gov (United States)

    Burton, Amanda C; Bissonette, Gregory B; Zhao, Adam C; Patel, Pooja K; Roesch, Matthew R

    2017-08-09

    Dorsal lateral striatum (DLS) is a highly associative structure that encodes relationships among environmental stimuli, behavioral responses, and predicted outcomes. DLS is known to be disrupted after chronic drug abuse; however, it remains unclear what neural signals in DLS are altered. Current theory suggests that drug use enhances stimulus-response processing at the expense of response-outcome encoding, but this has mostly been tested in simple behavioral tasks. Here, we investigated what neural correlates in DLS are affected by previous cocaine exposure as rats performed a complex reward-guided decision-making task in which predicted reward value was independently manipulated by changing the delay to or size of reward associated with a response direction across a series of trial blocks. After cocaine self-administration, rats exhibited stronger biases toward higher-value reward and firing in DLS more strongly represented action-outcome contingencies independent from actions subsequently taken rather than outcomes predicted by selected actions (chosen-outcome contingencies) and associations between stimuli and actions (stimulus-response contingencies). These results suggest that cocaine self-administration strengthens action-outcome encoding in rats (as opposed to chosen-outcome or stimulus-response encoding), which abnormally biases behavior toward valued reward when there is a choice between two options during reward-guided decision-making. SIGNIFICANCE STATEMENT Current theories suggest that the impaired decision-making observed in individuals who chronically abuse drugs reflects a decrease in goal-directed behaviors and an increase in habitual behaviors governed by neural representations of response-outcome (R-O) and stimulus-response associations, respectively. We examined the impact that prior cocaine self-administration had on firing in dorsal lateral striatum (DLS), a brain area known to be involved in habit formation and affected by drugs of abuse

  20. Development and Validation of the Conceptual Assessment of Natural Selection (CANS)

    Science.gov (United States)

    Kalinowski, Steven T.; Leonard, Mary J.; Taper, Mark L.

    2016-01-01

    We developed and validated the Conceptual Assessment of Natural Selection (CANS), a multiple-choice test designed to assess how well college students understand the central principles of natural selection. The expert panel that reviewed the CANS concluded its questions were relevant to natural selection and generally did a good job sampling the…

  1. Cross-validation pitfalls when selecting and assessing regression and classification models.

    Science.gov (United States)

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  2. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    Directory of Open Access Journals (Sweden)

    Heinz-Martin Süß

    2018-05-01

    Full Text Available The original aim of complex problem solving (CPS research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system. The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2 and figural reasoning (Study 2 – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1 cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2 in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly

  3. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    Science.gov (United States)

    Süß, Heinz-Martin; Kretzschmar, André

    2018-01-01

    The original aim of complex problem solving (CPS) research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system) twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system). The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2) and figural reasoning (Study 2) – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1) cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2) in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly utilizes the

  4. Discharge data from 50 selected rivers for GCM validation

    International Nuclear Information System (INIS)

    Duemenil, L.; Isele, K.; Liebscher, H.J.; Schroeder, U.; Schumacher, M.; Wilke, K.

    1993-01-01

    This Technical Report refers to a joint project between GRDC Koblenz and MPI Hamburg. The Global Runoff Data Centre operates under the auspieces of WMO at the Federal Institute of Hydrology (Bundesanstalt fuer Gewaesserkunde) in Koblenz. River discharge data of the 50 largest rivers provide an independent data source for the validation of the hydrological cycle in general circulation models. This type of data is particularly valuable, because in some cases the available time series are exceptionally long. The data are presented as time series of annual average discharge (averaged over the period for which data is available, see below for caveats) and as annual cycles of monthly mean discharge averaged over the length of the time series available. (orig./KW)

  5. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Science.gov (United States)

    2010-07-01

    ... EMPLOYMENT OPPORTUNITY, DEPARTMENT OF LABOR 3-UNIFORM GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... techniques contemplated by these guidelines usually should be followed if technically feasible. Where the...

  6. Taxonomic evaluation of selected Ganoderma species and database sequence validation

    Directory of Open Access Journals (Sweden)

    Suldbold Jargalmaa

    2017-07-01

    Full Text Available Species in the genus Ganoderma include several ecologically important and pathogenic fungal species whose medicinal and economic value is substantial. Due to the highly similar morphological features within the Ganoderma, identification of species has relied heavily on DNA sequencing using BLAST searches, which are only reliable if the GenBank submissions are accurately labeled. In this study, we examined 113 specimens collected from 1969 to 2016 from various regions in Korea using morphological features and multigene analysis (internal transcribed spacer, translation elongation factor 1-α, and the second largest subunit of RNA polymerase II. These specimens were identified as four Ganoderma species: G. sichuanense, G. cf. adspersum, G. cf. applanatum, and G. cf. gibbosum. With the exception of G. sichuanense, these species were difficult to distinguish based solely on morphological features. However, phylogenetic analysis at three different loci yielded concordant phylogenetic information, and supported the four species distinctions with high bootstrap support. A survey of over 600 Ganoderma sequences available on GenBank revealed that 65% of sequences were either misidentified or ambiguously labeled. Here, we suggest corrected annotations for GenBank sequences based on our phylogenetic validation and provide updated global distribution patterns for these Ganoderma species.

  7. Taxonomic evaluation of selected Ganoderma species and database sequence validation

    Science.gov (United States)

    Jargalmaa, Suldbold; Eimes, John A.; Park, Myung Soo; Park, Jae Young; Oh, Seung-Yoon

    2017-01-01

    Species in the genus Ganoderma include several ecologically important and pathogenic fungal species whose medicinal and economic value is substantial. Due to the highly similar morphological features within the Ganoderma, identification of species has relied heavily on DNA sequencing using BLAST searches, which are only reliable if the GenBank submissions are accurately labeled. In this study, we examined 113 specimens collected from 1969 to 2016 from various regions in Korea using morphological features and multigene analysis (internal transcribed spacer, translation elongation factor 1-α, and the second largest subunit of RNA polymerase II). These specimens were identified as four Ganoderma species: G. sichuanense, G. cf. adspersum, G. cf. applanatum, and G. cf. gibbosum. With the exception of G. sichuanense, these species were difficult to distinguish based solely on morphological features. However, phylogenetic analysis at three different loci yielded concordant phylogenetic information, and supported the four species distinctions with high bootstrap support. A survey of over 600 Ganoderma sequences available on GenBank revealed that 65% of sequences were either misidentified or ambiguously labeled. Here, we suggest corrected annotations for GenBank sequences based on our phylogenetic validation and provide updated global distribution patterns for these Ganoderma species. PMID:28761785

  8. Validation and selection of ODE based systems biology models: how to arrive at more reliable decisions.

    Science.gov (United States)

    Hasdemir, Dicle; Hoefsloot, Huub C J; Smilde, Age K

    2015-07-08

    Most ordinary differential equation (ODE) based modeling studies in systems biology involve a hold-out validation step for model validation. In this framework a pre-determined part of the data is used as validation data and, therefore it is not used for estimating the parameters of the model. The model is assumed to be validated if the model predictions on the validation dataset show good agreement with the data. Model selection between alternative model structures can also be performed in the same setting, based on the predictive power of the model structures on the validation dataset. However, drawbacks associated with this approach are usually under-estimated. We have carried out simulations by using a recently published High Osmolarity Glycerol (HOG) pathway from S.cerevisiae to demonstrate these drawbacks. We have shown that it is very important how the data is partitioned and which part of the data is used for validation purposes. The hold-out validation strategy leads to biased conclusions, since it can lead to different validation and selection decisions when different partitioning schemes are used. Furthermore, finding sensible partitioning schemes that would lead to reliable decisions are heavily dependent on the biology and unknown model parameters which turns the problem into a paradox. This brings the need for alternative validation approaches that offer flexible partitioning of the data. For this purpose, we have introduced a stratified random cross-validation (SRCV) approach that successfully overcomes these limitations. SRCV leads to more stable decisions for both validation and selection which are not biased by underlying biological phenomena. Furthermore, it is less dependent on the specific noise realization in the data. Therefore, it proves to be a promising alternative to the standard hold-out validation strategy.

  9. An Update on the Diversity - Validity Dilemma in Personnel Selection: A Review

    Directory of Open Access Journals (Sweden)

    Celina Druart

    2012-12-01

    Full Text Available As globalization increases and labor markets become substantially more diverse, increasing diversity during personnel selection has become a dominant theme in personnel selection in human resource management. However, while trying to pursue this goal, researchers and practitioners find themselves confronted with the diversity-validity dilemma, as some of the most valid selection instruments display considerable ethnic subgroup differences in test performance. The goal of the current paper is twofold. First, we update and review the literature on the diversity-validity dilemma and discuss several strategies that aim to increase diversity without jeopardizing criterion-related validity. Second, we provide researchers and practitioners with evidence-based guidelines for dealing with the dilemma. Additionally, we identify several new avenues for future research.

  10. Validity of a structured method of selecting abstracts for a plastic surgical scientific meeting

    NARCIS (Netherlands)

    van der Steen, LPE; Hage, JJ; Kon, M; Monstrey, SJ

    In 1999, the European Association of Plastic Surgeons accepted a structured method to assess and select the abstracts that are submitted for its yearly scientific meeting. The two criteria used to evaluate whether such a selection method is accurate were reliability and validity. The authors

  11. Procedure for the Selection and Validation of a Calibration Model I-Description and Application.

    Science.gov (United States)

    Desharnais, Brigitte; Camirand-Lemyre, Félix; Mireault, Pascal; Skinner, Cameron D

    2017-05-01

    Calibration model selection is required for all quantitative methods in toxicology and more broadly in bioanalysis. This typically involves selecting the equation order (quadratic or linear) and weighting factor correctly modelizing the data. A mis-selection of the calibration model will generate lower quality control (QC) accuracy, with an error up to 154%. Unfortunately, simple tools to perform this selection and tests to validate the resulting model are lacking. We present a stepwise, analyst-independent scheme for selection and validation of calibration models. The success rate of this scheme is on average 40% higher than a traditional "fit and check the QCs accuracy" method of selecting the calibration model. Moreover, the process was completely automated through a script (available in Supplemental Data 3) running in RStudio (free, open-source software). The need for weighting was assessed through an F-test using the variances of the upper limit of quantification and lower limit of quantification replicate measurements. When weighting was required, the choice between 1/x and 1/x2 was determined by calculating which option generated the smallest spread of weighted normalized variances. Finally, model order was selected through a partial F-test. The chosen calibration model was validated through Cramer-von Mises or Kolmogorov-Smirnov normality testing of the standardized residuals. Performance of the different tests was assessed using 50 simulated data sets per possible calibration model (e.g., linear-no weight, quadratic-no weight, linear-1/x, etc.). This first of two papers describes the tests, procedures and outcomes of the developed procedure using real LC-MS-MS results for the quantification of cocaine and naltrexone. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Sensitive and selective determination of polycyclic aromatic hydrocarbons in mainstream cigarette smoke using a graphene-coated solid-phase microextraction fiber prior to GC/MS.

    Science.gov (United States)

    Wang, Xiaoyu; Wang, Yuan; Qin, Yaqiong; Ding, Li; Chen, Yi; Xie, Fuwei

    2015-08-01

    A simple method has been developed for the simultaneous determination of 16 polycyclic aromatic hydrocarbons (PAHs) in mainstream cigarette smoke. The procedure is based on employing a homemade graphene-coated solid-phase microextraction (SPME) fiber for extraction prior to GC/MS. In comparison to commercial 100-μm poly(dimethyl siloxane) (PDMS) fiber, the graphene-coated SPME fiber exhibits advantageous cleanup and preconcentration efficiencies. By collecting the particulate phase 5 cigarettes, the LODs and LOQs of 16 target PAHs were 0.02-0.07 and 0.07-0.22 ng/cigarette, respectively, and all of the linear correlation efficiencies were larger than 0.995. The validation results also indicate that the method has good repeatability (RSD between 4.2% and 9.5%) and accuracy (spiked recoveries between 80% and 110%). The developed method was applied to analyze two Kentucky reference cigarettes (1R5F and 3R4F) and six Chinese brands of cigarettes. In addition, the PAH concentrations in the particulate phase of the smoke from the 1R5F Kentucky cigarettes were in good agreement with recently reported results. Due to easy operation and good validation results, this SPME-GC/MS method may be an excellent alternative for trace analysis of PAHs in cigarette smoke. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Statistical approach for selection of regression model during validation of bioanalytical method

    Directory of Open Access Journals (Sweden)

    Natalija Nakov

    2014-06-01

    Full Text Available The selection of an adequate regression model is the basis for obtaining accurate and reproducible results during the bionalytical method validation. Given the wide concentration range, frequently present in bioanalytical assays, heteroscedasticity of the data may be expected. Several weighted linear and quadratic regression models were evaluated during the selection of the adequate curve fit using nonparametric statistical tests: One sample rank test and Wilcoxon signed rank test for two independent groups of samples. The results obtained with One sample rank test could not give statistical justification for the selection of linear vs. quadratic regression models because slight differences between the error (presented through the relative residuals were obtained. Estimation of the significance of the differences in the RR was achieved using Wilcoxon signed rank test, where linear and quadratic regression models were treated as two independent groups. The application of this simple non-parametric statistical test provides statistical confirmation of the choice of an adequate regression model.

  14. Fuzzy decision-making: a new method in model selection via various validity criteria

    International Nuclear Information System (INIS)

    Shakouri Ganjavi, H.; Nikravesh, K.

    2001-01-01

    Modeling is considered as the first step in scientific investigations. Several alternative models may be candida ted to express a phenomenon. Scientists use various criteria to select one model between the competing models. Based on the solution of a Fuzzy Decision-Making problem, this paper proposes a new method in model selection. The method enables the scientist to apply all desired validity criteria, systematically by defining a proper Possibility Distribution Function due to each criterion. Finally, minimization of a utility function composed of the Possibility Distribution Functions will determine the best selection. The method is illustrated through a modeling example for the A verage Daily Time Duration of Electrical Energy Consumption in Iran

  15. Development and Validation of the Motivations for Selection of Medical Study (MSMS Questionnaire in India.

    Directory of Open Access Journals (Sweden)

    Sonu Goel

    Full Text Available Understanding medical students' motivation to select medical studies is particularly salient to inform practice and policymaking in countries-such as India-where shortage of medical personnel poses crucial and chronical challenges to healthcare systems. This study aims to develop and validate a questionnaire to assess the motivation of medical students to select medical studies.A Motivation for Selection of Medical Study (MSMS questionnaire was developed using extensive literature review followed by Delphi technique. The scale consisted of 12 items, 5 measuring intrinsic dimensions of motivations and 7 measuring extrinsic dimensions. Exploratory factor analysis (EFA, confirmatory factor analysis (CFA, validity, reliability and data quality checks were conducted on a sample of 636 medical students from six medical colleges of three North Indian states.The MSMS questionnaire consisted of 3 factors (subscales and 8 items. The three principal factors that emerged after EFA were the scientific factor (e.g. research opportunities and the ability to use new cutting edge technologies, the societal factor (e.g. job security and the humanitarian factor (e.g. desire to help others. The CFA conducted showed goodness-of-fit indices supporting the 3-factor model.The three extracted factors cut across the traditional dichotomy between intrinsic and extrinsic motivation and uncover a novel three-faceted motivation construct based on scientific factors, societal expectations and humanitarian needs. This validated instrument can be used to evaluate the motivational factors of medical students to choose medical study in India and similar settings and constitutes a powerful tool for policymakers to design measures able to increase selection of medical curricula.

  16. Construct validity of the Free and Cued Selective Reminding Test in older adults with memory complaints.

    Science.gov (United States)

    Clerici, Francesca; Ghiretti, Roberta; Di Pucchio, Alessandra; Pomati, Simone; Cucumo, Valentina; Marcone, Alessandra; Vanacore, Nicola; Mariani, Claudio; Cappa, Stefano Francesco

    2017-06-01

    The Free and Cued Selective Reminding Test (FCSRT) is the memory test recommended by the International Working Group on Alzheimer's disease (AD) for the detection of amnestic syndrome of the medial temporal type in prodromal AD. Assessing the construct validity and internal consistency of the Italian version of the FCSRT is thus crucial. The FCSRT was administered to 338 community-dwelling participants with memory complaints (57% females, age 74.5 ± 7.7 years), including 34 with AD, 203 with Mild Cognitive Impairment, and 101 with Subjective Memory Impairment. Internal Consistency was estimated using Cronbach's alpha coefficient. To assess convergent validity, five FCSRT scores (Immediate Free Recall, Immediate Total Recall, Delayed Free Recall, Delayed Total Recall, and Index of Sensitivity of Cueing) were correlated with three well-validated memory tests: Story Recall, Rey Auditory Verbal Learning test, and Rey Complex Figure (RCF) recall (partial correlation analysis). To assess divergent validity, a principal component analysis (an exploratory factor analysis) was performed including, in addition to the above-mentioned memory tasks, the following tests: Word Fluencies, RCF copy, Clock Drawing Test, Trail Making Test, Frontal Assessment Battery, Raven Coloured Progressive Matrices, and Stroop Colour-Word Test. Cronbach's alpha coefficients for immediate recalls (IFR and ITR) and delayed recalls (DFR and DTR) were, respectively, .84 and .81. All FCSRT scores were highly correlated with those of the three well-validated memory tests. The factor analysis showed that the FCSRT does not load on the factors saturated by non-memory tests. These findings indicate that the FCSRT has a good internal consistency and has an excellent construct validity as an episodic memory measure. © 2015 The British Psychological Society.

  17. Sensor Selection and Data Validation for Reliable Integrated System Health Management

    Science.gov (United States)

    Garg, Sanjay; Melcher, Kevin J.

    2008-01-01

    For new access to space systems with challenging mission requirements, effective implementation of integrated system health management (ISHM) must be available early in the program to support the design of systems that are safe, reliable, highly autonomous. Early ISHM availability is also needed to promote design for affordable operations; increased knowledge of functional health provided by ISHM supports construction of more efficient operations infrastructure. Lack of early ISHM inclusion in the system design process could result in retrofitting health management systems to augment and expand operational and safety requirements; thereby increasing program cost and risk due to increased instrumentation and computational complexity. Having the right sensors generating the required data to perform condition assessment, such as fault detection and isolation, with a high degree of confidence is critical to reliable operation of ISHM. Also, the data being generated by the sensors needs to be qualified to ensure that the assessments made by the ISHM is not based on faulty data. NASA Glenn Research Center has been developing technologies for sensor selection and data validation as part of the FDDR (Fault Detection, Diagnosis, and Response) element of the Upper Stage project of the Ares 1 launch vehicle development. This presentation will provide an overview of the GRC approach to sensor selection and data quality validation and will present recent results from applications that are representative of the complexity of propulsion systems for access to space vehicles. A brief overview of the sensor selection and data quality validation approaches is provided below. The NASA GRC developed Systematic Sensor Selection Strategy (S4) is a model-based procedure for systematically and quantitatively selecting an optimal sensor suite to provide overall health assessment of a host system. S4 can be logically partitioned into three major subdivisions: the knowledge base, the down-select

  18. Numerical validation of selected computer programs in nonlinear analysis of steel frame exposed to fire

    Science.gov (United States)

    Maślak, Mariusz; Pazdanowski, Michał; Woźniczka, Piotr

    2018-01-01

    Validation of fire resistance for the same steel frame bearing structure is performed here using three different numerical models, i.e. a bar one prepared in the SAFIR environment, and two 3D models developed within the framework of Autodesk Simulation Mechanical (ASM) and an alternative one developed in the environment of the Abaqus code. The results of the computer simulations performed are compared with the experimental results obtained previously, in a laboratory fire test, on a structure having the same characteristics and subjected to the same heating regimen. Comparison of the experimental and numerically determined displacement evolution paths for selected nodes of the considered frame during the simulated fire exposure constitutes the basic criterion applied to evaluate the validity of the numerical results obtained. The experimental and numerically determined estimates of critical temperature specific to the considered frame and related to the limit state of bearing capacity in fire have been verified as well.

  19. Evaluation of the Relative Validity of the Short Diet Questionnaire for Assessing Usual Consumption Frequencies of Selected Nutrients and Foods

    Directory of Open Access Journals (Sweden)

    Bryna Shatenstein

    2015-08-01

    Full Text Available A 36-item Short Diet Questionnaire (SDQ was developed to assess usual consumption frequencies of foods providing fats, fibre, calcium, vitamin D, in addition to fruits and vegetables. It was pretested among 30 community-dwelling participants from the Québec Longitudinal Study on Nutrition and Successful Aging, “NuAge” (n = 1793, 52.4% women, recruited in three age groups (70 ± 2 years; 75 ± 2 years; 80 ± 2 years. Following revision, the SDQ was administered to 527 NuAge participants (55% female, distributed among the three age groups, both sexes and languages (French, English prior to the second of three non-consecutive 24 h diet recalls (24HR and validated relative to the mean of three 24HR. Full data were available for 396 participants. Most SDQ nutrients and fruit and vegetable servings were lower than 24HR estimates (p < 0.05 except calcium, vitamin D, and saturated and trans fats. Spearman correlations between the SDQ and 24HR were modest and significant (p < 0.01, ranging from 0.19 (cholesterol to 0.45 (fruits and vegetables. Cross-classification into quartiles showed 33% of items were jointly classified into identical quartiles of the distribution, 73% into identical and contiguous quartiles, and only 7% were frankly misclassified. The SDQ is a reasonably accurate, rapid approach for ranking usual frequencies of selected nutrients and foods. Further testing is needed in a broader age range.

  20. Predictive Validity of an Empirical Approach for Selecting Promising Message Topics: A Randomized-Controlled Study

    Science.gov (United States)

    Lee, Stella Juhyun; Brennan, Emily; Gibson, Laura Anne; Tan, Andy S. L.; Kybert-Momjian, Ani; Liu, Jiaying; Hornik, Robert

    2016-01-01

    Several message topic selection approaches propose that messages based on beliefs pretested and found to be more strongly associated with intentions will be more effective in changing population intentions and behaviors when used in a campaign. This study aimed to validate the underlying causal assumption of these approaches which rely on cross-sectional belief–intention associations. We experimentally tested whether messages addressing promising themes as identified by the above criterion were more persuasive than messages addressing less promising themes. Contrary to expectations, all messages increased intentions. Interestingly, mediation analyses showed that while messages deemed promising affected intentions through changes in targeted promising beliefs, messages deemed less promising also achieved persuasion by influencing nontargeted promising beliefs. Implications for message topic selection are discussed. PMID:27867218

  1. Long-Term Prognostic Validity of Talent Selections: Comparing National and Regional Coaches, Laypersons and Novices

    Science.gov (United States)

    Schorer, Jörg; Rienhoff, Rebecca; Fischer, Lennart; Baker, Joseph

    2017-01-01

    In most sports, the development of elite athletes is a long-term process of talent identification and support. Typically, talent selection systems administer a multi-faceted strategy including national coach observations and varying physical and psychological tests when deciding who is chosen for talent development. The aim of this exploratory study was to evaluate the prognostic validity of talent selections by varying groups 10 years after they had been conducted. This study used a unique, multi-phased approach. Phase 1 involved players (n = 68) in 2001 completing a battery of general and sport-specific tests of handball ‘talent’ and performance. In Phase 2, national and regional coaches (n = 7) in 2001 who attended training camps identified the most talented players. In Phase 3, current novice and advanced handball players (n = 12 in each group) selected the most talented from short videos of matches played during the talent camp. Analyses compared predictions among all groups with a best model-fit derived from the motor tests. Results revealed little difference between regional and national coaches in the prediction of future performance and little difference in forecasting performance between novices and players. The best model-fit regression by the motor-tests outperformed all predictions. While several limitations are discussed, this study is a useful starting point for future investigations considering athlete selection decisions in talent identification in sport. PMID:28744238

  2. Long-Term Prognostic Validity of Talent Selections: Comparing National and Regional Coaches, Laypersons and Novices

    Directory of Open Access Journals (Sweden)

    Jörg Schorer

    2017-07-01

    Full Text Available In most sports, the development of elite athletes is a long-term process of talent identification and support. Typically, talent selection systems administer a multi-faceted strategy including national coach observations and varying physical and psychological tests when deciding who is chosen for talent development. The aim of this exploratory study was to evaluate the prognostic validity of talent selections by varying groups 10 years after they had been conducted. This study used a unique, multi-phased approach. Phase 1 involved players (n = 68 in 2001 completing a battery of general and sport-specific tests of handball ‘talent’ and performance. In Phase 2, national and regional coaches (n = 7 in 2001 who attended training camps identified the most talented players. In Phase 3, current novice and advanced handball players (n = 12 in each group selected the most talented from short videos of matches played during the talent camp. Analyses compared predictions among all groups with a best model-fit derived from the motor tests. Results revealed little difference between regional and national coaches in the prediction of future performance and little difference in forecasting performance between novices and players. The best model-fit regression by the motor-tests outperformed all predictions. While several limitations are discussed, this study is a useful starting point for future investigations considering athlete selection decisions in talent identification in sport.

  3. Long-Term Prognostic Validity of Talent Selections: Comparing National and Regional Coaches, Laypersons and Novices.

    Science.gov (United States)

    Schorer, Jörg; Rienhoff, Rebecca; Fischer, Lennart; Baker, Joseph

    2017-01-01

    In most sports, the development of elite athletes is a long-term process of talent identification and support. Typically, talent selection systems administer a multi-faceted strategy including national coach observations and varying physical and psychological tests when deciding who is chosen for talent development. The aim of this exploratory study was to evaluate the prognostic validity of talent selections by varying groups 10 years after they had been conducted. This study used a unique, multi-phased approach. Phase 1 involved players ( n = 68) in 2001 completing a battery of general and sport-specific tests of handball 'talent' and performance. In Phase 2, national and regional coaches ( n = 7) in 2001 who attended training camps identified the most talented players. In Phase 3, current novice and advanced handball players ( n = 12 in each group) selected the most talented from short videos of matches played during the talent camp. Analyses compared predictions among all groups with a best model-fit derived from the motor tests. Results revealed little difference between regional and national coaches in the prediction of future performance and little difference in forecasting performance between novices and players. The best model-fit regression by the motor-tests outperformed all predictions. While several limitations are discussed, this study is a useful starting point for future investigations considering athlete selection decisions in talent identification in sport.

  4. Voltammetric determination of copper in selected pharmaceutical preparations--validation of the method.

    Science.gov (United States)

    Lutka, Anna; Maruszewska, Małgorzata

    2011-01-01

    It were established and validated the conditions of voltammetric determination of copper in pharmaceutical preparations. The three selected preparations: Zincuprim (A), Wapń, cynk, miedź z wit. C (B), Vigor complete (V) contained different salts and different quantity of copper (II) and increasing number of accompanied ingredients. For the purpose to transfer copper into solution, the samples of powdered tablets of the first and second preparation were undergone extraction and of the third the mineralization procedures. The concentration of copper in solution was determined by differential pulse voltammetry (DP) using comparison with standard technique. In the validation process, the selectivity, accuracy, precision and linearity of DP determination of copper in three preparations were estimated. Copper was determined within the concentration range of 1-9 ppm (1-9 microg/mL): the mean recoveries approached 102% (A), 100% (B), 102% (V); the relative standard deviations of determinations (RSD) were 0.79-1.59% (A), 0.62-0.85% (B) and 1.68-2.28% (V), respectively. The mean recoveries and the RSDs of determination satisfied the requirements for the analyte concentration at the level 1-10 ppm. The statistical verification confirmed that the tested voltammetric method is suitable for determination of copper in pharmaceutical preparation.

  5. Prior Elicitation, Assessment and Inference with a Dirichlet Prior

    Directory of Open Access Journals (Sweden)

    Michael Evans

    2017-10-01

    Full Text Available Methods are developed for eliciting a Dirichlet prior based upon stating bounds on the individual probabilities that hold with high prior probability. This approach to selecting a prior is applied to a contingency table problem where it is demonstrated how to assess the prior with respect to the bias it induces as well as how to check for prior-data conflict. It is shown that the assessment of a hypothesis via relative belief can easily take into account what it means for the falsity of the hypothesis to correspond to a difference of practical importance and provide evidence in favor of a hypothesis.

  6. Antibody Selection for Cancer Target Validation of FSH-Receptor in Immunohistochemical Settings

    Directory of Open Access Journals (Sweden)

    Nina Moeker

    2017-10-01

    Full Text Available Background: The follicle-stimulating hormone (FSH-receptor (FSHR has been reported to be an attractive target for antibody therapy in human cancer. However, divergent immunohistochemical (IHC findings have been reported for FSHR expression in tumor tissues, which could be due to the specificity of the antibodies used. Methods: Three frequently used antibodies (sc-7798, sc-13935, and FSHR323 were validated for their suitability in an immunohistochemical study for FSHR expression in different tissues. As quality control, two potential therapeutic anti-hFSHR Ylanthia® antibodies (Y010913, Y010916 were used. The specificity criteria for selection of antibodies were binding to native hFSHR of different sources, and no binding to non-related proteins. The ability of antibodies to stain the paraffin-embedded Flp-In Chinese hamster ovary (CHO/FSHR cells was tested after application of different epitope retrieval methods. Results: From the five tested anti-hFSHR antibodies, only Y010913, Y010916, and FSHR323 showed specific binding to native, cell-presented hFSHR. Since Ylanthia® antibodies were selected to specifically recognize native FSHR, as required for a potential therapeutic antibody candidate, FSHR323 was the only antibody to detect the receptor in IHC/histochemical settings on transfected cells, and at markedly lower, physiological concentrations (ex., in Sertoli cells of human testes. The pattern of FSH323 staining noticed for ovarian, prostatic, and renal adenocarcinomas indicated that FSHR was expressed mainly in the peripheral tumor blood vessels. Conclusion: Of all published IHC antibodies tested, only antibody FSHR323 proved suitable for target validation of hFSHR in an IHC setting for cancer. Our studies could not confirm the previously reported FSHR overexpression in ovarian and prostate cancer cells. Instead, specific overexpression in peripheral tumor blood vessels could be confirmed after thorough validation of the antibodies used.

  7. Use of the recognition heuristic depends on the domain's recognition validity, not on the recognition validity of selected sets of objects.

    Science.gov (United States)

    Pohl, Rüdiger F; Michalkiewicz, Martha; Erdfelder, Edgar; Hilbig, Benjamin E

    2017-07-01

    According to the recognition-heuristic theory, decision makers solve paired comparisons in which one object is recognized and the other not by recognition alone, inferring that recognized objects have higher criterion values than unrecognized ones. However, success-and thus usefulness-of this heuristic depends on the validity of recognition as a cue, and adaptive decision making, in turn, requires that decision makers are sensitive to it. To this end, decision makers could base their evaluation of the recognition validity either on the selected set of objects (the set's recognition validity), or on the underlying domain from which the objects were drawn (the domain's recognition validity). In two experiments, we manipulated the recognition validity both in the selected set of objects and between domains from which the sets were drawn. The results clearly show that use of the recognition heuristic depends on the domain's recognition validity, not on the set's recognition validity. In other words, participants treat all sets as roughly representative of the underlying domain and adjust their decision strategy adaptively (only) with respect to the more general environment rather than the specific items they are faced with.

  8. Validity of selected physical activity questions in white Seventh-day Adventists and non-Adventists.

    Science.gov (United States)

    Singh, P N; Tonstad, S; Abbey, D E; Fraser, G E

    1996-08-01

    The validity and reliability of selected physical activity questions were assessed in both Seventh-day Adventist (N = 131) and non-Adventist (N = 101) study groups. Vigorous activity questions similar to those used by others and new questions that measured moderate and light activities were included. Validation was external, comparing questionnaire data with treadmill exercise time, resting heart rate, and body mass index (kg.m-2), and internal, comparing data with other similar questions. Both Adventist and non-Adventist males showed significant age-adjusted correlations between treadmill time and a "Run-Walk-Jog Index" (R = 0.28, R = 0.48, respectively). These correlations increased substantially when restricting analysis to exercise speeds exceeding 3 mph (R = 0.39, R = 0.71, respectively). Frequency of sweating and a vigorous physical activity index also correlated significantly with treadmill time in males. Correlations were generally weaker in females. Moderate- and light-intensity questions were not correlated with physical fitness. Internal correlations R = 0.50-0.78) between the above three vigorous activity questions were significant in all groups, and correlations (R = 0.14-0.60) for light and moderate activity questions were also documented. Test-retest reliability coefficients were high for vigorous activity questions (R = 0.48-0.85) and for one set of moderate activity questions (R = 0.43-0.75). No important differences in validity and reliability were found between Adventist and non-Adventists, but the validity of vigorous activity measures was generally weaker in females.

  9. BRBN-T validation: adaptation of the Selective Reminding Test and Word List Generation

    Directory of Open Access Journals (Sweden)

    Mariana Rigueiro Neves

    2015-10-01

    Full Text Available Objective This study aims to present the Selective Reminding Test(SRT and Word List Generation (WLG adaptation to the Portuguese population, within the validation of the Brief Repeatable Battery of Neuropsychological Tests (BRBN-Tfor multiple sclerosis (MS patients.Method 66 healthy participants (54.5% female recruited from the community volunteered to participate in this study.Results A combination of procedures from Classical Test Theory (CTT and Item Response Theory (ITR were applied to item analysis and selection. For each SRT list, 12 words were selected and 3 letters were chosen for WLG to constitute the final versions of these tests for the Portuguese population.Conclusion The combination of CTT and ITR maximized the decision making process in the adaptation of the SRT and WLG to a different culture and language (Portuguese. The relevance of this study lies on the production of reliable standardized neuropsychological tests, so that they can be used to facilitate a more rigorous monitoring of the evolution of MS, as well as any therapeutic effects and cognitive rehabilitation.

  10. Mixed hemimicelles solid-phase extraction based on sodium dodecyl sulfate-coated nano-magnets for selective adsorption and enrichment of illegal cationic dyes in food matrices prior to high-performance liquid chromatography-diode array detection detection.

    Science.gov (United States)

    Qi, Ping; Liang, Zhi-An; Wang, Yu; Xiao, Jian; Liu, Jia; Zhou, Qing-Qiong; Zheng, Chun-Hao; Luo, Li-Ni; Lin, Zi-Hao; Zhu, Fang; Zhang, Xue-Wu

    2016-03-11

    In this study, mixed hemimicelles solid-phase extraction (MHSPE) based on sodium dodecyl sulfate (SDS) coated nano-magnets Fe3O4 was investigated as a novel method for the extraction and separation of four banned cationic dyes, Auramine O, Rhodamine B, Basic orange 21 and Basic orange 22, in condiments prior to HPLC detection. The main factors affecting the extraction of analysts, such as pH, surfactant and adsorbent concentrations and zeta potential were studied and optimized. Under optimized conditions, the proposed method was successful applied for the analysis of banned cationic dyes in food samples such as chili sauce, soybean paste and tomato sauce. Validation data showed the good recoveries in the range of 70.1-104.5%, with relative standard deviations less than 15%. The method limits of determination/quantification were in the range of 0.2-0.9 and 0.7-3μgkg(-1), respectively. The selective adsorption and enrichment of cationic dyes were achieved by the synergistic effects of hydrophobic interactions and electrostatic attraction between mixed hemimicelles and the cationic dyes, which also resulted in the removal of natural pigments interferences from sample extracts. When applied to real samples, RB was detected in several positive samples (chili powders) within the range from 0.042 to 0.177mgkg(-1). These results indicate that magnetic MHSPE is an efficient and selective sample preparation technique for the extraction of banned cationic dyes in a complex matrix. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Use of the "Intervention Selection Profile-Social Skills" to Identify Social Skill Acquisition Deficits: A Preliminary Validation Study

    Science.gov (United States)

    Kilgus, Stephen P.; von der Embse, Nathaniel P.; Scott, Katherine; Paxton, Sara

    2015-01-01

    The purpose of this investigation was to develop and initially validate the "Intervention Selection Profile-Social Skills" (ISP-SS), a novel brief social skills assessment method intended for use at Tier 2. Participants included 54 elementary school teachers and their 243 randomly selected students. Teachers rated students on two rating…

  12. Predictive validity of the personal qualities assessment for selection of medical students in Scotland.

    Science.gov (United States)

    Dowell, Jon; Lumsden, Mary Ann; Powis, David; Munro, Don; Bore, Miles; Makubate, Boikanyo; Kumwenda, Ben

    2011-01-01

    The Personal Qualities Assessment (PQA) was developed to enhance medical student selection by measuring a range of non-cognitive attributes in the applicants to medical school. Applicants to the five Scottish medical schools were invited to pilot the test in 2001 and 2002. To evaluate the predictive validity of PQA for selecting medical students. A longitudinal cohort study was conducted in which PQA scores were compared with senior year medical school performance. Consent to access performance markers was obtained from 626 students (61.6% of 1017 entrants in 2002-2003). Linkable Foundation Year (4th) rankings were available for 411 (66%) students and objective structured clinical examination (OSCE) rankings for 335 (54%) of those consenting. Both samples were representative of the original cohort. No significant correlations were detected between separate elements of the PQA assessment and student performance. However, using the algorithm advocated by Powis et al. those defined as 'non-extreme' (libertarian-communitarian moral orientation scales were ranked higher in OSCEs (average of 7.5% or 25 out of 335, p = 0.049). This study was limited by high attrition and basic outcome markers which are insensitive to relevant non-cognitive characteristics. However, it is the largest currently available study of predictive validity for the PQA assessment. There was one finding of significance: that those students who were identified by PQA as 'not extreme' on the two personal characteristics scales performed better in an OSCE measure of professionalism. Futures studies are required since psychometric testing for both cognitive and non-cognitive attributes are increasingly used in admission process and these should include more and better measures of professionalism against which to correlate non-cognitive traits.

  13. Reconsidering vocational interests for personnel selection: the validity of an interest-based selection test in relation to job knowledge, job performance, and continuance intentions.

    Science.gov (United States)

    Van Iddekinge, Chad H; Putka, Dan J; Campbell, John P

    2011-01-01

    Although vocational interests have a long history in vocational psychology, they have received extremely limited attention within the recent personnel selection literature. We reconsider some widely held beliefs concerning the (low) validity of interests for predicting criteria important to selection researchers, and we review theory and empirical evidence that challenge such beliefs. We then describe the development and validation of an interests-based selection measure. Results of a large validation study (N = 418) reveal that interests predicted a diverse set of criteria—including measures of job knowledge, job performance, and continuance intentions—with corrected, cross-validated Rs that ranged from .25 to .46 across the criteria (mean R = .31). Interests also provided incremental validity beyond measures of general cognitive aptitude and facets of the Big Five personality dimensions in relation to each criterion. Furthermore, with a couple exceptions, the interest scales were associated with small to medium subgroup differences, which in most cases favored women and racial minorities. Taken as a whole, these results appear to call into question the prevailing thought that vocational interests have limited usefulness for selection.

  14. Criteria for validation and selection of cognitive tests for investigating the effects of foods and nutrients.

    Science.gov (United States)

    de Jager, Celeste A; Dye, Louise; de Bruin, Eveline A; Butler, Laurie; Fletcher, John; Lamport, Daniel J; Latulippe, Marie E; Spencer, Jeremy P E; Wesnes, Keith

    2014-03-01

    This review is an output of the International Life Sciences Institute (ILSI) Europe Marker Initiative, which aims to identify evidence-based criteria for selecting adequate measures of nutrient effects on health through comprehensive literature review. Experts in cognitive and nutrition sciences examined the applicability of these proposed criteria to the field of cognition with respect to the various cognitive domains usually assessed to reflect brain or neurological function. This review covers cognitive domains important in the assessment of neuronal integrity and function, commonly used tests and their state of validation, and the application of the measures to studies of nutrition and nutritional intervention trials. The aim is to identify domain-specific cognitive tests that are sensitive to nutrient interventions and from which guidance can be provided to aid the application of selection criteria for choosing the most suitable tests for proposed nutritional intervention studies using cognitive outcomes. The material in this review serves as a background and guidance document for nutritionists, neuropsychologists, psychiatrists, and neurologists interested in assessing mental health in terms of cognitive test performance and for scientists intending to test the effects of food or food components on cognitive function.

  15. On the selection of shape and orientation of a greenhouse. Thermal modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Sethi, V.P. [Department of Mechanical Engineering, Punjab Agricultural University, Ludhiana 141 004, Punjab (India)

    2009-01-15

    In this study, five most commonly used single span shapes of greenhouses viz. even-span, uneven-span, vinery, modified arch and quonset type have been selected for comparison. The length, width and height (at the center) are kept same for all the selected shapes. A mathematical model for computing transmitted total solar radiation (beam, diffused and ground reflected) at each hour, for each month and at any latitude for the selected geometry greenhouses (through each wall, inclined surfaces and roofs) is developed for both east-west and north-south orientation. Computed transmitted solar radiation is then introduced in a transient thermal model developed to compute hourly inside air temperature for each shape and orientation. Experimental validation of both the models is carried out for the measured total solar radiation and inside air temperature for an east-west orientation, even-span greenhouse (for a typical day in summer) at Ludhiana (31 N and 77 E) Punjab, India. During the experimentation, capsicum crop is grown inside the greenhouse. The predicted and measured values are in close agreement. Results show that uneven-span shape greenhouse receives the maximum and quonset shape receives the minimum solar radiation during each month of the year at all latitudes. East-west orientation is the best suited for year round greenhouse applications at all latitudes as this orientation receives greater total radiation in winter and less in summer except near the equator. Results also show that inside air temperature rise depends upon the shape of the greenhouse and this variation from uneven-span shape to quonset shape is 4.6 C (maximum) and 3.5 C (daily average) at 31 N latitude. (author)

  16. Validated HPTLC methods for determination of some selected antihypertensive mixtures in their combined dosage forms

    Directory of Open Access Journals (Sweden)

    Rasha A. Shaalan

    2014-12-01

    Full Text Available Simple and selective HPTLC methods were developed for the simultaneous determination of the antihypertensive drugs; carvedilol and hydrochlorothiazide in their binary mixture (Mixture I and amlodipine besylate, valsartan, and hydrochlorothiazide in their combined ternary formulation (Mixture II. Effective chromatographic separation was achieved on Fluka TLC plates 20 × 20 cm aluminum cards, 0.2 mm thickness through linear ascending development. For Mixture I, the mobile phase composed of chloroform–methanol in the ratio 8:2 v/v. Detection was performed at 254 nm for both carvedilol and hydrochlorothiazide. For Mixture II, the mobile phase was chloroform–methanol–ammonia in the volume ratio 8:2:0.1. Detection was performed at 254 nm for valsartan and hydrochlorothiazide, and at 365 nm for amlodipine. Quantification was based on spectrodensitometric analysis. Analytical performance of the proposed HPTLC procedures was statistically validated with respect to linearity, ranges, precision, accuracy, specificity, robustness, detection and quantification limits. The linearity ranges were 0.05–1.0 and 0.1–2.0 μg/spot for carvedilol and hydrochlorothiazide, respectively in Mixture I, 0.1–2.0, 0.1–2.0 and 0.2–4.0 μg/spot for amlodipine, hydrochlorothiazide and valsartan, respectively in Mixture II, with correlation coefficients >0.9992. The validated HPTLC methods were applied to the analysis of the cited antihypertensive drugs in their combined pharmaceutical tablets. The proposed methods confirmed peak identity and purity.

  17. Sixteen-row multislice computed tomography in the assessment of pulmonary veins prior to ablative treatment: validation vs conventional pulmonary venography and study of reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Maksimovic, R.; Cademartiri, F.; Pattynama, P.M.T. [Erasmus Medical Center, Rotterdam (Netherlands). Dept. of Radiology; Scholten, M; Jordaens, L.J. [Erasmus Medical Center, Rotterdam (Netherlands). Dept. of Cardiology

    2004-03-01

    The aim of this study was to validate multislice computed tomography (MSCT) venography measurements of pulmonary vein (PV) diameters vs conventional pulmonary venography (CPV), and to assess the reproducibility of MSCT data. The study included 21 consecutive patients with atrial fibrillation who were planned for cryothermal ablation of PVs. One day before ablation, all patients underwent CPV and contrast-enhanced non-gated MSCT venography. The MSCT was repeated 3 months after ablation. The CPV images of the treated PVs (n=40) were analyzed and compared with the results of MSCT measurements. Reproducibility of MSCT venography-based data was assessed by interobserver (n=84 PVs) and interexamination (n=44 PVs) variability. Pre-treatment PV diameters on MSCT and CPV showed good correlation (r=0.87, p<0.01; 18.9{+-}2.3 mm, 188.5{+-}2.4 mm, respectively). Interobserver agreement and interexamination reproducibility were good (r=0.91, r=0.82, respectively, p<0.01), with narrow limits of agreement (Bland and Altman method). The MSCT venography allows accurate and reproducible assessment of PVs. It can be used both in non-invasive planning of treatment for ablative therapy and in the follow-up of patients.

  18. The reactor kinetics code tank: a validation against selected SPERT-1b experiments

    International Nuclear Information System (INIS)

    Ellis, R.J.

    1990-01-01

    The two-dimensional space-time analysis code TANK is being developed for the simulation of transient behaviour in the MAPLE class of research reactors. MAPLE research reactor cores are compact, light-water-cooled and -moderated, with a high degree of forced subcooling. The SPERT-1B(24/32) reactor core had many similarities to MAPLE-X10, and the results of the SPERT transient experiments are well documented. As a validation of TANK, a series of simulations of certain SPERT reactor transients was undertaken. Special features were added to the TANK code to model reactors with plate-type fuel and to allow for the simulation of rapid void production. The results of a series of super-prompt-critical reactivity step-insertion transient simulations are presented. The selected SPERT transients were all initiated from low power, at ambient temperatures, and with negligible coolant flow. Th results of the TANK simulations are in good agreement with the trends in the experimental SPERT data

  19. Bacterial selection for biological control of plant disease: criterion determination and validation

    Directory of Open Access Journals (Sweden)

    Monalize Salete Mota

    Full Text Available Abstract This study aimed to evaluate the biocontrol potential of bacteria isolated from different plant species and soils. The production of compounds related to phytopathogen biocontrol and/or promotion of plant growth in bacterial isolates was evaluated by measuring the production of antimicrobial compounds (ammonia and antibiosis and hydrolytic enzymes (amylases, lipases, proteases, and chitinases and phosphate solubilization. Of the 1219 bacterial isolates, 92% produced one or more of the eight compounds evaluated, but only 1% of the isolates produced all the compounds. Proteolytic activity was most frequently observed among the bacterial isolates. Among the compounds which often determine the success of biocontrol, 43% produced compounds which inhibit mycelial growth of Monilinia fructicola, but only 11% hydrolyzed chitin. Bacteria from different plant species (rhizosphere or phylloplane exhibited differences in the ability to produce the compounds evaluated. Most bacterial isolates with biocontrol potential were isolated from rhizospheric soil. The most efficient bacteria (producing at least five compounds related to phytopathogen biocontrol and/or plant growth, 86 in total, were evaluated for their biocontrol potential by observing their ability to kill juvenile Mesocriconema xenoplax. Thus, we clearly observed that bacteria that produced more compounds related to phytopathogen biocontrol and/or plant growth had a higher efficacy for nematode biocontrol, which validated the selection strategy used.

  20. Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Ethene Sites

    Science.gov (United States)

    2015-12-01

    FINAL REPORT Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...TITLE AND SUBTITLE Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...project ER-201129 was to develop and validate a framework used to make bioremediation decisions based on site-specific physical and biogeochemical

  1. A Bayesian framework for adaptive selection, calibration, and validation of coarse-grained models of atomistic systems

    Energy Technology Data Exchange (ETDEWEB)

    Farrell, Kathryn, E-mail: kfarrell@ices.utexas.edu; Oden, J. Tinsley, E-mail: oden@ices.utexas.edu; Faghihi, Danial, E-mail: danial@ices.utexas.edu

    2015-08-15

    A general adaptive modeling algorithm for selection and validation of coarse-grained models of atomistic systems is presented. A Bayesian framework is developed to address uncertainties in parameters, data, and model selection. Algorithms for computing output sensitivities to parameter variances, model evidence and posterior model plausibilities for given data, and for computing what are referred to as Occam Categories in reference to a rough measure of model simplicity, make up components of the overall approach. Computational results are provided for representative applications.

  2. A Bayesian framework for adaptive selection, calibration, and validation of coarse-grained models of atomistic systems

    Science.gov (United States)

    Farrell, Kathryn; Oden, J. Tinsley; Faghihi, Danial

    2015-08-01

    A general adaptive modeling algorithm for selection and validation of coarse-grained models of atomistic systems is presented. A Bayesian framework is developed to address uncertainties in parameters, data, and model selection. Algorithms for computing output sensitivities to parameter variances, model evidence and posterior model plausibilities for given data, and for computing what are referred to as Occam Categories in reference to a rough measure of model simplicity, make up components of the overall approach. Computational results are provided for representative applications.

  3. The predictive validity of selection for entry into postgraduate training in general practice: evidence from three longitudinal studies.

    Science.gov (United States)

    Patterson, Fiona; Lievens, Filip; Kerrin, Máire; Munro, Neil; Irish, Bill

    2013-11-01

    The selection methodology for UK general practice is designed to accommodate several thousand applicants per year and targets six core attributes identified in a multi-method job-analysis study To evaluate the predictive validity of selection methods for entry into postgraduate training, comprising a clinical problem-solving test, a situational judgement test, and a selection centre. A three-part longitudinal predictive validity study of selection into training for UK general practice. In sample 1, participants were junior doctors applying for training in general practice (n = 6824). In sample 2, participants were GP registrars 1 year into training (n = 196). In sample 3, participants were GP registrars sitting the licensing examination after 3 years, at the end of training (n = 2292). The outcome measures include: assessor ratings of performance in a selection centre comprising job simulation exercises (sample 1); supervisor ratings of trainee job performance 1 year into training (sample 2); and licensing examination results, including an applied knowledge examination and a 12-station clinical skills objective structured clinical examination (OSCE; sample 3). Performance ratings at selection predicted subsequent supervisor ratings of job performance 1 year later. Selection results also significantly predicted performance on both the clinical skills OSCE and applied knowledge examination for licensing at the end of training. In combination, these longitudinal findings provide good evidence of the predictive validity of the selection methods, and are the first reported for entry into postgraduate training. Results show that the best predictor of work performance and training outcomes is a combination of a clinical problem-solving test, a situational judgement test, and a selection centre. Implications for selection methods for all postgraduate specialties are considered.

  4. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    Science.gov (United States)

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used

  5. Construct validity of selected measures of physical activity beliefs and motives in fifth and sixth grade boys and girls.

    Science.gov (United States)

    Dishman, Rod K; Saunders, Ruth P; McIver, Kerry L; Dowda, Marsha; Pate, Russell R

    2013-06-01

    Scales used to measure selected social-cognitive beliefs and motives for physical activity were tested among boys and girls. Covariance modeling was applied to responses obtained from large multi-ethnic samples of students in the fifth and sixth grades. Theoretically and statistically sound models were developed, supporting the factorial validity of the scales in all groups. Multi-group longitudinal invariance was confirmed between boys and girls, overweight and normal weight students, and non-Hispanic black and white children. The construct validity of the scales was supported by hypothesized convergent and discriminant relationships within a measurement model that included correlations with physical activity (MET • min/day) measured by an accelerometer. Scores from the scales provide valid assessments of selected beliefs and motives that are putative mediators of change in physical activity among boys and girls, as they begin the understudied transition from the fifth grade into middle school, when physical activity naturally declines.

  6. Glucose Injections into the Dorsal Hippocampus or Dorsolateral Striatum of Rats Prior to T-Maze Training: Modulation of Learning Rates and Strategy Selection

    Science.gov (United States)

    Canal, Clinton E.; Stutz, Sonja J.; Gold, Paul E.

    2005-01-01

    The present experiments examined the effects of injecting glucose into the dorsal hippocampus or dorsolateral striatum on learning rates and on strategy selection in rats trained on a T-maze that can be solved by using either a hippocampus-sensitive place or striatum-sensitive response strategy. Percentage strategy selection on a probe trial…

  7. Concurrent Validation of Experimental Army Enlisted Personnel Selection and Classification Measures

    National Research Council Canada - National Science Library

    Knapp, Deirdre J; Tremble, Trueman R

    2007-01-01

    .... This report documents the method and results of the criterion-related validation. The predictor set includes measures of cognitive ability, temperament, psychomotor skills, values, expectations...

  8. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Science.gov (United States)

    2010-07-01

    .... 1607.6 Section 1607.6 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should...

  9. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    Science.gov (United States)

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.

  10. Used-habitat calibration plots: A new procedure for validating species distribution, resource selection, and step-selection models

    Science.gov (United States)

    Fieberg, John R.; Forester, James D.; Street, Garrett M.; Johnson, Douglas H.; ArchMiller, Althea A.; Matthiopoulos, Jason

    2018-01-01

    “Species distribution modeling” was recently ranked as one of the top five “research fronts” in ecology and the environmental sciences by ISI's Essential Science Indicators (Renner and Warton 2013), reflecting the importance of predicting how species distributions will respond to anthropogenic change. Unfortunately, species distribution models (SDMs) often perform poorly when applied to novel environments. Compounding on this problem is the shortage of methods for evaluating SDMs (hence, we may be getting our predictions wrong and not even know it). Traditional methods for validating SDMs quantify a model's ability to classify locations as used or unused. Instead, we propose to focus on how well SDMs can predict the characteristics of used locations. This subtle shift in viewpoint leads to a more natural and informative evaluation and validation of models across the entire spectrum of SDMs. Through a series of examples, we show how simple graphical methods can help with three fundamental challenges of habitat modeling: identifying missing covariates, non-linearity, and multicollinearity. Identifying habitat characteristics that are not well-predicted by the model can provide insights into variables affecting the distribution of species, suggest appropriate model modifications, and ultimately improve the reliability and generality of conservation and management recommendations.

  11. Feature selection for anomaly–based network intrusion detection using cluster validity indices

    CSIR Research Space (South Africa)

    Naidoo, Tyrone

    2015-09-01

    Full Text Available data, which is rarely available in operational networks. It uses normalized cluster validity indices as an objective function that is optimized over the search space of candidate feature subsets via a genetic algorithm. Feature sets produced...

  12. Do Current Recommendations for Upper Instrumented Vertebra Predict Shoulder Imbalance? An Attempted Validation of Level Selection for Adolescent Idiopathic Scoliosis.

    Science.gov (United States)

    Bjerke, Benjamin T; Cheung, Zoe B; Shifflett, Grant D; Iyer, Sravisht; Derman, Peter B; Cunningham, Matthew E

    2015-10-01

    Shoulder balance for adolescent idiopathic scoliosis (AIS) patients is associated with patient satisfaction and self-image. However, few validated systems exist for selecting the upper instrumented vertebra (UIV) post-surgical shoulder balance. The purpose is to examine the existing UIV selection criteria and correlate with post-surgical shoulder balance in AIS patients. Patients who underwent spinal fusion at age 10-18 years for AIS over a 6-year period were reviewed. All patients with a minimum of 1-year radiographic follow-up were included. Imbalance was determined to be radiographic shoulder height |RSH| ≥ 15 mm at latest follow-up. Three UIV selection methods were considered: Lenke, Ilharreborde, and Trobisch. A recommended UIV was determined using each method from pre-surgical radiographs. The recommended UIV for each method was compared to the actual UIV instrumented for all three methods; concordance between these levels was defined as "Correct" UIV selection, and discordance was defined as "Incorrect" selection. One hundred seventy-one patients were included with 2.3 ± 1.1 year follow-up. For all methods, "Correct" UIV selection resulted in more shoulder imbalance than "Incorrect" UIV selection. Overall shoulder imbalance incidence was improved from 31.0% (53/171) to 15.2% (26/171). New shoulder imbalance incidence for patients with previously level shoulders was 8.8%. We could not identify a set of UIV selection criteria that accurately predicted post-surgical shoulder balance. Further validated measures are needed in this area. The complexity of proximal thoracic curve correction is underscored in a case example, where shoulder imbalance occurred despite "Correct" UIV selection by all methods.

  13. Rapid and selective derivatizatin method for the nitrogen-sensitive detection of carboxylic acids in biological fluids prior to gas chromatographic analysis

    NARCIS (Netherlands)

    Lingeman, H.; Haan, H.B.P.; Hulshoff, A.

    1984-01-01

    A rapid and selective derivatization procedure is described for the pre-column labelling of carboxylic acids with a nitrogen-containing label. The carboxylic acid function is activated with 2-bromo-1-methylpyridinium iodide and the activated carboxylic acid function reacts with a primary or a

  14. Use of selective cyclooxygenase-2 inhibitors and nonselective nonsteroidal antiinflammatory drugs in high doses increases mortality and risk of reinfarction in patients with prior myocardial infarction

    DEFF Research Database (Denmark)

    Sørensen, Rikke; Abildstrøm, Steen Zabell; Torp-Pedersen, C.

    2008-01-01

    The selective cyclooxygenase-2 (COX-2) inhibitors and other nonselective nonsteroidal antiinflammatory drugs (NSAIDs) have been associated with increased cardiovascular risk, but the risk in patients with established cardiovascular disease is unknown. In the present study, we analyzed the risk of...

  15. A Selected Reaction Monitoring Mass Spectrometry Protocol for Validation of Proteomic Biomarker Candidates in Studies of Psychiatric Disorders.

    Science.gov (United States)

    Reis-de-Oliveira, Guilherme; Garcia, Sheila; Guest, Paul C; Cassoli, Juliana S; Martins-de-Souza, Daniel

    2017-01-01

    Most biomarker candidates arising from proteomic studies of psychiatric disorders have not progressed for use in clinical studies due to insufficient validation steps. Here we describe a selective reaction monitoring mass spectrometry (SRM-MS) approach that could be used as a follow-up validation tool of proteins identified in blood serum or plasma. This protocol specifically covers the stages of peptide selection and optimization. The increasing application of SRM-MS should enable fast, sensitive, and robust methods with the potential for use in clinical studies involving sampling of serum or plasma. Understanding the molecular mechanisms and identifying potential biomarkers for risk assessment, diagnosis, prognosis, and prediction of drug response goes toward the implementation of translational medicine strategies for improved treatment of patients with psychiatric disorders and other debilitating diseases.

  16. Development, Validation and Summative Evaluation of Card Pairing Games for Selected Math 8 Topics

    Directory of Open Access Journals (Sweden)

    Ronald O. Ocampo

    2015-12-01

    Full Text Available Traditional classroom situation where students are taught predominantly of lecture-discussion method put the classroom in a mathophobic atmosphere. Oftentimes, students exposed to this classroom atmosphere lead to math anxiety and eventually hate the subject and the teacher. Addressing this, varied interactive strategies to create an atmosphere of discourse has been developed and promoted. The use of instructional games has been viewed as one strategy that promotes active learning inside the classroom. Instructional games support constructivist learning and social learning. This study is aimed at developing, validating and evaluating card pairing games for specific topics in Math 8. The Research and Development model ( R& D was used. The card pairing games was validated by subject experts and experts in developing games. In evaluating the card pairing games, the Quasi-Experimental Pretest-Posttest design was used. There are six card pairing games developed for specific topics in Math 8; the card pairing game is highly valid based on the result of the validation; Students exposed to card pairing game become more intact (homogeneous; Students exposed to card games enhance academic performance. It is recommended to test the effectiveness of card pairing games to other group of students; Encourage math teachers to use the developed math card pairing games for classroom instruction; Develop other card pairing game for specific topics in math.

  17. Feature selection for anomaly–based network intrusion detection using cluster validity indices

    CSIR Research Space (South Africa)

    Naidoo, T

    2015-09-01

    Full Text Available for Anomaly–Based Network Intrusion Detection Using Cluster Validity Indices Tyrone Naidoo_, Jules–Raymond Tapamoy, Andre McDonald_ Modelling and Digital Science, Council for Scientific and Industrial Research, South Africa 1tnaidoo2@csir.co.za 3...

  18. Reliability and Validity of Selected PROMIS Measures in People with Rheumatoid Arthritis.

    Directory of Open Access Journals (Sweden)

    Susan J Bartlett

    Full Text Available To evaluate the reliability and validity of 11 PROMIS measures to assess symptoms and impacts identified as important by people with rheumatoid arthritis (RA.Consecutive patients (N = 177 in an observational study completed PROMIS computer adapted tests (CATs and a short form (SF assessing pain, fatigue, physical function, mood, sleep, and participation. We assessed test-test reliability and internal consistency using correlation and Cronbach's alpha. We assessed convergent validity by examining Pearson correlations between PROMIS measures and existing measures of similar domains and known groups validity by comparing scores across disease activity levels using ANOVA.Participants were mostly female (82% and white (83% with mean (SD age of 56 (13 years; 24% had ≤ high school, 29% had RA ≤ 5 years with 13% ≤ 2 years, and 22% were disabled. PROMIS Physical Function, Pain Interference and Fatigue instruments correlated moderately to strongly (rho's ≥ 0.68 with corresponding PROs. Test-retest reliability ranged from .725-.883, and Cronbach's alpha from .906-.991. A dose-response relationship with disease activity was evident in Physical Function with similar trends in other scales except Anger.These data provide preliminary evidence of reliability and construct validity of PROMIS CATs to assess RA symptoms and impacts, and feasibility of use in clinical care. PROMIS instruments captured the experiences of RA patients across the broad continuum of RA symptoms and function, especially at low disease activity levels. Future research is needed to evaluate performance in relevant subgroups, assess responsiveness and identify clinically meaningful changes.

  19. Concurrent Validation of Experimental Army Enlisted Personnel Selection and Classification Measures

    Science.gov (United States)

    2007-08-01

    Megan Shay, Mary Warthen, Gordon Waugh, and Shelly West. 3 The samples sizes in Table 2.1 represent the number of participants who completed Soldier...scores. This finding casts serious doubt on a fundamental assumption underlying the construction and validation of interest-based P-E fit measures... cast on dominant arm, using other arm to respond). The final sample size was 755. Psychometric Properties Table 12.1 reports the means, standard

  20. Selecting the "Best" Factor Structure and Moving Measurement Validation Forward: An Illustration.

    Science.gov (United States)

    Schmitt, Thomas A; Sass, Daniel A; Chappelle, Wayne; Thompson, William

    2018-04-09

    Despite the broad literature base on factor analysis best practices, research seeking to evaluate a measure's psychometric properties frequently fails to consider or follow these recommendations. This leads to incorrect factor structures, numerous and often overly complex competing factor models and, perhaps most harmful, biased model results. Our goal is to demonstrate a practical and actionable process for factor analysis through (a) an overview of six statistical and psychometric issues and approaches to be aware of, investigate, and report when engaging in factor structure validation, along with a flowchart for recommended procedures to understand latent factor structures; (b) demonstrating these issues to provide a summary of the updated Posttraumatic Stress Disorder Checklist (PCL-5) factor models and a rationale for validation; and (c) conducting a comprehensive statistical and psychometric validation of the PCL-5 factor structure to demonstrate all the issues we described earlier. Considering previous research, the PCL-5 was evaluated using a sample of 1,403 U.S. Air Force remotely piloted aircraft operators with high levels of battlefield exposure. Previously proposed PCL-5 factor structures were not supported by the data, but instead a bifactor model is arguably more statistically appropriate.

  1. Integrated account of method, site selection and programme prior to the site investigation phase[Planning for a Swedish repository for spent nuclear fuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-12-01

    applications and have these applications reviewed by the appropriate authorities. An analysis of conceivable alternatives for managing and disposing of spent nuclear fuel has confirmed that deep geological disposal according to the KBS-3 method has the best prospects of meeting all requirements. The alternative of putting off a decision until some future time (the zero alternative) does not appear tenable. The assessment of long-term safety shows that the prospects of building a safe deep repository in the Swedish bedrock are good. Independent Swedish and international review of the safety assessment confirm that the body of data in this respect is adequate for the siting process to proceed to the site investigation phase. A fuller summary is given below of the account given in this report of method as well as site selection and programme for the site investigation phase. The point of departure for the account is the review comments made by the regulatory authorities and the Government's decision regarding RD and D-Programme 98. In its decision, the Government stipulated conditions for SKB's continued research and development programme. The analysis of alternative system designs was to be supplemented, mainly with regard to the zero alternative and very deep boreholes. Furthermore, the Government decided that SKB shall submit an integrated evaluation of completed feasibility studies and other background material for selection of sites for site investigations and present a clear programme for site investigations.

  2. Optimization and validation of highly selective microfluidic integrated silicon nanowire chemical sensor

    Science.gov (United States)

    Ehfaed, Nuri. A. K. H.; Bathmanathan, Shillan A. L.; Dhahi, Th S.; Adam, Tijjani; Hashim, Uda; Noriman, N. Z.

    2017-09-01

    The study proposed characterization and optimization of silicon nanosensor for specific detection of heavy metal. The sensor was fabricated in-house and conventional photolithography coupled with size reduction via dry etching process in an oxidation furnace. Prior to heavy metal heavy metal detection, the capability to aqueous sample was determined utilizing serial DI water at various. The sensor surface was surface modified with Organofunctional alkoxysilanes (3-aminopropyl) triethoxysilane (APTES) to create molecular binding chemistry. This has allowed interaction between heavy metals being measured and the sensor component resulting in increasing the current being measured. Due to its, excellent detection capabilities, this sensor was able to identify different group heavy metal species. The device was further integrated with sub-50 µm for chemical delivery.

  3. Risk prediction models for selection of lung cancer screening candidates: A retrospective validation study

    NARCIS (Netherlands)

    K. ten Haaf (Kevin); J. Jeon (Jihyoun); M.C. Tammemagi (Martin); S.S. Han (Summer); C.Y. Kong (Chung Yin); S.K. Plevritis (Sylvia); E. Feuer (Eric); H.J. de Koning (Harry); E.W. Steyerberg (Ewout W.); R. Meza (Rafael)

    2017-01-01

    textabstractBackground: Selection of candidates for lung cancer screening based on individual risk has been proposed as an alternative to criteria based on age and cumulative smoking exposure (pack-years). Nine previously established risk models were assessed for their ability to identify those most

  4. The predictive validity of personality tests in air traffic controller selection

    NARCIS (Netherlands)

    Roe, R.A.; Oprins, E.A.P.B.; Geven, E.

    2012-01-01

    A brief historical review of test methods used for selecting air traffic controllers (ATCOs) shows that in contrast to e.g. ability tests and job samples, personality tests have been used rather infrequently. The lesser popularity of personality tests may be explained from the belief that

  5. Regional differences in the validity of self-reported use of health care in Belgium: selection versus reporting bias

    Directory of Open Access Journals (Sweden)

    J. Van der Heyden

    2016-08-01

    Full Text Available Abstract Background The Health Care Module of the European Health Interview Survey (EHIS is aimed to obtain comparable information on the use of inpatient and ambulatory care in all EU member states. In this study we assessed the validity of self-reported information on the use of health care, collected through this instrument, in the Belgian Health Interview Survey (BHIS, and explored the impact of selection and reporting bias on the validity of regional differences in health care use observed in the BHIS. Methods To assess reporting bias, self-reported BHIS 2008 data were linked with register-based data from the Belgian compulsory health insurance (BCHI. The latter were compared with similar estimates from a random sample of the BCHI to investigate the selection bias. Outcome indicators included the prevalence of a contact with a GP, specialist, dentist and a physiotherapist, as well as inpatient and day patient hospitalisation. The validity of the estimates and the regional differences were explored through measures of agreement and logistic regression analyses. Results Validity of self-reported health care use varies by type of health service and is more affected by reporting than by selection bias. Compared to health insurance estimates, self-reported results underestimate the percentage of people with a specialist contact in the past year (50.5 % versus 65.0 % and a day patient hospitalisation (7.8 % versus 13.9 %. Inversely, survey results overestimated the percentage of people having visited a dentist in the past year: 58.3 % versus 48.6 %. The best concordance was obtained for an inpatient hospitalisation (kappa 0.75. Survey data overestimate the higher prevalence of a contact with a specialist [OR 1.51 (95 % CI 1.33–1.72 for self-report and 1.08 (95 % CI 1.05–1.15 for register] and underestimate the lower prevalence of a contact with a GP [ORs 0.59 (95 % CI 0.51–0.70 and 0.41 (95 % CI 0.39–0.42 respectively] in

  6. A competency based selection procedure for Dutch postgraduate GP training: a pilot study on validity and reliability.

    Science.gov (United States)

    Vermeulen, Margit I; Tromp, Fred; Zuithoff, Nicolaas P A; Pieters, Ron H M; Damoiseaux, Roger A M J; Kuyvenhoven, Marijke M

    2014-12-01

    Abstract Background: Historically, semi-structured interviews (SSI) have been the core of the Dutch selection for postgraduate general practice (GP) training. This paper describes a pilot study on a newly designed competency-based selection procedure that assesses whether candidates have the competencies that are required to complete GP training. The objective was to explore reliability and validity aspects of the instruments developed. The new selection procedure comprising the National GP Knowledge Test (LHK), a situational judgement tests (SJT), a patterned behaviour descriptive interview (PBDI) and a simulated encounter (SIM) was piloted alongside the current procedure. Forty-seven candidates volunteered in both procedures. Admission decision was based on the results of the current procedure. Study participants did hardly differ from the other candidates. The mean scores of the candidates on the LHK and SJT were 21.9 % (SD 8.7) and 83.8% (SD 3.1), respectively. The mean self-reported competency scores (PBDI) were higher than the observed competencies (SIM): 3.7(SD 0.5) and 2.9(SD 0.6), respectively. Content-related competencies showed low correlations with one another when measured with different instruments, whereas more diverse competencies measured by a single instrument showed strong to moderate correlations. Moreover, a moderate correlation between LHK and SJT was found. The internal consistencies (intraclass correlation, ICC) of LHK and SJT were poor while the ICC of PBDI and SIM showed acceptable levels of reliability. Findings on content validity and reliability of these new instruments are promising to realize a competency based procedure. Further development of the instruments and research on predictive validity should be pursued.

  7. A design of a valid signal selecting and position decoding ASIC for PET using silicon photomultipliers

    International Nuclear Information System (INIS)

    Cho, M.; Lim, K.-T.; Kim, J.; Lee, C.; Cho, G.; Kim, H.; Yeom, J.-Y.; Choi, H.

    2017-01-01

    In most cases, a PET system has numerous electrical components and channel circuits and thus it would rather be a bulky product. Also, most existing systems receive analog signals from detectors which make them vulnerable to signal distortions. For these reasons, channel reduction techniques are important. In this work, an ASIC for PET module is being proposed. An ASIC chip for 16 PET detector channels, VSSPDC, has been designed and simulated. The main function of the chip is 16-to-1 channel reduction, i.e., finding the position of only the valid signals, signal timing, and magnitudes in all 16 channels at every recorded event. The ASIC comprises four of 4-channel modules and a 2 nd 4-to-1 router. A single channel module comprises a transimpedance amplifier for the silicon photomultipliers, dual comparators with high and low level references, and a logic circuitry. While the high level reference was used to test the validity of the signal, the low level reference was used for the timing. The 1-channel module of the ASIC produced an energy pulse by time-over-threshold method and it also produced a time pulse with a fixed delayed time. Since the ASIC chip outputs only a few digital pulses and does not require an external clock, it has an advantage over noise properties. The cadence simulation showed the good performance of the chip as designed.

  8. The Validity and Incremental Validity of Knowledge Tests, Low-Fidelity Simulations, and High-Fidelity Simulations for Predicting Job Performance in Advanced-Level High-Stakes Selection

    Science.gov (United States)

    Lievens, Filip; Patterson, Fiona

    2011-01-01

    In high-stakes selection among candidates with considerable domain-specific knowledge and experience, investigations of whether high-fidelity simulations (assessment centers; ACs) have incremental validity over low-fidelity simulations (situational judgment tests; SJTs) are lacking. Therefore, this article integrates research on the validity of…

  9. Spin-selective recombination reactions of radical pairs: Experimental test of validity of reaction operators

    Energy Technology Data Exchange (ETDEWEB)

    Maeda, Kiminori [Department of Chemistry, University of Oxford, Centre for Advanced Electron Spin Resonance, Inorganic Chemistry Laboratory, Oxford (United Kingdom); Liddell, Paul; Gust, Devens [Department of Chemistry and Biochemistry, Arizona State University, Tempe, Arizona, 85287-1604 (United States); Hore, P. J. [Department of Chemistry, University of Oxford, Physical and Theoretical Chemistry Laboratory, Oxford (United Kingdom)

    2013-12-21

    Spin-selective reactions of radical pairs are conventionally modelled using an approach that dates back to the 1970s [R. Haberkorn, Mol. Phys. 32, 1491 (1976)]. An alternative approach based on the theory of quantum measurements has recently been suggested [J. A. Jones and P. J. Hore, Chem. Phys. Lett. 488, 90 (2010)]. We present here the first experimental attempt to discriminate between the two models. Pulsed electron paramagnetic resonance spectroscopy has been used to investigate intramolecular electron transfer in the radical pair form of a carotenoid-porphyrin-fullerene molecular triad. The rate of spin-spin relaxation of the fullerene radical in the triad was found to be inconsistent with the quantum measurement description of the spin-selective kinetics, and in accord with the conventional model when combined with spin-dephasing caused by rotational modulation of the anisotropic g-tensor of the fullerene radical.

  10. Literature Review: Validity and Potential Usefulness of Psychomotor Ability Tests for Personnel Selection and Classification

    Science.gov (United States)

    1988-04-01

    processing capabilities. Craik and Lockhart (1972), for example, investigated limitations In the ability to store Information In short-term storage...F. I. M., & Lockhart , R. S. (1972). Levels of processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11, 671...preferable to an information processing taxonomy for purposes of the current selection and classification research. i-irst, on a theoretical level

  11. Development and Validation of Measures for Selecting Soldiers for the Officer Candidate School

    Science.gov (United States)

    2011-08-01

    SJT, there has been a debate about what SJTs actually measure and why they work (cf. Moss & Hunt, 1926; Thorndike , 1936), a debate that continues...meta-analytic review and integration. Psychological Bulletin, 129, 914-945. Thorndike , R. L. (1936). Factor analysis of social and abstract...intelligence. The Journal of Educational Psychology, XXVII, 231—233. Thorndike , R. L. (1949). Personnel selection: Test and measurement techniques. New York

  12. Computer-aided test selection and result validation-opportunities and pitfalls

    DEFF Research Database (Denmark)

    McNair, P; Brender, J; Talmon, J

    1998-01-01

    /or to increase cost-efficiency). Our experience shows that there is a practical limit to the extent of exploitation of the principle of dynamic test scheduling, unless it is automated in one way or the other. This paper analyses some issues of concern related to the profession of clinical biochemistry, when......Dynamic test scheduling is concerned with pre-analytical preprocessing of the individual samples within a clinical laboratory production by means of decision algorithms. The purpose of such scheduling is to provide maximal information with minimal data production (to avoid data pollution and...... implementing such dynamic test scheduling within a Laboratory Information System (and/or an advanced analytical workstation). The challenge is related to 1) generation of appropriately validated decision models, and 2) mastering consequences of analytical imprecision and bias....

  13. Site selection and directional models of deserts used for ERBE validation targets

    Science.gov (United States)

    Staylor, W. F.

    1986-01-01

    Broadband shortwave and longwave radiance measurements obtained from the Nimbus 7 Earth Radiation Budget scanner were used to develop reflectance and emittance models for the Sahara, Gibson, and Saudi Deserts. These deserts will serve as in-flight validation targets for the Earth Radiation Budget Experiment being flown on the Earth Radiation Budget Satellite and two National Oceanic and Atmospheric Administration polar satellites. The directional reflectance model derived for the deserts was a function of the sum and product of the cosines of the solar and viewing zenith angles, and thus reciprocity existed between these zenith angles. The emittance model was related by a power law of the cosine of the viewing zenith angle.

  14. Constrained noninformative priors

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-10-01

    The Jeffreys noninformative prior distribution for a single unknown parameter is the distribution corresponding to a uniform distribution in the transformed model where the unknown parameter is approximately a location parameter. To obtain a prior distribution with a specified mean but with diffusion reflecting great uncertainty, a natural generalization of the noninformative prior is the distribution corresponding to the constrained maximum entropy distribution in the transformed model. Examples are given

  15. Validation and characterization of a novel method for selective vagal deafferentation of the gut.

    Science.gov (United States)

    Diepenbroek, Charlene; Quinn, Danielle; Stephens, Ricky; Zollinger, Benjamin; Anderson, Seth; Pan, Annabelle; de Lartigue, Guillaume

    2017-10-01

    There is a lack of tools that selectively target vagal afferent neurons (VAN) innervating the gut. We use saporin (SAP), a potent neurotoxin, conjugated to the gastronintestinal (GI) hormone cholecystokinin (CCK-SAP) injected into the nodose ganglia (NG) of male Wistar rats to specifically ablate GI-VAN. We report that CCK-SAP ablates a subpopulation of VAN in culture. In vivo, CCK-SAP injection into the NG reduces VAN innervating the mucosal and muscular layers of the stomach and small intestine but not the colon, while leaving vagal efferent neurons intact. CCK-SAP abolishes feeding-induced c-Fos in the NTS, as well as satiation by CCK or glucagon like peptide-1 (GLP-1). CCK-SAP in the NG of mice also abolishes CCK-induced satiation. Therefore, we provide multiple lines of evidence that injection of CCK-SAP in NG is a novel selective vagal deafferentation technique of the upper GI tract that works in multiple vertebrate models. This method provides improved tissue specificity and superior separation of afferent and efferent signaling compared with vagotomy, capsaicin, and subdiaphragmatic deafferentation. NEW & NOTEWORTHY We develop a new method that allows targeted lesioning of vagal afferent neurons that innervate the upper GI tract while sparing vagal efferent neurons. This reliable approach provides superior tissue specificity and selectivity for vagal afferent over efferent targeting than traditional approaches. It can be used to address questions about the role of gut to brain signaling in physiological and pathophysiological conditions. Copyright © 2017 the American Physiological Society.

  16. Modeling and Experimental Validation of the Electron Beam Selective Melting Process

    Directory of Open Access Journals (Sweden)

    Wentao Yan

    2017-10-01

    Full Text Available Electron beam selective melting (EBSM is a promising additive manufacturing (AM technology. The EBSM process consists of three major procedures: ① spreading a powder layer, ② preheating to slightly sinter the powder, and ③ selectively melting the powder bed. The highly transient multi-physics phenomena involved in these procedures pose a significant challenge for in situ experimental observation and measurement. To advance the understanding of the physical mechanisms in each procedure, we leverage high-fidelity modeling and post-process experiments. The models resemble the actual fabrication procedures, including ① a powder-spreading model using the discrete element method (DEM, ② a phase field (PF model of powder sintering (solid-state sintering, and ③ a powder-melting (liquid-state sintering model using the finite volume method (FVM. Comprehensive insights into all the major procedures are provided, which have rarely been reported. Preliminary simulation results (including powder particle packing within the powder bed, sintering neck formation between particles, and single-track defects agree qualitatively with experiments, demonstrating the ability to understand the mechanisms and to guide the design and optimization of the experimental setup and manufacturing process.

  17. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study.

    Science.gov (United States)

    Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.

  18. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study

    Directory of Open Access Journals (Sweden)

    Martin A. Proescholdt

    2017-01-01

    Full Text Available Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca, correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp. In this study we compared the results of the sca with the pressure reactivity index (PRx, an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc. The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.

  19. Heat transfer in plate heat exchanger channels: Experimental validation of selected correlation equations

    Directory of Open Access Journals (Sweden)

    Cieśliński Janusz T.

    2016-09-01

    Full Text Available This study is focused on experimental investigation of selected type of brazed plate heat exchanger (PHEx. The Wilson plot approach was applied in order to estimate heat transfer coefficients for the PHEx passages. The main aim of the paper was to experimentally check ability of several correlations published in the literature to predict heat transfer coefficients by comparison experimentally obtained data with appropriate predictions. The results obtained revealed that Hausen and Dittus-Boelter correlations underestimated heat transfer coefficient for the tested PHEx by an order of magnitude. The Aspen Plate code overestimated heat transfer coefficient by about 50%, while Muley-Manglik correlation overestimated it from 1% to 25%, dependent on the value of Reynolds number and hot or cold liquid side.

  20. The validation of the Minnesota Job Satisfaction Questionnaire in selected organisations in South Africa

    Directory of Open Access Journals (Sweden)

    Johanna H. Buitendach

    2009-04-01

    Full Text Available The objectives of this study were to assess the construct equivalence of the Minnesota Job Satisfaction Questionnaire (MSQ, and to investigate the manifestation of job satisfaction at selected organisations in South Africa. A cross-sectional survey design with a random sample (N = 474 was used. The MSQ and a biographical questionnaire were administered. The results confirmed a two-factor model of job satisfaction, consisting of extrinsic job satisfaction and intrinsic job satisfaction. Exploratory factor analysis with target rotations conf rmed the construct equivalence of scales for the black and white groups. The results obtained from comparing job satisfaction levels of various demographic groups showed that practically significant differences existed between the job satisfaction of different age and race groups.

  1. Assessment of somatotype in young voleyball players: Validity as criteria to select young sports talents

    Directory of Open Access Journals (Sweden)

    Moisés de Hoyo Lora

    2008-07-01

    Full Text Available http://dx.doi.org/10.5007/1980-0037.2008v10n3p255 The anthropometric characteristics of athletes can determine their sporting performance. For this reason, we’ve defined the somatotype of young volleyball players in order to be able to control their sports training and to ensure their appropriate athletic development. In the present investigation 154 male and female volleyball players (aged from 12 to 14 years were analyzed. Data were collected according to the ISAK protocol. The results show an endomesomorphic profile for male and female volleyball players agreeing with the predominant profile at these ages. However, after comparing these data with results obtained in other studies, we observed a certain homogeneity in the male somatotype, invalidating the current trend of using this parameter as criteria to select young sports talent. However, somatotype could be a factor to take into account with female athletes, since their profile is much more heterogeneous.

  2. Optimizing Training Population Data and Validation of Genomic Selection for Economic Traits in Soft Winter Wheat

    Directory of Open Access Journals (Sweden)

    Amber Hoffstetter

    2016-09-01

    Full Text Available Genomic selection (GS is a breeding tool that estimates breeding values (GEBVs of individuals based solely on marker data by using a model built using phenotypic and marker data from a training population (TP. The effectiveness of GS increases as the correlation of GEBVs and phenotypes (accuracy increases. Using phenotypic and genotypic data from a TP of 470 soft winter wheat lines, we assessed the accuracy of GS for grain yield, Fusarium Head Blight (FHB resistance, softness equivalence (SE, and flour yield (FY. Four TP data sampling schemes were tested: (1 use all TP data, (2 use subsets of TP lines with low genotype-by-environment interaction, (3 use subsets of markers significantly associated with quantitative trait loci (QTL, and (4 a combination of 2 and 3. We also correlated the phenotypes of relatives of the TP to their GEBVs calculated from TP data. The GS accuracy within the TP using all TP data ranged from 0.35 (FHB to 0.62 (FY. On average, the accuracy of GS from using subsets of data increased by 54% relative to using all TP data. Using subsets of markers selected for significant association with the target trait had the greatest impact on GS accuracy. Between-environment prediction accuracy was also increased by using data subsets. The accuracy of GS when predicting the phenotypes of TP relatives ranged from 0.00 to 0.85. These results suggest that GS could be useful for these traits and GS accuracy can be greatly improved by using subsets of TP data.

  3. Evaluation of recruitment and selection for specialty training in public health: interim results of a prospective cohort study to measure the predictive validity of the selection process.

    Science.gov (United States)

    Pashayan, Nora; Gray, Selena; Duff, Celia; Parkes, Julie; Williams, David; Patterson, Fiona; Koczwara, Anna; Fisher, Grant; Mason, Brendan W

    2016-06-01

    The recruitment process for public health specialty training includes an assessment centre (AC) with three components, Rust Advanced Numerical Reasoning Appraisal (RANRA), Watson-Glaser Critical Thinking Appraisal (WGCT) and a Situation Judgement Test (SJT), which determines invitation to a selection centre (SC). The scores are combined into a total recruitment (TR) score that determines the offers of appointment. A prospective cohort study using anonymous record linkage to investigate the association between applicant's scores in the recruitment process and registrar's progress through training measured by results of Membership Faculty Public Health (MFPH) examinations and outcomes of the Annual Review of Competence Progression (ARCP). Higher scores in RANRA, WGCT, AC, SC and TR were all significantly associated with higher adjusted odds of passing Part A MFPH exam at the first attempt. Higher scores in AC, SC and TR were significantly associated with passing Part B exam at the first attempt. Higher scores in SJT, AC and SC were significantly associated with satisfactory ARCP outcomes. The current UK national recruitment and selection process for public health specialty training has good predictive validity. The individual components of the process are testing different skills and abilities and together they are providing additive value. © The Author 2015. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Risk prediction models for selection of lung cancer screening candidates: A retrospective validation study.

    Directory of Open Access Journals (Sweden)

    Kevin Ten Haaf

    2017-04-01

    Full Text Available Selection of candidates for lung cancer screening based on individual risk has been proposed as an alternative to criteria based on age and cumulative smoking exposure (pack-years. Nine previously established risk models were assessed for their ability to identify those most likely to develop or die from lung cancer. All models considered age and various aspects of smoking exposure (smoking status, smoking duration, cigarettes per day, pack-years smoked, time since smoking cessation as risk predictors. In addition, some models considered factors such as gender, race, ethnicity, education, body mass index, chronic obstructive pulmonary disease, emphysema, personal history of cancer, personal history of pneumonia, and family history of lung cancer.Retrospective analyses were performed on 53,452 National Lung Screening Trial (NLST participants (1,925 lung cancer cases and 884 lung cancer deaths and 80,672 Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO ever-smoking participants (1,463 lung cancer cases and 915 lung cancer deaths. Six-year lung cancer incidence and mortality risk predictions were assessed for (1 calibration (graphically by comparing the agreement between the predicted and the observed risks, (2 discrimination (area under the receiver operating characteristic curve [AUC] between individuals with and without lung cancer (death, and (3 clinical usefulness (net benefit in decision curve analysis by identifying risk thresholds at which applying risk-based eligibility would improve lung cancer screening efficacy. To further assess performance, risk model sensitivities and specificities in the PLCO were compared to those based on the NLST eligibility criteria. Calibration was satisfactory, but discrimination ranged widely (AUCs from 0.61 to 0.81. The models outperformed the NLST eligibility criteria over a substantial range of risk thresholds in decision curve analysis, with a higher sensitivity for all models and a

  5. Selection and validation of reference genes for miRNA expression studies during porcine pregnancy.

    Directory of Open Access Journals (Sweden)

    Jocelyn M Wessels

    Full Text Available MicroRNAs comprise a family of small non-coding RNAs that modulate several developmental and physiological processes including pregnancy. Their ubiquitous presence is confirmed in mammals, worms, flies and plants. Although rapid advances have been made in microRNA research, information on stable reference genes for validation of microRNA expression is still lacking. Real time PCR is a widely used tool to quantify gene transcripts. An appropriate reference gene must be chosen to minimize experimental error in this system. A small difference in miRNA levels between experimental samples can be biologically meaningful as these entities can affect multiple targets in a pathway. This study examined the suitability of six commercially available reference genes (RNU1A, RNU5A, RNU6B, SNORD25, SCARNA17, and SNORA73A in maternal-fetal tissues from healthy and spontaneously arresting/dying conceptuses from sows were separately analyzed at gestation day 20. Comparisons were also made with non-pregnant endometrial tissues from sows. Spontaneous fetal loss is a prime concern to the commercial pork industry. Our laboratory has previously identified deficits in vasculature development at maternal-fetal interface as one of the major participating causes of fetal loss. Using this well-established model, we have extended our studies to identify suitable microRNA reference genes. A methodical approach to assessing suitability was adopted using standard curve and melting curve analysis, PCR product sequencing, real time PCR expression in a panel of gestational tissues, and geNorm and NormFinder analysis. Our quantitative real time PCR analysis confirmed expression of all 6 reference genes in maternal and fetal tissues. All genes were uniformly expressed in tissues from healthy and spontaneously arresting conceptus attachment sites. Comparisons between tissue types (maternal/fetal/non-pregnant revealed significant differences for RNU5A, RNU6B, SCARNA17, and SNORA73A

  6. Dual-Material Electron Beam Selective Melting: Hardware Development and Validation Studies

    Directory of Open Access Journals (Sweden)

    Chao Guo

    2015-03-01

    Full Text Available Electron beam selective melting (EBSM is an additive manufacturing technique that directly fabricates three-dimensional parts in a layerwise fashion by using an electron beam to scan and melt metal powder. In recent years, EBSM has been successfully used in the additive manufacturing of a variety of materials. Previous research focused on the EBSM process of a single material. In this study, a novel EBSM process capable of building a gradient structure with dual metal materials was developed, and a powder-supplying method based on vibration was put forward. Two different powders can be supplied individually and then mixed. Two materials were used in this study: Ti6Al4V powder and Ti47Al2Cr2Nb powder. Ti6Al4V has excellent strength and plasticity at room temperature, while Ti47Al2Cr2Nb has excellent performance at high temperature, but is very brittle. A Ti6Al4V/Ti47Al2Cr2Nb gradient material was successfully fabricated by the developed system. The microstructures and chemical compositions were characterized by optical microscopy, scanning microscopy, and electron microprobe analysis. Results showed that the interface thickness was about 300 μm. The interface was free of cracks, and the chemical compositions exhibited a staircase-like change within the interface.

  7. Selecting and validating reference genes for quantitative real-time PCR in Plutella xylostella (L.).

    Science.gov (United States)

    You, Yanchun; Xie, Miao; Vasseur, Liette; You, Minsheng

    2018-05-01

    Gene expression analysis provides important clues regarding gene functions, and quantitative real-time PCR (qRT-PCR) is a widely used method in gene expression studies. Reference genes are essential for normalizing and accurately assessing gene expression. In the present study, 16 candidate reference genes (ACTB, CyPA, EF1-α, GAPDH, HSP90, NDPk, RPL13a, RPL18, RPL19, RPL32, RPL4, RPL8, RPS13, RPS4, α-TUB, and β-TUB) from Plutella xylostella were selected to evaluate gene expression stability across different experimental conditions using five statistical algorithms (geNorm, NormFinder, Delta Ct, BestKeeper, and RefFinder). The results suggest that different reference genes or combinations of reference genes are suitable for normalization in gene expression studies of P. xylostella according to the different developmental stages, strains, tissues, and insecticide treatments. Based on the given experimental sets, the most stable reference genes were RPS4 across different developmental stages, RPL8 across different strains and tissues, and EF1-α across different insecticide treatments. A comprehensive and systematic assessment of potential reference genes for gene expression normalization is essential for post-genomic functional research in P. xylostella, a notorious pest with worldwide distribution and a high capacity to adapt and develop resistance to insecticides.

  8. MRI and neuropathological validations of the involvement of air pollutants in cortical selective neuronal loss.

    Science.gov (United States)

    Ejaz, Sohail; Anwar, Khaleeq; Ashraf, Muhammad

    2014-03-01

    Vehicles are a major source of air pollution, especially particulate matter (PM) pollution, throughout the world and auto-rickshaws are considered main contributors to this air pollution. PM, in addition to causing respiratory and cardiovascular disorders, has potential to gain access to the brain and could induce neuroinflammation leading to different neurological disorders. Therefore, in the current project, MRI and immunohistochemistry techniques were adopted to ascertain the neurotoxic potential of the chronic exposure to different PM generated by two-stroke auto-rickshaws (TSA), four-stroke auto-rickshaws (FSA), and aluminum sulfate (AS) solution in rats. The results highlighted that all treated groups followed a pattern of dose-dependent increase in pure cortical neuronal loss, selective neuronal loss (SNL), nuclear pyknosis, karyolysis, and karyorrhexis. Mild to moderate areas of penumbra were also observed with increase in the population of activated microglia and astrocytes, while no alteration in the intensities of T2W MRI signals was perceived in any group. When comparing the findings, TSA possess more neurotoxic potential than FSA and AS, which could be associated with increased concentration of certain elements in TSA emissions. The study concludes that chronic exposure to PM from TSA, FSA, and AS solutions produces diverse neuropathies in the brain, which may lead to different life-threatening neurological disorders like stroke, Alzheimer's, and Parkinson's disorders. Government and environmental agencies should take serious notice of this alarming situation, and immediate steps should be implemented to improve the standards of PM emissions from auto-rickshaws.

  9. Measurement error correction in the least absolute shrinkage and selection operator model when validation data are available.

    Science.gov (United States)

    Vasquez, Monica M; Hu, Chengcheng; Roe, Denise J; Halonen, Marilyn; Guerra, Stefano

    2017-01-01

    Measurement of serum biomarkers by multiplex assays may be more variable as compared to single biomarker assays. Measurement error in these data may bias parameter estimates in regression analysis, which could mask true associations of serum biomarkers with an outcome. The Least Absolute Shrinkage and Selection Operator (LASSO) can be used for variable selection in these high-dimensional data. Furthermore, when the distribution of measurement error is assumed to be known or estimated with replication data, a simple measurement error correction method can be applied to the LASSO method. However, in practice the distribution of the measurement error is unknown and is expensive to estimate through replication both in monetary cost and need for greater amount of sample which is often limited in quantity. We adapt an existing bias correction approach by estimating the measurement error using validation data in which a subset of serum biomarkers are re-measured on a random subset of the study sample. We evaluate this method using simulated data and data from the Tucson Epidemiological Study of Airway Obstructive Disease (TESAOD). We show that the bias in parameter estimation is reduced and variable selection is improved.

  10. Empirical Validation of a Hypothesis of the Hormetic Selective Forces Driving the Evolution of Longevity Regulation Mechanisms

    Directory of Open Access Journals (Sweden)

    Alejandra Gomez-Perez

    2016-12-01

    Full Text Available Exogenously added lithocholic bile acid and some other bile acids slow down yeast chronological aging by eliciting a hormetic stress response and altering mitochondrial functionality. Unlike animals, yeast cells do not synthesize bile acids. We therefore hypothesized that bile acids released into an ecosystem by animals may act as interspecies chemical signals that generate selective pressure for the evolution of longevity regulation mechanisms in yeast within this ecosystem. To empirically verify our hypothesis, in this study we carried out a 3-step process for the selection of long-lived yeast species by a long-term exposure to exogenous lithocholic bile acid. Such experimental evolution yielded 20 long-lived mutants, 3 of which were capable of sustaining their considerably prolonged chronological lifespans after numerous passages in medium without lithocholic acid. The extended longevity of each of the 3 long-lived yeast species was a dominant polygenic trait caused by mutations in more than two nuclear genes. Each of the 3 mutants displayed considerable alterations to the age-related chronology of mitochondrial respiration and showed enhanced resistance to chronic oxidative, thermal and osmotic stresses. Our findings empirically validate the hypothesis suggesting that hormetic selective forces can drive the evolution of longevity regulation mechanisms within an ecosystem.

  11. Validation of KDRI/KDPI for the selection of expanded criteria kidney donors

    Directory of Open Access Journals (Sweden)

    Raimundo M. García del Moral Martín

    2018-05-01

    Full Text Available Introduction: KDRI/KDPI are tools use in kidney donor evaluation. It has been proposed as a substitute of, or complementary to preimplantation renal biopsy. These scores have not been validated in Spain. Objective: (1 To investigate the concordance between KDPI and histological scores (preimplantation renal biopsy and (2 to assess the relationship between KDRI, KDPI and histological score on graft survival in the expanded criteria donors group. Methodology: Retrospective cohort study from 1 January 1998 to 31 December 2010. Results: During the study 120 donors were recruited, that resulted in 220 preimplantation renal biopsies. 144 (65% grafts were considered suitable for kidney transplantation. 76 (34.5% were discarded. Median follow up has been 6.4 years (sd 3.9. Median age 63.1 years (sd 8.2, males (145; 65.9%, non-diabetic (191; 86.8% and without another cardiovascular risk factors (173; 78.6%. 153 (69.5% donors died of cerebrovascular disease. There were significant differences in KDRI/KDPI score in both groups 1.56/89 (sd 0.22 vs 1.66/93 (sd 0.15, p < 0.01. The KDPI showed moderate concordance and correlation with the histological score (AUC 0.64/correlation coefficient 0.24, p < 0.01. KDPI (HR 24.3, p < 0.01 and KDRI (HR 23.3, p < 0.01 scores were associated with graft survival in multivariate analysis. Conclusion: (1 KPDI and histological scores show moderate concordance. The utility of both scores as combined tools it has to be determined. (2 KDPI score, and especially KDRI score, are valid for estimating graft survival and combined with the biopsy can help to individualized decision making in the expanded criteria donors pool. Resumen: Introducción: El KDRI y su variante KDPI son dos herramientas utilizadas para la valoración del donante renal. Se ha propuesto la utilidad del KDPI como sustituto/complementario a la biopsia renal preimplantación. Estos scores no están validados en España. Objetivo: 1 Investigar la

  12. Bayesian model selection validates a biokinetic model for zirconium processing in humans

    Science.gov (United States)

    2012-01-01

    Background In radiation protection, biokinetic models for zirconium processing are of crucial importance in dose estimation and further risk analysis for humans exposed to this radioactive substance. They provide limiting values of detrimental effects and build the basis for applications in internal dosimetry, the prediction for radioactive zirconium retention in various organs as well as retrospective dosimetry. Multi-compartmental models are the tool of choice for simulating the processing of zirconium. Although easily interpretable, determining the exact compartment structure and interaction mechanisms is generally daunting. In the context of observing the dynamics of multiple compartments, Bayesian methods provide efficient tools for model inference and selection. Results We are the first to apply a Markov chain Monte Carlo approach to compute Bayes factors for the evaluation of two competing models for zirconium processing in the human body after ingestion. Based on in vivo measurements of human plasma and urine levels we were able to show that a recently published model is superior to the standard model of the International Commission on Radiological Protection. The Bayes factors were estimated by means of the numerically stable thermodynamic integration in combination with a recently developed copula-based Metropolis-Hastings sampler. Conclusions In contrast to the standard model the novel model predicts lower accretion of zirconium in bones. This results in lower levels of noxious doses for exposed individuals. Moreover, the Bayesian approach allows for retrospective dose assessment, including credible intervals for the initially ingested zirconium, in a significantly more reliable fashion than previously possible. All methods presented here are readily applicable to many modeling tasks in systems biology. PMID:22863152

  13. Assessment of somatotype in young voleyball players: Validity as criteria to select young sports talents

    Directory of Open Access Journals (Sweden)

    Luís Carrasco Páez

    2008-06-01

    Full Text Available The anthropometric characteristics of athletes can determine their sporting performance. For this reason, we’ve defined the somatotype of young volleyball players in order to be able to control their sports training and to ensure their appropriate athletic development. In the present investigation 154 male and female volleyball players (aged from 12 to 14 years were analyzed. Data were collected according to the ISAK protocol. The results show an endomesomorphic profile for male and female volleyball players agreeing with the predominant profile at these ages. However, after comparing these data with results obtained in other studies, we observed a certain homogeneity in the male somatotype, invalidating the current trend of using this parameter as criteria to select young sports talent. However, somatotype could be a factor to take into account with female athletes, since their profile is much more heterogeneous. ResumoLas características antropométricas de los deportistas pueden determinar su rendimiento deportivo. Por esa razón se pretende conocer el somatotipo de jugadores infantiles de voleibol para poder así, controlar el entrenamiento y asegurar un adecuado desarrollo de los deportistas. En este estudio se analizaron 154 jugadores/as, con edades comprendidas entre 12 y 14 años. Los datos se extrajeron según las técnicas recomendadas por la ISAK (2001. Los resultados mostraron um perfil endomesomorfo tanto en chicos como en chicas, lo que concuerda con el perfil predominante en estas edades. Por otra parte, al comparar el somatotipo de estos jugadores con el de otros estudios observamos una cierta homogeneidad en el caso de los chicos, lo que invalidaría la tendencia actual de usar este parámetro como criterio para la selección de jóvenes talentos deportivos. Sin embargo, si podría ser un elemento a tener en cuenta en el caso de las chicas, ya que su perfil es mucho más heterogéneo.

  14. Development and Validation of Marker-Aided Selection Methods for Wood Property Traits in Loblolly Pine and Hybrid Poplar; FINAL

    International Nuclear Information System (INIS)

    Tuskan, G.A.

    2001-01-01

    Wood properties influence pulp and paper quality. Certainly, overall pulp yields are directly related to the cellulose content, changes in hemicellulose content are associated with changes in pulp cohesiveness, and pulping efficiency is related to lignin content. Despite the importance of wood properties on product quality, little progress has been made in improving such traits because current methods of assessing wood and fiber characteristics are time-consuming, expensive, and often imprecise. Genetic improvement of wood and fiber properties has been further hampered by the large size of trees, delayed reproductive maturity and long harvest cycles. Recent developments in molecular genetics will help overcome the physical, economic and biological constraints in assessing and improving wood properties. Genetic maps consisting of numerous molecular markers are now available for loblolly pine and hybrid poplar. Such markers/maps may be used as part of a marker-aided selection and breeding effort or to expedite the isolation and characterization of genes and/or promoters that directly control wood properties. The objectives of this project are: (1) to apply new and rapid analytical techniques for assessing component wood properties to segregating F(sub 2) progeny populations of loblolly pine and hybrid poplar, (2) to map quantitative trait loci and identify molecular markers associated with wood properties in each of the above species and (3) to validate marker-aided selection methods for wood properties in loblolly pine and hybrid poplar

  15. Development and Validation of Marker-Aided Selection Methods for Wood Property Traits in Loblolly Pine and Hybrid Poplar

    Energy Technology Data Exchange (ETDEWEB)

    Tuskan, G.A.

    2001-06-20

    Wood properties influence pulp and paper quality. Certainly, overall pulp yields are directly related to the cellulose content, changes in hemicellulose content are associated with changes in pulp cohesiveness, and pulping efficiency is related to lignin content. Despite the importance of wood properties on product quality, little progress has been made in improving such traits because current methods of assessing wood and fiber characteristics are time-consuming, expensive, and often imprecise. Genetic improvement of wood and fiber properties has been further hampered by the large size of trees, delayed reproductive maturity and long harvest cycles. Recent developments in molecular genetics will help overcome the physical, economic and biological constraints in assessing and improving wood properties. Genetic maps consisting of numerous molecular markers are now available for loblolly pine and hybrid poplar. Such markers/maps may be used as part of a marker-aided selection and breeding effort or to expedite the isolation and characterization of genes and/or promoters that directly control wood properties. The objectives of this project are: (1) to apply new and rapid analytical techniques for assessing component wood properties to segregating F2 progeny populations of loblolly pine and hybrid poplar, (2) to map quantitative trait loci and identify molecular markers associated with wood properties in each of the above species and (3) to validate marker-aided selection methods for wood properties in loblolly pine and hybrid poplar.

  16. Disentangling the Predictive Validity of High School Grades for Academic Success in University

    Science.gov (United States)

    Vulperhorst, Jonne; Lutz, Christel; de Kleijn, Renske; van Tartwijk, Jan

    2018-01-01

    To refine selective admission models, we investigate which measure of prior achievement has the best predictive validity for academic success in university. We compare the predictive validity of three core high school subjects to the predictive validity of high school grade point average (GPA) for academic achievement in a liberal arts university…

  17. The Prior-project

    DEFF Research Database (Denmark)

    Engerer, Volkmar Paul; Roued-Cunliffe, Henriette; Albretsen, Jørgen

    digitisation of Arthur Prior’s Nachlass kept in the Bodleian Library, Oxford. The DH infrastructure in question is the Prior Virtual Lab (PVL). PVL was established in 2011 in order to provide researchers in the field of temporal logic easy access to the papers of Arthur Norman Prior (1914-1969), and officially......In this paper, we present a DH research infrastructure which relies heavily on a combination of domain knowledge with information technology. The general goal is to develop tools to aid scholars in their interpretations and understanding of temporal logic. This in turn is based on an extensive...

  18. Subcarinal lymph node in upper lobe non-small cell lung cancer patients: is selective lymph node dissection valid?

    Science.gov (United States)

    Aokage, Keiju; Yoshida, Junji; Ishii, Genichiro; Hishida, Tomoyuki; Nishimura, Mitsuyo; Nagai, Kanji

    2010-11-01

    Little is known about selective lymph node dissection in non-small cell lung cancer (NSCLC) patients. We sought to gain insight into subcarinal node involvement for its frequency and impact on outcome to evaluate whether it is valid to omit subcarinal lymph node dissection in upper lobe NSCLC patients. We reviewed node metastases distribution according to node region, tumor location, and histology among 1099 patients with upper lobe NSCLC. We paid special attention to subcarinal metastases patients without superior mediastinal node metastases, because their pathological stages would have been underdiagnosed if subcarinal node dissection had been omitted. We also assessed the outcome and the pattern of failure among subcarinal metastases patients. To identify subcarinal node involvement predictors, we analyzed 7 clinical factors. Subcarinal node metastases were found in 20 patients and were least frequent among squamous cell carcinoma patients (0.5%). Two of them were free from superior mediastinal metastases but died of the disease at 1 month and due to an unknown cause at 18 months, respectively. Seventeen of the 20 patients developed multi-site recurrence within 37 months. The 5-year survival rate of the 20 patients with subcarinal metastases was 9.0%, which was significantly lower than 32.0% of patients with only superior mediastinal metastases. Clinical diagnosis of node metastases was significantly predictive of subcarinal metastases. Subcarinal node metastases from upper lobe NSCLC were rare and predicted an extremely poor outcome. It appears valid to omit subcarinal node dissection in upper lobe NSCLC patients, especially in clinical N0 squamous cell carcinoma patients. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Arthur Prior and 'Now'

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Jørgensen, Klaus Frovin

    2016-01-01

    ’s search led him through the work of Castañeda, and back to his own work on hybrid logic: the first made temporal reference philosophically respectable, the second made it technically feasible in a modal framework. With the aid of hybrid logic, Prior built a bridge from a two-dimensional UT calculus...

  20. Prior Knowledge Assessment Guide

    Science.gov (United States)

    2014-12-01

    assessment in a reasonable amount of time. Hands-on assessments can be extremely diverse in makeup and administration depending on the subject matter...DEVELOPING AND USING PRIOR KNOWLEDGE ASSESSMENTS TO TAILOR TRAINING D-3 ___ Brush and scrub ___ Orchards ___ Rice

  1. Validation of candidate gene markers for marker-assisted selection of potato cultivars with improved tuber quality.

    Science.gov (United States)

    Li, Li; Tacke, Eckhard; Hofferbert, Hans-Reinhardt; Lübeck, Jens; Strahwald, Josef; Draffehn, Astrid M; Walkemeier, Birgit; Gebhardt, Christiane

    2013-04-01

    Tuber yield, starch content, starch yield and chip color are complex traits that are important for industrial uses and food processing of potato. Chip color depends on the quantity of reducing sugars glucose and fructose in the tubers, which are generated by starch degradation. Reducing sugars accumulate when tubers are stored at low temperatures. Early and efficient selection of cultivars with superior yield, starch yield and chip color is hampered by the fact that reliable phenotypic selection requires multiple year and location trials. Application of DNA-based markers early in the breeding cycle, which are diagnostic for superior alleles of genes that control natural variation of tuber quality, will reduce the number of clones to be evaluated in field trials. Association mapping using genes functional in carbohydrate metabolism as markers has discovered alleles of invertases and starch phosphorylases that are associated with tuber quality traits. Here, we report on new DNA variants at loci encoding ADP-glucose pyrophosphorylase and the invertase Pain-1, which are associated with positive or negative effect with chip color, tuber starch content and starch yield. Marker-assisted selection (MAS) and marker validation were performed in tetraploid breeding populations, using various combinations of 11 allele-specific markers associated with tuber quality traits. To facilitate MAS, user-friendly PCR assays were developed for specific candidate gene alleles. In a multi-parental population of advanced breeding clones, genotypes were selected for having different combinations of five positive and the corresponding negative marker alleles. Genotypes combining five positive marker alleles performed on average better than genotypes with four negative alleles and one positive allele. When tested individually, seven of eight markers showed an effect on at least one quality trait. The direction of effect was as expected. Combinations of two to three marker alleles were

  2. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    Directory of Open Access Journals (Sweden)

    Ramos Hector

    2011-03-01

    Full Text Available Abstract Background Since its inception, proteomics has essentially operated in a discovery mode with the goal of identifying and quantifying the maximal number of proteins in a sample. Increasingly, proteomic measurements are also supporting hypothesis-driven studies, in which a predetermined set of proteins is consistently detected and quantified in multiple samples. Selected reaction monitoring (SRM is a targeted mass spectrometric technique that supports the detection and quantification of specific proteins in complex samples at high sensitivity and reproducibility. Here, we describe ATAQS, an integrated software platform that supports all stages of targeted, SRM-based proteomics experiments including target selection, transition optimization and post acquisition data analysis. This software will significantly facilitate the use of targeted proteomic techniques and contribute to the generation of highly sensitive, reproducible and complete datasets that are particularly critical for the discovery and validation of targets in hypothesis-driven studies in systems biology. Result We introduce a new open source software pipeline, ATAQS (Automated and Targeted Analysis with Quantitative SRM, which consists of a number of modules that collectively support the SRM assay development workflow for targeted proteomic experiments (project management and generation of protein, peptide and transitions and the validation of peptide detection by SRM. ATAQS provides a flexible pipeline for end-users by allowing the workflow to start or end at any point of the pipeline, and for computational biologists, by enabling the easy extension of java algorithm classes for their own algorithm plug-in or connection via an external web site. This integrated system supports all steps in a SRM-based experiment and provides a user-friendly GUI that can be run by any operating system that allows the installation of the Mozilla Firefox web browser. Conclusions Targeted

  3. Selection and validation of appropriate reference genes for quantitative real-time PCR analysis in Salvia hispanica.

    Directory of Open Access Journals (Sweden)

    Rahul Gopalam

    Full Text Available Quantitative real-time polymerase chain reaction (qRT-PCR has become the most popular choice for gene expression studies. For accurate expression analysis, it is pertinent to select a stable reference gene to normalize the data. It is now known that the expression of internal reference genes varies considerably during developmental stages and under different experimental conditions. For Salvia hispanica, an economically important oilseed crop, there are no reports of stable reference genes till date. In this study, we chose 13 candidate reference genes viz. Actin11 (ACT, Elongation factor 1-alpha (EF1-α, Eukaryotic translation initiation factor 3E (ETIF3E, alpha tubulin (α-TUB, beta tubulin (β-TUB, Glyceraldehyde 3-phosphate dehydrogenase (GAPDH, Cyclophilin (CYP, Clathrin adaptor complex (CAC, Serine/threonine-protein phosphatase 2A (PP2A, FtsH protease (FtsH, 18S ribosomal RNA (18S rRNA, S-adenosyl methionine decarboxylase (SAMDC and Rubisco activase (RCA and the expression levels of these genes were assessed in a diverse set of tissue samples representing vegetative stages, reproductive stages and various abiotic stress treatments. Two of the widely used softwares, geNorm and Normfinder were used to evaluate the expression stabilities of these 13 candidate reference genes under different conditions. Results showed that GAPDH and CYP expression remain stable throughout in the different abiotic stress treatments, CAC and PP2A expression were relatively stable under reproductive stages and α-TUB, PP2A and ETIF3E were found to be stably expressed in vegetative stages. Further, the expression levels of Diacylglycerol acyltransferase (DGAT1, a key enzyme in triacylglycerol synthesis was analyzed to confirm the validity of reference genes identified in the study. This is the first systematic study of selection of reference genes in S. hispanica, and will benefit future expression studies in this crop.

  4. Design and Validation of a Cyclic Strain Bioreactor to Condition Spatially-Selective Scaffolds in Dual Strain Regimes

    Directory of Open Access Journals (Sweden)

    J. Matthew Goodhart

    2014-03-01

    Full Text Available The objective of this study was to design and validate a unique bioreactor design for applying spatially selective, linear, cyclic strain to degradable and non-degradable polymeric fabric scaffolds. This system uses a novel three-clamp design to apply cyclic strain via a computer controlled linear actuator to a specified zone of a scaffold while isolating the remainder of the scaffold from strain. Image analysis of polyethylene terephthalate (PET woven scaffolds subjected to a 3% mechanical stretch demonstrated that the stretched portion of the scaffold experienced 2.97% ± 0.13% strain (mean ± standard deviation while the unstretched portion experienced 0.02% ± 0.18% strain. NIH-3T3 fibroblast cells were cultured on the PET scaffolds and half of each scaffold was stretched 5% at 0.5 Hz for one hour per day for 14 days in the bioreactor. Cells were checked for viability and proliferation at the end of the 14 day period and levels of glycosaminoglycan (GAG and collagen (hydroxyproline were measured as indicators of extracellular matrix production. Scaffolds in the bioreactor showed a seven-fold increase in cell number over scaffolds cultured statically in tissue culture plastic petri dishes (control. Bioreactor scaffolds showed a lower concentration of GAG deposition per cell as compared to the control scaffolds largely due to the great increase in cell number. A 75% increase in hydroxyproline concentration per cell was seen in the bioreactor stretched scaffolds as compared to the control scaffolds. Surprisingly, little differences were experienced between the stretched and unstretched portions of the scaffolds for this study. This was largely attributed to the conditioned and shared media effect. Results indicate that the bioreactor system is capable of applying spatially-selective, linear, cyclic strain to cells growing on polymeric fabric scaffolds and evaluating the cellular and matrix responses to the applied strains.

  5. Validation of American Thyroid Association Ultrasound Risk Assessment of Thyroid Nodules Selected for Ultrasound Fine-Needle Aspiration.

    Science.gov (United States)

    Tang, Alice L; Falciglia, Mercedes; Yang, Huaitao; Mark, Jonathan R; Steward, David L

    2017-08-01

    The aim of this study was to validate the American Thyroid Association (ATA) sonographic risk assessment of thyroid nodules. The ATA sonographic risk assessment was prospectively applied to 206 thyroid nodules selected for ultrasound-guided fine-needle aspiration (US-FNA), and analyzed with The Bethesda System for Reporting Thyroid Cytopathology (TBSRTC), as well as surgical pathology for the subset undergoing surgical excision. The analysis included 206 thyroid nodules averaging 2.4 cm (range 1-7 cm; standard error of the mean 0.07). Using the ATA US pattern risk assessment, nodules were classified as high (4%), intermediate (31%), low (38%), and very low (26%) risk of malignancy. Nodule size was inversely correlated with sonographic risk assessment, as lower risk nodules were larger on average (p risk estimates (high 70-90%, intermediate 10-20%, low 5-10%, and very low 3%). ATA US pattern risk assessment also appropriately predicted the proportion of nodules classified as malignant or suspicious for malignancy through TBSRTC classification-high (77%), intermediate (6%), low (1%), and very low 0%-as well as benign TBSRTC classification-high (0%), intermediate (47%), low (61%), and very low (70%) (p risk stratification (high 100%, intermediate 21%, low 17%, and very low 12%; p = 0.003). This prospective study supports the new ATA sonographic pattern risk assessment for selection of thyroid nodules for US-FNA based upon TBSRTC and surgical pathology results. In the setting of indeterminate cytopathology, nodules categorized as atypia of undetermined significance/follicular lesion of undetermined significance with ATA high-risk sonographic patterns have a high likelihood of being malignant.

  6. The predictive validity of a situational judgement test, a clinical problem solving test and the core medical training selection methods for performance in specialty training .

    Science.gov (United States)

    Patterson, Fiona; Lopes, Safiatu; Harding, Stephen; Vaux, Emma; Berkin, Liz; Black, David

    2017-02-01

    The aim of this study was to follow up a sample of physicians who began core medical training (CMT) in 2009. This paper examines the long-term validity of CMT and GP selection methods in predicting performance in the Membership of Royal College of Physicians (MRCP(UK)) examinations. We performed a longitudinal study, examining the extent to which the GP and CMT selection methods (T1) predict performance in the MRCP(UK) examinations (T2). A total of 2,569 applicants from 2008-09 who completed CMT and GP selection methods were included in the study. Looking at MRCP(UK) part 1, part 2 written and PACES scores, both CMT and GP selection methods show evidence of predictive validity for the outcome variables, and hierarchical regressions show the GP methods add significant value to the CMT selection process. CMT selection methods predict performance in important outcomes and have good evidence of validity; the GP methods may have an additional role alongside the CMT selection methods. © Royal College of Physicians 2017. All rights reserved.

  7. Quantitative investigation of Raman selection rules and validation of the secular equation for trigonal LiNbO3

    International Nuclear Information System (INIS)

    Pezzotti, Giuseppe; Hagihara, Hirofumi; Zhu Wenliang

    2013-01-01

    Some theoretical aspects of the vibrational behaviour of trigonal lithium niobate (LiNbO 3 ) are studied and discussed in detail based on spectroscopic experimental assessments. Polarized Raman spectroscopy is systematically applied to retrieve the fundamental parameters governing the dependence of Raman intensity on crystallographic orientation, through quantitating the complete set of individual elements for the second-rank Raman tensors of the LiNbO 3 cell (C 3v (3m) point group, R3c space group). Moreover, computational algorithms are also explicitly constructed to describe the spectral shifts of the selected Raman bands when subjected to unknown stress fields. Accordingly, we have experimentally confirmed the validity of the secular equation for the trigonal cell and quantitatively substantiated its application through the determination of the full set of phonon deformation potentials for seven independent bands among those available in the LiNbO 3 vibrational spectrum. Finally, a brief discussion is offered about the significance of the presented characterizations in the technological field of LiNbO 3 devices, including the newly shown possibility of quantitatively and concurrently unfolding from polarized Raman spectra both crystallographic and mechanical information in their vectorial and tensorial nature, respectively. (paper)

  8. The predictive validity of the selection battery used for junior leader training within the South African national defence force

    Directory of Open Access Journals (Sweden)

    Johannes Muller

    2003-10-01

    Full Text Available The principal objective of the study was to determine the predictive validity of the test battery used for the selection of junior leaders in the South African National Defence Force. A sample of 96 respondents completed certain indices of the SPEEX-Battery as well as the Advanced Ravens Progressive Matrices test. The test results were compared with the course results. Using canonical correlation analysis, a highly significant relationship was found between the independent variables and the dependent variables (r = 0,787; p is less than 0,00005. The predictors with the highest loadings were cognitive ability, conceptualisation, reading comprehension, listening potential, physical stress, and mental stress. Opsomming Die hoofdoelwit van die studie was om die voorspellingsgeldigheid van die toetsbattery vir keuring van junior leiers in die Suid Afrikaanse Nasionale Weermag te evalueer. ’n Steekproef van 96 respondente het sekere indekse van die SPEEX-Battery asook die Advanced Ravens Progressive Matrices toets voltooi. Die toetsresultate is vervolgens vergelyk met die kursusuitslae. Die veranderlikes is aan kanoniese korrelasie-ontleding onderwerp wat ’n betekenisvolle verwantskap opgelewer het tussen die onafhanklike veranderlikes en die afhanklike veranderlikes (r = 0,787; p is kleiner as 0,00005. Die voorspellers met die hoogste ladings was kognitiewe vermoë, konseptualisering, leesbegrip, luisterpotensiaal, fisieke stres en psigiese stres.

  9. Sets of priors reflecting prior-data conflict and agreement

    NARCIS (Netherlands)

    Walter, G.M.; Coolen, F.P.A.; Carvalho, J.P.; Lesot, M.-J.; Kaymak, U.; Vieira, S.; Bouchon-Meunier, B.; Yager, R.R.

    2016-01-01

    Bayesian inference enables combination of observations with prior knowledge in the reasoning process. The choice of a particular prior distribution to represent the available prior knowledge is, however, often debatable, especially when prior knowledge is limited or data are scarce, as then

  10. Validation of the ANSR(®) Listeria monocytogenes Method for Detection of Listeria monocytogenes in Selected Food and Environmental Samples.

    Science.gov (United States)

    Caballero, Oscar; Alles, Susan; Le, Quynh-Nhi; Gray, R Lucas; Hosking, Edan; Pinkava, Lisa; Norton, Paul; Tolan, Jerry; Mozola, Mark; Rice, Jennifer; Chen, Yi; Ryser, Elliot; Odumeru, Joseph

    2016-01-01

    Work was conducted to validate performance of the ANSR(®) for Listeria monocytogenes method in selected food and environmental matrixes. This DNA-based assay involves amplification of nucleic acid via an isothermal reaction based on nicking enzyme amplification technology. Following single-step sample enrichment for 16-24 h for most matrixes, the assay is completed in 40 min using only simple instrumentation. When 50 distinct strains of L. monocytogenes were tested for inclusivity, 48 produced positive results, the exceptions being two strains confirmed by PCR to lack the assay target gene. Forty-seven nontarget strains (30 species), including multiple non-monocytogenes Listeria species as well as non-Listeria, Gram-positive bacteria, were tested, and all generated negative ANSR assay results. Performance of the ANSR method was compared with that of the U.S. Department of Agriculture, Food Safety and Inspection Service Microbiology Laboratory Guidebook reference culture procedure for detection of L. monocytogenes in hot dogs, pasteurized liquid egg, and sponge samples taken from an inoculated stainless steel surface. In addition, ANSR performance was measured against the U.S. Food and Drug Administration Bacteriological Analytical Manual reference method for detection of L. monocytogenes in Mexican-style cheese, cantaloupe, sprout irrigation water, and guacamole. With the single exception of pasteurized liquid egg at 16 h, ANSR method performance as quantified by the number of positives obtained was not statistically different from that of the reference methods. Robustness trials demonstrated that deliberate introduction of small deviations to the normal assay parameters did not affect ANSR method performance. Results of accelerated stability testing conducted using two manufactured lots of reagents predicts stability at the specified storage temperature of 4°C of more than 1 year.

  11. High-throughput screening assay used in pharmacognosy: Selection, optimization and validation of methods of enzymatic inhibition by UV-visible spectrophotometry

    Directory of Open Access Journals (Sweden)

    Graciela Granados-Guzmán

    2014-02-01

    Full Text Available In research laboratories of both organic synthesis and extraction of natural products, every day a lot of products that can potentially introduce some biological activity are obtained. Therefore it is necessary to have in vitro assays, which provide reliable information for further evaluation in in vivo systems. From this point of view, in recent years has intensified the use of high-throughput screening assays. Such trials should be optimized and validated for accurate and precise results, i.e. reliable. The present review addresses the steps needed to develop and validate bioanalytical methods, emphasizing UV-Visible spectrophotometry as detection system. Particularly focuses on the selection of the method, the optimization to determine the best experimental conditions, validation, implementation of optimized and validated method to real samples, and finally maintenance and possible transfer it to a new laboratory.

  12. A highly selective and sensitive ultrasonic assisted dispersive liquid phase microextraction based on deep eutectic solvent for determination of cadmium in food and water samples prior to electrothermal atomic absorption spectrometry.

    Science.gov (United States)

    Zounr, Rizwan Ali; Tuzen, Mustafa; Deligonul, Nihal; Khuhawar, Muhammad Yar

    2018-07-01

    A simple, fast, green, sensitive and selective ultrasonic assisted deep eutectic solvent liquid-phase microextraction technique was used for preconcentration and extraction of cadmium (Cd) in water and food samples by electrothermal atomic absorption spectrometry (ETAAS). In this technique, a synthesized reagent (Z)-N-(3,5-diphenyl-1H-pyrrol-2-yl)-3,5-diphenyl-2H-pyrrol-2-imine (Azo) was used as a complexing agent for Cd. The main factors effecting the pre-concentration and extraction of Cd such as effect of pH, type and composition of deep eutectic solvent (DES), volume of DES, volume of complexing agent, volume of tetrahydrofuran (THF) and ultrasonication time have been examined in detail. At optimum conditions the value of pH and molar ratio of DES were found to be 6.0 and 1:4 (ChCl:Ph), respectively. The detection limit (LOD), limit of quantification (LOQ), relative standard deviation (RSD) and preconcentration factor (PF) were observed as 0.023 ng L -1 , 0.161 ng L -1 , 3.1% and 100, correspondingly. Validation of the developed technique was observed by extraction of Cd in certified reference materials (CRMs) and observed results were successfully compared with certified values. The developed procedure was practiced to various food, beverage and water samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Prior indigenous technological species

    Science.gov (United States)

    Wright, Jason T.

    2018-01-01

    One of the primary open questions of astrobiology is whether there is extant or extinct life elsewhere the solar system. Implicit in much of this work is that we are looking for microbial or, at best, unintelligent life, even though technological artefacts might be much easier to find. Search for Extraterrestrial Intelligence (SETI) work on searches for alien artefacts in the solar system typically presumes that such artefacts would be of extrasolar origin, even though life is known to have existed in the solar system, on Earth, for eons. But if a prior technological, perhaps spacefaring, species ever arose in the solar system, it might have produced artefacts or other technosignatures that have survived to present day, meaning solar system artefact SETI provides a potential path to resolving astrobiology's question. Here, I discuss the origins and possible locations for technosignatures of such a prior indigenous technological species, which might have arisen on ancient Earth or another body, such as a pre-greenhouse Venus or a wet Mars. In the case of Venus, the arrival of its global greenhouse and potential resurfacing might have erased all evidence of its existence on the Venusian surface. In the case of Earth, erosion and, ultimately, plate tectonics may have erased most such evidence if the species lived Gyr ago. Remaining indigenous technosignatures might be expected to be extremely old, limiting the places they might still be found to beneath the surfaces of Mars and the Moon, or in the outer solar system.

  14. On the selection and validation of biological treatment processes. The GDF experience; Le choix et la validation des procedes de traitement biologique. L`experience de GDF

    Energy Technology Data Exchange (ETDEWEB)

    Druelle, V. [Gaz de France (GDF), 75 - Paris (France)

    1996-12-31

    The biological treatment process was selected by Gaz de France (GDF), the French national gas utility, for the de-pollution of an old gas works where the main pollutants are coal tars containing polycyclic aromatic hydrocarbons. Microorganism-based biological treatment techniques may involve bio-reactors, static ground knolls (where oxygen is brought through drains) and dynamic knolls (where oxygenation is carried out by turning up the soil). Issues on sampling, sorting, process testing, site preparation, process control, etc. are reviewed

  15. On the selection and validation of biological treatment processes. The GDF experience; Le choix et la validation des procedes de traitement biologique. L`experience de GDF

    Energy Technology Data Exchange (ETDEWEB)

    Druelle, V [Gaz de France (GDF), 75 - Paris (France)

    1997-12-31

    The biological treatment process was selected by Gaz de France (GDF), the French national gas utility, for the de-pollution of an old gas works where the main pollutants are coal tars containing polycyclic aromatic hydrocarbons. Microorganism-based biological treatment techniques may involve bio-reactors, static ground knolls (where oxygen is brought through drains) and dynamic knolls (where oxygenation is carried out by turning up the soil). Issues on sampling, sorting, process testing, site preparation, process control, etc. are reviewed

  16. Attentional and Contextual Priors in Sound Perception.

    Science.gov (United States)

    Wolmetz, Michael; Elhilali, Mounya

    2016-01-01

    Behavioral and neural studies of selective attention have consistently demonstrated that explicit attentional cues to particular perceptual features profoundly alter perception and performance. The statistics of the sensory environment can also provide cues about what perceptual features to expect, but the extent to which these more implicit contextual cues impact perception and performance, as well as their relationship to explicit attentional cues, is not well understood. In this study, the explicit cues, or attentional prior probabilities, and the implicit cues, or contextual prior probabilities, associated with different acoustic frequencies in a detection task were simultaneously manipulated. Both attentional and contextual priors had similarly large but independent impacts on sound detectability, with evidence that listeners tracked and used contextual priors for a variety of sound classes (pure tones, harmonic complexes, and vowels). Further analyses showed that listeners updated their contextual priors rapidly and optimally, given the changing acoustic frequency statistics inherent in the paradigm. A Bayesian Observer model accounted for both attentional and contextual adaptations found with listeners. These results bolster the interpretation of perception as Bayesian inference, and suggest that some effects attributed to selective attention may be a special case of contextual prior integration along a feature axis.

  17. USAF Enlisted Air Traffic Controller Selection: Examination of the Predictive Validity of the FAA Air Traffic Selection and Training Battery versus Training Performance

    National Research Council Canada - National Science Library

    Carretta, Thomas R; King, Raymond E

    2008-01-01

    .... The current study examined the utility of the FAA Air Traffic Selection and Training (AT-SAT) battery for incrementing the predictiveness of the ASVAB versus several enlisted ATC training criteria...

  18. GENERAL ASPECTS REGARDING THE PRIOR DISCIPLINARY RESEARCH

    Directory of Open Access Journals (Sweden)

    ANDRA PURAN (DASCĂLU

    2012-05-01

    Full Text Available Disciplinary research is the first phase of the disciplinary action. According to art. 251 paragraph 1 of the Labour Code no disciplinary sanction may be ordered before performing the prior disciplinary research.These regulations provide an exception: the sanction of written warning. The current regulations in question, kept from the old regulation, provides a protection for employees against abuses made by employers, since sanctions are affecting the salary or the position held, or even the development of individual employment contract. Thus, prior research of the fact that is a misconduct, before a disciplinary sanction is applied, is an essential condition for the validity of the measure ordered. Through this study we try to highlight some general issues concerning the characteristics, processes and effects of prior disciplinary research.

  19. Impact of entrainment and impingement on fish populations in the Hudson River estuary. Volume III. An analysis of the validity of the utilities' stock-recruitment curve-fitting exercise and prior estimation of beta technique. Environmental Sciences Division publication No. 1792

    International Nuclear Information System (INIS)

    Christensen, S.W.; Goodyear, C.P.; Kirk, B.L.

    1982-03-01

    This report addresses the validity of the utilities' use of the Ricker stock-recruitment model to extrapolate the combined entrainment-impingement losses of young fish to reductions in the equilibrium population size of adult fish. In our testimony, a methodology was developed and applied to address a single fundamental question: if the Ricker model really did apply to the Hudson River striped bass population, could the utilities' estimates, based on curve-fitting, of the parameter alpha (which controls the impact) be considered reliable. In addition, an analysis is included of the efficacy of an alternative means of estimating alpha, termed the technique of prior estimation of beta (used by the utilities in a report prepared for regulatory hearings on the Cornwall Pumped Storage Project). This validation methodology should also be useful in evaluating inferences drawn in the literature from fits of stock-recruitment models to data obtained from other fish stocks

  20. A competency based selection procedure for Dutch postgraduate GP training: a pilot study on validity and reliability

    NARCIS (Netherlands)

    Vermeulen, M.I.; Tromp, F.; Zuithoff, N.P.; Pieters, R.H.; Damoiseaux, R.A.; Kuyvenhoven, M.M.

    2014-01-01

    Abstract Background: Historically, semi-structured interviews (SSI) have been the core of the Dutch selection for postgraduate general practice (GP) training. This paper describes a pilot study on a newly designed competency-based selection procedure that assesses whether candidates have the

  1. Validity of purchasing power parity for selected Latin American countries: Linear and non-linear unit root tests

    Directory of Open Access Journals (Sweden)

    Claudio Roberto Fóffano Vasconcelos

    2016-01-01

    Full Text Available The aim of this study is to examine empirically the validity of PPP in the context of unit root tests based on linear and non-linear models of the real effective exchange rate of Argentina, Brazil, Chile, Colombia, Mexico, Peru and Venezuela. For this purpose, we apply the Harvey et al. (2008 linearity test and the non-linear unit root test (Kruse, 2011. The results show that the series with linear characteristics are Argentina, Brazil, Chile, Colombia and Peru and those with non-linear characteristics are Mexico and Venezuela. The linear unit root tests indicate that the real effective exchange rate is stationary for Chile and Peru, and the non-linear unit root tests evidence that Mexico is stationary. In the period analyzed, the results show support for the validity of PPP in only three of the seven countries.

  2. Selecting short-statured children needing growth hormone testing: Derivation and validation of a clinical decision rule

    Directory of Open Access Journals (Sweden)

    Bréart Gérard

    2008-07-01

    Full Text Available Abstract Background Numerous short-statured children are evaluated for growth hormone (GH deficiency (GHD. In most patients, GH provocative tests are normal and are thus in retrospect unnecessary. Methods A retrospective cohort study was conducted to identify predictors of growth hormone (GH deficiency (GHD in children seen for short stature, and to construct a very sensitive and fairly specific predictive tool to avoid unnecessary GH provocative tests. GHD was defined by the presence of 2 GH concentration peaks Results The initial study included 167 patients, 36 (22% of whom had GHD, including 5 (3% with certain GHD. Independent predictors of GHD were: growth rate Conclusion We have derived and performed an internal validation of a highly sensitive decision rule that could safely help to avoid more than 2/3 of the unnecessary GH tests. External validation of this rule is needed before any application.

  3. Shuffling cross-validation-bee algorithm as a new descriptor selection method for retention studies of pesticides in biopartitioning micellar chromatography.

    Science.gov (United States)

    Zarei, Kobra; Atabati, Morteza; Ahmadi, Monire

    2017-05-04

    Bee algorithm (BA) is an optimization algorithm inspired by the natural foraging behaviour of honey bees to find the optimal solution which can be proposed to feature selection. In this paper, shuffling cross-validation-BA (CV-BA) was applied to select the best descriptors that could describe the retention factor (log k) in the biopartitioning micellar chromatography (BMC) of 79 heterogeneous pesticides. Six descriptors were obtained using BA and then the selected descriptors were applied for model development using multiple linear regression (MLR). The descriptor selection was also performed using stepwise, genetic algorithm and simulated annealing methods and MLR was applied to model development and then the results were compared with those obtained from shuffling CV-BA. The results showed that shuffling CV-BA can be applied as a powerful descriptor selection method. Support vector machine (SVM) was also applied for model development using six selected descriptors by BA. The obtained statistical results using SVM were better than those obtained using MLR, as the root mean square error (RMSE) and correlation coefficient (R) for whole data set (training and test), using shuffling CV-BA-MLR, were obtained as 0.1863 and 0.9426, respectively, while these amounts for the shuffling CV-BA-SVM method were obtained as 0.0704 and 0.9922, respectively.

  4. Semi-physiologic model validation and bioequivalence trials simulation to select the best analyte for acetylsalicylic acid.

    Science.gov (United States)

    Cuesta-Gragera, Ana; Navarro-Fontestad, Carmen; Mangas-Sanjuan, Victor; González-Álvarez, Isabel; García-Arieta, Alfredo; Trocóniz, Iñaki F; Casabó, Vicente G; Bermejo, Marival

    2015-07-10

    The objective of this paper is to apply a previously developed semi-physiologic pharmacokinetic model implemented in NONMEM to simulate bioequivalence trials (BE) of acetyl salicylic acid (ASA) in order to validate the model performance against ASA human experimental data. ASA is a drug with first-pass hepatic and intestinal metabolism following Michaelis-Menten kinetics that leads to the formation of two main metabolites in two generations (first and second generation metabolites). The first aim was to adapt the semi-physiological model for ASA in NOMMEN using ASA pharmacokinetic parameters from literature, showing its sequential metabolism. The second aim was to validate this model by comparing the results obtained in NONMEM simulations with published experimental data at a dose of 1000 mg. The validated model was used to simulate bioequivalence trials at 3 dose schemes (100, 1000 and 3000 mg) and with 6 test formulations with decreasing in vivo dissolution rate constants versus the reference formulation (kD 8-0.25 h (-1)). Finally, the third aim was to determine which analyte (parent drug, first generation or second generation metabolite) was more sensitive to changes in formulation performance. The validation results showed that the concentration-time curves obtained with the simulations reproduced closely the published experimental data, confirming model performance. The parent drug (ASA) was the analyte that showed to be more sensitive to the decrease in pharmaceutical quality, with the highest decrease in Cmax and AUC ratio between test and reference formulations. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. In-house validation of a liquid chromatography-tandem mass spectrometry method for the determination of selective androgen receptor modulators (SARMS) in bovine urine.

    Science.gov (United States)

    Schmidt, Kathrin S; Mankertz, Joachim

    2018-06-01

    A sensitive and robust LC-MS/MS method allowing the rapid screening and confirmation of selective androgen receptor modulators in bovine urine was developed and successfully validated according to Commission Decision 2002/657/EC, chapter 3.1.3 'alternative validation', by applying a matrix-comprehensive in-house validation concept. The confirmation of the analytes in the validation samples was achieved both on the basis of the MRM ion ratios as laid down in Commission Decision 2002/657/EC and by comparison of their enhanced product ion (EPI) spectra with a reference mass spectral library by making use of the QTRAP technology. Here, in addition to the MRM survey scan, EPI spectra were generated in a data-dependent way according to an information-dependent acquisition criterion. Moreover, stability studies of the analytes in solution and in matrix according to an isochronous approach proved the stability of the analytes in solution and in matrix for at least the duration of the validation study. To identify factors that have a significant influence on the test method in routine analysis, a factorial effect analysis was performed. To this end, factors considered to be relevant for the method in routine analysis (e.g. operator, storage duration of the extracts before measurement, different cartridge lots and different hydrolysis conditions) were systematically varied on two levels. The examination of the extent to which these factors influence the measurement results of the individual analytes showed that none of the validation factors exerts a significant influence on the measurement results.

  6. Validation of the thermal-hydraulic system code ATHLET based on selected pressure drop and void fraction BFBT tests

    Energy Technology Data Exchange (ETDEWEB)

    Di Marcello, Valentino, E-mail: valentino.marcello@kit.edu; Escalante, Javier Jimenez; Espinoza, Victor Sanchez

    2015-07-15

    Highlights: • Simulation of BFBT-BWR steady-state and transient tests with ATHLET. • Validation of thermal-hydraulic models based on pressure drops and void fraction measurements. • TRACE system code is used for the comparative study. • Predictions result in a good agreement with the experiments. • Discrepancies are smaller or comparable with respect to the measurements uncertainty. - Abstract: Validation and qualification of thermal-hydraulic system codes based on separate effect tests are essential for the reliability of numerical tools when applied to nuclear power plant analyses. To this purpose, the Institute for Neutron Physics and Reactor Technology (INR) at the Karlsruhe Institute of Technology (KIT) is involved in various validation and qualification activities of different CFD, sub-channel and system codes. In this paper, the capabilities of the thermal-hydraulic code ATHLET are assessed based on the experimental results provided within the NUPEC BFBT benchmark related to key Boiling Water Reactors (BWR) phenomena. Void fraction and pressure drops measurements in the BFBT bundle performed under steady-state and transient conditions which are representative for e.g. turbine trip and recirculation pump trip events, are compared with the numerical results of ATHLET. The comparison of code predictions with the BFBT data has shown good agreement given the experimental uncertainty and the results are consistent with the trends obtained with similar thermal-hydraulic codes.

  7. Accommodating Uncertainty in Prior Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-19

    A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.

  8. Multicenter Validation of a Customizable Scoring Tool for Selection of Trainees for a Residency or Fellowship Program. The EAST-IST Study.

    Science.gov (United States)

    Bosslet, Gabriel T; Carlos, W Graham; Tybor, David J; McCallister, Jennifer; Huebert, Candace; Henderson, Ashley; Miles, Matthew C; Twigg, Homer; Sears, Catherine R; Brown, Cynthia; Farber, Mark O; Lahm, Tim; Buckley, John D

    2017-04-01

    Few data have been published regarding scoring tools for selection of postgraduate medical trainee candidates that have wide applicability. The authors present a novel scoring tool developed to assist postgraduate programs in generating an institution-specific rank list derived from selected elements of the U.S. Electronic Residency Application System (ERAS) application. The authors developed and validated an ERAS and interview day scoring tool at five pulmonary and critical care fellowship programs: the ERAS Application Scoring Tool-Interview Scoring Tool. This scoring tool was then tested for intrarater correlation versus subjective rankings of ERAS applications. The process for development of the tool was performed at four other institutions, and it was performed alongside and compared with the "traditional" ranking methods at the five programs and compared with the submitted National Residency Match Program rank list. The ERAS Application Scoring Tool correlated highly with subjective faculty rankings at the primary institution (average Spearman's r = 0.77). The ERAS Application Scoring Tool-Interview Scoring Tool method correlated well with traditional ranking methodology at all five institutions (Spearman's r = 0.54, 0.65, 0.72, 0.77, and 0.84). This study validates a process for selecting and weighting components of the ERAS application and interview day to create a customizable, institution-specific tool for ranking candidates to postgraduate medical education programs. This scoring system can be used in future studies to compare the outcomes of fellowship training.

  9. Use of Sensitivity and Uncertainty Analysis to Select Benchmark Experiments for the Validation of Computer Codes and Data

    International Nuclear Information System (INIS)

    Elam, K.R.; Rearden, B.T.

    2003-01-01

    Sensitivity and uncertainty analysis methodologies under development at Oak Ridge National Laboratory were applied to determine whether existing benchmark experiments adequately cover the area of applicability for the criticality code and data validation of PuO 2 and mixed-oxide (MOX) powder systems. The study examined three PuO 2 powder systems and four MOX powder systems that would be useful for establishing mass limits for a MOX fuel fabrication facility. Using traditional methods to choose experiments for criticality analysis validation, 46 benchmark critical experiments were identified as applicable to the PuO 2 powder systems. However, only 14 experiments were thought to be within the area of applicability for dry MOX powder systems.The applicability of 318 benchmark critical experiments, including the 60 experiments initially identified, was assessed. Each benchmark and powder system was analyzed using the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) one-dimensional (TSUNAMI-1D) or TSUNAMI three-dimensional (TSUNAMI-3D) sensitivity analysis sequences, which will be included in the next release of the SCALE code system. This sensitivity data and cross-section uncertainty data were then processed with TSUNAMI-IP to determine the correlation of each application to each experiment in the benchmarking set. Correlation coefficients are used to assess the similarity between systems and determine the applicability of one system for the code and data validation of another.The applicability of most of the experiments identified using traditional methods was confirmed by the TSUNAMI analysis. In addition, some PuO 2 and MOX powder systems were determined to be within the area of applicability of several other benchmarks that would not have been considered using traditional methods. Therefore, the number of benchmark experiments useful for the validation of these systems exceeds the number previously expected. The TSUNAMI analysis

  10. Prediction of cognitive and motor development in preterm children using exhaustive feature selection and cross-validation of near-term white matter microstructure.

    Science.gov (United States)

    Schadl, Kornél; Vassar, Rachel; Cahill-Rowley, Katelyn; Yeom, Kristin W; Stevenson, David K; Rose, Jessica

    2018-01-01

    Advanced neuroimaging and computational methods offer opportunities for more accurate prognosis. We hypothesized that near-term regional white matter (WM) microstructure, assessed on diffusion tensor imaging (DTI), using exhaustive feature selection with cross-validation would predict neurodevelopment in preterm children. Near-term MRI and DTI obtained at 36.6 ± 1.8 weeks postmenstrual age in 66 very-low-birth-weight preterm neonates were assessed. 60/66 had follow-up neurodevelopmental evaluation with Bayley Scales of Infant-Toddler Development, 3rd-edition (BSID-III) at 18-22 months. Linear models with exhaustive feature selection and leave-one-out cross-validation computed based on DTI identified sets of three brain regions most predictive of cognitive and motor function; logistic regression models were computed to classify high-risk infants scoring one standard deviation below mean. Cognitive impairment was predicted (100% sensitivity, 100% specificity; AUC = 1) by near-term right middle-temporal gyrus MD, right cingulate-cingulum MD, left caudate MD. Motor impairment was predicted (90% sensitivity, 86% specificity; AUC = 0.912) by left precuneus FA, right superior occipital gyrus MD, right hippocampus FA. Cognitive score variance was explained (29.6%, cross-validated Rˆ2 = 0.296) by left posterior-limb-of-internal-capsule MD, Genu RD, right fusiform gyrus AD. Motor score variance was explained (31.7%, cross-validated Rˆ2 = 0.317) by left posterior-limb-of-internal-capsule MD, right parahippocampal gyrus AD, right middle-temporal gyrus AD. Search in large DTI feature space more accurately identified neonatal neuroimaging correlates of neurodevelopment.

  11. Improving selection of markers in nutrition research: evaluation of the criteria proposed by the ILSI Europe Marker Validation Initiative.

    Science.gov (United States)

    Calder, Philip C; Boobis, Alan; Braun, Deborah; Champ, Claire L; Dye, Louise; Einöther, Suzanne; Greyling, Arno; Matthys, Christophe; Putz, Peter; Wopereis, Suzan; Woodside, Jayne V; Antoine, Jean-Michel

    2017-06-01

    The conduct of high-quality nutrition research requires the selection of appropriate markers as outcomes, for example as indicators of food or nutrient intake, nutritional status, health status or disease risk. Such selection requires detailed knowledge of the markers, and consideration of the factors that may influence their measurement, other than the effects of nutritional change. A framework to guide selection of markers within nutrition research studies would be a valuable tool for researchers. A multidisciplinary Expert Group set out to test criteria designed to aid the evaluation of candidate markers for their usefulness in nutrition research and subsequently to develop a scoring system for markers. The proposed criteria were tested using thirteen markers selected from a broad range of nutrition research fields. The result of this testing was a modified list of criteria and a template for evaluating a potential marker against the criteria. Subsequently, a semi-quantitative system for scoring a marker and an associated template were developed. This system will enable the evaluation and comparison of different candidate markers within the same field of nutrition research in order to identify their relative usefulness. The ranking criteria of proven, strong, medium or low are likely to vary according to research setting, research field and the type of tool used to assess the marker and therefore the considerations for scoring need to be determined in a setting-, field- and tool-specific manner. A database of such markers, their interpretation and range of possible values would be valuable to nutrition researchers.

  12. Developing and Validating a Rapid Small-Scale Column Test Procedure for GAC Selection using Reconstituted Lyophilized NOM

    Science.gov (United States)

    Cost effective design and operation of Granular Activated Carbon (GAC) facilities requires the selection of GAC that is optimal for a specific site. Rapid small-scale column tests (RSSCTs) are widely used for GAC assessment due to several advantages, including the ability to simu...

  13. The Prior Internet Resources 2017

    DEFF Research Database (Denmark)

    Engerer, Volkmar Paul; Albretsen, Jørgen

    2017-01-01

    The Prior Internet Resources (PIR) are presented. Prior’s unpublished scientific manuscripts and his wast letter correspondence with fellow researchers at the time, his Nachlass, is now subject to transcription by Prior-researchers worldwide, and form an integral part of PIR. It is demonstrated...

  14. The Importance of Prior Knowledge.

    Science.gov (United States)

    Cleary, Linda Miller

    1989-01-01

    Recounts a college English teacher's experience of reading and rereading Noam Chomsky, building up a greater store of prior knowledge. Argues that Frank Smith provides a theory for the importance of prior knowledge and Chomsky's work provided a personal example with which to interpret and integrate that theory. (RS)

  15. Validating the Kinematic Wave Approach for Rapid Soil Erosion Assessment and Improved BMP Site Selection to Enhance Training Land Sustainability

    Science.gov (United States)

    2014-02-01

    installation based on a Euclidean distance allocation and assigned that installation’s threshold values. The second approach used a thin - plate spline ...installation critical nLS+ thresholds involved spatial interpolation. A thin - plate spline radial basis functions (RBF) was selected as the...the interpolation of installation results using a thin - plate spline radial basis function technique. 6.5 OBJECTIVE #5: DEVELOP AND

  16. Validation of a New Method to Automatically Select Cases With Intraoperative Red Blood Cell Transfusion for Audit.

    Science.gov (United States)

    Dexter, Franklin; Epstein, Richard H; Ledolter, Johannes; Dasovich, Susan M; Herman, Jay H; Maga, Joni M; Schwenk, Eric S

    2018-05-01

    Hospitals review allogeneic red blood cell (RBC) transfusions for appropriateness. Audit criteria have been published that apply to 5 common procedures. We expanded on this work to study the management decision of selecting which cases involving transfusion of at least 1 RBC unit to audit (review) among all surgical procedures, including those previously studied. This retrospective, observational study included 400,000 cases among 1891 different procedures over an 11-year period. There were 12,616 cases with RBC transfusion. We studied the proportions of cases that would be audited based on criteria of nadir hemoglobin (Hb) greater than the hospital's selected transfusion threshold, or absent Hb or missing estimated blood loss (EBL) among procedures with median EBL 50%) that would be audited and most cases (>50%) with transfusion were among procedures with median EBL 9 g/dL, the procedure's median EBL was 9 g/dL and median EBL for the procedure ≥500 mL. An automated process to select cases for audit of intraoperative transfusion of RBC needs to consider the median EBL of the procedure, whether the nadir Hb is below the hospital's Hb transfusion threshold for surgical cases, and the absence of either a Hb or entry of the EBL for the case. This conclusion applies to all surgical cases and procedures.

  17. Validation, optimisation, and application data in support of the development of a targeted selected ion monitoring assay for degraded cardiac troponin T

    Directory of Open Access Journals (Sweden)

    Alexander S. Streng

    2016-06-01

    Full Text Available Cardiac troponin T (cTnT fragmentation in human serum was investigated using a newly developed targeted selected ion monitoring assay, as described in the accompanying article: “Development of a targeted selected ion monitoring assay for the elucidation of protease induced structural changes in cardiac troponin T” [1]. This article presents data describing aspects of the validation and optimisation of this assay. The data consists of several figures, an excel file containing the results of a sequence identity search, and a description of the raw mass spectrometry (MS data files, deposited in the ProteomeXchange repository with id PRIDE: http://www.ebi.ac.uk/pride/archive/projects/PXD003187.

  18. Determination of validity and reliability of performance assessments tasks developed for selected topics in high school chemistry

    Science.gov (United States)

    Zichittella, Gail Eberhardt

    The primary purpose of this study was to validate performance assessments, which can be used as teaching and assessment instruments in high school science classrooms. This study evaluated the classroom usability of these performance instruments and establishes the interrater reliability of the scoring rubrics when used by classroom teachers. The assessment instruments were designed to represent two levels of scientific inquiry. The high level of inquiry tasks are relatively unstructured in terms of student directions; the low inquiry tasks provided more structure for the student. The tasks cover two content topics studied in chemistry (scientific observation and density). Students from a variety of Western New York school districts who were enrolled in chemistry classes and other science courses were involved in completion of the tasks at the two levels of inquiry. The chemistry students completed the NYS Regents Examination in Chemistry. Their classroom teachers were interviewed and completed a questionnaire to aid in the establishment their epistemological view on the inclusion of inquiry based learning in the science classroom. Data showed that the performance assessment tasks were reliable, valid and helpful for obtaining a more complete picture of the students' scientific understanding. The teacher participants reported no difficulty with the usability of the task in the high school chemistry setting. Collected data gave no evidence of gender bias with reference to the performance tasks or the NYS Regents Chemistry Examination. Additionally, it was shown that the instructors' classroom practices do have an effect upon the students' achievement on the performance tasks and the NYS Regents examination. Data also showed that achievement on the performance tasks was influenced by the number of years of science instruction students had received.

  19. Identification of selective inhibitors of RET and comparison with current clinical candidates through development and validation of a robust screening cascade [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Amanda J. Watson

    2016-05-01

    Full Text Available RET (REarranged during Transfection is a receptor tyrosine kinase, which plays pivotal roles in regulating cell survival, differentiation, proliferation, migration and chemotaxis. Activation of RET is a mechanism of oncogenesis in medullary thyroid carcinomas where both germline and sporadic activating somatic mutations are prevalent.   At present, there are no known specific RET inhibitors in clinical development, although many potent inhibitors of RET have been opportunistically identified through selectivity profiling of compounds initially designed to target other tyrosine kinases. Vandetanib and cabozantinib, both multi-kinase inhibitors with RET activity, are approved for use in medullary thyroid carcinoma, but additional pharmacological activities, most notably inhibition of vascular endothelial growth factor - VEGFR2 (KDR, lead to dose-limiting toxicity. The recent identification of RET fusions present in ~1% of lung adenocarcinoma patients has renewed interest in the identification and development of more selective RET inhibitors lacking the toxicities associated with the current treatments.   In an earlier publication [Newton et al, 2016; 1] we reported the discovery of a series of 2-substituted phenol quinazolines as potent and selective RET kinase inhibitors. Here we describe the development of the robust screening cascade which allowed the identification and advancement of this chemical series.  Furthermore we have profiled a panel of RET-active clinical compounds both to validate the cascade and to confirm that none display a RET-selective target profile.

  20. Novel and validated titrimetric method for determination of selected angiotensin-II-receptor antagonists in pharmaceutical preparations and its comparison with UV spectrophotometric determination

    Directory of Open Access Journals (Sweden)

    Shrikant H. Patil

    2012-12-01

    Full Text Available A novel and simple titrimetric method for determination of commonly used angiotensin-II-receptor antagonists (ARA-IIs is developed and validated. The direct acid base titration of four ARA-IIs, namely eprosartan mesylate, irbesartan, telmisartan and valsartan, was carried out in the mixture of ethanol:water (1:1 as solvent using standardized sodium hydroxide aqueous solution as titrant, either visually using phenolphthalein as an indicator or potentiometrically using combined pH electrode. The method was found to be accurate and precise, having relative standard deviation of less than 2% for all ARA-IIs studied. Also, it was shown that the method could be successfully applied to the assay of commercial pharmaceuticals containing the above-mentioned ARA-IIs. The validity of the method was tested by the recovery studies of standard addition to pharmaceuticals and the results were found to be satisfactory. Results obtained by this method were found to be in good agreement with those obtained by UV spectrophotometric method. For UV spectrophotometric analysis ethanol was used as a solvent and wavelength of 233 nm, 246 nm, 296 nm, and 250 nm was selected for determination of eprosartan mesylate, irbesartan, telmisartan, and valsartan respectively. The proposed titrimetric method is simple, rapid, convenient and sufficiently precise for quality control purposes. Keywords: Angiotensin-II-receptor antagonists, Titrimetric assay, UV spectrophotometry, Validation

  1. Construct-level predictive validity of educational attainment and intellectual aptitude tests in medical student selection: meta-regression of six UK longitudinal studies

    Science.gov (United States)

    2013-01-01

    Background Measures used for medical student selection should predict future performance during training. A problem for any selection study is that predictor-outcome correlations are known only in those who have been selected, whereas selectors need to know how measures would predict in the entire pool of applicants. That problem of interpretation can be solved by calculating construct-level predictive validity, an estimate of true predictor-outcome correlation across the range of applicant abilities. Methods Construct-level predictive validities were calculated in six cohort studies of medical student selection and training (student entry, 1972 to 2009) for a range of predictors, including A-levels, General Certificates of Secondary Education (GCSEs)/O-levels, and aptitude tests (AH5 and UK Clinical Aptitude Test (UKCAT)). Outcomes included undergraduate basic medical science and finals assessments, as well as postgraduate measures of Membership of the Royal Colleges of Physicians of the United Kingdom (MRCP(UK)) performance and entry in the Specialist Register. Construct-level predictive validity was calculated with the method of Hunter, Schmidt and Le (2006), adapted to correct for right-censorship of examination results due to grade inflation. Results Meta-regression analyzed 57 separate predictor-outcome correlations (POCs) and construct-level predictive validities (CLPVs). Mean CLPVs are substantially higher (.450) than mean POCs (.171). Mean CLPVs for first-year examinations, were high for A-levels (.809; CI: .501 to .935), and lower for GCSEs/O-levels (.332; CI: .024 to .583) and UKCAT (mean = .245; CI: .207 to .276). A-levels had higher CLPVs for all undergraduate and postgraduate assessments than did GCSEs/O-levels and intellectual aptitude tests. CLPVs of educational attainment measures decline somewhat during training, but continue to predict postgraduate performance. Intellectual aptitude tests have lower CLPVs than A-levels or GCSEs

  2. Assessing food selection in a health promotion program: validation of a brief instrument for American Indian children in the southwest United States.

    Science.gov (United States)

    Koehler, K M; Cunningham-Sabo, L; Lambert, L C; McCalman, R; Skipper, B J; Davis, S M

    2000-02-01

    Brief dietary assessment instruments are needed to evaluate behavior changes of participants in dietary intervention programs. The purpose of this project was to design and validate an instrument for children participating in Pathways to Health, a culturally appropriate, cancer prevention curriculum. Validation of a brief food selection instrument, Yesterday's Food Choices (YFC), which contained 33 questions about foods eaten the previous day with response choices of yes, no, or not sure. Reference data for validation were 24-hour dietary recalls administered individually to 120 students selected randomly. The YFC and 24-hour dietary recalls were administered to American Indian children in fifth- and seventh-grade classes in the Southwest United States. Dietary recalls were coded for food items in the YFC and results were compared for each item using percentage agreement and the kappa statistic. Percentage agreement for all items was greater than 60%; for most items it was greater than 70%, and for several items it was greater than 80%. The amount of agreement beyond that explained by chance (kappa statistic) was generally small. Three items showed substantial agreement beyond chance (kappa > or = 0.6); 2 items showed moderate agreement (kappa = 0.40 to 0.59) most items showed fair agreement (kappa = 0.20 to 0.39). The food items showing substantial agreement were hot or cold cereal, low-fat milk, and mutton or chile stew. Fried or scrambled eggs and deep-fried foods showed moderate agreement beyond chances. Previous development and validation of brief food selection instruments for children participating in health promotion programs has had limited success. In this study, instrument-related factors that apparently contributed to poor agreement between data from the YFC and 24-hour dietary recall were inclusion of categories of foods vs specific foods; food knowledge, preparation, and vocabulary, item length, and overreporting of attractive foods. Collecting and

  3. Reliability and validity of selected measures associated with increased fall risk in females over the age of 45 years with distal radius fracture - A pilot study.

    Science.gov (United States)

    Mehta, Saurabh P; MacDermid, Joy C; Richardson, Julie; MacIntyre, Norma J; Grewal, Ruby

    2015-01-01

    Clinical measurement. This study examined test-retest reliability and convergent/divergent construct validity of selected tests and measures that assess balance impairment, fear of falling (FOF), impaired physical activity (PA), and lower extremity muscle strength (LEMS) in females >45 years of age after the distal radius fracture (DRF) population. Twenty one female participants with DRF were assessed on two occasions. Timed Up and Go, Functional Reach, and One Leg Standing tests assessed balance impairment. Shortened Falls Efficacy Scale, Activity-specific Balance Confidence scale, and Fall Risk Perception Questionnaire assessed FOF. International Physical Activity Questionnaire and Rapid Assessment of Physical Activity were administered to assess PA level. Chair stand test and isometric muscle strength testing for hip and knee assessed LEMS. Intraclass correlation coefficients (ICC) examined the test-retest reliability of the measures. Pearson correlation coefficients (r) examined concurrent relationships between the measures. The results demonstrated fair to excellent test-retest reliability (ICC between 0.50 and 0.96) and low to moderate concordance between the measures (low if r ≤ 0.4; moderate if r = 0.4-0.7). The results provide preliminary estimates of test-retest reliability and convergent/divergent construct validity of selected measures associated with increased risk for falling in the females >45 years of age after DRF. Further research directions to advance knowledge regarding fall risk assessment in DRF population have been identified. Copyright © 2015 Hanley & Belfus. Published by Elsevier Inc. All rights reserved.

  4. Development and Validation of an ICP-MS Method for Simultaneous Determination of Selected Metals in Electronic Cigarette Aerosol

    Directory of Open Access Journals (Sweden)

    Ohashi Shintaro

    2018-04-01

    Full Text Available Safety and quality standards for electronic cigarettes (e-cigarettes have been introduced regionally. In 2016, the U.S. Food and Drug Administration (FDA issued a rule to regulate e-cigarettes, requiring to report harmful and potentially harmful constituents (HPHCs. In the United Kingdom, the British Standards Institution (BSI specified the metals to be monitored for e-cigarettes. In this study, a method was developed and validated for the simultaneous determination of 13 metals (Be, Al, Cr, Fe, Co, Ni, Cu, As, Se, Ag, Cd, Sn and Pb in e-cigarette aerosol. Furthermore, matrix effects of major constituents in the aerosol were investigated using glycerol or 1,2-propylene glycol solutions. E-cigarette aerosol was generated by a rotary smoking machine according to CORESTA Recommended Method N° 81 and collected by an electrostatic precipitator coupled to an impinger containing nitric acid. The collected aerosol was dissolved in nitric acid and an aliquot of this solution was analyzed by inductively coupled plasma mass spectrometry (ICP-MS equipped with a collision/reaction cell.

  5. Bench-top validation testing of selected immunological and molecular Renibacterium salmoninarum diagnostic assays by comparison with quantitative bacteriological culture

    Science.gov (United States)

    Elliott, D.G.; Applegate, L.J.; Murray, A.L.; Purcell, M.K.; McKibben, C.L.

    2013-01-01

    No gold standard assay exhibiting error-free classification of results has been identified for detection of Renibacterium salmoninarum, the causative agent of salmonid bacterial kidney disease. Validation of diagnostic assays for R. salmoninarum has been hindered by its unique characteristics and biology, and difficulties in locating suitable populations of reference test animals. Infection status of fish in test populations is often unknown, and it is commonly assumed that the assay yielding the most positive results has the highest diagnostic accuracy, without consideration of misclassification of results. In this research, quantification of R. salmoninarum in samples by bacteriological culture provided a standardized measure of viable bacteria to evaluate analytical performance characteristics (sensitivity, specificity and repeatability) of non-culture assays in three matrices (phosphate-buffered saline, ovarian fluid and kidney tissue). Non-culture assays included polyclonal enzyme-linked immunosorbent assay (ELISA), direct smear fluorescent antibody technique (FAT), membrane-filtration FAT, nested polymerase chain reaction (nested PCR) and three real-time quantitative PCR assays. Injection challenge of specific pathogen-free Chinook salmon, Oncorhynchus tshawytscha (Walbaum), with R. salmoninarum was used to estimate diagnostic sensitivity and specificity. Results did not identify a single assay demonstrating the highest analytical and diagnostic performance characteristics, but revealed strengths and weaknesses of each test.

  6. New valid spectrofluorimetric method for determination of selected cephalosporins in different pharmaceutical formulations using safranin as fluorophore

    Science.gov (United States)

    Derayea, Sayed M.; Ahmed, Hytham M.; Abdelmageed, Osama H.; Haredy, Ahmed M.

    2016-01-01

    A new validated spectrofluorimetric method has been developed for the determination of some cephalosporins namely; cefepime, cefaclor, cefadroxil, cefpodoxime and cefexime. The method was based on the reaction of these drugs with safranin in slightly alkaline medium (pH 8.0), to form ion-association complexes. The fluorescent products were extracted into chloroform and their fluorescence intensities were measured at 544-565 nm after excitation at 518-524 nm. The reaction conditions influencing the product formation and stability were investigated and optimized. The relative fluorescence intensity was proportional to the drug concentration in the linear ranges of 0.15-1.35, 0.35-1.25, 0.35-1.25, 0.20-1.44 and 0.20-1.25 μg/mL for cefepime, cefaclor, cefadroxil, cefpodoxime proxetil and cefexime, respectively. The detection limits were 40, 100, 100, 60 and 70 ng/mL, respectively. The performance of the developed method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference spectrophotometric method. Various pharmaceutical formulations were successfully analyzed using the proposed method and the results were in good agreement with those of the previously reported methods.

  7. Selected ICAR Data from the SAPA-Project: Development and Initial Validation of a Public-Domain Measure

    Directory of Open Access Journals (Sweden)

    David M. Condon

    2016-01-01

    Full Text Available These data were collected during the initial evaluation of the International Cognitive Ability Resource (ICAR project. ICAR is an international collaborative effort to develop open-source public-domain tools for cognitive ability assessment, including tools that can be administered in non-proctored environments (e.g., online administration and those which are based on automatic item generation algorithms. These data provide initial validation of the first four ICAR item types as reported in Condon & Revelle [1]. The 4 item types contain a total of 60 items: 9 Letter and Number Series items, 11 Matrix Reasoning items, 16 Verbal Reasoning items and 24 Three-dimensional Rotation items. Approximately 97,000 individuals were administered random subsets of these 60 items using the Synthetic Aperture Personality Assessment method between August 18, 2010 and May 20, 2013. The data are available in rdata and csv formats and are accompanied by documentation stored as a text file. Re-use potential includes a wide range of structural and item-level analyses.

  8. Translation and validation of the assistive technology device predisposition assessment in Greek in order to assess satisfaction with use of the selected assistive device.

    Science.gov (United States)

    Koumpouros, Yiannis; Papageorgiou, Effie; Karavasili, Alexandra; Alexopoulou, Despoina

    2017-07-01

    To examine the Assistive Technology Device Predisposition Assessment scale and provide evidence of validity and reliability of the Greek version. We translated and adapted the original instrument in Greek according to the most well-known guidelines recommendations. Field test studies were conducted in a rehabilitation hospital to validate the appropriateness of the final results. Ratings of the different items were statistically analyzed. We recruited 115 subjects who were administered the Form E of the original questionnaire. The experimental analysis conducted revealed a three subscales structure: (i) Adaptability, (ii) Fit to Use, and (iii) Socializing. According to the results of our study the three subscales measure different constructs. Reliability measures (ICC = 0.981, Pearson's correlation = 0.963, Cronbach's α = 0.701) yielded high values. Test-retest outcome showed great stability. This is the first study, at least to the knowledge of the authors, which focuses merely on measuring the satisfaction of the users from the used assistive device, while exploring the Assistive Technology Device Predisposition Assessment - Device Form in such depth. According to the results, it is a stable, valid and reliable instrument and applicable to the Greek population. Thus, it can be used to measure the satisfaction of patients with assistive devices. Implications for Rehabilitation The paper explores the cultural adaptability and applicability of ATD PA - Device Form. ATD PA - Device Form can be used to assess user satisfaction by the selected assistive device. ATD PA - Device Form is a valid and reliable instrument in measuring users' satisfaction in Greekreality.

  9. Selection and validation of endogenous reference genes for qRT-PCR analysis in leafy spurge (Euphorbia esula.

    Directory of Open Access Journals (Sweden)

    Wun S Chao

    Full Text Available Quantitative real-time polymerase chain reaction (qRT-PCR is the most important tool in measuring levels of gene expression due to its accuracy, specificity, and sensitivity. However, the accuracy of qRT-PCR analysis strongly depends on transcript normalization using stably expressed reference genes. The aim of this study was to find internal reference genes for qRT-PCR analysis in various experimental conditions for seed, adventitious underground bud, and other organs of leafy spurge. Eleven candidate reference genes (BAM4, PU1, TRP-like, FRO1, ORE9, BAM1, SEU, ARF2, KAPP, ZTL, and MPK4 were selected from among 171 genes based on expression stabilities during seed germination and bud growth. The other ten candidate reference genes were selected from three different sources: (1 3 stably expressed leafy spurge genes (60S, bZIP21, and MD-100 identified from the analyses of leafy spurge microarray data; (2 3 orthologs of Arabidopsis "general purpose" traditional reference genes (GAPDH_1, GAPDH_2, and UBC; and (3 4 orthologs of Arabidopsis stably expressed genes (UBC9, SAND, PTB, and F-box identified from Affymetrix ATH1 whole-genome GeneChip studies. The expression stabilities of these 21 genes were ranked based on the C(T values of 72 samples using four different computation programs including geNorm, Normfinder, BestKeeper, and the comparative ΔC(T method. Our analyses revealed SAND, PTB, ORE9, and ARF2 to be the most appropriate reference genes for accurate normalization of gene expression data. Since SAND and PTB were obtained from 4 orthologs of Arabidopsis, while ORE9 and ARF2 were selected from 171 leafy spurge genes, it was more efficient to identify good reference genes from the orthologs of other plant species that were known to be stably expressed than that of randomly testing endogenous genes. Nevertheless, the two newly identified leafy spurge genes, ORE9 and ARF2, can serve as orthologous candidates in the search for reference genes

  10. Recruiting for Prior Service Market

    Science.gov (United States)

    2008-06-01

    perceptions, expectations and issues for re-enlistment • Develop potential marketing and advertising tactics and strategies targeted to the defined...01 JUN 2008 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Recruiting for Prior Service Market 5a. CONTRACT NUMBER 5b. GRANT...Command First Handshake to First Unit of Assignment An Army of One Proud to Be e e to Serve Recruiting for Prior Service Market MAJ Eric Givens / MAJ Brian

  11. A novel computational approach for development of highly selective fenitrothion imprinted polymer: theoretical predictions and experimental validations

    Energy Technology Data Exchange (ETDEWEB)

    Barros, Leonardo Augusto de; Pereira, Leandro Alves; Rath, Susanne [Universidade de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Dept. de Quimica Analitica; Custodio, Rogerio [Universidade de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Dept. de Fisico-Quimica

    2014-04-15

    The quality of molecularly imprinted recognition sites depend on the mechanisms and the extent of the functional monomer-template interactions present in the prepolymerization mixture. Thus, an understanding of the physical parameters governing these interactions is key for producing a highly selective molecularly imprinted polymer (MIP). In this paper, novel molecular modeling studies were performed to optimize the conditions for the molecular imprinting of fenitrothion. Four possible functional monomers were evaluated. Five porogenic solvents were investigated employing the polarizable continuum method. The MIP based in methacrylic acid (MAA-MIP) synthesized in the presence of toluene shown to be the most thermodynamically stable complex. Contrarily, MIP based in p-vinylbenzoic acid (PVB-MIP) had the lowest binding energy. According to the adsorption parameters fitted by the Langmuir-Freundlich isotherm, MAA-MIP presented twice the number of binding sites compared to PVB-MIP (103.35 and 53.77 μmol g{sup -1}, respectively) (author)

  12. Validation of a simple distributed sediment delivery approach in selected sub-basins of the River Inn catchment area

    Science.gov (United States)

    Reid, Lucas; Kittlaus, Steffen; Scherer, Ulrike

    2015-04-01

    For large areas without highly detailed data the empirical Universal Soil Loss Equation (USLE) is widely used to quantify soil loss. The problem though is usually the quantification of actual sediment influx into the rivers. As the USLE provides long-term mean soil loss rates, it is often combined with spatially lumped models to estimate the sediment delivery ratio (SDR). But it gets difficult with spatially lumped approaches in large catchment areas where the geographical properties have a wide variance. In this study we developed a simple but spatially distributed approach to quantify the sediment delivery ratio by considering the characteristics of the flow paths in the catchments. The sediment delivery ratio was determined using an empirical approach considering the slope, morphology and land use properties along the flow path as an estimation of travel time of the eroded particles. The model was tested against suspended solids measurements in selected sub-basins of the River Inn catchment area in Germany and Austria, ranging from the high alpine south to the Molasse basin in the northern part.

  13. Do candidate reactions relate to job performance or affect criterion-related validity? A multistudy investigation of relations among reactions, selection test scores, and job performance.

    Science.gov (United States)

    McCarthy, Julie M; Van Iddekinge, Chad H; Lievens, Filip; Kung, Mei-Chuan; Sinar, Evan F; Campion, Michael A

    2013-09-01

    Considerable evidence suggests that how candidates react to selection procedures can affect their test performance and their attitudes toward the hiring organization (e.g., recommending the firm to others). However, very few studies of candidate reactions have examined one of the outcomes organizations care most about: job performance. We attempt to address this gap by developing and testing a conceptual framework that delineates whether and how candidate reactions might influence job performance. We accomplish this objective using data from 4 studies (total N = 6,480), 6 selection procedures (personality tests, job knowledge tests, cognitive ability tests, work samples, situational judgment tests, and a selection inventory), 5 key candidate reactions (anxiety, motivation, belief in tests, self-efficacy, and procedural justice), 2 contexts (industry and education), 3 continents (North America, South America, and Europe), 2 study designs (predictive and concurrent), and 4 occupational areas (medical, sales, customer service, and technological). Consistent with previous research, candidate reactions were related to test scores, and test scores were related to job performance. Further, there was some evidence that reactions affected performance indirectly through their influence on test scores. Finally, in no cases did candidate reactions affect the prediction of job performance by increasing or decreasing the criterion-related validity of test scores. Implications of these findings and avenues for future research are discussed. PsycINFO Database Record (c) 2013 APA, all rights reserved

  14. Ultra-preconcentration and determination of selected pharmaceutical and personal care products in different water matrices by solid-phase extraction combined with dispersive liquid-liquid microextraction prior to ultra high pressure liquid chromatography tandem mass spectrometry analysis.

    Science.gov (United States)

    Celano, Rita; Piccinelli, Anna Lisa; Campone, Luca; Rastrelli, Luca

    2014-08-15

    Pharmaceutical and personal care products (PPCPs) are one of the most important classes of emerging contaminants. The potential of ecological and environmental impacts associated with PPCPs are of particular concern because they continually penetrate the aquatic environment. This work describes a novel ultra-preconcentration technique for the rapid and highly sensitive analysis of selected PPCPs in environmental water matrices at ppt levels. Selected PPCPs were rapidly extracted and concentrated from large volumes of aqueous solutions (500 and 250mL) by solid-phase extraction combined with dispersive liquid-liquid microextraction (SPE-DLLME) and then analyzed using UHPLC-MS/MS. Experimental parameters were carefully investigated and optimized to achieve the best SPE-DLLME efficiency and higher enrichment factors. The best results were obtained using the ternary mixture acetonitrile/methanol/dichloromethane 3:3:4, v/v/v, both as SPE eluent and DLLME extractant/dispersive mixture. DLLME aqueous solution (5% NaCl, 10mgL(-1) TBAB) was also modified to improve the extraction efficiency of more hydrophilic PPCPs. Under the optimal conditions, an exhaustive extraction for most of the investigated analytes (recoveries >70%), with a precision (RSD drinking, sea, river and wastewater). Method detection and quantification limits were at very low ppt levels and below 1 and 3ngL(-1), respectively, for 15 of selected PPCPs. The proposed analytical procedure offers numerous advantages such as the simplicity of operation, rapidity, a high enrichment factor and sensitivity. So it is suitable for monitoring and studies of occurrence of PPCPs in different environmental compartments. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Novel Selectivity-Based Forensic Toxicological Validation of a Paper Spray Mass Spectrometry Method for the Quantitative Determination of Eight Amphetamines in Whole Blood

    Science.gov (United States)

    Teunissen, Sebastiaan F.; Fedick, Patrick W.; Berendsen, Bjorn J. A.; Nielen, Michel W. F.; Eberlin, Marcos N.; Graham Cooks, R.; van Asten, Arian C.

    2017-12-01

    Paper spray tandem mass spectrometry is used to identify and quantify eight individual amphetamines in whole blood in 1.3 min. The method has been optimized and fully validated according to forensic toxicology guidelines, for the quantification of amphetamine, methamphetamine, 3,4-methylenedioxyamphetamine (MDA), 3,4-methylenedioxy- N-methylamphetamine (MDMA), 3,4-methylenedioxy- N-ethylamphetamine (MDEA), para-methoxyamphetamine (PMA), para-methoxymethamphetamine (PMMA), and 4-fluoroamphetamine (4-FA). Additionally, a new concept of intrinsic and application-based selectivity is discussed, featuring increased confidence in the power to discriminate the amphetamines from other chemically similar compounds when applying an ambient mass spectrometric method without chromatographic separation. Accuracy was within ±15% and average precision was better than 15%, and better than 20% at the LLOQ. Detection limits between 15 and 50 ng/mL were obtained using only 12 μL of whole blood. [Figure not available: see fulltext.

  16. Development, validation, and application of a method for selected avermectin determination in rural waters using high performance liquid chromatography and fluorescence detection.

    Science.gov (United States)

    Lemos, Maria Augusta Travassos; Matos, Camila Alves; de Resende, Michele Fabri; Prado, Rachel Bardy; Donagemma, Raquel Andrade; Netto, Annibal Duarte Pereira

    2016-11-01

    Avermectins (AVM) are macrocyclic lactones used in livestock and agriculture. A quantitative method of high performance liquid chromatography with fluorescence detection for the determination of eprinomectin, abamectin, doramectin and ivermectin in rural water samples was developed and validated. The method was employed to study samples collected in the Pito Aceso River microbasin, located in the Bom Jardim municipality, Rio de Janeiro State, Brazil. Samples were extracted by solid phase extraction using a polymeric stationary phase, the eluted fraction was re-concentrated under a gentle N2 flow and derivatized to allow AVM determination using liquid chromatography with fluorescence detection. The excitation and emission wavelengths of the derivatives were 365 and 470nm, respectively, and a total chromatographic run of 12min was achieved. Very low limits of quantification (22-58ngL(-1)) were found after re-concentration using N2. Recovery values varied from 85.7% to 119.2% with standard deviations between 1.2% and 10.2%. The validated method was applied in the determination of AVM in 15 water samples collected in the Pito Aceso River microbasin, but most of them were free of AVM or showed only trace levels of these compounds, except for a sample that contained doramectin (9.11µgL(-1)). The method is suitable for routine analysis with satisfactory recovery, sensitivity, and selectivity. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Selection and validation of reference genes for gene expression analysis in switchgrass (Panicum virgatum using quantitative real-time RT-PCR.

    Directory of Open Access Journals (Sweden)

    Jacinta Gimeno

    Full Text Available Switchgrass (Panicum virgatum has received a lot of attention as a forage and bioenergy crop during the past few years. Gene expression studies are in progress to improve new traits and develop new cultivars. Quantitative real time PCR (qRT-PCR has emerged as an important technique to study gene expression analysis. For accurate and reliable results, normalization of data with reference genes is essential. In this work, we evaluate the stability of expression of genes to use as reference for qRT-PCR in the grass P. virgatum. Eleven candidate reference genes, including eEF-1α, UBQ6, ACT12, TUB6, eIF-4a, GAPDH, SAMDC, TUA6, CYP5, U2AF, and FTSH4, were validated for qRT-PCR normalization in different plant tissues and under different stress conditions. The expression stability of these genes was verified by the use of two distinct algorithms, geNorm and NormFinder. Differences were observed after comparison of the ranking of the candidate reference genes identified by both programs but eEF-1α, eIF-4a, CYP5 and U2AF are ranked as the most stable genes in the samples sets under study. Both programs discard the use of SAMDC and TUA6 for normalization. Validation of the reference genes proposed by geNorm and NormFinder were performed by normalization of transcript abundance of a group of target genes in different samples. Results show similar expression patterns when the best reference genes selected by both programs were used but differences were detected in the transcript abundance of the target genes. Based on the above research, we recommend the use of different statistical algorithms to identify the best reference genes for expression data normalization. The best genes selected in this study will help to improve the quality of gene expression data in a wide variety of samples in switchgrass.

  18. Logarithmic Laplacian Prior Based Bayesian Inverse Synthetic Aperture Radar Imaging.

    Science.gov (United States)

    Zhang, Shuanghui; Liu, Yongxiang; Li, Xiang; Bi, Guoan

    2016-04-28

    This paper presents a novel Inverse Synthetic Aperture Radar Imaging (ISAR) algorithm based on a new sparse prior, known as the logarithmic Laplacian prior. The newly proposed logarithmic Laplacian prior has a narrower main lobe with higher tail values than the Laplacian prior, which helps to achieve performance improvement on sparse representation. The logarithmic Laplacian prior is used for ISAR imaging within the Bayesian framework to achieve better focused radar image. In the proposed method of ISAR imaging, the phase errors are jointly estimated based on the minimum entropy criterion to accomplish autofocusing. The maximum a posterior (MAP) estimation and the maximum likelihood estimation (MLE) are utilized to estimate the model parameters to avoid manually tuning process. Additionally, the fast Fourier Transform (FFT) and Hadamard product are used to minimize the required computational efficiency. Experimental results based on both simulated and measured data validate that the proposed algorithm outperforms the traditional sparse ISAR imaging algorithms in terms of resolution improvement and noise suppression.

  19. Example-driven manifold priors for image deconvolution.

    Science.gov (United States)

    Ni, Jie; Turaga, Pavan; Patel, Vishal M; Chellappa, Rama

    2011-11-01

    Image restoration methods that exploit prior information about images to be estimated have been extensively studied, typically using the Bayesian framework. In this paper, we consider the role of prior knowledge of the object class in the form of a patch manifold to address the deconvolution problem. Specifically, we incorporate unlabeled image data of the object class, say natural images, in the form of a patch-manifold prior for the object class. The manifold prior is implicitly estimated from the given unlabeled data. We show how the patch-manifold prior effectively exploits the available sample class data for regularizing the deblurring problem. Furthermore, we derive a generalized cross-validation (GCV) function to automatically determine the regularization parameter at each iteration without explicitly knowing the noise variance. Extensive experiments show that this method performs better than many competitive image deconvolution methods.

  20. Intraoperative validation of CT-based lymph nodal levels, sublevels IIa and IIb: Is it of clinical relevance in selective radiation therapy?

    International Nuclear Information System (INIS)

    Levendag, Peter; Gregoire, Vincent; Hamoir, Marc; Voet, Peter; Est, Henrie van der; Heijmen, Ben; Kerrebijn, Jeroen

    2005-01-01

    Purpose: The objectives of this study are to discuss the intraoperative validation of CT-based boundaries of lymph nodal levels in the neck, and in particular the clinical relevance of the delineation of sublevels IIa and IIb in case of selective radiation therapy (RT). Methods and Materials: To validate the radiologically defined level contours, clips were positioned intraoperatively at the level boundaries defined by surgical anatomy. In 10 consecutive patients, clips were placed, at the time of a neck dissection being performed, at the most cranial border of the neck. Anterior-posterior and lateral X-ray films were obtained intraoperatively. Next, in 3 patients, neck levels were contoured on preoperative contrast-enhanced CT scans according to the international consensus guidelines. From each of these 3 patients, an intraoperative CT scan was also obtained, with clips placed at the surgical-anatomy-based level boundaries. The preoperative (CT-based) and intraoperative (surgery-defined) CT scans were matched. Results: Clips placed at the most cranial part of the neck lined up at the caudal part of the transverse process of the cervical vertebra C-I. The posterior border of surgical level IIa (spinal accessory nerve [SAN]) did not match with the posterior border of CT-based level IIa (internal jugular vein [IJV]). Other surgical boundaries and CT-based contours were in good agreement. Conclusions: The cranial border of the neck, i.e., the cranial border of level IIa/IIb, corresponds to the caudal edge of the lateral process of C-I. Except for the posterior border between level IIa and level IIb, a perfect match was observed between the other surgical-clip-identified levels II-V boundaries (surgical-anatomy) and the CT-based delineation contours. It is argued that (1) because of the parotid gland overlapping part of level II, and (2) the frequent infestation of occult metastatic cells in the lymph channels around the IJV, the division of level II into radiologic

  1. Selection and Validation of Reference Genes for Quantitative Real-Time PCR Normalization Under Ethanol Stress Conditions in Oenococcus oeni SD-2a

    Directory of Open Access Journals (Sweden)

    Shuai Peng

    2018-05-01

    Full Text Available The powerful Quantitative real-time PCR (RT-qPCR was widely used to assess gene expression levels, which requires the optimal reference genes used for normalization. Oenococcus oeni (O. oeni, as the one of most important microorganisms in wine industry and the most resistant lactic acid bacteria (LAB species to ethanol, has not been investigated regarding the selection of stable reference genes for RT-qPCR normalization under ethanol stress conditions. In this study, nine candidate reference genes (proC, dnaG, rpoA, ldhD, ddlA, rrs, gyrA, gyrB, and dpoIII were analyzed to determine the most stable reference genes for RT-qPCR in O. oeni SD-2a under different ethanol stress conditions (8, 12, and 16% (v/v ethanol. The transcript stabilities of these genes were evaluated using the algorithms geNorm, NormFinder, and BestKeeper. The results showed that dnaG and dpoIII were selected as the best reference genes across all experimental ethanol conditions. Considering single stress experimental modes, dpoIII and dnaG would be suitable to normalize expression level for 8% ethanol shock treatment, while the combination of gyrA, gyrB, and rrs would be suitable for 12% ethanol shock treatment. proC and gyrB revealed the most stable expression in 16% ethanol shock treatment. This study selected and validated for the first time the reference genes for RT-qPCR normalization in O. oeni SD-2a under ethanol stress conditions.

  2. The critical success factors and impact of prior knowledge to nursing students when transferring nursing knowledge during nursing clinical practise.

    Science.gov (United States)

    Tsai, Ming-Tien; Tsai, Ling-Long

    2005-11-01

    Nursing practise plays an important role in transferring nursing knowledge to nursing students. From the related literature review, prior knowledge will affect how learners gain new knowledge. There has been no direct examination of the prior knowledge interaction effect on students' performance and its influence on nursing students when evaluating the knowledge transfer success factors. This study explores (1) the critical success factors in transferring nursing knowledge, (2) the impact of prior knowledge when evaluating the success factors for transferring nursing knowledge. This research utilizes in-depth interviews to probe the initial success factor phase. A total of 422 valid questionnaires were conducted by the authors. The data were analysed by comparing the mean score and t-test between two groups. Seventeen critical success factors were identified by the two groups of students. Twelve items were selected to examine the diversity in the two groups. Students with prior knowledge were more independent than the other group. They also preferred self-directed learning over students without prior knowledge. Students who did not have prior knowledge were eager to take every opportunity to gain experience and more readily adopted new knowledge.

  3. Quantum steganography using prior entanglement

    International Nuclear Information System (INIS)

    Mihara, Takashi

    2015-01-01

    Steganography is the hiding of secret information within innocent-looking information (e.g., text, audio, image, video, etc.). A quantum version of steganography is a method based on quantum physics. In this paper, we propose quantum steganography by combining quantum error-correcting codes with prior entanglement. In many steganographic techniques, embedding secret messages in error-correcting codes may cause damage to them if the embedded part is corrupted. However, our proposed steganography can separately create secret messages and the content of cover messages. The intrinsic form of the cover message does not have to be modified for embedding secret messages. - Highlights: • Our steganography combines quantum error-correcting codes with prior entanglement. • Our steganography can separately create secret messages and the content of cover messages. • Errors in cover messages do not have affect the recovery of secret messages. • We embed a secret message in the Steane code as an example of our steganography

  4. Quantum steganography using prior entanglement

    Energy Technology Data Exchange (ETDEWEB)

    Mihara, Takashi, E-mail: mihara@toyo.jp

    2015-06-05

    Steganography is the hiding of secret information within innocent-looking information (e.g., text, audio, image, video, etc.). A quantum version of steganography is a method based on quantum physics. In this paper, we propose quantum steganography by combining quantum error-correcting codes with prior entanglement. In many steganographic techniques, embedding secret messages in error-correcting codes may cause damage to them if the embedded part is corrupted. However, our proposed steganography can separately create secret messages and the content of cover messages. The intrinsic form of the cover message does not have to be modified for embedding secret messages. - Highlights: • Our steganography combines quantum error-correcting codes with prior entanglement. • Our steganography can separately create secret messages and the content of cover messages. • Errors in cover messages do not have affect the recovery of secret messages. • We embed a secret message in the Steane code as an example of our steganography.

  5. Prior information in structure estimation

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav; Nedoma, Petr; Khailova, Natalia; Pavelková, Lenka

    2003-01-01

    Roč. 150, č. 6 (2003), s. 643-653 ISSN 1350-2379 R&D Projects: GA AV ČR IBS1075102; GA AV ČR IBS1075351; GA ČR GA102/03/0049 Institutional research plan: CEZ:AV0Z1075907 Keywords : prior knowledge * structure estimation * autoregressive models Subject RIV: BC - Control Systems Theory Impact factor: 0.745, year: 2003 http://library.utia.cas.cz/separaty/historie/karny-0411258.pdf

  6. Derivation and validation of two decision instruments for selective chest CT in blunt trauma: a multicenter prospective observational study (NEXUS Chest CT).

    Science.gov (United States)

    Rodriguez, Robert M; Langdorf, Mark I; Nishijima, Daniel; Baumann, Brigitte M; Hendey, Gregory W; Medak, Anthony J; Raja, Ali S; Allen, Isabel E; Mower, William R

    2015-10-01

    Unnecessary diagnostic imaging leads to higher costs, longer emergency department stays, and increased patient exposure to ionizing radiation. We sought to prospectively derive and validate two decision instruments (DIs) for selective chest computed tomography (CT) in adult blunt trauma patients. From September 2011 to May 2014, we prospectively enrolled blunt trauma patients over 14 y of age presenting to eight US, urban level 1 trauma centers in this observational study. During the derivation phase, physicians recorded the presence or absence of 14 clinical criteria before viewing chest imaging results. We determined injury outcomes by CT radiology readings and categorized injuries as major or minor according to an expert-panel-derived clinical classification scheme. We then employed recursive partitioning to derive two DIs: Chest CT-All maximized sensitivity for all injuries, and Chest CT-Major maximized sensitivity for only major thoracic injuries (while increasing specificity). In the validation phase, we employed similar methodology to prospectively test the performance of both DIs. We enrolled 11,477 patients-6,002 patients in the derivation phase and 5,475 patients in the validation phase. The derived Chest CT-All DI consisted of (1) abnormal chest X-ray, (2) rapid deceleration mechanism, (3) distracting injury, (4) chest wall tenderness, (5) sternal tenderness, (6) thoracic spine tenderness, and (7) scapular tenderness. The Chest CT-Major DI had the same criteria without rapid deceleration mechanism. In the validation phase, Chest CT-All had a sensitivity of 99.2% (95% CI 95.4%-100%), a specificity of 20.8% (95% CI 19.2%-22.4%), and a negative predictive value (NPV) of 99.8% (95% CI 98.9%-100%) for major injury, and a sensitivity of 95.4% (95% CI 93.6%-96.9%), a specificity of 25.5% (95% CI 23.5%-27.5%), and a NPV of 93.9% (95% CI 91.5%-95.8%) for either major or minor injury. Chest CT-Major had a sensitivity of 99.2% (95% CI 95.4%-100%), a specificity of

  7. Assessment of Prior Learning in Adult Vocational Education and Training

    Directory of Open Access Journals (Sweden)

    Vibe Aarkrog

    2015-04-01

    Full Text Available The article deals about the results of a study of school-based Assessment of Prior Learning of adults who have enrolled as students in a VET college in order to qualify for occupations as skilled workers. Based on examples of VET teachers’ methods for assessing the students’ prior learning in the programs for gastronomes, respectively child care assistants the article discusses two issues in relation to Assessment of Prior Learing: the encounter of practical experience and school-based knowledge and the validity and reliability of the assessment procedures. Through focusing on the students’ knowing that and knowing why the assessment is based on a scholastic perception of the students’ needs for training, reflecting one of the most important challenges in Assessment of Prior Learning: how can practical experience be transformed into credits for the knowledge parts of the programs? The study shows that by combining several Assessment of Prior Learning methods and comparing the teachers’ assessments the teachers respond to the issues of validity and reliability. However, validity and reliability might be even further strengthened, if the competencies are well defined, if the education system is aware of securing a reasonable balance between knowing how, knowing that, and knowing why, and if the teachers are adequately trained for the assessment procedures.

  8. Selection and validation of reference genes for quantitative gene expression analyses in various tissues and seeds at different developmental stages in Bixa orellana L.

    Science.gov (United States)

    Moreira, Viviane S; Soares, Virgínia L F; Silva, Raner J S; Sousa, Aurizangela O; Otoni, Wagner C; Costa, Marcio G C

    2018-05-01

    Bixa orellana L., popularly known as annatto, produces several secondary metabolites of pharmaceutical and industrial interest, including bixin, whose molecular basis of biosynthesis remain to be determined. Gene expression analysis by quantitative real-time PCR (qPCR) is an important tool to advance such knowledge. However, correct interpretation of qPCR data requires the use of suitable reference genes in order to reduce experimental variations. In the present study, we have selected four different candidates for reference genes in B. orellana , coding for 40S ribosomal protein S9 (RPS9), histone H4 (H4), 60S ribosomal protein L38 (RPL38) and 18S ribosomal RNA (18SrRNA). Their expression stabilities in different tissues (e.g. flower buds, flowers, leaves and seeds at different developmental stages) were analyzed using five statistical tools (NormFinder, geNorm, BestKeeper, ΔCt method and RefFinder). The results indicated that RPL38 is the most stable gene in different tissues and stages of seed development and 18SrRNA is the most unstable among the analyzed genes. In order to validate the candidate reference genes, we have analyzed the relative expression of a target gene coding for carotenoid cleavage dioxygenase 1 (CCD1) using the stable RPL38 and the least stable gene, 18SrRNA , for normalization of the qPCR data. The results demonstrated significant differences in the interpretation of the CCD1 gene expression data, depending on the reference gene used, reinforcing the importance of the correct selection of reference genes for normalization.

  9. Acquisition of multiple prior distributions in tactile temporal order judgment

    Directory of Open Access Journals (Sweden)

    Yasuhito eNagai

    2012-08-01

    Full Text Available The Bayesian estimation theory proposes that the brain acquires the prior distribution of a task and integrates it with sensory signals to minimize the effect of sensory noise. Psychophysical studies have demonstrated that our brain actually implements Bayesian estimation in a variety of sensory-motor tasks. However, these studies only imposed one prior distribution on participants within a task period. In this study, we investigated the conditions that enable the acquisition of multiple prior distributions in temporal order judgment (TOJ of two tactile stimuli across the hands. In Experiment 1, stimulation intervals were randomly selected from one of two prior distributions (biased to right hand earlier and biased to left hand earlier in association with color cues (green and red, respectively. Although the acquisition of the two priors was not enabled by the color cues alone, it was significant when participants shifted their gaze (above or below in response to the color cues. However, the acquisition of multiple priors was not significant when participants moved their mouths (opened or closed. In Experiment 2, the spatial cues (above and below were used to identify which eye position or retinal cue position was crucial for the eye-movement-dependent acquisition of multiple priors in Experiment 1. The acquisition of the two priors was significant when participants moved their gaze to the cues (i.e., the cue positions on the retina were constant across the priors, as well as when participants did not shift their gazes (i.e., the cue positions on the retina changed according to the priors. Thus, both eye and retinal cue positions were effective in acquiring multiple priors. Based on previous neurophysiological reports, we discuss possible neural correlates that contribute to the acquisition of multiple priors.

  10. On the prior probabilities for two-stage Bayesian estimates

    International Nuclear Information System (INIS)

    Kohut, P.

    1992-01-01

    The method of Bayesian inference is reexamined for its applicability and for the required underlying assumptions in obtaining and using prior probability estimates. Two different approaches are suggested to determine the first-stage priors in the two-stage Bayesian analysis which avoid certain assumptions required for other techniques. In the first scheme, the prior is obtained through a true frequency based distribution generated at selected intervals utilizing actual sampling of the failure rate distributions. The population variability distribution is generated as the weighed average of the frequency distributions. The second method is based on a non-parametric Bayesian approach using the Maximum Entropy Principle. Specific features such as integral properties or selected parameters of prior distributions may be obtained with minimal assumptions. It is indicated how various quantiles may also be generated with a least square technique

  11. Validation of the catalytic properties of Cu-Os/13X using single fixed bed reactor in selective catalytic reduction of NO

    International Nuclear Information System (INIS)

    Oh, Kwang Seok; Woo, Seong Ihl

    2007-01-01

    Catalytic decomposition of NO over Cu-Os/13X has been carried out in a tubular fixed bed reactor at atmospheric pressure and the results were compared with literature data performed by high-throughput screening (HTS). The activity and durability of Cu-Os/13X prepared by conventional ion-exchange method have been investigated in the presence of H 2 O and SO 2 . It was found that Cu-Os/13X prepared by ion-exchange shows a high activity in a wide temperature range in selective catalytic reduction (SCR) of NO with C 3 H 6 compared to Cu/13X, proving the existence of more NO adsorption site on Cu-Os/13X. However, Cu-Os/13X exhibited low activity in the presence of water, and was quite different from the result reported in literature. SO 2 resistance is also low and does not recover its original activity when the SO 2 was blocked in the feed gas stream. This result suggested that catalytic activity between combinatorial screening and conventional testing should be compared to confirm the validity of high-throughput screening

  12. "VALIDATION OF 13C-UREA BREATH TEST WITH NON DISPERSIVE ISOTOPE SELECTIVE INFRARED SPECTROSCOPY FOR THE DIAGNOSIS OF HELICOBACTER PYLORI INFECTION: A SURVEY IN IRANIAN POPULATION"

    Directory of Open Access Journals (Sweden)

    "Davood Beiki

    2005-04-01

    Full Text Available The urea breath test (UBT which is carried out with 13C or 14C labeled urea is one of the most important non invasive methods for detection of Helicobacter pylori infection. Application of 13C-UBT is becoming increasingly popular because of its non radioactive nature which makes it suitable for diagnostic purposes in children and women of child bearing ages. While isotope ratio mass spectrometer (IRMS is generally used to detect 13C in expired breath, this instrument is expensive and recently non dispersive isotope selective infrared (NDIR spectroscopy which is a lower cost technique has been employed as a reliable counterpart for IRMS in small clinics. The aim of this study was to assess the validity of NDIR spectroscopy technique in Iranian population in comparison with histological examination, rapid urease test and 14C-urea breath test as gold standard. Seventy six patients with dyspepsia were underwent 13CUBT for diagnosis of Helicobacter pylori infection. Good agreements were found between the 13C-UBT and gold standard methods. The 13C-UBT showed 100% sensitivity, 97.3% specificity, 97.56% positive predictive value, 100% negative predictive value and 98.65% accuracy. On the basis of these results it could be concluded that 13C-UBT performed with NDIR spectroscopy is a reliable, accurate and non invasive diagnostic tool for detection of Helicobacter pylori infection in the Iranian population.

  13. Clustering and Bayesian hierarchical modeling for the definition of informative prior distributions in hydrogeology

    Science.gov (United States)

    Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.

    2017-12-01

    In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.

  14. Depth image enhancement using perceptual texture priors

    Science.gov (United States)

    Bang, Duhyeon; Shim, Hyunjung

    2015-03-01

    A depth camera is widely used in various applications because it provides a depth image of the scene in real time. However, due to the limited power consumption, the depth camera presents severe noises, incapable of providing the high quality 3D data. Although the smoothness prior is often employed to subside the depth noise, it discards the geometric details so to degrade the distance resolution and hinder achieving the realism in 3D contents. In this paper, we propose a perceptual-based depth image enhancement technique that automatically recovers the depth details of various textures, using a statistical framework inspired by human mechanism of perceiving surface details by texture priors. We construct the database composed of the high quality normals. Based on the recent studies in human visual perception (HVP), we select the pattern density as a primary feature to classify textures. Upon the classification results, we match and substitute the noisy input normals with high quality normals in the database. As a result, our method provides the high quality depth image preserving the surface details. We expect that our work is effective to enhance the details of depth image from 3D sensors and to provide a high-fidelity virtual reality experience.

  15. Introduction to the Monte Carlo project and the approach to the validation of probabilistic models of dietary exposure to selected food chemicals

    NARCIS (Netherlands)

    Gibney, M.J.; Voet, van der H.

    2003-01-01

    The Monte Carlo project was established to allow an international collaborative effort to define conceptual models for food chemical and nutrient exposure, to define and validate the software code to govern these models, to provide new or reconstructed databases for validation studies, and to use

  16. 7 CFR 4290.480 - Prior approval of changes to RBIC's business plan.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Prior approval of changes to RBIC's business plan... § 4290.480 Prior approval of changes to RBIC's business plan. Without the Secretary's prior written approval, no change in your business plan, upon which you were selected and licensed as a RBIC, may take...

  17. Bayesian Image Restoration Using a Large-Scale Total Patch Variation Prior

    Directory of Open Access Journals (Sweden)

    Yang Chen

    2011-01-01

    Full Text Available Edge-preserving Bayesian restorations using nonquadratic priors are often inefficient in restoring continuous variations and tend to produce block artifacts around edges in ill-posed inverse image restorations. To overcome this, we have proposed a spatial adaptive (SA prior with improved performance. However, this SA prior restoration suffers from high computational cost and the unguaranteed convergence problem. Concerning these issues, this paper proposes a Large-scale Total Patch Variation (LS-TPV Prior model for Bayesian image restoration. In this model, the prior for each pixel is defined as a singleton conditional probability, which is in a mixture prior form of one patch similarity prior and one weight entropy prior. A joint MAP estimation is thus built to ensure the iteration monotonicity. The intensive calculation of patch distances is greatly alleviated by the parallelization of Compute Unified Device Architecture(CUDA. Experiments with both simulated and real data validate the good performance of the proposed restoration.

  18. Divergent Priors and well Behaved Bayes Factors

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2011-01-01

    textabstractDivergent priors are improper when defined on unbounded supports. Bartlett's paradox has been taken to imply that using improper priors results in ill-defined Bayes factors, preventing model comparison by posterior probabilities. However many improper priors have attractive properties

  19. Iterated random walks with shape prior

    DEFF Research Database (Denmark)

    Pujadas, Esmeralda Ruiz; Kjer, Hans Martin; Piella, Gemma

    2016-01-01

    the parametric probability density function. Then, random walks is performed iteratively aligning the prior with the current segmentation in every iteration. We tested the proposed approach with natural and medical images and compared it with the latest techniques with random walks and shape priors......We propose a new framework for image segmentation using random walks where a distance shape prior is combined with a region term. The shape prior is weighted by a confidence map to reduce the influence of the prior in high gradient areas and the region term is computed with k-means to estimate....... The experiments suggest that this method gives promising results for medical and natural images....

  20. Logarithmic Laplacian Prior Based Bayesian Inverse Synthetic Aperture Radar Imaging

    Directory of Open Access Journals (Sweden)

    Shuanghui Zhang

    2016-04-01

    Full Text Available This paper presents a novel Inverse Synthetic Aperture Radar Imaging (ISAR algorithm based on a new sparse prior, known as the logarithmic Laplacian prior. The newly proposed logarithmic Laplacian prior has a narrower main lobe with higher tail values than the Laplacian prior, which helps to achieve performance improvement on sparse representation. The logarithmic Laplacian prior is used for ISAR imaging within the Bayesian framework to achieve better focused radar image. In the proposed method of ISAR imaging, the phase errors are jointly estimated based on the minimum entropy criterion to accomplish autofocusing. The maximum a posterior (MAP estimation and the maximum likelihood estimation (MLE are utilized to estimate the model parameters to avoid manually tuning process. Additionally, the fast Fourier Transform (FFT and Hadamard product are used to minimize the required computational efficiency. Experimental results based on both simulated and measured data validate that the proposed algorithm outperforms the traditional sparse ISAR imaging algorithms in terms of resolution improvement and noise suppression.

  1. Novel Selectivity-Based Forensic Toxicological Validation of a Paper Spray Mass Spectrometry Method for the Quantitative Determination of Eight Amphetamines in Whole Blood

    NARCIS (Netherlands)

    Teunissen, Sebastiaan F.; Fedick, Patrick W.; Berendsen, Bjorn J.A.; Nielen, Michel W.F.; Eberlin, Marcos N.; Graham Cooks, R.; Asten, van Arian C.

    2017-01-01

    Paper spray tandem mass spectrometry is used to identify and quantify eight individual amphetamines in whole blood in 1.3 min. The method has been optimized and fully validated according to forensic toxicology guidelines, for the quantification of amphetamine, methamphetamine,

  2. The development of Assessment of SpondyloArthritis international Society classification criteria for axial spondyloarthritis (part II): validation and final selection

    NARCIS (Netherlands)

    Rudwaleit, M.; van der Heijde, D.; Landewé, R.; Listing, J.; Akkoc, N.; Brandt, J.; Braun, J.; Chou, C. T.; Collantes-Estevez, E.; Dougados, M.; Huang, F.; Gu, J.; Khan, M. A.; Kirazli, Y.; Maksymowych, W. P.; Mielants, H.; Sørensen, I. J.; Ozgocmen, S.; Roussou, E.; Valle-Oñate, R.; Weber, U.; Wei, J.; Sieper, J.

    2009-01-01

    To validate and refine two sets of candidate criteria for the classification/diagnosis of axial spondyloarthritis (SpA). All Assessment of SpondyloArthritis international Society (ASAS) members were invited to include consecutively new patients with chronic (> or =3 months) back pain of unknown

  3. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  4. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-05-25

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users\\' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  5. A Single Image Dehazing Method Using Average Saturation Prior

    Directory of Open Access Journals (Sweden)

    Zhenfei Gu

    2017-01-01

    Full Text Available Outdoor images captured in bad weather are prone to yield poor visibility, which is a fatal problem for most computer vision applications. The majority of existing dehazing methods rely on an atmospheric scattering model and therefore share a common limitation; that is, the model is only valid when the atmosphere is homogeneous. In this paper, we propose an improved atmospheric scattering model to overcome this inherent limitation. By adopting the proposed model, a corresponding dehazing method is also presented. In this method, we first create a haze density distribution map of a hazy image, which enables us to segment the hazy image into scenes according to the haze density similarity. Then, in order to improve the atmospheric light estimation accuracy, we define an effective weight assignment function to locate a candidate scene based on the scene segmentation results and therefore avoid most potential errors. Next, we propose a simple but powerful prior named the average saturation prior (ASP, which is a statistic of extensive high-definition outdoor images. Using this prior combined with the improved atmospheric scattering model, we can directly estimate the scene atmospheric scattering coefficient and restore the scene albedo. The experimental results verify that our model is physically valid, and the proposed method outperforms several state-of-the-art single image dehazing methods in terms of both robustness and effectiveness.

  6. Single-laboratory validation of a method for the determination of select volatile organic compounds in foods by using vacuum distillation with gas chromatography/mass spectrometry.

    Science.gov (United States)

    Nyman, Patricia J; Limm, William; Begley, Timothy H; Chirtel, Stuart J

    2014-01-01

    Recent studies showed that headspace and purge and trap methods have limitations when used to determine volatile organic compounds (VOCs) in foods, including matrix effects and artifact formation from precursors present in the sample matrix or from thermal decomposition. U.S. Environmental Protection Agency Method 8261A liberates VOCs from the sample matrix by using vacuum distillation at room temperature. The method was modified and validated for the determination of furan, chloroform, benzene, trichloroethene, toluene, and sytrene in infant formula, canned tuna (in water), peanut butter, and an orange beverage (orange-flavored noncarbonated beverage). The validation studies showed that the LOQ values ranged from 0.05 ng/g toluene in infant formula to 5.10 ng/g toluene in peanut butter. Fortified recoveries were determined at the first, second, and third standard additions, and concentrations ranged from 0.07 to 6.9 ng/g. When quantified by the method of standard additions, the recoveries ranged from 56 to 218% at the first standard addition and 89 to 117% at the third. The validated method was used to conduct a survey of the targeted VOCs in 18 foods. The amounts found ranged from none detected to 73.8 ng/g furan in sweet potato baby food.

  7. Advanced prior modeling for 3D bright field electron tomography

    Science.gov (United States)

    Sreehari, Suhas; Venkatakrishnan, S. V.; Drummy, Lawrence F.; Simmons, Jeffrey P.; Bouman, Charles A.

    2015-03-01

    Many important imaging problems in material science involve reconstruction of images containing repetitive non-local structures. Model-based iterative reconstruction (MBIR) could in principle exploit such redundancies through the selection of a log prior probability term. However, in practice, determining such a log prior term that accounts for the similarity between distant structures in the image is quite challenging. Much progress has been made in the development of denoising algorithms like non-local means and BM3D, and these are known to successfully capture non-local redundancies in images. But the fact that these denoising operations are not explicitly formulated as cost functions makes it unclear as to how to incorporate them in the MBIR framework. In this paper, we formulate a solution to bright field electron tomography by augmenting the existing bright field MBIR method to incorporate any non-local denoising operator as a prior model. We accomplish this using a framework we call plug-and-play priors that decouples the log likelihood and the log prior probability terms in the MBIR cost function. We specifically use 3D non-local means (NLM) as the prior model in the plug-and-play framework, and showcase high quality tomographic reconstructions of a simulated aluminum spheres dataset, and two real datasets of aluminum spheres and ferritin structures. We observe that streak and smear artifacts are visibly suppressed, and that edges are preserved. Also, we report lower RMSE values compared to the conventional MBIR reconstruction using qGGMRF as the prior model.

  8. Explicating Validity

    Science.gov (United States)

    Kane, Michael T.

    2016-01-01

    How we choose to use a term depends on what we want to do with it. If "validity" is to be used to support a score interpretation, validation would require an analysis of the plausibility of that interpretation. If validity is to be used to support score uses, validation would require an analysis of the appropriateness of the proposed…

  9. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  10. Improving Open Access through Prior Learning Assessment

    Science.gov (United States)

    Yin, Shuangxu; Kawachi, Paul

    2013-01-01

    This paper explores and presents new data on how to improve open access in distance education through using prior learning assessments. Broadly there are three types of prior learning assessment (PLAR): Type-1 for prospective students to be allowed to register for a course; Type-2 for current students to avoid duplicating work-load to gain…

  11. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  12. Selecting Future Teachers: The Predictive Validity of Communication Skills, Personality and Academic Achievement in the Admission Process at an Asian University

    Directory of Open Access Journals (Sweden)

    Richard J. Holmes

    2009-06-01

    Full Text Available This paper studies the relationship between communication skills, personality factors and performance in secondary school and academic success in Teaching English as a Second Language (TESL programme in a Malaysian university. It was found that three specific skills: fluency, clarity and language use were modestly predictive of success over the first six semesters of the degree programme but that personality traits and general and educational knowledge were not. Performance on the Malaysian secondary school examination, especially in maths, also predicted academic success. It was also found that the qualities assessed at the interview were barely detectable by lecturers a little more than two years later although communicative skills were somewhat more so than the others. The findings suggest that when students are studying in the medium of a second language, communicative competence and prior academic achievement, possibly reflective of underlying general intelligence are important factors contributing to academic success.

  13. Investigating the treatment of missing data in an Olympiad-type test – the case of the selection validity in the South African Mathematics Olympiad

    Directory of Open Access Journals (Sweden)

    Caroline Long

    2016-10-01

    Full Text Available The purpose of the South African Mathematics Olympiad is to generate interest in mathematics and to identify the most talented mathematical minds. Our focus is on how the handling of missing data affects the selection of the ‘best’ contestants. Two approaches handling missing data, applying the Rasch model, are described. The issue of guessing is investigated through a tailored analysis. We present two microanalyses to illustate how missing data may impact selection; the first investigates groups of contestants that may miss selection under particular conditions; the second focuses on two contestants each of whom answer 14 items correctly. This comparison raises questions about the proportion of correct to incorrect answers. Recommendations are made for future scoring of the test, which include reconsideration of negative marking and weighting as well as considering the inclusion of 150 or 200 contestants as opposed to 100 contestants for participation in the final round.

  14. How to select aspirant laparoscopic surgical trainees: establishing concurrent validity comparing Xitact LS500 index performance scores with standardized psychomotor aptitude test battery scores

    NARCIS (Netherlands)

    Schijven, Marlies P.; Jakimowicz, Jack J.; Carter, Fiona J.

    2004-01-01

    BACKGROUND: Although a controversial topic in medical education, the selection of aspirant surgical trainees is a subject that needs to be addressed. In the view of preventing surgical trainee drop-outs and of appropriate allocation of limited resources, it is an issue critical to the profession.

  15. The test chemical selection procedure of the European Centre for the Validation of Alternative Methods for the EU Project ReProTect.

    Science.gov (United States)

    Pazos, Patricia; Pellizzer, Cristian; Stummann, Tina C; Hareng, Lars; Bremer, Susanne

    2010-08-01

    The selection of reference compounds is crucial for a successful in vitro test development in order to proof the relevance of the test system. This publication describes the criteria and the selection strategy leading to a list of more than 130 chemicals suitable for test development within the ReProTect project. The presented chemical inventory aimed to support the development and optimization of in vitro tests that seek to fulfill ECVAM's criteria for entering into the prevalidation. In order to select appropriate substances, a primary database was established compiling information from existing databases. In a second step, predefined selection criteria have been applied to obtain a comprehensive list ready to undergo a peer review process from independent experts with industrial, academic and regulatory background. Finally, a peer reviewed chemical list containing 13 substances challenging endocrine disrupter tests, additional 50 substances serving as reference chemicals for various tests evaluating effects on male and female fertility, and finally 61 substances were identified as known to provoke effects on the early development of mammalian offspring. The final list aims to cover relevant and specific mode/site of actions as they are known to be relevant for various substance classes. However, the recommended list should not be interpreted as a list of reproductive toxicants, because such a description requires proven associations with adverse effects of mammalian reproduction, which are subject of regulatory decisions done by involved competent authorities. Copyright 2010 Elsevier Inc. All rights reserved.

  16. Systematic Model for Validating Equipment Uses in Selected Marketing and Distribution Education Programs. Final Report, February 1, 1980-June 30, 1981.

    Science.gov (United States)

    Gildan, Kate; Buckner, Leroy

    Research was conducted to provide a model for selecting equipment for marketing and distributive education programs that was required for the development of the skills or competencies needed to perform in marketing and distribution occupation. A research of the literature identified both competency statements for three program areas--Fashion…

  17. The Prediction of Training Proficiency in Firefighters: A Study of Predictive Validity in Spain

    Directory of Open Access Journals (Sweden)

    Alfredo Berges

    2018-02-01

    Full Text Available The present study provides results of criterion validity in the selection of firefighters in Spain. The predictors were cognitive skills, job knowledge, and physical aptitudes, and the criterion was training proficiency. The process involves 639 candidates, but only 44 complete successfully the selection process. Our results support previous evidence showing that general cognitive ability is the best predictor of training proficiency, with an operational validity of .57. With respect to the other predictors, job knowledge presented an operational validity of .55 and physical tests of .49. In addition, multiple regression analysis showed that cognitive aptitude explains 33% of the variance, but when physical aptitudes are included the explained variance increases to 50%. If we also add job knowledge, explained variance increases to 55%. Our study offers recent results of criterion validity in a barely investigated job, gathered in a country other than the one where prior research had been carried out.

  18. Genomic selection in maritime pine.

    Science.gov (United States)

    Isik, Fikret; Bartholomé, Jérôme; Farjat, Alfredo; Chancerel, Emilie; Raffin, Annie; Sanchez, Leopoldo; Plomion, Christophe; Bouffier, Laurent

    2016-01-01

    A two-generation maritime pine (Pinus pinaster Ait.) breeding population (n=661) was genotyped using 2500 SNP markers. The extent of linkage disequilibrium and utility of genomic selection for growth and stem straightness improvement were investigated. The overall intra-chromosomal linkage disequilibrium was r(2)=0.01. Linkage disequilibrium corrected for genomic relationships derived from markers was smaller (rV(2)=0.006). Genomic BLUP, Bayesian ridge regression and Bayesian LASSO regression statistical models were used to obtain genomic estimated breeding values. Two validation methods (random sampling 50% of the population and 10% of the progeny generation as validation sets) were used with 100 replications. The average predictive ability across statistical models and validation methods was about 0.49 for stem sweep, and 0.47 and 0.43 for total height and tree diameter, respectively. The sensitivity analysis suggested that prior densities (variance explained by markers) had little or no discernible effect on posterior means (residual variance) in Bayesian prediction models. Sampling from the progeny generation for model validation increased the predictive ability of markers for tree diameter and stem sweep but not for total height. The results are promising despite low linkage disequilibrium and low marker coverage of the genome (∼1.39 markers/cM). Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Terminology for pregnancy loss prior to viability

    DEFF Research Database (Denmark)

    Kolte, A M; Bernardi, L A; Christiansen, O B

    2015-01-01

    Pregnancy loss prior to viability is common and research in the field is extensive. Unfortunately, terminology in the literature is inconsistent. The lack of consensus regarding nomenclature and classification of pregnancy loss prior to viability makes it difficult to compare study results from...... different centres. In our opinion, terminology and definitions should be based on clinical findings, and when possible, transvaginal ultrasound. With this Early Pregnancy Consensus Statement, it is our goal to provide clear and consistent terminology for pregnancy loss prior to viability....

  20. Androstenedione response to recombinant human FSH is the most valid predictor of the number of selected follicles in polycystic ovarian syndrome: (a case-control study).

    Science.gov (United States)

    Ozyurek, Eser Sefik; Yoldemir, Tevfik; Artar, Gokhan

    2017-05-12

    We aimed to test the hypothesis that the correlation of the changes in the blood Androstenedione (A 4 ) levels to the number of selected follicles during ovulation induction with low-dose recombinant human follicle stimulating hormone (rhFSH) is as strong as the correlation to changes in the blood Estradiol (E 2 ) levels in polycystic ovary syndrome (PCOS). Prospective Case-control study conducted from October 2014 to January 2016. 61 non-PCOS control (Group I) and 46 PCOS (Group II) patients treated with the chronic low-dose step up protocosl with rhFSH. A 4 , E 2 , progesterone blood levels and follicular growth were monitored.. Univariate and hierarchical multivariable analysis were performed for age, BMI, HOMA-IR, A 4 and E 2 (with the number of selected follicles as the dependent variable in both groups). ROC analysis was performed to define threshold values for the significant determinants of the number of selected follicles to predict cyle cancellations due to excessive ovarian response. The control group (Group I) was comprised of 61 cycles from a group of primary infertile non-PCOS patients, and the study group (Group II) of 46 cycles of PCOS patients. The analysis revealed that the strongest independent predictor of the total number of selected follicles in Group I was the E 2 (AUC) (B = 0.0006[0.0003-0.001]; P ovarian response and accurate titration of the rhFSH doses. The study was registered as a prospective case-control study in the ClinicalTrials.gov registry with the identifier NCT02329483 .

  1. Selection and Validation of Reference Genes for qRT-PCR Expression Analysis of Candidate Genes Involved in Olfactory Communication in the Butterfly Bicyclus anynana

    OpenAIRE

    Arun, Alok; Bauml?, V?ronique; Amelot, Ga?l; Nieberding, Caroline M.

    2015-01-01

    Real-time quantitative reverse transcription PCR (qRT-PCR) is a technique widely used to quantify the transcriptional expression level of candidate genes. qRT-PCR requires the selection of one or several suitable reference genes, whose expression profiles remain stable across conditions, to normalize the qRT-PCR expression profiles of candidate genes. Although several butterfly species (Lepidoptera) have become important models in molecular evolutionary ecology, so far no study aimed at ident...

  2. A Simulation of Pell Grant Awards and Costs Using Prior-Prior Year Financial Data

    Science.gov (United States)

    Kelchen, Robert; Jones, Gigi

    2015-01-01

    We examine the likely implications of switching from a prior year (PY) financial aid system, the current practice in which students file the Free Application for Federal Student Aid (FAFSA) using income data from the previous tax year, to prior-prior year (PPY), in which data from two years before enrollment is used. While PPY allows students to…

  3. Prior Authorization of PMDs Demonstration - Status Update

    Data.gov (United States)

    U.S. Department of Health & Human Services — CMS implemented a Prior Authorization process for scooters and power wheelchairs for people with Fee-For-Service Medicare who reside in seven states with high...

  4. Short Report Biochemical derangements prior to emergency ...

    African Journals Online (AJOL)

    MMJ VOL 29 (1): March 2017. Biochemical derangements prior to emergency laparotomy at QECH 55. Malawi Medical Journal 29 (1): March 2017 ... Venepuncture was performed preoperatively for urgent cases, defined as those requiring.

  5. Deriving proper uniform priors for regression coefficients, Parts I, II, and III

    NARCIS (Netherlands)

    van Erp, H.R.N.; Linger, R.O.; van Gelder, P.H.A.J.M.

    2017-01-01

    It is a relatively well-known fact that in problems of Bayesian model selection, improper priors should, in general, be avoided. In this paper we will derive and discuss a collection of four proper uniform priors which lie on an ascending scale of informativeness. It will turn out that these

  6. Varying prior information in Bayesian inversion

    International Nuclear Information System (INIS)

    Walker, Matthew; Curtis, Andrew

    2014-01-01

    Bayes' rule is used to combine likelihood and prior probability distributions. The former represents knowledge derived from new data, the latter represents pre-existing knowledge; the Bayesian combination is the so-called posterior distribution, representing the resultant new state of knowledge. While varying the likelihood due to differing data observations is common, there are also situations where the prior distribution must be changed or replaced repeatedly. For example, in mixture density neural network (MDN) inversion, using current methods the neural network employed for inversion needs to be retrained every time prior information changes. We develop a method of prior replacement to vary the prior without re-training the network. Thus the efficiency of MDN inversions can be increased, typically by orders of magnitude when applied to geophysical problems. We demonstrate this for the inversion of seismic attributes in a synthetic subsurface geological reservoir model. We also present results which suggest that prior replacement can be used to control the statistical properties (such as variance) of the final estimate of the posterior in more general (e.g., Monte Carlo based) inverse problem solutions. (paper)

  7. Prospective regularization design in prior-image-based reconstruction

    International Nuclear Information System (INIS)

    Dang, Hao; Siewerdsen, Jeffrey H; Stayman, J Webster

    2015-01-01

    Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in

  8. A comparison of selected MMPI-2 and MMPI-2-RF validity scales in assessing effort on cognitive tests in a military sample.

    Science.gov (United States)

    Jones, Alvin; Ingram, M Victoria

    2011-10-01

    Using a relatively new statistical paradigm, Optimal Data Analysis (ODA; Yarnold & Soltysik, 2005), this research demonstrated that newly developed scales for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) and MMPI-2 Restructured Form (MMPI-2-RF) specifically designed to assess over-reporting of cognitive and/or somatic symptoms were more effective than the MMPI-2 F-family of scales in predicting effort status on tests of cognitive functioning in a sample of 288 military members. ODA demonstrated that when all scales were performing at their theoretical maximum possible level of classification accuracy, the Henry Heilbronner Index (HHI), Response Bias Scale (RBS), Fake Bad Scale (FBS), and the Symptom Validity Scale (FBS-r) outperformed the F-family of scales on a variety of ODA indexes of classification accuracy, including an omnibus measure (effect strength total, EST) of the descriptive and prognostic utility of ODA models developed for each scale. Based on the guidelines suggested by Yarnold and Soltysik for evaluating effect strengths for ODA models, the newly developed scales had effects sizes that were moderate in size (37.66 to 45.68), whereas the F-family scales had effects strengths that ranged from weak to moderate (15.42 to 32.80). In addition, traditional analysis demonstrated that HHI, RBS, FBS, and FBS-R had large effect sizes (0.98 to 1.16) based on Cohen's (1988) suggested categorization of effect size when comparing mean scores for adequate versus inadequate effort groups, whereas F-family of scales had small to medium effect sizes (0.25 to 0.76). The MMPI-2-RF Infrequent Somatic Responses Scale (F(S)) tended to perform in a fashion similar to F, the best performing F-family scale.

  9. Heuristics as Bayesian inference under extreme priors.

    Science.gov (United States)

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Prior processes and their applications nonparametric Bayesian estimation

    CERN Document Server

    Phadia, Eswar G

    2016-01-01

    This book presents a systematic and comprehensive treatment of various prior processes that have been developed over the past four decades for dealing with Bayesian approach to solving selected nonparametric inference problems. This revised edition has been substantially expanded to reflect the current interest in this area. After an overview of different prior processes, it examines the now pre-eminent Dirichlet process and its variants including hierarchical processes, then addresses new processes such as dependent Dirichlet, local Dirichlet, time-varying and spatial processes, all of which exploit the countable mixture representation of the Dirichlet process. It subsequently discusses various neutral to right type processes, including gamma and extended gamma, beta and beta-Stacy processes, and then describes the Chinese Restaurant, Indian Buffet and infinite gamma-Poisson processes, which prove to be very useful in areas such as machine learning, information retrieval and featural modeling. Tailfree and P...

  11. External Prior Guided Internal Prior Learning for Real-World Noisy Image Denoising

    Science.gov (United States)

    Xu, Jun; Zhang, Lei; Zhang, David

    2018-06-01

    Most of existing image denoising methods learn image priors from either external data or the noisy image itself to remove noise. However, priors learned from external data may not be adaptive to the image to be denoised, while priors learned from the given noisy image may not be accurate due to the interference of corrupted noise. Meanwhile, the noise in real-world noisy images is very complex, which is hard to be described by simple distributions such as Gaussian distribution, making real noisy image denoising a very challenging problem. We propose to exploit the information in both external data and the given noisy image, and develop an external prior guided internal prior learning method for real noisy image denoising. We first learn external priors from an independent set of clean natural images. With the aid of learned external priors, we then learn internal priors from the given noisy image to refine the prior model. The external and internal priors are formulated as a set of orthogonal dictionaries to efficiently reconstruct the desired image. Extensive experiments are performed on several real noisy image datasets. The proposed method demonstrates highly competitive denoising performance, outperforming state-of-the-art denoising methods including those designed for real noisy images.

  12. Offending prior to first psychiatric contact

    DEFF Research Database (Denmark)

    Stevens, H; Agerbo, E; Dean, K

    2012-01-01

    There is a well-established association between psychotic disorders and subsequent offending but the extent to which those who develop psychosis might have a prior history of offending is less clear. Little is known about whether the association between illness and offending exists in non-psychot......-psychotic disorders. The aim of this study was to determine whether the association between mental disorder and offending is present prior to illness onset in psychotic and non-psychotic disorders.......There is a well-established association between psychotic disorders and subsequent offending but the extent to which those who develop psychosis might have a prior history of offending is less clear. Little is known about whether the association between illness and offending exists in non...

  13. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  14. Selection and validation of a set of reliable reference genes for quantitative RT-PCR studies in the brain of the Cephalopod Mollusc Octopus vulgaris

    Directory of Open Access Journals (Sweden)

    Biffali Elio

    2009-07-01

    Full Text Available Abstract Background Quantitative real-time polymerase chain reaction (RT-qPCR is valuable for studying the molecular events underlying physiological and behavioral phenomena. Normalization of real-time PCR data is critical for a reliable mRNA quantification. Here we identify reference genes to be utilized in RT-qPCR experiments to normalize and monitor the expression of target genes in the brain of the cephalopod mollusc Octopus vulgaris, an invertebrate. Such an approach is novel for this taxon and of advantage in future experiments given the complexity of the behavioral repertoire of this species when compared with its relatively simple neural organization. Results We chose 16S, and 18S rRNA, actB, EEF1A, tubA and ubi as candidate reference genes (housekeeping genes, HKG. The expression of 16S and 18S was highly variable and did not meet the requirements of candidate HKG. The expression of the other genes was almost stable and uniform among samples. We analyzed the expression of HKG into two different set of animals using tissues taken from the central nervous system (brain parts and mantle (here considered as control tissue by BestKeeper, geNorm and NormFinder. We found that HKG expressions differed considerably with respect to brain area and octopus samples in an HKG-specific manner. However, when the mantle is treated as control tissue and the entire central nervous system is considered, NormFinder revealed tubA and ubi as the most suitable HKG pair. These two genes were utilized to evaluate the relative expression of the genes FoxP, creb, dat and TH in O. vulgaris. Conclusion We analyzed the expression profiles of some genes here identified for O. vulgaris by applying RT-qPCR analysis for the first time in cephalopods. We validated candidate reference genes and found the expression of ubi and tubA to be the most appropriate to evaluate the expression of target genes in the brain of different octopuses. Our results also underline the

  15. A Retrospective Propensity Score-Matched Early Thromboembolic Event Analysis of Prothrombin Complex Concentrate vs Fresh Frozen Plasma for Warfarin Reversal Prior to Emergency Neurosurgical Procedures.

    Science.gov (United States)

    Agarwal, Prateek; Abdullah, Kalil G; Ramayya, Ashwin G; Nayak, Nikhil R; Lucas, Timothy H

    2017-06-29

    Reversal of therapeutic anticoagulation prior to emergency neurosurgical procedures is required in the setting of intracranial hemorrhage. Multifactor prothrombin complex concentrate (PCC) promises rapid efficacy but may increase the probability of thrombotic complications compared to fresh frozen plasma (FFP). To compare the rate of thrombotic complications in patients treated with PCC or FFP to reverse therapeutic anticoagulation prior to emergency neurosurgical procedures in the setting of intracranial hemorrhage at a level I trauma center. Sixty-three consecutive patients on warfarin therapy presenting with intracranial hemorrhage who received anticoagulation reversal prior to emergency neurosurgical procedures were retrospectively identified between 2007 and 2016. They were divided into 2 cohorts based on reversal agent, either PCC (n = 28) or FFP (n = 35). The thrombotic complications rates within 72 h of reversal were compared using the χ 2 test. A multivariate propensity score matching analysis was used to limit the threat to interval validity from selection bias arising from differences in demographics, laboratory values, history, and clinical status. Thrombotic complications were uncommon in this neurosurgical population, occurring in 1.59% (1/63) of treated patients. There was no significant difference in the thrombotic complication rate between groups, 3.57% (1/28; PCC group) vs 0% (0/35; FFP group). Propensity score matching analysis validated this finding after controlling for any selection bias. In this limited sample, thrombotic complication rates were similar between use of PCC and FFP for anticoagulation reversal in the management of intracranial hemorrhage prior to emergency neurosurgical procedures. Copyright © 2017 by the Congress of Neurological Surgeons

  16. FACTAR validation

    International Nuclear Information System (INIS)

    Middleton, P.B.; Wadsworth, S.L.; Rock, R.C.; Sills, H.E.; Langman, V.J.

    1995-01-01

    A detailed strategy to validate fuel channel thermal mechanical behaviour codes for use of current power reactor safety analysis is presented. The strategy is derived from a validation process that has been recently adopted industry wide. Focus of the discussion is on the validation plan for a code, FACTAR, for application in assessing fuel channel integrity safety concerns during a large break loss of coolant accident (LOCA). (author)

  17. Near surface geotechnical and geophysical data cross validated for site characterization applications. The cases of selected accelerometric stations in Crete island (Greece)

    Science.gov (United States)

    Loupasakis, Constantinos; Tsangaratos, Paraskevas; Rozos, Dimitrios; Rondoyianni, Theodora; Vafidis, Antonis; Steiakakis, Emanouil; Agioutantis, Zacharias; Savvaidis, Alexandros; Soupios, Pantelis; Papadopoulos, Ioannis; Papadopoulos, Nikos; Sarris, Apostolos; Mangriotis, Maria-Dafni; Dikmen, Unal

    2015-04-01

    The near surface ground conditions are highly important for the design of civil constructions. These conditions determine primarily the ability of the foundation formations to bear loads, the stress - strain relations and the corresponding deformations, as well as the soil amplification and corresponding peak ground motion in case of dynamic loading. The static and dynamic geotechnical parameters as well as the ground-type/soil-category can be determined by combining geotechnical and geophysical methods, such as engineering geological surface mapping, geotechnical drilling, in situ and laboratory testing and geophysical investigations. The above mentioned methods were combined for the site characterization in selected sites of the Hellenic Accelerometric Network (HAN) in the area of Crete Island. The combination of the geotechnical and geophysical methods in thirteen (13) sites provided sufficient information about their limitations, setting up the minimum tests requirements in relation to the type of the geological formations. The reduced accuracy of the surface mapping in urban sites, the uncertainties introduced by the geophysical survey in sites with complex geology and the 1-D data provided by the geotechnical drills are some of the causes affecting the right order and the quantity of the necessary investigation methods. Through this study the gradual improvement on the accuracy of the site characterization data in regards to the applied investigation techniques is presented by providing characteristic examples from the total number of thirteen sites. As an example of the gradual improvement of the knowledge about the ground conditions the case of AGN1 strong motion station, located at Agios Nikolaos city (Eastern Crete), is briefly presented. According to the medium scale geological map of IGME the station was supposed to be founded over limestone. The detailed geological mapping reveled that a few meters of loose alluvial deposits occupy the area, expected

  18. Selection and validation of potato candidate genes for maturity corrected resistance to Phytophthora infestans based on differential expression combined with SNP association and linkage mapping

    Directory of Open Access Journals (Sweden)

    Meki Shehabu Muktar

    2015-09-01

    Full Text Available Late blight of potato (Solanum tuberosum L. caused by the oomycete Phytophthora infestans (Mont. de Bary, is one of the most important bottlenecks of potato production worldwide. Cultivars with high levels of durable, race unspecific, quantitative resistance are part of a solution to this problem. However, breeding for quantitative resistance is hampered by the correlation between resistance and late plant maturity, which is an undesirable agricultural attribute. The objectives of our research are (i the identification of genes that condition quantitative resistance to P. infestans not compromised by late plant maturity and (ii the discovery of diagnostic single nucleotide polymorphism (SNP markers to be used as molecular tools to increase efficiency and precision of resistance breeding. Twenty two novel candidate genes were selected based on comparative transcript profiling by SuperSAGE (serial analysis of gene expression in groups of plants with contrasting levels of maturity corrected resistance (MCR. Reproducibility of differential expression was tested by quantitative real time PCR and allele specific pyrosequencing in four new sets of genotype pools with contrasting late blight resistance levels, at three infection time points and in three independent infection experiments. Reproducibility of expression patterns ranged from 28% to 97%. Association mapping in a panel of 184 tetraploid cultivars identified SNPs in five candidate genes that were associated with MCR. These SNPs can be used in marker-assisted resistance breeding. Linkage mapping in two half-sib families (n = 111 identified SNPs in three candidate genes that were linked with MCR. The differentially expressed genes that showed association and/or linkage with MCR putatively function in phytosterol synthesis, fatty acid synthesis, asparagine synthesis, chlorophyll synthesis, cell wall modification and in the response to pathogen elicitors.

  19. Verification and Validation of TMAP7

    Energy Technology Data Exchange (ETDEWEB)

    James Ambrosek; James Ambrosek

    2008-12-01

    The Tritium Migration Analysis Program, Version 7 (TMAP7) code is an update of TMAP4, an earlier version that was verified and validated in support of the International Thermonuclear Experimental Reactor (ITER) program and of the intermediate version TMAP2000. It has undergone several revisions. The current one includes radioactive decay, multiple trap capability, more realistic treatment of heteronuclear molecular formation at surfaces, processes that involve surface-only species, and a number of other improvements. Prior to code utilization, it needed to be verified and validated to ensure that the code is performing as it was intended and that its predictions are consistent with physical reality. To that end, the demonstration and comparison problems cited here show that the code results agree with analytical solutions for select problems where analytical solutions are straightforward or with results from other verified and validated codes, and that actual experimental results can be accurately replicated using reasonable models with this code. These results and their documentation in this report are necessary steps in the qualification of TMAP7 for its intended service.

  20. Recognition of Prior Learning: The Participants' Perspective

    Science.gov (United States)

    Miguel, Marta C.; Ornelas, José H.; Maroco, João P.

    2016-01-01

    The current narrative on lifelong learning goes beyond formal education and training, including learning at work, in the family and in the community. Recognition of prior learning is a process of evaluation of those skills and knowledge acquired through life experience, allowing them to be formally recognized by the qualification systems. It is a…

  1. PET reconstruction via nonlocal means induced prior.

    Science.gov (United States)

    Hou, Qingfeng; Huang, Jing; Bian, Zhaoying; Chen, Wufan; Ma, Jianhua

    2015-01-01

    The traditional Bayesian priors for maximum a posteriori (MAP) reconstruction methods usually incorporate local neighborhood interactions that penalize large deviations in parameter estimates for adjacent pixels; therefore, only local pixel differences are utilized. This limits their abilities of penalizing the image roughness. To achieve high-quality PET image reconstruction, this study investigates a MAP reconstruction strategy by incorporating a nonlocal means induced (NLMi) prior (NLMi-MAP) which enables utilizing global similarity information of image. The present NLMi prior approximates the derivative of Gibbs energy function by an NLM filtering process. Specially, the NLMi prior is obtained by subtracting the current image estimation from its NLM filtered version and feeding the residual error back to the reconstruction filter to yield the new image estimation. We tested the present NLMi-MAP method with simulated and real PET datasets. Comparison studies with conventional filtered backprojection (FBP) and a few iterative reconstruction methods clearly demonstrate that the present NLMi-MAP method performs better in lowering noise, preserving image edge and in higher signal to noise ratio (SNR). Extensive experimental results show that the NLMi-MAP method outperforms the existing methods in terms of cross profile, noise reduction, SNR, root mean square error (RMSE) and correlation coefficient (CORR).

  2. Prior learning assessment and quality assurance practice ...

    African Journals Online (AJOL)

    The use of RPL (Recognition of Prior Learning) in higher education to assess RPL candidates for admission into programmes of study met with a lot of criticism from faculty academics. Lecturers viewed the possibility of admitting large numbers of under-qualified adult learners, as a threat to the institution's reputation, or an ...

  3. Action priors for learning domain invariances

    CSIR Research Space (South Africa)

    Rosman, Benjamin S

    2015-04-01

    Full Text Available behavioural invariances in the domain, by identifying actions to be prioritised in local contexts, invariant to task details. This information has the effect of greatly increasing the speed of solving new problems. We formalise this notion as action priors...

  4. Selection and validation of reference genes for qRT-PCR expression analysis of candidate genes involved in olfactory communication in the butterfly Bicyclus anynana.

    Directory of Open Access Journals (Sweden)

    Alok Arun

    Full Text Available Real-time quantitative reverse transcription PCR (qRT-PCR is a technique widely used to quantify the transcriptional expression level of candidate genes. qRT-PCR requires the selection of one or several suitable reference genes, whose expression profiles remain stable across conditions, to normalize the qRT-PCR expression profiles of candidate genes. Although several butterfly species (Lepidoptera have become important models in molecular evolutionary ecology, so far no study aimed at identifying reference genes for accurate data normalization for any butterfly is available. The African bush brown butterfly Bicyclus anynana has drawn considerable attention owing to its suitability as a model for evolutionary ecology, and we here provide a maiden extensive study to identify suitable reference gene in this species. We monitored the expression profile of twelve reference genes: eEF-1α, FK506, UBQL40, RpS8, RpS18, HSP, GAPDH, VATPase, ACT3, TBP, eIF2 and G6PD. We tested the stability of their expression profiles in three different tissues (wings, brains, antennae, two developmental stages (pupal and adult and two sexes (male and female, all of which were subjected to two food treatments (food stress and control feeding ad libitum. The expression stability and ranking of twelve reference genes was assessed using two algorithm-based methods, NormFinder and geNorm. Both methods identified RpS8 as the best suitable reference gene for expression data normalization. We also showed that the use of two reference genes is sufficient to effectively normalize the qRT-PCR data under varying tissues and experimental conditions that we used in B. anynana. Finally, we tested the effect of choosing reference genes with different stability on the normalization of the transcript abundance of a candidate gene involved in olfactory communication in B. anynana, the Fatty Acyl Reductase 2, and we confirmed that using an unstable reference gene can drastically alter the

  5. Selection and validation of reference genes for qRT-PCR expression analysis of candidate genes involved in olfactory communication in the butterfly Bicyclus anynana.

    Science.gov (United States)

    Arun, Alok; Baumlé, Véronique; Amelot, Gaël; Nieberding, Caroline M

    2015-01-01

    Real-time quantitative reverse transcription PCR (qRT-PCR) is a technique widely used to quantify the transcriptional expression level of candidate genes. qRT-PCR requires the selection of one or several suitable reference genes, whose expression profiles remain stable across conditions, to normalize the qRT-PCR expression profiles of candidate genes. Although several butterfly species (Lepidoptera) have become important models in molecular evolutionary ecology, so far no study aimed at identifying reference genes for accurate data normalization for any butterfly is available. The African bush brown butterfly Bicyclus anynana has drawn considerable attention owing to its suitability as a model for evolutionary ecology, and we here provide a maiden extensive study to identify suitable reference gene in this species. We monitored the expression profile of twelve reference genes: eEF-1α, FK506, UBQL40, RpS8, RpS18, HSP, GAPDH, VATPase, ACT3, TBP, eIF2 and G6PD. We tested the stability of their expression profiles in three different tissues (wings, brains, antennae), two developmental stages (pupal and adult) and two sexes (male and female), all of which were subjected to two food treatments (food stress and control feeding ad libitum). The expression stability and ranking of twelve reference genes was assessed using two algorithm-based methods, NormFinder and geNorm. Both methods identified RpS8 as the best suitable reference gene for expression data normalization. We also showed that the use of two reference genes is sufficient to effectively normalize the qRT-PCR data under varying tissues and experimental conditions that we used in B. anynana. Finally, we tested the effect of choosing reference genes with different stability on the normalization of the transcript abundance of a candidate gene involved in olfactory communication in B. anynana, the Fatty Acyl Reductase 2, and we confirmed that using an unstable reference gene can drastically alter the expression

  6. Automation of cellular therapy product manufacturing: results of a split validation comparing CD34 selection of peripheral blood stem cell apheresis product with a semi-manual vs. an automatic procedure.

    Science.gov (United States)

    Hümmer, Christiane; Poppe, Carolin; Bunos, Milica; Stock, Belinda; Wingenfeld, Eva; Huppert, Volker; Stuth, Juliane; Reck, Kristina; Essl, Mike; Seifried, Erhard; Bonig, Halvard

    2016-03-16

    Automation of cell therapy manufacturing promises higher productivity of cell factories, more economical use of highly-trained (and costly) manufacturing staff, facilitation of processes requiring manufacturing steps at inconvenient hours, improved consistency of processing steps and other benefits. One of the most broadly disseminated engineered cell therapy products is immunomagnetically selected CD34+ hematopoietic "stem" cells (HSCs). As the clinical GMP-compliant automat CliniMACS Prodigy is being programmed to perform ever more complex sequential manufacturing steps, we developed a CD34+ selection module for comparison with the standard semi-automatic CD34 "normal scale" selection process on CliniMACS Plus, applicable for 600 × 10(6) target cells out of 60 × 10(9) total cells. Three split-validation processings with healthy donor G-CSF-mobilized apheresis products were performed; feasibility, time consumption and product quality were assessed. All processes proceeded uneventfully. Prodigy runs took about 1 h longer than CliniMACS Plus runs, albeit with markedly less hands-on operator time and therefore also suitable for less experienced operators. Recovery of target cells was the same for both technologies. Although impurities, specifically T- and B-cells, were 5 ± 1.6-fold and 4 ± 0.4-fold higher in the Prodigy products (p = ns and p = 0.013 for T and B cell depletion, respectively), T cell contents per kg of a virtual recipient receiving 4 × 10(6) CD34+ cells/kg was below 10 × 10(3)/kg even in the worst Prodigy product and thus more than fivefold below the specification of CD34+ selected mismatched-donor stem cell products. The products' theoretical clinical usability is thus confirmed. This split validation exercise of a relatively short and simple process exemplifies the potential of automatic cell manufacturing. Automation will further gain in attractiveness when applied to more complex processes, requiring frequent interventions or handling at

  7. Statistical Analysis and validation

    NARCIS (Netherlands)

    Hoefsloot, H.C.J.; Horvatovich, P.; Bischoff, R.

    2013-01-01

    In this chapter guidelines are given for the selection of a few biomarker candidates from a large number of compounds with a relative low number of samples. The main concepts concerning the statistical validation of the search for biomarkers are discussed. These complicated methods and concepts are

  8. Preoperative physical examination and imaging of femoroacetabular impingement prior to hip arthroscopy-a systematic review.

    Science.gov (United States)

    Haldane, Chloe E; Ekhtiari, Seper; de Sa, Darren; Simunovic, Nicole; Ayeni, Olufemi R

    2017-08-01

    The purpose of this systematic review is to report current preoperative assessment for femoroacetabular impingement (FAI) including physical examination and imaging modalities prior to hip arthroscopy, and report current imaging measures used in the diagnosis of FAI. The electronic databases MEDLINE, EMBASE and PubMed were searched and screened in duplicate for relevant studies. Data regarding patient demographics, non-operative treatment, preoperative assessment including physical examination and imaging prior to hip arthroscopy were abstracted. Study quality was assessed in duplicate using the Methodological Index for Non-Randomized Studies criteria. Sixty-eight studies of fair quality evidence that involved a total of 5125 patients (5400 hips) were included. In total, 56% of all patients were male and mean age was 36 years (SD ± 10.0). Within physical examination, FADIR impingement testing was reported in 57% of patients. All included studies reported plain radiographic imaging as a component of preoperative assessment with anterior-posterior pelvis view being the most commonly reported view, followed by the cross-table lateral and Dunn views. Magnetic resonance imaging was obtained for 52% of included patients and computed tomography for 26% of patients. The most commonly reported measure within imaging for the diagnosis of cam type impingement was alpha angle (66%), whereas for pincer type impingement, the cross-over sign (48%) was most reported. Preoperative assessment is underreported in the FAI literature. Improved reporting is warranted to develop a more consistent and validated diagnostic algorithm for FAI to enhance patient selection. Level of evidence : Level IV, Systematic Review of Level I-IV Studies.

  9. Preoperative physical examination and imaging of femoroacetabular impingement prior to hip arthroscopy—a systematic review

    Science.gov (United States)

    Haldane, Chloe E.; Ekhtiari, Seper; de SA, Darren; Simunovic, Nicole

    2017-01-01

    Abstract The purpose of this systematic review is to report current preoperative assessment for femoroacetabular impingement (FAI) including physical examination and imaging modalities prior to hip arthroscopy, and report current imaging measures used in the diagnosis of FAI. The electronic databases MEDLINE, EMBASE and PubMed were searched and screened in duplicate for relevant studies. Data regarding patient demographics, non-operative treatment, preoperative assessment including physical examination and imaging prior to hip arthroscopy were abstracted. Study quality was assessed in duplicate using the Methodological Index for Non-Randomized Studies criteria. Sixty-eight studies of fair quality evidence that involved a total of 5125 patients (5400 hips) were included. In total, 56% of all patients were male and mean age was 36 years (SD ± 10.0). Within physical examination, FADIR impingement testing was reported in 57% of patients. All included studies reported plain radiographic imaging as a component of preoperative assessment with anterior–posterior pelvis view being the most commonly reported view, followed by the cross-table lateral and Dunn views. Magnetic resonance imaging was obtained for 52% of included patients and computed tomography for 26% of patients. The most commonly reported measure within imaging for the diagnosis of cam type impingement was alpha angle (66%), whereas for pincer type impingement, the cross-over sign (48%) was most reported. Preoperative assessment is underreported in the FAI literature. Improved reporting is warranted to develop a more consistent and validated diagnostic algorithm for FAI to enhance patient selection. Level of evidence: Level IV, Systematic Review of Level I–IV Studies. PMID:28948032

  10. Bayesian Model Comparison With the g-Prior

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Cemgil, Ali Taylan

    2014-01-01

    ’s asymptotic MAP rule was an improvement, and in this paper we extend the work by Djuric in several ways. Specifically, we consider the elicitation of proper prior distributions, treat the case of real- and complex-valued data simultaneously in a Bayesian framework similar to that considered by Djuric......, and develop new model selection rules for a regression model containing both linear and non-linear parameters. Moreover, we use this framework to give a new interpretation of the popular information criteria and relate their performance to the signal-to-noise ratio of the data. By use of simulations, we also...... demonstrate that our proposed model comparison and selection rules outperform the traditional information criteria both in terms of detecting the true model and in terms of predicting unobserved data. The simulation code is available online....

  11. Maximum entropy reconstruction of spin densities involving non uniform prior

    International Nuclear Information System (INIS)

    Schweizer, J.; Ressouche, E.; Papoular, R.J.; Zheludev, A.I.

    1997-01-01

    Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m(rvec r), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for ρ(rvec r) = m(rvec r). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing

  12. Evolution of Industry Knowledge in the Public Domain: Prior Art Searching for Software Patents

    Directory of Open Access Journals (Sweden)

    Jinseok Park

    2005-03-01

    Full Text Available Searching prior art is a key part of the patent application and examination processes. A comprehensive prior art search gives the inventor ideas as to how he can improve or circumvent existing technology by providing up to date knowledge on the state of the art. It also enables the patent applicant to minimise the likelihood of an objection from the patent office. This article explores the characteristics of prior art associated with software patents, dealing with difficulties in searching prior art due to the lack of resources, and considers public contribution to the formation of prior art databases. It addresses the evolution of electronic prior art in line with technological development, and discusses laws and practices in the EPO, USPTO, and the JPO in relation to the validity of prior art resources on the Internet. This article also investigates the main features of searching sources and tools in the three patent offices as well as non-patent literature databases. Based on the analysis of various searching databases, it provides some strategies of efficient prior art searching that should be considered for software-related inventions.

  13. Random template placement and prior information

    International Nuclear Information System (INIS)

    Roever, Christian

    2010-01-01

    In signal detection problems, one is usually faced with the task of searching a parameter space for peaks in the likelihood function which indicate the presence of a signal. Random searches have proven to be very efficient as well as easy to implement, compared e.g. to searches along regular grids in parameter space. Knowledge of the parameterised shape of the signal searched for adds structure to the parameter space, i.e., there are usually regions requiring to be densely searched while in other regions a coarser search is sufficient. On the other hand, prior information identifies the regions in which a search will actually be promising or may likely be in vain. Defining specific figures of merit allows one to combine both template metric and prior distribution and devise optimal sampling schemes over the parameter space. We show an example related to the gravitational wave signal from a binary inspiral event. Here the template metric and prior information are particularly contradictory, since signals from low-mass systems tolerate the least mismatch in parameter space while high-mass systems are far more likely, as they imply a greater signal-to-noise ratio (SNR) and hence are detectable to greater distances. The derived sampling strategy is implemented in a Markov chain Monte Carlo (MCMC) algorithm where it improves convergence.

  14. Prior knowledge of category size impacts visual search.

    Science.gov (United States)

    Wu, Rachel; McGee, Brianna; Echiverri, Chelsea; Zinszer, Benjamin D

    2018-03-30

    Prior research has shown that category search can be similar to one-item search (as measured by the N2pc ERP marker of attentional selection) for highly familiar, smaller categories (e.g., letters and numbers) because the finite set of items in a category can be grouped into one unit to guide search. Other studies have shown that larger, more broadly defined categories (e.g., healthy food) also can elicit N2pc components during category search, but the amplitude of these components is typically attenuated. Two experiments investigated whether the perceived size of a familiar category impacts category and exemplar search. We presented participants with 16 familiar company logos: 8 from a smaller category (social media companies) and 8 from a larger category (entertainment/recreation manufacturing companies). The ERP results from Experiment 1 revealed that, in a two-item search array, search was more efficient for the smaller category of logos compared to the larger category. In a four-item search array (Experiment 2), where two of the four items were placeholders, search was largely similar between the category types, but there was more attentional capture by nontarget members from the same category as the target for smaller rather than larger categories. These results support a growing literature on how prior knowledge of categories affects attentional selection and capture during visual search. We discuss the implications of these findings in relation to assessing cognitive abilities across the lifespan, given that prior knowledge typically increases with age. © 2018 Society for Psychophysiological Research.

  15. Emergency Medicine Myths: Computed Tomography of the Head Prior to Lumbar Puncture in Adults with Suspected Bacterial Meningitis - Due Diligence or Antiquated Practice?

    Science.gov (United States)

    April, Michael D; Long, Brit; Koyfman, Alex

    2017-09-01

    Various sources purport an association between lumbar puncture and brainstem herniation in patients with intracranial mass effect lesions. Several organizations and texts recommend head computed tomography (CT) prior to lumbar puncture in selected patients. To review the evidence regarding the utility of obtaining head CT prior to lumbar puncture in adults with suspected bacterial meningitis. Observational studies report a risk of post-lumbar puncture brainstem herniation in the presence of intracranial mass effect (1.5%) that is significantly lower than that reported among all patients with bacterial meningitis (up to 13.3%). It is unclear from existing literature whether identifying patients with intracranial mass effect decreases herniation risk. Up to 80% of patients with bacterial meningitis experiencing herniation have no CT abnormalities, and approximately half of patients with intracranial mass effect not undergoing lumbar puncture herniate. Decision rules to selectively perform CT on only those individuals most likely to have intracranial mass effect lesions have not undergone validation. Despite recommendations for immediate antimicrobial therapy prior to imaging, data indicate an association between pre-lumbar puncture CT and antibiotic delays. Recent data demonstrate shortened door-to-antibiotic times and lower mortality from bacterial meningitis after implementation of new national guidelines, which restricted generally accepted CT indications by removing impaired mental status as imaging criterion. Data supporting routine head CT prior to lumbar puncture are limited. Physicians should consider selective CT for those patients at risk for intracranial mass effect lesions based on decision rules or clinical gestalt. Patients undergoing head CT must receive immediate antibiotic therapy. Published by Elsevier Inc.

  16. Shape prior modeling using sparse representation and online dictionary learning.

    Science.gov (United States)

    Zhang, Shaoting; Zhan, Yiqiang; Zhou, Yan; Uzunbas, Mustafa; Metaxas, Dimitris N

    2012-01-01

    The recently proposed sparse shape composition (SSC) opens a new avenue for shape prior modeling. Instead of assuming any parametric model of shape statistics, SSC incorporates shape priors on-the-fly by approximating a shape instance (usually derived from appearance cues) by a sparse combination of shapes in a training repository. Theoretically, one can increase the modeling capability of SSC by including as many training shapes in the repository. However, this strategy confronts two limitations in practice. First, since SSC involves an iterative sparse optimization at run-time, the more shape instances contained in the repository, the less run-time efficiency SSC has. Therefore, a compact and informative shape dictionary is preferred to a large shape repository. Second, in medical imaging applications, training shapes seldom come in one batch. It is very time consuming and sometimes infeasible to reconstruct the shape dictionary every time new training shapes appear. In this paper, we propose an online learning method to address these two limitations. Our method starts from constructing an initial shape dictionary using the K-SVD algorithm. When new training shapes come, instead of re-constructing the dictionary from the ground up, we update the existing one using a block-coordinates descent approach. Using the dynamically updated dictionary, sparse shape composition can be gracefully scaled up to model shape priors from a large number of training shapes without sacrificing run-time efficiency. Our method is validated on lung localization in X-Ray and cardiac segmentation in MRI time series. Compared to the original SSC, it shows comparable performance while being significantly more efficient.

  17. Racial/Ethnic Differences in Dietary Intake among WIC Families Prior to Food Package Revisions

    Science.gov (United States)

    Kong, Angela; Odoms-Young, Angela M.; Schiffer, Linda A.; Berbaum, Michael L.; Porter, Summer J.; Blumstein, Lara; Fitzgibbon, Marian L.

    2013-01-01

    Objective: To compare the diets of African American and Hispanic families in the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) prior to the 2009 food package revisions. Methods: Mother-child dyads were recruited from 12 WIC sites in Chicago, IL. Individuals with 1 valid 24-hour recall were included in the analyses…

  18. Validation of multisource electronic health record data: an application to blood transfusion data.

    Science.gov (United States)

    Hoeven, Loan R van; Bruijne, Martine C de; Kemper, Peter F; Koopman, Maria M W; Rondeel, Jan M M; Leyte, Anja; Koffijberg, Hendrik; Janssen, Mart P; Roes, Kit C B

    2017-07-14

    Although data from electronic health records (EHR) are often used for research purposes, systematic validation of these data prior to their use is not standard practice. Existing validation frameworks discuss validity concepts without translating these into practical implementation steps or addressing the potential influence of linking multiple sources. Therefore we developed a practical approach for validating routinely collected data from multiple sources and to apply it to a blood transfusion data warehouse to evaluate the usability in practice. The approach consists of identifying existing validation frameworks for EHR data or linked data, selecting validity concepts from these frameworks and establishing quantifiable validity outcomes for each concept. The approach distinguishes external validation concepts (e.g. concordance with external reports, previous literature and expert feedback) and internal consistency concepts which use expected associations within the dataset itself (e.g. completeness, uniformity and plausibility). In an example case, the selected concepts were applied to a transfusion dataset and specified in more detail. Application of the approach to a transfusion dataset resulted in a structured overview of data validity aspects. This allowed improvement of these aspects through further processing of the data and in some cases adjustment of the data extraction. For example, the proportion of transfused products that could not be linked to the corresponding issued products initially was 2.2% but could be improved by adjusting data extraction criteria to 0.17%. This stepwise approach for validating linked multisource data provides a basis for evaluating data quality and enhancing interpretation. When the process of data validation is adopted more broadly, this contributes to increased transparency and greater reliability of research based on routinely collected electronic health records.

  19. Use of prior mammograms in the transition to digital mammography: A performance and cost analysis

    International Nuclear Information System (INIS)

    Taylor-Phillips, S.; Wallis, M.G.; Duncan, A.; Gale, A.G.

    2012-01-01

    Breast screening in Europe is gradually changing from film to digital imaging and reporting of cases. In the transition period prior mammograms (from the preceding screening round) are films thereby potentially causing difficulties in comparison to current digital mammograms. To examine this breast screening performance was measured at a digital mammography workstation with prior mammograms displayed in different formats, and the associated costs calculated. 160 selected difficult cases (41% malignant) were read by eight UK qualified mammography readers in three conditions: with film prior mammograms; with digitised prior mammograms; or without prior mammograms. Lesion location and probability of malignancy were recorded, alongside a decision of whether to recall each case for further tests. JAFROC analysis showed a difference between conditions (p = .006); performance with prior mammograms in either film or digitised formats was superior to that without prior mammograms (p < .05). There was no difference in the performance when the prior mammograms were presented in film or digitised form. The number of benign or normal cases recalled was 26% higher without prior mammograms than with digitised or film prior mammograms (p < .05). This would correspond to an increase in recall rate at the study hospital from 4.3% to 5.5% with no associated increase in cancer detection rate. The cost of this increase was estimated to be £11,581 (€13,666) per 10,000 women screened, which is higher than the cost of digitised (£11,114/€13,115), or film display (£6451/€7612) of the prior mammograms. It is recommended that, where available, prior mammograms are used in the transition to digital breast screening.

  20. Leveraging Prior Calculus Study with Embedded Review

    Science.gov (United States)

    Nikolov, Margaret C.; Withers, Wm. Douglas

    2016-01-01

    We propose a new course structure to address the needs of college students with previous calculus study but no course validations as an alternative to repeating the first year of calculus. Students are introduced directly to topics from Calculus III unpreceded by a formal review of topics from Calculus I or II, but with additional syllabus time…

  1. Automated segmentation of dental CBCT image with prior-guided sequential random forests

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599-7513 (United States); Chen, Ken-Chung; Tang, Zhen [Surgical Planning Laboratory, Department of Oral and Maxillofacial Surgery, Houston Methodist Research Institute, Houston, Texas 77030 (United States); Xia, James J., E-mail: dgshen@med.unc.edu, E-mail: JXia@HoustonMethodist.org [Surgical Planning Laboratory, Department of Oral and Maxillofacial Surgery, Houston Methodist Research Institute, Houston, Texas 77030 (United States); Department of Surgery (Oral and Maxillofacial Surgery), Weill Medical College, Cornell University, New York, New York 10065 (United States); Department of Oral and Craniomaxillofacial Surgery, Shanghai Jiao Tong University School of Medicine, Shanghai Ninth People’s Hospital, Shanghai 200011 (China); Shen, Dinggang, E-mail: dgshen@med.unc.edu, E-mail: JXia@HoustonMethodist.org [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599-7513 and Department of Brain and Cognitive Engineering, Korea University, Seoul 02841 (Korea, Republic of)

    2016-01-15

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate 3D models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the image artifacts caused by beam hardening, imaging noise, inhomogeneity, truncation, and maximal intercuspation, it is difficult to segment the CBCT. Methods: In this paper, the authors present a new automatic segmentation method to address these problems. Specifically, the authors first employ a majority voting method to estimate the initial segmentation probability maps of both mandible and maxilla based on multiple aligned expert-segmented CBCT images. These probability maps provide an important prior guidance for CBCT segmentation. The authors then extract both the appearance features from CBCTs and the context features from the initial probability maps to train the first-layer of random forest classifier that can select discriminative features for segmentation. Based on the first-layer of trained classifier, the probability maps are updated, which will be employed to further train the next layer of random forest classifier. By iteratively training the subsequent random forest classifier using both the original CBCT features and the updated segmentation probability maps, a sequence of classifiers can be derived for accurate segmentation of CBCT images. Results: Segmentation results on CBCTs of 30 subjects were both quantitatively and qualitatively validated based on manually labeled ground truth. The average Dice ratios of mandible and maxilla by the authors’ method were 0.94 and 0.91, respectively, which are significantly better than the state-of-the-art method based on sparse representation (p-value < 0.001). Conclusions: The authors have developed and validated a novel fully automated method

  2. Automated segmentation of dental CBCT image with prior-guided sequential random forests

    International Nuclear Information System (INIS)

    Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang; Chen, Ken-Chung; Tang, Zhen; Xia, James J.; Shen, Dinggang

    2016-01-01

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate 3D models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the image artifacts caused by beam hardening, imaging noise, inhomogeneity, truncation, and maximal intercuspation, it is difficult to segment the CBCT. Methods: In this paper, the authors present a new automatic segmentation method to address these problems. Specifically, the authors first employ a majority voting method to estimate the initial segmentation probability maps of both mandible and maxilla based on multiple aligned expert-segmented CBCT images. These probability maps provide an important prior guidance for CBCT segmentation. The authors then extract both the appearance features from CBCTs and the context features from the initial probability maps to train the first-layer of random forest classifier that can select discriminative features for segmentation. Based on the first-layer of trained classifier, the probability maps are updated, which will be employed to further train the next layer of random forest classifier. By iteratively training the subsequent random forest classifier using both the original CBCT features and the updated segmentation probability maps, a sequence of classifiers can be derived for accurate segmentation of CBCT images. Results: Segmentation results on CBCTs of 30 subjects were both quantitatively and qualitatively validated based on manually labeled ground truth. The average Dice ratios of mandible and maxilla by the authors’ method were 0.94 and 0.91, respectively, which are significantly better than the state-of-the-art method based on sparse representation (p-value < 0.001). Conclusions: The authors have developed and validated a novel fully automated method

  3. Order-Constrained Reference Priors with Implications for Bayesian Isotonic Regression, Analysis of Covariance and Spatial Models

    Science.gov (United States)

    Gong, Maozhen

    Selecting an appropriate prior distribution is a fundamental issue in Bayesian Statistics. In this dissertation, under the framework provided by Berger and Bernardo, I derive the reference priors for several models which include: Analysis of Variance (ANOVA)/Analysis of Covariance (ANCOVA) models with a categorical variable under common ordering constraints, the conditionally autoregressive (CAR) models and the simultaneous autoregressive (SAR) models with a spatial autoregression parameter rho considered. The performances of reference priors for ANOVA/ANCOVA models are evaluated by simulation studies with comparisons to Jeffreys' prior and Least Squares Estimation (LSE). The priors are then illustrated in a Bayesian model of the "Risk of Type 2 Diabetes in New Mexico" data, where the relationship between the type 2 diabetes risk (through Hemoglobin A1c) and different smoking levels is investigated. In both simulation studies and real data set modeling, the reference priors that incorporate internal order information show good performances and can be used as default priors. The reference priors for the CAR and SAR models are also illustrated in the "1999 SAT State Average Verbal Scores" data with a comparison to a Uniform prior distribution. Due to the complexity of the reference priors for both CAR and SAR models, only a portion (12 states in the Midwest) of the original data set is considered. The reference priors can give a different marginal posterior distribution compared to a Uniform prior, which provides an alternative for prior specifications for areal data in Spatial statistics.

  4. Groundwater Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed E. Hassan

    2006-01-24

    data to constrain model input parameters is shown for the second case study using a Bayesian approach known as Markov Chain Monte Carlo. The approach shows a great potential to be helpful in the validation process and in incorporating prior knowledge with new field data to derive posterior distributions for both model input and output.

  5. Sparse Multivariate Modeling: Priors and Applications

    DEFF Research Database (Denmark)

    Henao, Ricardo

    This thesis presents a collection of statistical models that attempt to take advantage of every piece of prior knowledge available to provide the models with as much structure as possible. The main motivation for introducing these models is interpretability since in practice we want to be able...... a general yet self-contained description of every model in terms of generative assumptions, interpretability goals, probabilistic formulation and target applications. Case studies, benchmark results and practical details are also provided as appendices published elsewhere, containing reprints of peer...

  6. Genome position specific priors for genomic prediction

    DEFF Research Database (Denmark)

    Brøndum, Rasmus Froberg; Su, Guosheng; Lund, Mogens Sandø

    2012-01-01

    casual mutation is different between the populations but affects the same gene. Proportions of a four-distribution mixture for SNP effects in segments of fixed size along the genome are derived from one population and set as location specific prior proportions of distributions of SNP effects...... for the target population. The model was tested using dairy cattle populations of different breeds: 540 Australian Jersey bulls, 2297 Australian Holstein bulls and 5214 Nordic Holstein bulls. The traits studied were protein-, fat- and milk yield. Genotypic data was Illumina 777K SNPs, real or imputed Results...

  7. Validation philosophy

    International Nuclear Information System (INIS)

    Vornehm, D.

    1994-01-01

    To determine when a set of calculations falls within an umbrella of an existing validation documentation, it is necessary to generate a quantitative definition of range of applicability (our definition is only qualitative) for two reasons: (1) the current trend in our regulatory environment will soon make it impossible to support the legitimacy of a validation without quantitative guidelines; and (2) in my opinion, the lack of support by DOE for further critical experiment work is directly tied to our inability to draw a quantitative open-quotes line-in-the-sandclose quotes beyond which we will not use computer-generated values

  8. Development, validation and application of a micro-liquid chromatography-tandem mass spectrometry based method for simultaneous quantification of selected protein biomarkers of endothelial dysfunction in murine plasma.

    Science.gov (United States)

    Suraj, Joanna; Kurpińska, Anna; Olkowicz, Mariola; Niedzielska-Andres, Ewa; Smolik, Magdalena; Zakrzewska, Agnieszka; Jasztal, Agnieszka; Sitek, Barbara; Chlopicki, Stefan; Walczak, Maria

    2018-02-05

    The objective of this study was to develop and validate the method based on micro-liquid chromatography-tandem mass spectrometry (microLC/MS-MRM) for simultaneous determination of adiponectin (ADN), von Willebrand factor (vWF), soluble form of vascular cell adhesion molecule 1 (sVCAM-1), soluble form of intercellular adhesion molecule 1 (sICAM-1) and syndecan-1 (SDC-1) in mouse plasma. The calibration range was established from 2.5pmol/mL to 5000pmol/mL for ADN; 5pmol/mL to 5000pmol/mL for vWF; 0.375pmol/mL to 250pmol/mL for sVCAM-1 and sICAM-1; and 0.25pmol/mL to 250pmol/mL for SDC-1. The method was applied to measure the plasma concentration of selected proteins in mice fed high-fat diet (HFD), and revealed the pro-thrombotic status by increased concentration of vWF (1.31±0.17 nmol/mL (Control) vs 1.98±0.09 nmol/mL (HFD), p <0.05) and the dysregulation of adipose tissue metabolism by decreased concentration of ADN (0.62±0.08 nmol/mL (Control) vs 0.37±0.06 nmol/mL (HFD), p <0.05). In conclusion, the microLC/MS-MRM-based method allows for reliable measurements of selected protein biomarkers of endothelial dysfunction in mouse plasma. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Serological and genetic characterisation of bovine respiratory syncytial virus (BRSV) indicates that Danish isolates belong to the intermediate subgroup: no evidence of a selective effect on the variability of G protein nucleotide sequence by prior cell culture adaption and passages in cell culture

    DEFF Research Database (Denmark)

    Larsen, Lars Erik; Uttenthal, Åse; Arctander, P.

    1998-01-01

    on the nucleotide sequence of the G protein. These findings indicated that the previously established variabilities of the G protein of RS virus isolates were not attributable to mutations induced during the propagation of the virus. The reactivity of the Danish isolates with G protein-specific MAbs were similar......Danish isolates of bovine respiratory syncytial virus (BRSV) were characterised by nucleotide sequencing of the G glycoprotein and by their reactivity with a panel of monoclonal antibodies (MAbs). Among the six Danish isolates, the overall sequence divergence ranged between 0 and 3...... part of the G gene of additional 11 field BRSV viruses, processed directly from lung samples without prior adaption to cell culture growth. revealed sequence variabilities in the range obtained with the propagated virus. In addition, several passages in cell culture and in calves had no major impact...

  10. Extended Linear Models with Gaussian Priors

    DEFF Research Database (Denmark)

    Quinonero, Joaquin

    2002-01-01

    In extended linear models the input space is projected onto a feature space by means of an arbitrary non-linear transformation. A linear model is then applied to the feature space to construct the model output. The dimension of the feature space can be very large, or even infinite, giving the model...... a very big flexibility. Support Vector Machines (SVM's) and Gaussian processes are two examples of such models. In this technical report I present a model in which the dimension of the feature space remains finite, and where a Bayesian approach is used to train the model with Gaussian priors...... on the parameters. The Relevance Vector Machine, introduced by Tipping, is a particular case of such a model. I give the detailed derivations of the expectation-maximisation (EM) algorithm used in the training. These derivations are not found in the literature, and might be helpful for newcomers....

  11. Savings for visuomotor adaptation require prior history of error, not prior repetition of successful actions.

    Science.gov (United States)

    Leow, Li-Ann; de Rugy, Aymar; Marinovic, Welber; Riek, Stephan; Carroll, Timothy J

    2016-10-01

    When we move, perturbations to our body or the environment can elicit discrepancies between predicted and actual outcomes. We readily adapt movements to compensate for such discrepancies, and the retention of this learning is evident as savings, or faster readaptation to a previously encountered perturbation. The mechanistic processes contributing to savings, or even the necessary conditions for savings, are not fully understood. One theory suggests that savings requires increased sensitivity to previously experienced errors: when perturbations evoke a sequence of correlated errors, we increase our sensitivity to the errors experienced, which subsequently improves error correction (Herzfeld et al. 2014). An alternative theory suggests that a memory of actions is necessary for savings: when an action becomes associated with successful target acquisition through repetition, that action is more rapidly retrieved at subsequent learning (Huang et al. 2011). In the present study, to better understand the necessary conditions for savings, we tested how savings is affected by prior experience of similar errors and prior repetition of the action required to eliminate errors using a factorial design. Prior experience of errors induced by a visuomotor rotation in the savings block was either prevented at initial learning by gradually removing an oppositely signed perturbation or enforced by abruptly removing the perturbation. Prior repetition of the action required to eliminate errors in the savings block was either deprived or enforced by manipulating target location in preceding trials. The data suggest that prior experience of errors is both necessary and sufficient for savings, whereas prior repetition of a successful action is neither necessary nor sufficient for savings. Copyright © 2016 the American Physiological Society.

  12. Negotiating Multicollinearity with Spike-and-Slab Priors.

    Science.gov (United States)

    Ročková, Veronika; George, Edward I

    2014-08-01

    In multiple regression under the normal linear model, the presence of multicollinearity is well known to lead to unreliable and unstable maximum likelihood estimates. This can be particularly troublesome for the problem of variable selection where it becomes more difficult to distinguish between subset models. Here we show how adding a spike-and-slab prior mitigates this difficulty by filtering the likelihood surface into a posterior distribution that allocates the relevant likelihood information to each of the subset model modes. For identification of promising high posterior models in this setting, we consider three EM algorithms, the fast closed form EMVS version of Rockova and George (2014) and two new versions designed for variants of the spike-and-slab formulation. For a multimodal posterior under multicollinearity, we compare the regions of convergence of these three algorithms. Deterministic annealing versions of the EMVS algorithm are seen to substantially mitigate this multimodality. A single simple running example is used for illustration throughout.

  13. Quantitative modeling of selective lysosomal targeting for drug design

    DEFF Research Database (Denmark)

    Trapp, Stefan; Rosania, G.; Horobin, R.W.

    2008-01-01

    log K ow. These findings were validated with experimental results and by a comparison to the properties of antimalarial drugs in clinical use. For ten active compounds, nine were predicted to accumulate to a greater extent in lysosomes than in other organelles, six of these were in the optimum range...... predicted by the model and three were close. Five of the antimalarial drugs were lipophilic weak dibasic compounds. The predicted optimum properties for a selective accumulation of weak bivalent bases in lysosomes are consistent with experimental values and are more accurate than any prior calculation...

  14. Variable selection methods in PLS regression - a comparison study on metabolomics data

    DEFF Research Database (Denmark)

    Karaman, İbrahim; Hedemann, Mette Skou; Knudsen, Knud Erik Bach

    . The aim of the metabolomics study was to investigate the metabolic profile in pigs fed various cereal fractions with special attention to the metabolism of lignans using LC-MS based metabolomic approach. References 1. Lê Cao KA, Rossouw D, Robert-Granié C, Besse P: A Sparse PLS for Variable Selection when...... integrated approach. Due to the high number of variables in data sets (both raw data and after peak picking) the selection of important variables in an explorative analysis is difficult, especially when different data sets of metabolomics data need to be related. Variable selection (or removal of irrelevant...... different strategies for variable selection on PLSR method were considered and compared with respect to selected subset of variables and the possibility for biological validation. Sparse PLSR [1] as well as PLSR with Jack-knifing [2] was applied to data in order to achieve variable selection prior...

  15. Putting Priors in Mixture Density Mercer Kernels

    Science.gov (United States)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2004-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly infinite dimensional feature space. We describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using predefined kernels. These data adaptive kernels can en- code prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS). The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains template for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic- algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code. The results show that the Mixture Density Mercer-Kernel described here outperforms tree-based classification in distinguishing high-redshift galaxies from low- redshift galaxies by approximately 16% on test data, bagged trees by approximately 7%, and bagged trees built on a much larger sample of data by approximately 2%.

  16. Prior expectations facilitate metacognition for perceptual decision.

    Science.gov (United States)

    Sherman, M T; Seth, A K; Barrett, A B; Kanai, R

    2015-09-01

    The influential framework of 'predictive processing' suggests that prior probabilistic expectations influence, or even constitute, perceptual contents. This notion is evidenced by the facilitation of low-level perceptual processing by expectations. However, whether expectations can facilitate high-level components of perception remains unclear. We addressed this question by considering the influence of expectations on perceptual metacognition. To isolate the effects of expectation from those of attention we used a novel factorial design: expectation was manipulated by changing the probability that a Gabor target would be presented; attention was manipulated by instructing participants to perform or ignore a concurrent visual search task. We found that, independently of attention, metacognition improved when yes/no responses were congruent with expectations of target presence/absence. Results were modeled under a novel Bayesian signal detection theoretic framework which integrates bottom-up signal propagation with top-down influences, to provide a unified description of the mechanisms underlying perceptual decision and metacognition. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Washing of waste prior to landfilling.

    Science.gov (United States)

    Cossu, Raffaello; Lai, Tiziana

    2012-05-01

    The main impact produced by landfills is represented by the release of leachate emissions. Waste washing treatment has been investigated to evaluate its efficiency in reducing the waste leaching fraction prior to landfilling. The results of laboratory-scale washing tests applied to several significant residues from integrated management of solid waste are presented in this study, specifically: non-recyclable plastics from source separation, mechanical-biological treated municipal solid waste and a special waste, automotive shredded residues. Results obtained demonstrate that washing treatment contributes towards combating the environmental impacts of raw wastes. Accordingly, a leachate production model was applied, leading to the consideration that the concentrations of chemical oxygen demand (COD) and total Kjeldahl nitrogen (TKN), parameters of fundamental importance in the characterization of landfill leachate, from a landfill containing washed wastes, are comparable to those that would only be reached between 90 and 220years later in the presence of raw wastes. The findings obtained demonstrated that washing of waste may represent an effective means of reducing the leachable fraction resulting in a consequent decrease in landfill emissions. Further studies on pilot scale are needed to assess the potential for full-scale application of this treatment. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Pitch perception prior to cortical maturation

    Science.gov (United States)

    Lau, Bonnie K.

    Pitch perception plays an important role in many complex auditory tasks including speech perception, music perception, and sound source segregation. Because of the protracted and extensive development of the human auditory cortex, pitch perception might be expected to mature, at least over the first few months of life. This dissertation investigates complex pitch perception in 3-month-olds, 7-month-olds and adults -- time points when the organization of the auditory pathway is distinctly different. Using an observer-based psychophysical procedure, a series of four studies were conducted to determine whether infants (1) discriminate the pitch of harmonic complex tones, (2) discriminate the pitch of unresolved harmonics, (3) discriminate the pitch of missing fundamental melodies, and (4) have comparable sensitivity to pitch and spectral changes as adult listeners. The stimuli used in these studies were harmonic complex tones, with energy missing at the fundamental frequency. Infants at both three and seven months of age discriminated the pitch of missing fundamental complexes composed of resolved and unresolved harmonics as well as missing fundamental melodies, demonstrating perception of complex pitch by three months of age. More surprisingly, infants in both age groups had lower pitch and spectral discrimination thresholds than adult listeners. Furthermore, no differences in performance on any of the tasks presented were observed between infants at three and seven months of age. These results suggest that subcortical processing is not only sufficient to support pitch perception prior to cortical maturation, but provides adult-like sensitivity to pitch by three months.

  19. Febrile seizures prior to sudden cardiac death

    DEFF Research Database (Denmark)

    Stampe, Niels Kjær; Glinge, Charlotte; Jabbari, Reza

    2018-01-01

    Aims: Febrile seizure (FS) is a common disorder affecting 2-5% of children up to 5 years of age. The aim of this study was to determine whether FS in early childhood are over-represented in young adults dying from sudden cardiac death (SCD). Methods and results: We included all deaths (n = 4595...... with FS was sudden arrhythmic death syndrome (5/8; 62.5%). Conclusion: In conclusion, this study demonstrates a significantly two-fold increase in the frequency of FS prior to death in young SCD cases compared with the two control groups, suggesting that FS could potentially contribute in a risk......) nationwide and through review of all death certificates, we identified 245 SCD in Danes aged 1-30 years in 2000-09. Through the usage of nationwide registries, we identified all persons admitted with first FS among SCD cases (14/245; 5.7%) and in the corresponding living Danish population (71 027/2 369 785...

  20. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    Science.gov (United States)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  1. Selection and Serial Entrepreneurs

    DEFF Research Database (Denmark)

    Chen, Jing

    2013-01-01

    There is substantial evidence that serial entrepreneurs outperform de novo entrepreneurs. But is this positive association between prior experience and performance the result of learning by doing or of selection on ability? This paper proposes a strategy that combines the fixed-effects model and IV...... when the analysis focuses on founding new startups in sectors closely related to entrepreneurs' previous ventures....

  2. Selective Mutism: Phenomenological Characteristics.

    Science.gov (United States)

    Ford, Mary Ann; Sladeczek, Ingrid E.; Carlson, John; Kratochwill, Thomas R.

    1998-01-01

    To explore factors related to selective mutism (SM), a survey of persons (N=153, including 135 children) with SM was undertaken. Three theoretical assumptions are supported: (1) variant talking behaviors prior to identification of SM; (2) link between SM and social anxiety; (3) potential link between temperament and SM. (EMK)

  3. Bayesian Group Bridge for Bi-level Variable Selection.

    Science.gov (United States)

    Mallick, Himel; Yi, Nengjun

    2017-06-01

    A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.

  4. Spectrophotometric determination of boron in water with prior distillation and hydrolysis of the methyl borate

    International Nuclear Information System (INIS)

    Monzo, J.; Pomares, F.; Guardia, M. de la

    1988-01-01

    A procedure for the determination of boron in irrigation waters is proposed, involving the prior distillation and hydrolysis of methyl borate and subsequent spectrophotometric determination with azomethine-H. The selectivity is better than that of the direct analysis method. (author)

  5. Figure-ground segmentation based on class-independent shape priors

    Science.gov (United States)

    Li, Yang; Liu, Yang; Liu, Guojun; Guo, Maozu

    2018-01-01

    We propose a method to generate figure-ground segmentation by incorporating shape priors into the graph-cuts algorithm. Given an image, we first obtain a linear representation of an image and then apply directional chamfer matching to generate class-independent, nonparametric shape priors, which provide shape clues for the graph-cuts algorithm. We then enforce shape priors in a graph-cuts energy function to produce object segmentation. In contrast to previous segmentation methods, the proposed method shares shape knowledge for different semantic classes and does not require class-specific model training. Therefore, the approach obtains high-quality segmentation for objects. We experimentally validate that the proposed method outperforms previous approaches using the challenging PASCAL VOC 2010/2012 and Berkeley (BSD300) segmentation datasets.

  6. Exchange and Interest Rates prior to EMU: The Case of Greece

    OpenAIRE

    Antzoulatos, Angelos A.; Wilfling, Bernd

    2003-01-01

    Recently a variety of exchange and interest rate models capturing the dynamics during the transition from an exchange rate arrangement of floating rates into a currency union have been derived. While these stochastic equilibrium models in continous time are theoretically rigorous, a systematic and extensive empirical validation is still lacking. Using exchange and interest rate data collected prior to the Greek EMU-entrance on 1 January 2001 this paper tries to fill the gap between theory and...

  7. Prior Mental Fatigue Impairs Marksmanship Decision Performance

    Directory of Open Access Journals (Sweden)

    James Head

    2017-09-01

    Full Text Available Purpose: Mental fatigue has been shown to impair subsequent physical performance in continuous and discontinuous exercise. However, its influence on subsequent fine-motor performance in an applied setting (e.g., marksmanship for trained soldiers is relatively unknown. The purpose of this study was to investigate whether prior mental fatigue influences subsequent marksmanship performance as measured by shooting accuracy and judgment of soldiers in a live-fire scenario.Methods: Twenty trained infantry soldiers engaged targets after completing either a mental fatigue or control intervention in a repeated measure design. Heart rate variability and the NASA-TLX were used to gauge physiological and subjective effects of the interventions. Target hit proportion, projectile group accuracy, and precision were used to measure marksmanship accuracy. Marksmanship accuracy was assessed by measuring bullet group accuracy (i.e., how close a group of shots are relative to center of mass and bullet group precision (i.e., how close are each individual shot to each other. Additionally, marksmanship decision accuracy (correctly shooting vs. correctly withholding shot when engaging targets was used to examine marksmanship performance.Results: Soldiers rated the mentally fatiguing task (59.88 ± 23.7 as having greater mental workload relative to the control intervention [31.29 ± 12.3, t(19 = 1.72, p < 0.001]. Additionally, soldiers completing the mental fatigue intervention (96.04 ± = 37.1 also had lower time-domain (standard deviation of normal to normal R-R intervals heart rate variability relative to the control [134.39 ± 47.4, t(18 = 3.59, p < 0.001]. Projectile group accuracy and group precision failed to show differences between interventions [t(19 = 0.98, p = 0.34, t(19 = 0.18, p = 0.87, respectively]. Marksmanship decision errors significantly increased after soldiers completed the mental fatigue intervention (48% ± 22.4 relative to the control

  8. Digital communication constraints in prior space missions

    Science.gov (United States)

    Yassine, Nathan K.

    2004-01-01

    Digital communication is crucial for space endeavors. Jt transmits scientific and command data between earth stations and the spacecraft crew. It facilitates communications between astronauts, and provides live coverage during all phases of the mission. Digital communications provide ground stations and spacecraft crew precise data on the spacecraft position throughout the entire mission. Lessons learned from prior space missions are valuable for our new lunar and Mars missions set by our president s speech. These data will save our agency time and money, and set course our current developing technologies. Limitations on digital communications equipment pertaining mass, volume, data rate, frequency, antenna type and size, modulation, format, and power in the passed space missions are of particular interest. This activity is in support of ongoing communication architectural studies pertaining to robotic and human lunar exploration. The design capabilities and functionalities will depend on the space and power allocated for digital communication equipment. My contribution will be gathering these data, write a report, and present it to Communications Technology Division Staff. Antenna design is very carefully studied for each mission scenario. Currently, Phased array antennas are being developed for the lunar mission. Phased array antennas use little power, and electronically steer a beam instead of DC motors. There are 615 patches in the phased array antenna. These patches have to be modified to have high yield. 50 patches were created for testing. My part is to assist in the characterization of these patch antennas, and determine whether or not certain modifications to quartz micro-strip patch radiators result in a significant yield to warrant proceeding with repairs to the prototype 19 GHz ferroelectric reflect-array antenna. This work requires learning how to calibrate an automatic network, and mounting and testing antennas in coaxial fixtures. The purpose of this

  9. Algorithms and tools for system identification using prior knowledge

    International Nuclear Information System (INIS)

    Lindskog, P.

    1994-01-01

    One of the hardest problems in system identification is that of model structure selection. In this thesis two different kinds of a priori process knowledge are used to address this fundamental problem. Concentrating on linear model structures, the first prior advantage of is knowledge about the systems' dominating time constants and resonance frequencies. The idea is to generalize FIR modelling by replacing the usual delay operator with discrete so-called Laguerre or Kautz filters. The generalization is such that stability, the linear regression structure and the approximation ability of the FIR model structure is retained, whereas the prior is used to reduce the number of parameters needed to arrive at a reasonable model. Tailorized and efficient system identification algorithms for these model structures are detailed in this work. The usefulness of the proposed methods is demonstrated through concrete simulation and application studies. The other approach is referred to as semi-physical modelling. The main idea is to use simple physical insight into the application, often in terms of a set of unstructured equations, in order to come up with suitable nonlinear transformation of the raw measurements, so as to allow for a good model structure. Semi-physical modelling is less ''ambitious'' than physical modelling in that no complete physical structure is sought, just combinations of inputs and outputs that can be subjected to more or less standard model structures, such as linear regressions. The suggested modelling procedure shows a first step where symbolic computations are employed to determine a suitable model structure - a set of regressors. We show how constructive methods from commutative and differential algebra can be applied for this. Subsequently, different numerical schemes for finding a subset of ''good'' regressors and for estimating the corresponding linear-in-the-parameters model are discussed. 107 refs, figs, tabs

  10. Targeted Memory Reactivation during Sleep Depends on Prior Learning.

    Science.gov (United States)

    Creery, Jessica D; Oudiette, Delphine; Antony, James W; Paller, Ken A

    2015-05-01

    When sounds associated with learning are presented again during slow-wave sleep, targeted memory reactivation (TMR) can produce improvements in subsequent location recall. Here we used TMR to investigate memory consolidation during an afternoon nap as a function of prior learning. Twenty healthy individuals (8 male, 19-23 y old). Participants learned to associate each of 50 common objects with a unique screen location. When each object appeared, its characteristic sound was played. After electroencephalography (EEG) electrodes were applied, location recall was assessed for each object, followed by a 90-min interval for sleep. During EEG-verified slow-wave sleep, half of the sounds were quietly presented over white noise. Recall was assessed 3 h after initial learning. A beneficial effect of TMR was found in the form of higher recall accuracy for cued objects compared to uncued objects when pre-sleep accuracy was used as an explanatory variable. An analysis of individual differences revealed that this benefit was greater for participants with higher pre-sleep recall accuracy. In an analysis for individual objects, cueing benefits were apparent as long as initial recall was not highly accurate. Sleep physiology analyses revealed that the cueing benefit correlated with delta power and fast spindle density. These findings substantiate the use of targeted memory reactivation (TMR) methods for manipulating consolidation during sleep. TMR can selectively strengthen memory storage for object-location associations learned prior to sleep, except for those near-perfectly memorized. Neural measures found in conjunction with TMR-induced strengthening provide additional evidence about mechanisms of sleep consolidation. © 2015 Associated Professional Sleep Societies, LLC.

  11. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, M.; Frehen, R.; Schotman, P.C.; Bauer, R.

    2010-01-01

    This paper proposes a novel approach for estimating time-varying betas of individual stocks that incorporates prior information based on fundamentals. We shrink the rolling window estimate of beta towards a firm-specific prior that is motivated by asset pricing theory. The prior captures structural

  12. Correlation set analysis: detecting active regulators in disease populations using prior causal knowledge

    Directory of Open Access Journals (Sweden)

    Huang Chia-Ling

    2012-03-01

    Full Text Available Abstract Background Identification of active causal regulators is a crucial problem in understanding mechanism of diseases or finding drug targets. Methods that infer causal regulators directly from primary data have been proposed and successfully validated in some cases. These methods necessarily require very large sample sizes or a mix of different data types. Recent studies have shown that prior biological knowledge can successfully boost a method's ability to find regulators. Results We present a simple data-driven method, Correlation Set Analysis (CSA, for comprehensively detecting active regulators in disease populations by integrating co-expression analysis and a specific type of literature-derived causal relationships. Instead of investigating the co-expression level between regulators and their regulatees, we focus on coherence of regulatees of a regulator. Using simulated datasets we show that our method performs very well at recovering even weak regulatory relationships with a low false discovery rate. Using three separate real biological datasets we were able to recover well known and as yet undescribed, active regulators for each disease population. The results are represented as a rank-ordered list of regulators, and reveals both single and higher-order regulatory relationships. Conclusions CSA is an intuitive data-driven way of selecting directed perturbation experiments that are relevant to a disease population of interest and represent a starting point for further investigation. Our findings demonstrate that combining co-expression analysis on regulatee sets with a literature-derived network can successfully identify causal regulators and help develop possible hypothesis to explain disease progression.

  13. Validation of the sensitivity of the National Emergency X-Radiography Utilization Study (NEXUS Head computed tomographic (CT decision instrument for selective imaging of blunt head injury patients: An observational study.

    Directory of Open Access Journals (Sweden)

    William R Mower

    2017-07-01

    Full Text Available Clinicians, afraid of missing intracranial injuries, liberally obtain computed tomographic (CT head imaging in blunt trauma patients. Prior work suggests that clinical criteria (National Emergency X-Radiography Utilization Study [NEXUS] Head CT decision instrument [DI] can reliably identify patients with important injuries, while excluding injury, and the need for imaging in many patients. Validating this DI requires confirmation of the hypothesis that the lower 95% confidence limit for its sensitivity in detecting serious injury exceeds 99.0%. A secondary goal of the study was to complete an independent validation and comparison of the Canadian and NEXUS Head CT rules among the subgroup of patients meeting the inclusion and exclusion criteria.We conducted a prospective observational study of the NEXUS Head CT DI in 4 hospital emergency departments between April 2006 and December 2015. Implementation of the rule requires that patients satisfy 8 criteria to achieve "low-risk" classification. Patients are excluded from "low-risk" classification and assigned "high-risk" status if they fail to meet 1 or more criteria. We examined the instrument's performance in assigning "high-risk" status to patients requiring neurosurgical intervention among a cohort of 11,770 blunt head injury patients. The NEXUS Head CT DI assigned high-risk status to 420 of 420 patients requiring neurosurgical intervention (sensitivity, 100.0% [95% confidence interval [CI]: 99.1%-100.0%]. The instrument assigned low-risk status to 2,823 of 11,350 patients who did not require neurosurgical intervention (specificity, 24.9% [95% CI: 24.1%-25.7%]. None of the 2,823 low-risk patients required neurosurgical intervention (negative predictive value [NPV], 100.0% [95% CI: 99.9%-100.0%]. The DI assigned high-risk status to 759 of 767 patients with significant intracranial injuries (sensitivity, 99.0% [95% CI: 98.0%-99.6%]. The instrument assigned low-risk status to 2,815 of 11

  14. Generalized Bayesian inference with sets of conjugate priors for dealing with prior-data conflict : course at Lund University

    NARCIS (Netherlands)

    Walter, G.

    2015-01-01

    In the Bayesian approach to statistical inference, possibly subjective knowledge on model parameters can be expressed by so-called prior distributions. A prior distribution is updated, via Bayes’ Rule, to the so-called posterior distribution, which combines prior information and information from

  15. The Influence of Prior Knowledge on the Retrieval-Directed Function of Note Taking in Prior Knowledge Activation

    Science.gov (United States)

    Wetzels, Sandra A. J.; Kester, Liesbeth; van Merrienboer, Jeroen J. G.; Broers, Nick J.

    2011-01-01

    Background: Prior knowledge activation facilitates learning. Note taking during prior knowledge activation (i.e., note taking directed at retrieving information from memory) might facilitate the activation process by enabling learners to build an external representation of their prior knowledge. However, taking notes might be less effective in…

  16. Prior Sensitivity Analysis in Default Bayesian Structural Equation Modeling.

    Science.gov (United States)

    van Erp, Sara; Mulder, Joris; Oberski, Daniel L

    2017-11-27

    Bayesian structural equation modeling (BSEM) has recently gained popularity because it enables researchers to fit complex models and solve some of the issues often encountered in classical maximum likelihood estimation, such as nonconvergence and inadmissible solutions. An important component of any Bayesian analysis is the prior distribution of the unknown model parameters. Often, researchers rely on default priors, which are constructed in an automatic fashion without requiring substantive prior information. However, the prior can have a serious influence on the estimation of the model parameters, which affects the mean squared error, bias, coverage rates, and quantiles of the estimates. In this article, we investigate the performance of three different default priors: noninformative improper priors, vague proper priors, and empirical Bayes priors-with the latter being novel in the BSEM literature. Based on a simulation study, we find that these three default BSEM methods may perform very differently, especially with small samples. A careful prior sensitivity analysis is therefore needed when performing a default BSEM analysis. For this purpose, we provide a practical step-by-step guide for practitioners to conducting a prior sensitivity analysis in default BSEM. Our recommendations are illustrated using a well-known case study from the structural equation modeling literature, and all code for conducting the prior sensitivity analysis is available in the online supplemental materials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Impact of formulary restriction with prior authorization by an antimicrobial stewardship program.

    Science.gov (United States)

    Reed, Erica E; Stevenson, Kurt B; West, Jessica E; Bauer, Karri A; Goff, Debra A

    2013-02-15

    In an era of increasing antimicrobial resistance and few antimicrobials in the developmental pipeline, many institutions have developed antimicrobial stewardship programs (ASPs) to help implement evidence-based (EB) strategies for ensuring appropriate utilization of these agents. EB strategies for accomplishing this include formulary restriction with prior authorization. Potential limitations to this particular strategy include delays in therapy, prescriber pushback, and unintended increases in use of un-restricted antimicrobials; however, our ASP found that implementing prior authorization for select antimicrobials along with making a significant effort to educate clinicians on criteria for use ensured more appropriate prescribing of these agents, hopefully helping to preserve their utility for years to come.

  18. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  19. The Role of Prior Knowledge in International Franchise Partner Recruitment

    OpenAIRE

    Wang, Catherine; Altinay, Levent

    2006-01-01

    Purpose To investigate the role of prior knowledge in the international franchise partner recruitment process and to evaluate how cultural distance influences the role of prior knowledge in this process. Design/Methodology/Approach A single embedded case study of an international hotel firm was the focus of the enquiry. Interviews, observations and document analysis were used as the data collection techniques. Findings Findings reveal that prior knowledge of the franchisor enab...

  20. Spectrally Consistent Satellite Image Fusion with Improved Image Priors

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.

    2006-01-01

    Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....

  1. Training shortest-path tractography: Automatic learning of spatial priors

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Liptrot, Matthew George; Reislev, Nina Linde

    2016-01-01

    Tractography is the standard tool for automatic delineation of white matter tracts from diffusion weighted images. However, the output of tractography often requires post-processing to remove false positives and ensure a robust delineation of the studied tract, and this demands expert prior...... knowledge. Here we demonstrate how such prior knowledge, or indeed any prior spatial information, can be automatically incorporated into a shortest-path tractography approach to produce more robust results. We describe how such a prior can be automatically generated (learned) from a population, and we...

  2. Crowdsourcing prior information to improve study design and data analysis.

    Directory of Open Access Journals (Sweden)

    Jeffrey S Chrabaszcz

    Full Text Available Though Bayesian methods are being used more frequently, many still struggle with the best method for setting priors with novel measures or task environments. We propose a method for setting priors by eliciting continuous probability distributions from naive participants. This allows us to include any relevant information participants have for a given effect. Even when prior means are near-zero, this method provides a principle way to estimate dispersion and produce shrinkage, reducing the occurrence of overestimated effect sizes. We demonstrate this method with a number of published studies and compare the effect of different prior estimation and aggregation methods.

  3. Prior knowledge in recalling arguments in bioethical dilemmas

    Directory of Open Access Journals (Sweden)

    Hiemke Katharina Schmidt

    2015-09-01

    Full Text Available Prior knowledge is known to facilitate learning new information. Normally in studies confirming this outcome the relationship between prior knowledge and the topic to be learned is obvious: the information to be acquired is part of the domain or topic to which the prior knowledge belongs. This raises the question as to whether prior knowledge of various domains facilitates recalling information. In this study 79 eleventh-grade students completed a questionnaire on their prior knowledge of seven different domains related to the bioethical dilemma of prenatal diagnostics. The students read a text containing arguments for and arguments against prenatal diagnostics. After one week and again 12 weeks later they were asked to write down all the arguments they remembered. Prior knowledge helped them recall the arguments one week (r = .350 and 12 weeks (r = .316 later. Prior knowledge of three of the seven domains significantly helped them recall the arguments one week later (correlations between r = .194 to r = .394. Partial correlations with interest as a control item revealed that interest did not explain the relationship between prior knowledge and recall. Prior knowledge of different domains jointly supports the recall of arguments related to bioethical topics.

  4. Chromosomal differences between acute nonlymphocytic leukemia in patients with prior solid tumors and prior hematologic malignancies. A study of 14 cases with prior breast cancer

    International Nuclear Information System (INIS)

    Mamuris, Z.; Dumont, J.; Dutrillaux, B.; Aurias, A.

    1989-01-01

    A cytogenetic study of 14 patients with secondary acute nonlymphocytic leukemia (S-ANLL) with prior treatment for breast cancer is reported. The chromosomes recurrently involved in numerical or structural anomalies are chromosomes 7, 5, 17, and 11, in decreasing order of frequency. The distribution of the anomalies detected in this sample of patients is similar to that observed in published cases with prior breast or other solid tumors, though anomalies of chromosome 11 were not pointed out, but it significantly differs from that of the S-ANLL with prior hematologic malignancies. This difference is principally due to a higher involvement of chromosome 7 in patients with prior hematologic malignancies and of chromosomes 11 and 17 in patients with prior solid tumors. A genetic determinism involving abnormal recessive alleles located on chromosomes 5, 7, 11, and 17 uncovered by deletions of the normal homologs may be a cause of S-ANLL. The difference between patients with prior hematologic malignancies or solid tumors may be explained by different constitutional mutations of recessive genes in the two groups of patients

  5. Further Validation of the Coach Identity Prominence Scale

    Science.gov (United States)

    Pope, J. Paige; Hall, Craig R.

    2014-01-01

    This study was designed to examine select psychometric properties of the Coach Identity Prominence Scale (CIPS), including the reliability, factorial validity, convergent validity, discriminant validity, and predictive validity. Coaches (N = 338) who averaged 37 (SD = 12.27) years of age, had a mean of 13 (SD = 9.90) years of coaching experience,…

  6. Post-prior discrepancies in the continuum distorted wave-eikonal initial state approximation for ion-helium ionization

    Energy Technology Data Exchange (ETDEWEB)

    Ciappina, M F [CONICET and Departamento de Fisica, Universidad Nacional del Sur, 8000 Bahia Blanca (Argentina); Cravero, W R [CONICET and Departamento de Fisica, Universidad Nacional del Sur, 8000 Bahia Blanca (Argentina); Garibotti, C R [CONICET and Division Colisiones Atomicas, Centro Atomico Bariloche, 8400 Bariloche (Argentina)

    2003-09-28

    We have explored post-prior discrepancies within continuum distorted wave-eikonal initial state theory for ion-atom ionization. Although there are no post-prior discrepancies when electron-target initial and final states are exact solutions of the respective Hamiltonians, discrepancies do arise for multielectronic targets, when a hydrogenic continuum with effective charge is used for the final electron-residual target wavefunction. We have found that the prior version calculations give better results than the post version, particularly for highly charged projectiles. We have explored the reasons for this behaviour and found that the prior version shows less sensitivity to the choice of the final state. The fact that the perturbation potentials operate upon the initial state suggests that the selection of the initial bound state is relatively more important than the final continuum state for the prior version.

  7. Testability evaluation using prior information of multiple sources

    Directory of Open Access Journals (Sweden)

    Wang Chao

    2014-08-01

    Full Text Available Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR and fault isolation rate (FIR, which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testability demonstration test data (TDTD such as low evaluation confidence and inaccurate result, a testability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF, and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  8. Testability evaluation using prior information of multiple sources

    Institute of Scientific and Technical Information of China (English)

    Wang Chao; Qiu Jing; Liu Guanjun; Zhang Yong

    2014-01-01

    Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR) and fault isolation rate (FIR), which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testabil-ity demonstration test data (TDTD) such as low evaluation confidence and inaccurate result, a test-ability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF), and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  9. Eliciting hyperparameters of prior distributions for the parameters of paired comparison models

    Directory of Open Access Journals (Sweden)

    Nasir Abbas

    2013-02-01

    Full Text Available Normal 0 false false false EN-US X-NONE AR-SA In the study of paired comparisons (PC, items may be ranked or issues may be prioritized through subjective assessment of certain judges. PC models are developed and then used to serve the purpose of ranking. The PC models may be studied through classical or Bayesian approach. Bayesian inference is a modern statistical technique used to draw conclusions about the population parameters. Its beauty lies in incorporating prior information about the parameters into the analysis in addition to current information (i.e. data. The prior and current information are formally combined to yield a posterior distribution about the population parameters, which is the work bench of the Bayesian statisticians. However, the problems the Bayesians face correspond to the selection and formal utilization of prior distribution. Once the type of prior distribution is decided to be used, the problem of estimating the parameters of the prior distribution (i.e. elicitation still persists. Different methods are devised to serve the purpose. In this study an attempt is made to use Minimum Chi-square (hence forth MCS for the elicitation purpose. Though it is a classical estimation technique, but is used here for the election purpose. The entire elicitation procedure is illustrated through a numerical data set.

  10. Construction and test of the PRIOR proton microscope; Aufbau und Test des Protonenmikroskops PRIOR

    Energy Technology Data Exchange (ETDEWEB)

    Lang, Philipp-Michael

    2015-01-15

    The study of High Energy Density Matter (HEDM) in the laboratory makes great demands on the diagnostics because these states can usually only be created for a short time and usual diagnostic techniques with visible light or X-rays come to their limit because of the high density. The high energy proton radiography technique that was developed in the 1990s at the Los Alamos National Laboratory is a very promising possibility to overcome those limits so that one can measure the density of HEDM with high spatial and time resolution. For this purpose the proton microscope PRIOR (Proton Radiography for FAIR) was set up at GSI, which not only reproduces the image, but also magnifies it by a factor of 4.2 and thereby penetrates matter with a density up to 20 g/cm{sup 2}. Straightaway a spatial resolution of less than 30 μm and a time resolution on the nanosecond scale was achieved. This work describes details to the principle, design and construction of the proton microscope as well as first measurements and simulations of essential components like magnetic lenses, a collimator and a scintillator screen. For the latter one it was possible to show that plastic scintillators can be used as converter as an alternative to the slower but more radiation resistant crystals, so that it is possible to reach a time resolution of 10 ns. Moreover the characteristics were investigated for the system at the commissioning in April 2014. Also the changes in the magnetic field due to radiation damage were studied. Besides that an overview about future applications is given. First experiments with Warm Dense Matter created by using a Pulsed Power Setup have already been performed. Furthermore the promising concept of combining proton radiography with particle therapy has been investigated in context of the PaNTERA project. An outlook on the possibilities with future experiments at the FAIR accelerator facility is given as well. Because of higher beam intensity an energy one can expect even

  11. Multi-diameter pigging: factors affecting the design and selection of pigging tools for multi-diameter pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Karl [Pipeline Engineering and Supply Co. Ltd., Richmond, NY (United States)

    2009-07-01

    This paper will consider the process involved in pigging tool selection for pipelines with two or more significant internal diameters which require pigging tools capable of negotiating the different internal diameters whilst also carrying out the necessary pipeline cleaning operation. The paper will include an analysis of pipeline features that affect pigging tool selection and then go on to look at other variables that determine the pigging tool design; this will include a step by step guide outlining how the tool is designed, the development of prototype pigs and the importance of testing and validation prior to final deployment in operational pigging programmes. (author)

  12. Relations among conceptual knowledge, procedural knowledge, and procedural flexibility in two samples differing in prior knowledge.

    Science.gov (United States)

    Schneider, Michael; Rittle-Johnson, Bethany; Star, Jon R

    2011-11-01

    Competence in many domains rests on children developing conceptual and procedural knowledge, as well as procedural flexibility. However, research on the developmental relations between these different types of knowledge has yielded unclear results, in part because little attention has been paid to the validity of the measures or to the effects of prior knowledge on the relations. To overcome these problems, we modeled the three constructs in the domain of equation solving as latent factors and tested (a) whether the predictive relations between conceptual and procedural knowledge were bidirectional, (b) whether these interrelations were moderated by prior knowledge, and (c) how both constructs contributed to procedural flexibility. We analyzed data from 2 measurement points each from two samples (Ns = 228 and 304) of middle school students who differed in prior knowledge. Conceptual and procedural knowledge had stable bidirectional relations that were not moderated by prior knowledge. Both kinds of knowledge contributed independently to procedural flexibility. The results demonstrate how changes in complex knowledge structures contribute to competence development.

  13. Adaptive nonparametric Bayesian inference using location-scale mixture priors

    NARCIS (Netherlands)

    Jonge, de R.; Zanten, van J.H.

    2010-01-01

    We study location-scale mixture priors for nonparametric statistical problems, including multivariate regression, density estimation and classification. We show that a rate-adaptive procedure can be obtained if the prior is properly constructed. In particular, we show that adaptation is achieved if

  14. Nudging toward Inquiry: Awakening and Building upon Prior Knowledge

    Science.gov (United States)

    Fontichiaro, Kristin, Comp.

    2010-01-01

    "Prior knowledge" (sometimes called schema or background knowledge) is information one already knows that helps him/her make sense of new information. New learning builds on existing prior knowledge. In traditional reporting-style research projects, students bypass this crucial step and plow right into answer-finding. It's no wonder that many…

  15. Drunkorexia: Calorie Restriction Prior to Alcohol Consumption among College Freshman

    Science.gov (United States)

    Burke, Sloane C.; Cremeens, Jennifer; Vail-Smith, Karen; Woolsey, Conrad

    2010-01-01

    Using a sample of 692 freshmen at a southeastern university, this study examined caloric restriction among students prior to planned alcohol consumption. Participants were surveyed for self-reported alcohol consumption, binge drinking, and caloric intake habits prior to drinking episodes. Results indicated that 99 of 695 (14%) of first year…

  16. Personality, depressive symptoms and prior trauma exposure of new ...

    African Journals Online (AJOL)

    Background. Police officers are predisposed to trauma exposure. The development of depression and post-traumatic stress disorder (PTSD) may be influenced by personality style, prior exposure to traumatic events and prior depression. Objectives. To describe the personality profiles of new Metropolitan Police Service ...

  17. 34 CFR 303.403 - Prior notice; native language.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Prior notice; native language. 303.403 Section 303.403... TODDLERS WITH DISABILITIES Procedural Safeguards General § 303.403 Prior notice; native language. (a... file a complaint and the timelines under those procedures. (c) Native language. (1) The notice must be...

  18. On the use of a pruning prior for neural networks

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1996-01-01

    We address the problem of using a regularization prior that prunes unnecessary weights in a neural network architecture. This prior provides a convenient alternative to traditional weight-decay. Two examples are studied to support this method and illustrate its use. First we use the sunspots...

  19. Bayesian Inference for Structured Spike and Slab Priors

    DEFF Research Database (Denmark)

    Andersen, Michael Riis; Winther, Ole; Hansen, Lars Kai

    2014-01-01

    Sparse signal recovery addresses the problem of solving underdetermined linear inverse problems subject to a sparsity constraint. We propose a novel prior formulation, the structured spike and slab prior, which allows to incorporate a priori knowledge of the sparsity pattern by imposing a spatial...

  20. 5 CFR 6201.103 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Prior approval for outside employment. 6201.103 Section 6201.103 Administrative Personnel EXPORT-IMPORT BANK OF THE UNITED STATES SUPPLEMENTAL STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE EXPORT-IMPORT BANK OF THE UNITED STATES § 6201.103 Prior...

  1. Prior authorisation schemes: trade barriers in need of scientific justification

    NARCIS (Netherlands)

    Meulen, van der B.M.J.

    2010-01-01

    Case C-333/08 Commission v. French Republic ‘processing aids’ [2010] ECR-0000 French prior authorisation scheme for processing aids in food production infringes upon Article 34 TFEU** 1. A prior authorisation scheme not complying with the principle of proportionality, infringes upon Article 34 TFEU.

  2. [Treatment of selective mutism].

    Science.gov (United States)

    Melfsen, Siebke; Warnke, Andreas

    2007-11-01

    Selective mutism is a communication disorder of childhood in which the child does not speak in specific social situations despite the ability to speak in other situations. A literature review was completed in order to provide practical guidelines for the assessment and treatment of children with selective mutism. There are many different behavioral approaches in the treatment of this disorder, e.g. contingency management, shaping, stimulus fading, escape-avoidance, self-modeling, learning theory approaches. A clearer diagnostic understanding of the disorder as part of anxiety or oppositional disorders needs to be realized prior to generalize an effective treatment for this disorder.

  3. Investigating Attitudes toward Physical Education: Validation across Two Instruments

    Science.gov (United States)

    Donovan, Corinne Baron; Mercier, Kevin; Phillips, Sharon R.

    2015-01-01

    The Centers for Disease Control have suggested that physical education plays a role in promoting healthy lifestyles. Prior research suggests a link between attitudes toward physical education and physical activity outside school. The current study provides additional evidence of construct validity through a validation across two instruments…

  4. Certification and Validation of Priori Learning and Competences

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2004-01-01

    The article examines forms of recognition and validation of prior learning and competences in Europe and Australia, and discusses the need to create validation systems that connect formal education, non-formal training and informal learning as a tool for lifelong learning policies in Colombia...

  5. Cross Cultural Validation Of Perceived Workfamily Facilitation Scale ...

    African Journals Online (AJOL)

    The work family interface contains four unique factors based on studies from western countries. However, some of these studies have questioned the cross cultural adoption of psychological concept, and called for a re-validation prior to adoption. The main purpose of this study is to re-validate the four factor structure that ...

  6. How do You Select an Anesthesia Method Prior to Tympanostomy Tube Insertion for a Child?

    OpenAIRE

    Lee, Dong-Hee

    2016-01-01

    The use of general (face-mask inhalation and intravenous) anesthesia has been the method of choice for tympanostomy tube insertion in children. However, there is no exact guideline for the choice of anesthesia method and there is no evidence to support the use of one anesthesia method over another. Clinically, the anesthesia method used to be decided by old customs and the surgeon's blind faith that children cannot bear tympanostomy tube insertion under local anesthesia. Clinicians should kee...

  7. A Representation Theorem and Applications to Measure Selection and Noninformative Priors

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    We introduce a set of transformations on the set of all probability distributions over a finite state space, and show that these transformations are the only ones that preserve certain elementary probabilistic relationships. This result provides a new perspective on a variety of probabilistic inf...

  8. Investigation of thermal treatment on selective separation of post consumer plastics prior to froth flotation.

    Science.gov (United States)

    Guney, Ali; Poyraz, M Ibrahim; Kangal, Olgac; Burat, Firat

    2013-09-01

    Plastics have become the widely used materials because of their advantages, such as cheapness, endurance, lightness, and hygiene. However, they cause waste and soil pollution and they do not easily decompose. Many promising technologies are being investigated for separating mixed thermoplastics, but they are still uneconomical and unreliable. Depending on their surface characteristics, these plastics can be separated from each other by flotation method which is useful mineral processing technique with its low cost and simplicity. The main objective of this study is to investigate the flotation characteristics of PET and PVC and determine the effect of plasticizer reagents on efficient plastic separation. For that purpose, various parameters such as pH, plasticizer concentration, plasticizer type, conditioning temperature and thermal conditioning were investigated. As a result, PET particles were floated with 95.1% purity and 65.3% efficiency while PVC particles were obtained with 98.1% purity and 65.3% efficiency. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. An evaluation of selected methods for the decontamination of cattle hides prior to skinning.

    Science.gov (United States)

    Small, A; Wells-Burr, B; Buncic, S

    2005-02-01

    The effectiveness of different decontamination treatments in reducing microbial loads on cattle hides was assessed. The 10-s hide treatments were conducted using a wet-and-dry vacuum cleaner filled with one of the liquids (heated to 50 °C) indicated below, followed or not by 10-min drying in the air. Also, the hide was clipped, followed or not by 10-s singeing using a hand-held blowtorch. Before and after each decontamination treatment, the hide was sampled (100 cm(2) areas) by a sponge-swabbing method to compare the total viable counts of bacteria (TVC). The largest bacterial reduction (Psanitizer solution (10% Betane Plus) resulted in significant reductions of 1.80 (Peffects. Since hide contamination is associated with microbial contamination of the carcasses, the results indicate that post-killing/pre-skinning hide decontamination (used alone, or in combination with carcass decontamination) has a potential to improve microbial meat safety. Nevertheless, further research is required to optimise the efficacy of these treatments in the reduction of specific pathogens under commercial conditions.

  10. Assessing Prior Experience in the Selection of Air Traffic Control Specialists

    Science.gov (United States)

    2013-04-01

    Federal Aviation Administration (FAA) human resources personnel. In addition to general and specialized experience and education requirements , the ATC...experience. The alternate requirements also include the general experience requirement , accompanied by an additional written test requirement ...had attended the Academy and did not have IFR experience. Alternate Requirement 2. Hold or have held an FAA certificate as a dispatcher for an air

  11. Oracle convergence rate of posterior under projection prior and Bayesian model selection

    NARCIS (Netherlands)

    Babenko, A.; Belitser, E.N.

    2010-01-01

    We apply the Bayes approach to the problem of projection estimation of a signal observed in the Gaussian white noise model and we study the rate at which the posterior distribution concentrates about the true signal from the space ℓ2 as the information in observations tends to infinity. A benchmark

  12. Oracle convergence rate of posterior under projection prior and Bayesian model selection

    NARCIS (Netherlands)

    Babenko, A.; Belitser, E.

    2010-01-01

    We apply the Bayes approach to the problem of projection estimation of a signal observed in the Gaussian white noise model and we study the rate at which the posterior distribution concentrates about the true signal from the space l2 as the information in observations tends to infinity. A benchmark

  13. Variational segmentation problems using prior knowledge in imaging and vision

    DEFF Research Database (Denmark)

    Fundana, Ketut

    This dissertation addresses variational formulation of segmentation problems using prior knowledge. Variational models are among the most successful approaches for solving many Computer Vision and Image Processing problems. The models aim at finding the solution to a given energy functional defined......, prior knowledge is needed to obtain the desired solution. The introduction of shape priors in particular, has proven to be an effective way to segment objects of interests. Firstly, we propose a prior-based variational segmentation model to segment objects of interest in image sequences, that can deal....... Many objects have high variability in shape and orientation. This often leads to unsatisfactory results, when using a segmentation model with single shape template. One way to solve this is by using more sophisticated shape models. We propose to incorporate shape priors from a shape sub...

  14. Total Variability Modeling using Source-specific Priors

    DEFF Research Database (Denmark)

    Shepstone, Sven Ewan; Lee, Kong Aik; Li, Haizhou

    2016-01-01

    sequence of an utterance. In both cases the prior for the latent variable is assumed to be non-informative, since for homogeneous datasets there is no gain in generality in using an informative prior. This work shows in the heterogeneous case, that using informative priors for com- puting the posterior......, can lead to favorable results. We focus on modeling the priors using minimum divergence criterion or fac- tor analysis techniques. Tests on the NIST 2008 and 2010 Speaker Recognition Evaluation (SRE) dataset show that our proposed method beats four baselines: For i-vector extraction using an already...... trained matrix, for the short2-short3 task in SRE’08, five out of eight female and four out of eight male common conditions, were improved. For the core-extended task in SRE’10, four out of nine female and six out of nine male common conditions were improved. When incorporating prior information...

  15. Own and Others’ Prior Experiences Influence Children’s Imitation of Causal Acts

    OpenAIRE

    Williamson, Rebecca A.; Meltzoff, Andrew N.

    2011-01-01

    Young children learn from others’ examples, and they do so selectively. We examine whether the efficacy of prior experiences influences children’s imitation. Thirty-six-month-olds had initial experience on a causal learning task either by performing the task themselves or by watching an adult perform it. The nature of the experience was manipulated such that the actor had either an easy or a difficult experience completing the task. Next, a second adult demonstrated an innovative technique fo...

  16. Inflammatory pathways are central to posterior cerebrovascular artery remodelling prior to the onset of congenital hypertension.

    Science.gov (United States)

    Walas, Dawid; Nowicki-Osuch, Karol; Alibhai, Dominic; von Linstow Roloff, Eva; Coghill, Jane; Waterfall, Christy; Paton, Julian Fr

    2018-01-01

    Cerebral artery hypoperfusion may provide the basis for linking ischemic stroke with hypertension. Brain hypoperfusion may induce hypertension that may serve as an auto-protective mechanism to prevent ischemic stroke. We hypothesised that hypertension is caused by remodelling of the cerebral arteries, which is triggered by inflammation. We used a congenital rat model of hypertension and examined age-related changes in gene expression of the cerebral arteries using RNA sequencing. Prior to hypertension, we found changes in signalling pathways associated with the immune system and fibrosis. Validation studies using second harmonics generation microscopy revealed upregulation of collagen type I and IV in both tunica externa and media. These changes in the extracellular matrix of cerebral arteries pre-empted hypertension accounting for their increased stiffness and resistance, both potentially conducive to stroke. These data indicate that inflammatory driven cerebral artery remodelling occurs prior to the onset of hypertension and may be a trigger elevating systemic blood pressure in genetically programmed hypertension.

  17. Content validation applied to job simulation and written examinations

    International Nuclear Information System (INIS)

    Saari, L.M.; McCutchen, M.A.; White, A.S.; Huenefeld, J.C.

    1984-08-01

    The application of content validation strategies in work settings have become increasingly popular over the last few years, perhaps spurred by an acknowledgment in the courts of content validation as a method for validating employee selection procedures (e.g., Bridgeport Guardians v. Bridgeport Police Dept., 1977). Since criterion-related validation is often difficult to conduct, content validation methods should be investigated as an alternative for determining job related selection procedures. However, there is not yet consensus among scientists and professionals concerning how content validation should be conducted. This may be because there is a lack of clear cut operations for conducting content validation for different types of selection procedures. The purpose of this paper is to discuss two content validation approaches being used for the development of a licensing examination that involves a job simulation exam and a written exam. These represent variations in methods for applying content validation. 12 references

  18. The Treatment Validity of Autism Screening Instruments

    Science.gov (United States)

    Livanis, Andrew; Mouzakitis, Angela

    2010-01-01

    Treatment validity is a frequently neglected topic of screening instruments used to identify autism spectrum disorders. Treatment validity, however, should represent an important aspect of these instruments to link the resulting data to the selection of interventions as well as make decisions about treatment length and intensity. Research…

  19. Construct Validity and Case Validity in Assessment

    Science.gov (United States)

    Teglasi, Hedwig; Nebbergall, Allison Joan; Newman, Daniel

    2012-01-01

    Clinical assessment relies on both "construct validity", which focuses on the accuracy of conclusions about a psychological phenomenon drawn from responses to a measure, and "case validity", which focuses on the synthesis of the full range of psychological phenomena pertaining to the concern or question at hand. Whereas construct validity is…

  20. The influence of prior knowledge on the retrieval-directed function of note taking in prior knowledge activation

    NARCIS (Netherlands)

    Wetzels, Sandra; Kester, Liesbeth; Van Merriënboer, Jeroen; Broers, Nick

    2010-01-01

    Wetzels, S. A. J., Kester, L., Van Merriënboer, J. J. G., & Broers, N. J. (2011). The influence of prior knowledge on the retrieval-directed function of note taking in prior knowledge activation. British Journal of Educational Psychology, 81(2), 274-291. doi: 10.1348/000709910X517425

  1. Learning priors for Bayesian computations in the nervous system.

    Directory of Open Access Journals (Sweden)

    Max Berniker

    Full Text Available Our nervous system continuously combines new information from our senses with information it has acquired throughout life. Numerous studies have found that human subjects manage this by integrating their observations with their previous experience (priors in a way that is close to the statistical optimum. However, little is known about the way the nervous system acquires or learns priors. Here we present results from experiments where the underlying distribution of target locations in an estimation task was switched, manipulating the prior subjects should use. Our experimental design allowed us to measure a subject's evolving prior while they learned. We confirm that through extensive practice subjects learn the correct prior for the task. We found that subjects can rapidly learn the mean of a new prior while the variance is learned more slowly and with a variable learning rate. In addition, we found that a Bayesian inference model could predict the time course of the observed learning while offering an intuitive explanation for the findings. The evidence suggests the nervous system continuously updates its priors to enable efficient behavior.

  2. Implicit Priors in Galaxy Cluster Mass and Scaling Relation Determinations

    Science.gov (United States)

    Mantz, A.; Allen, S. W.

    2011-01-01

    Deriving the total masses of galaxy clusters from observations of the intracluster medium (ICM) generally requires some prior information, in addition to the assumptions of hydrostatic equilibrium and spherical symmetry. Often, this information takes the form of particular parametrized functions used to describe the cluster gas density and temperature profiles. In this paper, we investigate the implicit priors on hydrostatic masses that result from this fully parametric approach, and the implications of such priors for scaling relations formed from those masses. We show that the application of such fully parametric models of the ICM naturally imposes a prior on the slopes of the derived scaling relations, favoring the self-similar model, and argue that this prior may be influential in practice. In contrast, this bias does not exist for techniques which adopt an explicit prior on the form of the mass profile but describe the ICM non-parametrically. Constraints on the slope of the cluster mass-temperature relation in the literature show a separation based the approach employed, with the results from fully parametric ICM modeling clustering nearer the self-similar value. Given that a primary goal of scaling relation analyses is to test the self-similar model, the application of methods subject to strong, implicit priors should be avoided. Alternative methods and best practices are discussed.

  3. Validation of the Vanderbilt Holistic Face Processing Test

    OpenAIRE

    Wang, Chao-Chih; Ross, David A.; Gauthier, Isabel; Richler, Jennifer J.

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the ...

  4. Validation of the Vanderbilt Holistic Face Processing Test.

    OpenAIRE

    Chao-Chih Wang; Chao-Chih Wang; David Andrew Ross; Isabel Gauthier; Jennifer Joanna Richler

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the ...

  5. Validation of a Radiosensitivity Molecular Signature in Breast Cancer

    Science.gov (United States)

    Eschrich, Steven A.; Fulp, William J.; Pawitan, Yudi; Foekens, John A.; Smid, Marcel; Martens, John W. M.; Echevarria, Michelle; Kamath, Vidya; Lee, Ji-Hyun; Harris, Eleanor E.; Bergh, Jonas; Torres-Roca, Javier F.

    2014-01-01

    Purpose Previously, we developed a radiosensitivity molecular signature (RSI) that was clinically-validated in three independent datasets (rectal, esophageal, head and neck) in 118 patients. Here, we test RSI in radiotherapy (RT) treated breast cancer patients. Experimental Design RSI was tested in two previously published breast cancer datasets. Patients were treated at the Karolinska University Hospital (n=159) and Erasmus Medical Center (n=344). RSI was applied as previously described. Results We tested RSI in RT-treated patients (Karolinska). Patients predicted to be radiosensitive (RS) had an improved 5 yr relapse-free survival when compared with radioresistant (RR) patients (95% vs. 75%, p=0.0212) but there was no difference between RS/RR patients treated without RT (71% vs. 77%, p=0.6744), consistent with RSI being RT-specific (interaction term RSIxRT, p=0.05). Similarly, in the Erasmus dataset RT-treated RS patients had an improved 5-year distant-metastasis-free survival over RR patients (77% vs. 64%, p=0.0409) but no difference was observed in patients treated without RT (RS vs. RR, 80% vs. 81%, p=0.9425). Multivariable analysis showed RSI is the strongest variable in RT-treated patients (Karolinska, HR=5.53, p=0.0987, Erasmus, HR=1.64, p=0.0758) and in backward selection (removal alpha of 0.10) RSI was the only variable remaining in the final model. Finally, RSI is an independent predictor of outcome in RT-treated ER+ patients (Erasmus, multivariable analysis, HR=2.64, p=0.0085). Conclusions RSI is validated in two independent breast cancer datasets totaling 503 patients. Including prior data, RSI is validated in five independent cohorts (621 patients) and represents, to our knowledge, the most extensively validated molecular signature in radiation oncology. PMID:22832933

  6. MARS Validation Plan and Status

    International Nuclear Information System (INIS)

    Ahn, Seung-hoon; Cho, Yong-jin

    2008-01-01

    The KINS Reactor Thermal-hydraulic Analysis System (KINS-RETAS) under development is directed toward a realistic analysis approach of best-estimate (BE) codes and realistic assumptions. In this system, MARS is pivoted to provide the BE Thermal-Hydraulic (T-H) response in core and reactor coolant system to various operational transients and accidental conditions. As required for other BE codes, the qualification is essential to ensure reliable and reasonable accuracy for a targeted MARS application. Validation is a key element of the code qualification, and determines the capability of a computer code in predicting the major phenomena expected to occur. The MARS validation was made by its developer KAERI, on basic premise that its backbone code RELAP5/MOD3.2 is well qualified against analytical solutions, test or operational data. A screening was made to select the test data for MARS validation; some models transplanted from RELAP5, if already validated and found to be acceptable, were screened out from assessment. It seems to be reasonable, but does not demonstrate whether code adequacy complies with the software QA guidelines. Especially there may be much difficulty in validating the life-cycle products such as code updates or modifications. This paper presents the plan for MARS validation, and the current implementation status

  7. Geochemistry Model Validation Report: External Accumulation Model

    International Nuclear Information System (INIS)

    Zarrabi, K.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  8. Estimating Functions with Prior Knowledge, (EFPK) for diffusions

    DEFF Research Database (Denmark)

    Nolsøe, Kim; Kessler, Mathieu; Madsen, Henrik

    2003-01-01

    In this paper a method is formulated in an estimating function setting for parameter estimation, which allows the use of prior information. The main idea is to use prior knowledge of the parameters, either specified as moments restrictions or as a distribution, and use it in the construction of a...... of an estimating function. It may be useful when the full Bayesian analysis is difficult to carry out for computational reasons. This is almost always the case for diffusions, which is the focus of this paper, though the method applies in other settings.......In this paper a method is formulated in an estimating function setting for parameter estimation, which allows the use of prior information. The main idea is to use prior knowledge of the parameters, either specified as moments restrictions or as a distribution, and use it in the construction...

  9. 29 CFR 452.40 - Prior office holding.

    Science.gov (United States)

    2010-07-01

    ... DISCLOSURE ACT OF 1959 Candidacy for Office; Reasonable Qualifications § 452.40 Prior office holding. A.... 26 26 Wirtz v. Hotel, Motel and Club Employees Union, Local 6, 391 U.S. 492 at 504. The Court stated...

  10. Form of prior for constrained thermodynamic processes with uncertainty

    Science.gov (United States)

    Aneja, Preety; Johal, Ramandeep S.

    2015-05-01

    We consider the quasi-static thermodynamic processes with constraints, but with additional uncertainty about the control parameters. Motivated by inductive reasoning, we assign prior distribution that provides a rational guess about likely values of the uncertain parameters. The priors are derived explicitly for both the entropy-conserving and the energy-conserving processes. The proposed form is useful when the constraint equation cannot be treated analytically. The inference is performed using spin-1/2 systems as models for heat reservoirs. Analytical results are derived in the high-temperatures limit. An agreement beyond linear response is found between the estimates of thermal quantities and their optimal values obtained from extremum principles. We also seek an intuitive interpretation for the prior and the estimated value of temperature obtained therefrom. We find that the prior over temperature becomes uniform over the quantity kept conserved in the process.

  11. Prior Expectations Bias Sensory Representations in Visual Cortex

    NARCIS (Netherlands)

    Kok, P.; Brouwer, G.J.; Gerven, M.A.J. van; Lange, F.P. de

    2013-01-01

    Perception is strongly influenced by expectations. Accordingly, perception has sometimes been cast as a process of inference, whereby sensory inputs are combined with prior knowledge. However, despite a wealth of behavioral literature supporting an account of perception as probabilistic inference,

  12. Bayesian optimal experimental design for priors of compact support

    KAUST Repository

    Long, Quan

    2016-01-01

    to account for the bounded domain of the uniform prior pdf of the parameters. The underlying Gaussian distribution is obtained in the spirit of the Laplace method, more precisely, the mode is chosen as the maximum a posteriori (MAP) estimate

  13. What good are actions? Accelerating learning using learned action priors

    CSIR Research Space (South Africa)

    Rosman, Benjamin S

    2012-11-01

    Full Text Available The computational complexity of learning in sequential decision problems grows exponentially with the number of actions available to the agent at each state. We present a method for accelerating this process by learning action priors that express...

  14. Assessment of prior learning in vocational education and training

    DEFF Research Database (Denmark)

    Wahlgren, Bjarne; Aarkrog, Vibe

    ’ knowledge, skills and competences during the students’ performances and the methods that the teachers apply in order to assess the students’ prior learning in relation to the regulations of the current VET-program. In particular the study focuses on how to assess not only the students’ explicated knowledge......The article deals about the results of a study of the assessment of prior learning among adult workers who want to obtain formal qualifications as skilled workers. The study contributes to developing methods for assessing prior learning including both the teachers’ ways of eliciting the students...... and skills but also their competences, i.e. the way the students use their skills and knowledge to perform in practice. Based on a description of the assessment procedures the article discusses central issues in relation to the assessment of prior learning. The empirical data have been obtained in the VET...

  15. Generalized multiple kernel learning with data-dependent priors.

    Science.gov (United States)

    Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li

    2015-06-01

    Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.

  16. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  17. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-07-07

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  18. SUSPENSION OF THE PRIOR DISCIPLINARY INVESTIGATION ACCORDING TO LABOR LAW

    Directory of Open Access Journals (Sweden)

    Nicolae, GRADINARU

    2014-11-01

    Full Text Available In order to conduct the prior disciplinary investigation, the employee shall be convoked in writing by the person authorized by the employer to carry out the research, specifying the subject, date, time and place of the meeting. For this purpose the employer shall appoint a committee charged with conducting the prior disciplinary investigation. Prior disciplinary research cannot be done without the possibility of the accused person to defend himself. It would be an abuse of the employer to violate these provisions. Since the employee is entitled to formulate and sustain defence in proving innocence or lesser degree of guilt than imputed, it needs between the moment were disclosed to the employee and the one of performing the prior disciplinary investigation to be a reasonable term for the employee to be able to prepare a defence in this regard. The employee's failure to present at the convocation, without an objective reason entitles the employer to dispose the sanctioning without making the prior disciplinary investigation. The objective reason which makes the employee, that is subject to prior disciplinary investigation, unable to present to the preliminary disciplinary investigation, should be at the time of the investigation in question.

  19. Identification of subsurface structures using electromagnetic data and shape priors

    Energy Technology Data Exchange (ETDEWEB)

    Tveit, Svenn, E-mail: svenn.tveit@uni.no [Uni CIPR, Uni Research, Bergen 5020 (Norway); Department of Mathematics, University of Bergen, Bergen 5020 (Norway); Bakr, Shaaban A., E-mail: shaaban.bakr1@gmail.com [Department of Mathematics, Faculty of Science, Assiut University, Assiut 71516 (Egypt); Uni CIPR, Uni Research, Bergen 5020 (Norway); Lien, Martha, E-mail: martha.lien@octio.com [Uni CIPR, Uni Research, Bergen 5020 (Norway); Octio AS, Bøhmergaten 44, Bergen 5057 (Norway); Mannseth, Trond, E-mail: trond.mannseth@uni.no [Uni CIPR, Uni Research, Bergen 5020 (Norway); Department of Mathematics, University of Bergen, Bergen 5020 (Norway)

    2015-03-01

    We consider the inverse problem of identifying large-scale subsurface structures using the controlled source electromagnetic method. To identify structures in the subsurface where the contrast in electric conductivity can be small, regularization is needed to bias the solution towards preserving structural information. We propose to combine two approaches for regularization of the inverse problem. In the first approach we utilize a model-based, reduced, composite representation of the electric conductivity that is highly flexible, even for a moderate number of degrees of freedom. With a low number of parameters, the inverse problem is efficiently solved using a standard, second-order gradient-based optimization algorithm. Further regularization is obtained using structural prior information, available, e.g., from interpreted seismic data. The reduced conductivity representation is suitable for incorporation of structural prior information. Such prior information cannot, however, be accurately modeled with a gaussian distribution. To alleviate this, we incorporate the structural information using shape priors. The shape prior technique requires the choice of kernel function, which is application dependent. We argue for using the conditionally positive definite kernel which is shown to have computational advantages over the commonly applied gaussian kernel for our problem. Numerical experiments on various test cases show that the methodology is able to identify fairly complex subsurface electric conductivity distributions while preserving structural prior information during the inversion.

  20. Compositional-prior-guided image reconstruction algorithm for multi-modality imaging

    Science.gov (United States)

    Fang, Qianqian; Moore, Richard H.; Kopans, Daniel B.; Boas, David A.

    2010-01-01

    The development of effective multi-modality imaging methods typically requires an efficient information fusion model, particularly when combining structural images with a complementary imaging modality that provides functional information. We propose a composition-based image segmentation method for X-ray digital breast tomosynthesis (DBT) and a structural-prior-guided image reconstruction for a combined DBT and diffuse optical tomography (DOT) breast imaging system. Using the 3D DBT images from 31 clinically measured healthy breasts, we create an empirical relationship between the X-ray intensities for adipose and fibroglandular tissue. We use this relationship to then segment another 58 healthy breast DBT images from 29 subjects into compositional maps of different tissue types. For each breast, we build a weighted-graph in the compositional space and construct a regularization matrix to incorporate the structural priors into a finite-element-based DOT image reconstruction. Use of the compositional priors enables us to fuse tissue anatomy into optical images with less restriction than when using a binary segmentation. This allows us to recover the image contrast captured by DOT but not by DBT. We show that it is possible to fine-tune the strength of the structural priors by changing a single regularization parameter. By estimating the optical properties for adipose and fibroglandular tissue using the proposed algorithm, we found the results are comparable or superior to those estimated with expert-segmentations, but does not involve the time-consuming manual selection of regions-of-interest. PMID:21258460

  1. Does attention speed up processing? Decreases and increases of processing rates in visual prior entry.

    Science.gov (United States)

    Tünnermann, Jan; Petersen, Anders; Scharlau, Ingrid

    2015-03-02

    Selective visual attention improves performance in many tasks. Among others, it leads to "prior entry"--earlier perception of an attended compared to an unattended stimulus. Whether this phenomenon is purely based on an increase of the processing rate of the attended stimulus or if a decrease in the processing rate of the unattended stimulus also contributes to the effect is, up to now, unanswered. Here we describe a novel approach to this question based on Bundesen's Theory of Visual Attention, which we use to overcome the limitations of earlier prior-entry assessment with temporal order judgments (TOJs) that only allow relative statements regarding the processing speed of attended and unattended stimuli. Prevalent models of prior entry in TOJs either indirectly predict a pure acceleration or cannot model the difference between acceleration and deceleration. In a paradigm that combines a letter-identification task with TOJs, we show that indeed acceleration of the attended and deceleration of the unattended stimuli conjointly cause prior entry. © 2015 ARVO.

  2. PREFERENCE OF PRIOR FOR BAYESIAN ANALYSIS OF THE MIXED BURR TYPE X DISTRIBUTION UNDER TYPE I CENSORED SAMPLES

    Directory of Open Access Journals (Sweden)

    Tabassum Naz Sindhu

    2014-05-01

    Full Text Available The paper is concerned with the preference of prior for the Bayesian analysis of the shape parameter of the mixture of Burr type X distribution using the censored data. We modeled the heterogeneous population using two components mixture of the Burr type X distribution. A comprehensive simulation scheme, through probabilistic mixing, has been followed to highlight the properties and behavior of the estimates in terms of sample size, corresponding risks and the proportion of the component of the mixture. The Bayes estimators of the parameters have been evaluated under the assumption of informative and non-informative priors using symmetric and asymmetric loss functions. The model selection criterion for the preference of the prior has been introduced. The hazard rate function of the mixture distribution has been discussed. The Bayes estimates under exponential prior and precautionary loss function exhibit the minimum posterior risks with some exceptions.

  3. Neutrino masses and their ordering: global data, priors and models

    Science.gov (United States)

    Gariazzo, S.; Archidiacono, M.; de Salas, P. F.; Mena, O.; Ternes, C. A.; Tórtola, M.

    2018-03-01

    We present a full Bayesian analysis of the combination of current neutrino oscillation, neutrinoless double beta decay and Cosmic Microwave Background observations. Our major goal is to carefully investigate the possibility to single out one neutrino mass ordering, namely Normal Ordering or Inverted Ordering, with current data. Two possible parametrizations (three neutrino masses versus the lightest neutrino mass plus the two oscillation mass splittings) and priors (linear versus logarithmic) are exhaustively examined. We find that the preference for NO is only driven by neutrino oscillation data. Moreover, the values of the Bayes factor indicate that the evidence for NO is strong only when the scan is performed over the three neutrino masses with logarithmic priors; for every other combination of parameterization and prior, the preference for NO is only weak. As a by-product of our Bayesian analyses, we are able to (a) compare the Bayesian bounds on the neutrino mixing parameters to those obtained by means of frequentist approaches, finding a very good agreement; (b) determine that the lightest neutrino mass plus the two mass splittings parametrization, motivated by the physical observables, is strongly preferred over the three neutrino mass eigenstates scan and (c) find that logarithmic priors guarantee a weakly-to-moderately more efficient sampling of the parameter space. These results establish the optimal strategy to successfully explore the neutrino parameter space, based on the use of the oscillation mass splittings and a logarithmic prior on the lightest neutrino mass, when combining neutrino oscillation data with cosmology and neutrinoless double beta decay. We also show that the limits on the total neutrino mass ∑ mν can change dramatically when moving from one prior to the other. These results have profound implications for future studies on the neutrino mass ordering, as they crucially state the need for self-consistent analyses which explore the

  4. Finding A Minimally Informative Dirichlet Prior Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  5. Prior-based artifact correction (PBAC) in computed tomography

    International Nuclear Information System (INIS)

    Heußer, Thorsten; Brehm, Marcus; Ritschl, Ludwig; Sawall, Stefan; Kachelrieß, Marc

    2014-01-01

    Purpose: Image quality in computed tomography (CT) often suffers from artifacts which may reduce the diagnostic value of the image. In many cases, these artifacts result from missing or corrupt regions in the projection data, e.g., in the case of metal, truncation, and limited angle artifacts. The authors propose a generalized correction method for different kinds of artifacts resulting from missing or corrupt data by making use of available prior knowledge to perform data completion. Methods: The proposed prior-based artifact correction (PBAC) method requires prior knowledge in form of a planning CT of the same patient or in form of a CT scan of a different patient showing the same body region. In both cases, the prior image is registered to the patient image using a deformable transformation. The registered prior is forward projected and data completion of the patient projections is performed using smooth sinogram inpainting. The obtained projection data are used to reconstruct the corrected image. Results: The authors investigate metal and truncation artifacts in patient data sets acquired with a clinical CT and limited angle artifacts in an anthropomorphic head phantom data set acquired with a gantry-based flat detector CT device. In all cases, the corrected images obtained by PBAC are nearly artifact-free. Compared to conventional correction methods, PBAC achieves better artifact suppression while preserving the patient-specific anatomy at the same time. Further, the authors show that prominent anatomical details in the prior image seem to have only minor impact on the correction result. Conclusions: The results show that PBAC has the potential to effectively correct for metal, truncation, and limited angle artifacts if adequate prior data are available. Since the proposed method makes use of a generalized algorithm, PBAC may also be applicable to other artifacts resulting from missing or corrupt data

  6. Finding a minimally informative Dirichlet prior distribution using least squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  7. Finding a Minimally Informative Dirichlet Prior Distribution Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straight-forward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in closed form, and so an approximate beta distribution is used in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial aleatory model for common-cause failure, must be estimated from data that is often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  8. Automation of cellular therapy product manufacturing: results of a split validation comparing CD34 selection of peripheral blood stem cell apheresis product with a semi-manual vs. an automatic procedure

    OpenAIRE

    H?mmer, Christiane; Poppe, Carolin; Bunos, Milica; Stock, Belinda; Wingenfeld, Eva; Huppert, Volker; Stuth, Juliane; Reck, Kristina; Essl, Mike; Seifried, Erhard; Bonig, Halvard

    2016-01-01

    Background Automation of cell therapy manufacturing promises higher productivity of cell factories, more economical use of highly-trained (and costly) manufacturing staff, facilitation of processes requiring manufacturing steps at inconvenient hours, improved consistency of processing steps and other benefits. One of the most broadly disseminated engineered cell therapy products is immunomagnetically selected CD34+?hematopoietic ?stem? cells (HSCs). Methods As the clinical GMP-compliant autom...

  9. Process validation for radiation processing

    International Nuclear Information System (INIS)

    Miller, A.

    1999-01-01

    Process validation concerns the establishment of the irradiation conditions that will lead to the desired changes of the irradiated product. Process validation therefore establishes the link between absorbed dose and the characteristics of the product, such as degree of crosslinking in a polyethylene tube, prolongation of shelf life of a food product, or degree of sterility of the medical device. Detailed international standards are written for the documentation of radiation sterilization, such as EN 552 and ISO 11137, and the steps of process validation that are described in these standards are discussed in this paper. They include material testing for the documentation of the correct functioning of the product, microbiological testing for selection of the minimum required dose and dose mapping for documentation of attainment of the required dose in all parts of the product. The process validation must be maintained by reviews and repeated measurements as necessary. This paper presents recommendations and guidance for the execution of these components of process validation. (author)

  10. Source Localization by Entropic Inference and Backward Renormalization Group Priors

    Directory of Open Access Journals (Sweden)

    Nestor Caticha

    2015-04-01

    Full Text Available A systematic method of transferring information from coarser to finer resolution based on renormalization group (RG transformations is introduced. It permits building informative priors in finer scales from posteriors in coarser scales since, under some conditions, RG transformations in the space of hyperparameters can be inverted. These priors are updated using renormalized data into posteriors by Maximum Entropy. The resulting inference method, backward RG (BRG priors, is tested by doing simulations of a functional magnetic resonance imaging (fMRI experiment. Its results are compared with a Bayesian approach working in the finest available resolution. Using BRG priors sources can be partially identified even when signal to noise ratio levels are up to ~ -25dB improving vastly on the single step Bayesian approach. For low levels of noise the BRG prior is not an improvement over the single scale Bayesian method. Analysis of the histograms of hyperparameters can show how to distinguish if the method is failing, due to very high levels of noise, or if the identification of the sources is, at least partially possible.

  11. A Noninformative Prior on a Space of Distribution Functions

    Directory of Open Access Journals (Sweden)

    Alexander Terenin

    2017-07-01

    Full Text Available In a given problem, the Bayesian statistical paradigm requires the specification of a prior distribution that quantifies relevant information about the unknowns of main interest external to the data. In cases where little such information is available, the problem under study may possess an invariance under a transformation group that encodes a lack of information, leading to a unique prior—this idea was explored at length by E.T. Jaynes. Previous successful examples have included location-scale invariance under linear transformation, multiplicative invariance of the rate at which events in a counting process are observed, and the derivation of the Haldane prior for a Bernoulli success probability. In this paper we show that this method can be extended, by generalizing Jaynes, in two ways: (1 to yield families of approximately invariant priors; and (2 to the infinite-dimensional setting, yielding families of priors on spaces of distribution functions. Our results can be used to describe conditions under which a particular Dirichlet Process posterior arises from an optimal Bayesian analysis, in the sense that invariances in the prior and likelihood lead to one and only one posterior distribution.

  12. On Bayesian reliability analysis with informative priors and censoring

    International Nuclear Information System (INIS)

    Coolen, F.P.A.

    1996-01-01

    In the statistical literature many methods have been presented to deal with censored observations, both within the Bayesian and non-Bayesian frameworks, and such methods have been successfully applied to, e.g., reliability problems. Also, in reliability theory it is often emphasized that, through shortage of statistical data and possibilities for experiments, one often needs to rely heavily on judgements of engineers, or other experts, for which means Bayesian methods are attractive. It is therefore important that such judgements can be elicited easily to provide informative prior distributions that reflect the knowledge of the engineers well. In this paper we focus on this aspect, especially on the situation that the judgements of the consulted engineers are based on experiences in environments where censoring has also been present previously. We suggest the use of the attractive interpretation of hyperparameters of conjugate prior distributions when these are available for assumed parametric models for lifetimes, and we show how one may go beyond the standard conjugate priors, using similar interpretations of hyper-parameters, to enable easier elicitation when censoring has been present in the past. This may even lead to more flexibility for modelling prior knowledge than when using standard conjugate priors, whereas the disadvantage of more complicated calculations that may be needed to determine posterior distributions play a minor role due to the advanced mathematical and statistical software that is widely available these days

  13. Novel use of pleural ultrasound can identify malignant entrapped lung prior to effusion drainage.

    Science.gov (United States)

    Salamonsen, Matthew R; Lo, Ada K C; Ng, Arnold C T; Bashirzadeh, Farzad; Wang, William Y S; Fielding, David I K

    2014-11-01

    The presence of entrapped lung changes the appropriate management of malignant pleural effusion from pleurodesis to insertion of an indwelling pleural catheter. No methods currently exist to identify entrapped lung prior to effusion drainage. Our objectives were to develop a method to identify entrapped lung using tissue movement and deformation (strain) analysis with ultrasonography and compare it to the existing technique of pleural elastance (PEL). Prior to drainage, 81 patients with suspected malignant pleural effusion underwent thoracic ultrasound using an echocardiogram machine. Images of the atelectatic lower lobe were acquired during breath hold, allowing motion and strain related to the cardiac impulse to be analyzed using motion mode (M mode) and speckle-tracking imaging, respectively. PEL was measured during effusion drainage. The gold-standard diagnosis of entrapped lung was the consensus opinion of two interventional pulmonologists according to postdrainage imaging. Participants were randomly divided into development and validation sets. Both total movement and strain were significantly reduced in entrapped lung. Using data from the development set, the area under the receiver-operating curves for the diagnosis of entrapped lung was 0.86 (speckle tracking), 0.79 (M mode), and 0.69 (PEL). Using respective cutoffs of 6%, 1 mm, and 19 cm H2O on the validation set, the sensitivity/specificity was 71%/85% (speckle tracking), 50%/85% (M mode), and 40%/100% (PEL). This novel ultrasound technique can identify entrapped lung prior to effusion drainage, which could allow appropriate choice of definitive management (pleurodesis vs indwelling catheter), reducing the number of interventions required to treat malignant pleural effusion.

  14. Selection of Celebrity Endorsers

    DEFF Research Database (Denmark)

    Hollensen, Svend; Schimmelpfennig, Christian

    2013-01-01

    several candidates by means of subtle evaluation procedures. Design/methodology/approach – A case study research has been carried out among companies experienced in celebrity endorsements to learn more about the endorser selection process in practise. Based on these cases theory is inductively developed......Purpose - This research aims at shedding some light on the various avenues marketers can undertake until finally an endorsement contract is signed. The focus of the study lies on verifying the generally held assumption that endorser selection is usually taken care of by creative agencies, vetting....... Findings – Our research suggests that generally held assumption that endorsers being selected and thoroughly vetted by a creative agency may not be universally valid. A normative model to illustrate the continuum of the selection process in practise is suggested and the two polar case studies (Swiss brand...

  15. The Validation Challenge: How Close Is Europe to Recognising All Learning? Briefing Note

    Science.gov (United States)

    Cedefop - European Centre for the Development of Vocational Training, 2014

    2014-01-01

    The European inventory on validation of non-formal and informal learning provides an unrivaled source of information detailing how validation of prior learning is developing across Europe. It shows that validation strategies and legislation, despite complexity of the task before them, have been developing slowly but steadily. However, there is…

  16. Superposing pure quantum states with partial prior information

    Science.gov (United States)

    Dogra, Shruti; Thomas, George; Ghosh, Sibasish; Suter, Dieter

    2018-05-01

    The principle of superposition is an intriguing feature of quantum mechanics, which is regularly exploited in many different circumstances. A recent work [M. Oszmaniec et al., Phys. Rev. Lett. 116, 110403 (2016), 10.1103/PhysRevLett.116.110403] shows that the fundamentals of quantum mechanics restrict the process of superimposing two unknown pure states, even though it is possible to superimpose two quantum states with partial prior knowledge. The prior knowledge imposes geometrical constraints on the choice of input states. We discuss an experimentally feasible protocol to superimpose multiple pure states of a d -dimensional quantum system and carry out an explicit experimental realization for two single-qubit pure states with partial prior information on a two-qubit NMR quantum information processor.

  17. Transcriptional analysis of abdominal fat in chickens divergently selected on bodyweight at two ages reveals novel mechanisms controlling adiposity: validating visceral adipose tissue as a dynamic endocrine and metabolic organ.

    Science.gov (United States)

    Resnyk, C W; Carré, W; Wang, X; Porter, T E; Simon, J; Le Bihan-Duval, E; Duclos, M J; Aggrey, S E; Cogburn, L A

    2017-08-16

    Decades of intensive genetic selection in the domestic chicken (Gallus gallus domesticus) have enabled the remarkable rapid growth of today's broiler (meat-type) chickens. However, this enhanced growth rate was accompanied by several unfavorable traits (i.e., increased visceral fatness, leg weakness, and disorders of metabolism and reproduction). The present descriptive analysis of the abdominal fat transcriptome aimed to identify functional genes and biological pathways that likely contribute to an extreme difference in visceral fatness of divergently selected broiler chickens. We used the Del-Mar 14 K Chicken Integrated Systems microarray to take time-course snapshots of global gene transcription in abdominal fat of juvenile [1-11 weeks of age (wk)] chickens divergently selected on bodyweight at two ages (8 and 36 wk). Further, a RNA sequencing analysis was completed on the same abdominal fat samples taken from high-growth (HG) and low-growth (LG) cockerels at 7 wk, the age with the greatest divergence in body weight (3.2-fold) and visceral fatness (19.6-fold). Time-course microarray analysis revealed 312 differentially expressed genes (FDR ≤ 0.05) as the main effect of genotype (HG versus LG), 718 genes in the interaction of age and genotype, and 2918 genes as the main effect of age. The RNA sequencing analysis identified 2410 differentially expressed genes in abdominal fat of HG versus LG chickens at 7 wk. The HG chickens are fatter and over-express numerous genes that support higher rates of visceral adipogenesis and lipogenesis. In abdominal fat of LG chickens, we found higher expression of many genes involved in hemostasis, energy catabolism and endocrine signaling, which likely contribute to their leaner phenotype and slower growth. Many transcription factors and their direct target genes identified in HG and LG chickens could be involved in their divergence in adiposity and growth rate. The present analyses of the visceral fat transcriptome in

  18. Understanding sleep disturbance in athletes prior to important competitions.

    Science.gov (United States)

    Juliff, Laura E; Halson, Shona L; Peiffer, Jeremiah J

    2015-01-01

    Anecdotally many athletes report worse sleep in the nights prior to important competitions. Despite sleep being acknowledged as an important factor for optimal athletic performance and overall health, little is understood about athlete sleep around competition. The aims of this study were to identify sleep complaints of athletes prior to competitions and determine whether complaints were confined to competition periods. Cross-sectional study. A sample of 283 elite Australian athletes (129 male, 157 female, age 24±5 y) completed two questionnaires; Competitive Sport and Sleep questionnaire and the Pittsburgh Sleep Quality Index. 64.0% of athletes indicated worse sleep on at least one occasion in the nights prior to an important competition over the past 12 months. The main sleep problem specified by athletes was problems falling asleep (82.1%) with the main reasons responsible for poor sleep indicated as thoughts about the competition (83.5%) and nervousness (43.8%). Overall 59.1% of team sport athletes reported having no strategy to overcome poor sleep compared with individual athletes (32.7%, p=0.002) who utilised relaxation and reading as strategies. Individual sport athletes had increased likelihood of poor sleep as they aged. The poor sleep reported by athletes prior to competition was situational rather than a global sleep problem. Poor sleep is common prior to major competitions in Australian athletes, yet most athletes are unaware of strategies to overcome the poor sleep experienced. It is essential coaches and scientists monitor and educate both individual and team sport athletes to facilitate sleep prior to important competitions. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  19. Neutrino mass priors for cosmology from random matrices

    Science.gov (United States)

    Long, Andrew J.; Raveri, Marco; Hu, Wayne; Dodelson, Scott

    2018-02-01

    Cosmological measurements of structure are placing increasingly strong constraints on the sum of the neutrino masses, Σ mν, through Bayesian inference. Because these constraints depend on the choice for the prior probability π (Σ mν), we argue that this prior should be motivated by fundamental physical principles rather than the ad hoc choices that are common in the literature. The first step in this direction is to specify the prior directly at the level of the neutrino mass matrix Mν, since this is the parameter appearing in the Lagrangian of the particle physics theory. Thus by specifying a probability distribution over Mν, and by including the known squared mass splittings, we predict a theoretical probability distribution over Σ mν that we interpret as a Bayesian prior probability π (Σ mν). Assuming a basis-invariant probability distribution on Mν, also known as the anarchy hypothesis, we find that π (Σ mν) peaks close to the smallest Σ mν allowed by the measured mass splittings, roughly 0.06 eV (0.1 eV) for normal (inverted) ordering, due to the phenomenon of eigenvalue repulsion in random matrices. We consider three models for neutrino mass generation: Dirac, Majorana, and Majorana via the seesaw mechanism; differences in the predicted priors π (Σ mν) allow for the possibility of having indications about the physical origin of neutrino masses once sufficient experimental sensitivity is achieved. We present fitting functions for π (Σ mν), which provide a simple means for applying these priors to cosmological constraints on the neutrino masses or marginalizing over their impact on other cosmological parameters.

  20. Physical examination prior to initiating hormonal contraception: a systematic review.

    Science.gov (United States)

    Tepper, Naomi K; Curtis, Kathryn M; Steenland, Maria W; Marchbanks, Polly A

    2013-05-01

    Provision of contraception is often linked with physical examination, including clinical breast examination (CBE) and pelvic examination. This review was conducted to evaluate the evidence regarding outcomes among women with and without physical examination prior to initiating hormonal contraceptives. The PubMed database was searched from database inception through March 2012 for all peer-reviewed articles in any language concerning CBE and pelvic examination prior to initiating hormonal contraceptives. The quality of each study was assessed using the United States Preventive Services Task Force grading system. The search did not identify any evidence regarding outcomes among women screened versus not screened with CBE prior to initiation of hormonal contraceptives. The search identified two case-control studies of fair quality which compared women who did or did not undergo pelvic examination prior to initiating oral contraceptives (OCs) or depot medroxyprogesterone acetate (DMPA). No differences in risk factors for cervical neoplasia, incidence of sexually transmitted infections, incidence of abnormal Pap smears or incidence of abnormal wet mount findings were observed. Although women with breast cancer should not use hormonal contraceptives, there is little utility in screening prior to initiation, due to the low incidence of breast cancer and uncertain value of CBE among women of reproductive age. Two fair quality studies demonstrated no differences between women who did or did not undergo pelvic examination prior to initiating OCs or DMPA with respect to risk factors or clinical outcomes. In addition, pelvic examination is not likely to detect any conditions for which hormonal contraceptives would be unsafe. Published by Elsevier Inc.

  1. Selection - factors and influences on training

    International Nuclear Information System (INIS)

    Bruno, R.J.; Mascitti, A.P.

    1987-01-01

    Personnel performance is certainly the goal of training programs and the impact of personnel performance on plant performance indicators is well known. This presentation discusses the selection of personnel prior to training and emphasizes the need for selection criteria to include aptitude intelligence, mechanical ability, work ethic, and emotional stability. Selected data is presented from Point Beach that support a rigorous selection and screening program to ensure training successfully prepared these personnel for job assignments

  2. Compressive Online Robust Principal Component Analysis with Multiple Prior Information

    DEFF Research Database (Denmark)

    Van Luong, Huynh; Deligiannis, Nikos; Seiler, Jürgen

    -rank components. Unlike conventional batch RPCA, which processes all the data directly, our method considers a small set of measurements taken per data vector (frame). Moreover, our method incorporates multiple prior information signals, namely previous reconstructed frames, to improve these paration...... and thereafter, update the prior information for the next frame. Using experiments on synthetic data, we evaluate the separation performance of the proposed algorithm. In addition, we apply the proposed algorithm to online video foreground and background separation from compressive measurements. The results show...

  3. Prior knowledge processing for initial state of Kalman filter

    Czech Academy of Sciences Publication Activity Database

    Suzdaleva, Evgenia

    2010-01-01

    Roč. 24, č. 3 (2010), s. 188-202 ISSN 0890-6327 R&D Projects: GA ČR(CZ) GP201/06/P434 Institutional research plan: CEZ:AV0Z10750506 Keywords : Kalman filtering * prior knowledge * state-space model * initial state distribution Subject RIV: BC - Control Systems Theory Impact factor: 0.729, year: 2010 http://library.utia.cas.cz/separaty/2009/AS/suzdaleva-prior knowledge processing for initial state of kalman filter.pdf

  4. Role of strategies and prior exposure in mental rotation.

    Science.gov (United States)

    Cherney, Isabelle D; Neff, Nicole L

    2004-06-01

    The purpose of these two studies was to examine sex differences in strategy use and the effect of prior exposure on the performance on Vandenberg and Kuse's 1978 Mental Rotation Test. A total of 152 participants completed the spatial task and self-reported their strategy use. Consistent with previous studies, men outperformed women. Strategy usage did not account for these differences, although guessing did. Previous exposure to the Mental Rotation Test, American College Test scores and frequent computer or video game play predicted performance on the test. These results suggest that prior exposure to spatial tasks may provide cues to improve participants' performance.

  5. Phase transitions in restricted Boltzmann machines with generic priors

    Science.gov (United States)

    Barra, Adriano; Genovese, Giuseppe; Sollich, Peter; Tantari, Daniele

    2017-10-01

    We study generalized restricted Boltzmann machines with generic priors for units and weights, interpolating between Boolean and Gaussian variables. We present a complete analysis of the replica symmetric phase diagram of these systems, which can be regarded as generalized Hopfield models. We underline the role of the retrieval phase for both inference and learning processes and we show that retrieval is robust for a large class of weight and unit priors, beyond the standard Hopfield scenario. Furthermore, we show how the paramagnetic phase boundary is directly related to the optimal size of the training set necessary for good generalization in a teacher-student scenario of unsupervised learning.

  6. C. A. Meredith, A. N. Prior, and Possible Worlds

    DEFF Research Database (Denmark)

    Hasle, Per Frederik Vilhelm; Rybaříková, Zuzana

    of Meredith’s and Prior’s work. On the one hand, it might cause corruption of Meredith’s system of logic and lead to paradoxes, as Prior pointed out in ‘Modal Logic with Functorial Variables and a Contingent Constant’. On the other hand, considering Prior as a mere follower of Meredith could cause......, their understanding of the relevant formal representations and indeed their general approach to modal logic considerably differed. These differences should be pointed out in order to more precisely appreciate the contribution of each of these authors. To neglect the differences could cause the misinterpretation...

  7. Ultrasonic techniques validation on shell

    International Nuclear Information System (INIS)

    Navarro, J.; Gonzalez, E.

    1998-01-01

    Due to the results obtained in several international RRT during the 80's, it has been necessary to prove the effectiveness of the NDT techniques. For this reason it has been imperative to verify the goodness of the Inspection Procedure over different mock-ups, representative of the inspection area and with real defects. Prior to the revision of the inspection procedure and with the aim of updating the techniques used, it is a good practice to perform different scans on the mock-ups until the validation is achieved. It is at this point, where all the parameters of the inspection at hands are defined; transducer, step, scan direction,... and what it's more important, it will be demonstrated that the technique to be used for the area required to inspection is suitable to evaluate the degradation phenomena that could appear. (Author)

  8. Challenges in Implementing National Systems of Competency Validation with Regard to Adult Learning Professionals: Perspectives from Romania and India

    Science.gov (United States)

    Sava, Simona Lidia; Shah, S. Y.

    2015-01-01

    Validation of prior learning (VPL), also referred to as recognition, validation and accreditation of prior learning (RVA), is becoming an increasingly important political issue at both European and international levels. In 2012, the European Council, the UNESCO Institute for Lifelong Learning (UIL) and the Organisation for Economic Co-operation…

  9. The Prior Can Often Only Be Understood in the Context of the Likelihood

    Directory of Open Access Journals (Sweden)

    Andrew Gelman

    2017-10-01

    Full Text Available A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation.

  10. Extending Prior Posts in Dyadic Online Text Chat

    Science.gov (United States)

    Tudini, Vincenza

    2015-01-01

    This study explores whether chat users are able to extend prior, apparently completed posts in the dyadic online text chat context. Dyadic text chat has a unique turn-taking system, and most chat softwares do not permit users to monitor one another's written messages-in-progress. This is likely to impact on their use of online extensions as an…

  11. 13 CFR 305.14 - Occupancy prior to completion.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Occupancy prior to completion. 305.14 Section 305.14 Business Credit and Assistance ECONOMIC DEVELOPMENT ADMINISTRATION, DEPARTMENT OF... the Recipient's risk and must follow the requirements of local and State law. ...

  12. Do managers manipulate earnings prior to management buyouts?

    NARCIS (Netherlands)

    Mao, Yaping; Renneboog, Luc

    2015-01-01

    To address the question as to whether managers intending to purchase their company by means of a levered buyout transaction manipulate earnings in order to buy their firm on the cheap, we study the different types of earnings management prior to the transaction: accrual management, real earnings

  13. Do Managers Manipulate Earnings Prior to Management Buyouts?

    NARCIS (Netherlands)

    Mao, Y.; Renneboog, L.D.R.

    2013-01-01

    Abstract: To address the question as to whether managers manipulate accounting numbers downwards prior to management buyouts (MBOs), we implement an industry-adjusted buyout-specific approach and receive an affirmative answer. In UK buyout companies, negative earnings manipulation (understating the

  14. Recognition of Prior Learning as an integral component of ...

    African Journals Online (AJOL)

    This is irrespective of whether that learning has been acquired through unstructured learning, performance development, off-the-job assessment, or skills and knowledge that meet workplace needs but have been gained through various previous learning experiences. The concept Recognition of Prior Learning (RPL) is ...

  15. Investigation into alternative sludge conditioning prior to dewatering

    CSIR Research Space (South Africa)

    Smollen, M

    1997-01-01

    Full Text Available have proven that the mixture of char and a small quantity of polyelectrolyte (0.5 to 1kg per ton of dry solids), used as a conditioner prior to centrifugation and filtration tests, produced cake solids concentration superior to that obtained by using...

  16. An Adaptively Accelerated Bayesian Deblurring Method with Entropy Prior

    Directory of Open Access Journals (Sweden)

    Yong-Hoon Kim

    2008-05-01

    Full Text Available The development of an efficient adaptively accelerated iterative deblurring algorithm based on Bayesian statistical concept has been reported. Entropy of an image has been used as a “prior” distribution and instead of additive form, used in conventional acceleration methods an exponent form of relaxation constant has been used for acceleration. Thus the proposed method is called hereafter as adaptively accelerated maximum a posteriori with entropy prior (AAMAPE. Based on empirical observations in different experiments, the exponent is computed adaptively using first-order derivatives of the deblurred image from previous two iterations. This exponent improves speed of the AAMAPE method in early stages and ensures stability at later stages of iteration. In AAMAPE method, we also consider the constraint of the nonnegativity and flux conservation. The paper discusses the fundamental idea of the Bayesian image deblurring with the use of entropy as prior, and the analytical analysis of superresolution and the noise amplification characteristics of the proposed method. The experimental results show that the proposed AAMAPE method gives lower RMSE and higher SNR in 44% lesser iterations as compared to nonaccelerated maximum a posteriori with entropy prior (MAPE method. Moreover, AAMAPE followed by wavelet wiener filtering gives better result than the state-of-the-art methods.

  17. Structure of NCI Cooperative Groups Program Prior to NCTN

    Science.gov (United States)

    Learn how the National Cancer Institute’s Cooperative Groups Program was structured prior to its being replaced by NCI’s National Clinical Trials Network (NCTN). The NCTN gives funds and other support to cancer research organizations to conduct cancer clinical trials.

  18. Reference Priors For Non-Normal Two-Sample Problems

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1997-01-01

    The reference prior algorithm (Berger and Bernardo, 1992) is applied to locationscale models with any regular sampling density. A number of two-sample problems is analyzed in this general context, extending the dierence, ratio and product of Normal means problems outside Normality, while explicitly

  19. Reference Priors for the General Location-Scale Model

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1997-01-01

    The reference prior algorithm (Berger and Bernardo 1992) is applied to multivariate location-scale models with any regular sampling density, where we establish the irrelevance of the usual assumption of Normal sampling if our interest is in either the location or the scale. This result immediately

  20. Prior-to-Exam: What Activities Enhance Performance?

    Science.gov (United States)

    Rhoads, C. J.; Healy, Therese

    2013-01-01

    Can instructors impact their student performance by recommending an activity just prior to taking an exam? In this study, college students were randomly assigned to one of three treatment groups (study, exercise, or meditation) or a control group. Each group was given two different types of tests; a traditional concept exam, and a non-traditional…

  1. 5 CFR 6401.103 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    .... 6401.103 Section 6401.103 Administrative Personnel ENVIRONMENTAL PROTECTION AGENCY SUPPLEMENTAL STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE ENVIRONMENTAL PROTECTION AGENCY § 6401.103 Prior approval... her Deputy Ethics Official before engaging in outside employment, with or without compensation, that...

  2. Anxiety and blood pressure prior to dental treatment.

    NARCIS (Netherlands)

    Benjamins, C.; Schuurs, A.H.; Asscheman, H.; Hoogstraten, J.

    1990-01-01

    Assessed dental anxiety and blood pressure immediately prior to a dental appointment in 24 patients attending a university dental clinic or a clinic for anxious dental patients in the Netherlands. Blood pressure was assessed by 2 independent methods, and the interchangeability of the blood-pressure

  3. Morbidity prior to a Diagnosis of Sleep-Disordered Breathing

    DEFF Research Database (Denmark)

    Jennum, Poul; Ibsen, Rikke Falkner; Kjellberg, Jakob

    2013-01-01

    Sleep-disordered breathing (SDB) causes burden to the sufferer, the healthcare system, and society. Most studies have focused on cardiovascular diseases (CVDs) after a diagnosis of obstructive sleep apnea (OSA) or obesity hypoventilation syndrome (OHS); however, the overall morbidity prior...

  4. Recognising Health Care Assistants' Prior Learning through a Caring Ideology

    Science.gov (United States)

    Sandberg, Fredrik

    2010-01-01

    This article critically appraises a process of recognising prior learning (RPL) using analytical tools from Habermas' theory of communicative action. The RPL process is part of an in-service training program for health care assistants where the goal is to become a licensed practical nurse. Data about the RPL process were collected using interviews…

  5. Source-specific Informative Prior for i-Vector Extraction

    DEFF Research Database (Denmark)

    Shepstone, Sven Ewan; Lee, Kong Aik; Li, Haizhou

    2015-01-01

    An i-vector is a low-dimensional fixed-length representation of a variable-length speech utterance, and is defined as the posterior mean of a latent variable conditioned on the observed feature sequence of an utterance. The assumption is that the prior for the latent variable is non...

  6. Risk for malnutrition in patients prior to vascular surgery

    NARCIS (Netherlands)

    Beek, Lies Ter; Banning, Louise B D; Visser, Linda; Roodenburg, Jan L N; Krijnen, Wim P; van der Schans, Cees P; Pol, Robert A; Jager-Wittenaar, Harriët

    2017-01-01

    BACKGROUND: Malnutrition is an important risk factor for adverse post-operative outcomes. The prevalence of risk for malnutrition is unknown in patients prior to vascular surgery. We aimed to assess prevalence and associated factors of risk for malnutrition in this patient group. METHODS: Patients

  7. Imprecision and prior-data conflict in generalized Bayesian inference

    NARCIS (Netherlands)

    Walter, Gero; Augustin, T. (Thomas)

    2009-01-01

    A great advantage of imprecise probability models over models based on precise, traditional probabilities is the potential to reflect the amount of knowledge they stand for. Consequently, imprecise probability models promise to offer a vivid tool for handling situations of prior-data conflict in

  8. 5 CFR 7901.102 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... ETHICAL CONDUCT FOR EMPLOYEES OF THE TENNESSEE VALLEY AUTHORITY § 7901.102 Prior approval for outside... or designee. The written request shall be submitted through the employee's supervisor or human resource office and shall, at a minimum, identify the employer or other person for whom the services are to...

  9. The Transformation of Higher Education through Prior Learning Assessment

    Science.gov (United States)

    Kamenetz, Anya

    2011-01-01

    Providing college credit for prior learning is nothing new. The American Council on Education's Credit Recommendation Service (CREDIT), the largest national program making credit recommendations for workplace and other training, dates to 1974. Several colleges that specialize in the practice--Excelsior and Empire State in New York, Thomas Edison…

  10. Simultaneous tomographic reconstruction and segmentation with class priors

    DEFF Research Database (Denmark)

    Romanov, Mikhail; Dahl, Anders Bjorholm; Dong, Yiqiu

    2015-01-01

    are combined to produce a reconstruction that is identical to the segmentation. We consider instead a hybrid approach that simultaneously produces both a reconstructed image and segmentation. We incorporate priors about the desired classes of the segmentation through a Hidden Markov Measure Field Model, and we...

  11. 47 CFR 25.118 - Modifications not requiring prior authorization.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Modifications not requiring prior authorization. 25.118 Section 25.118 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Applications and Licenses General Application Filing Requirements § 25...

  12. The epizootiology of the highly pathogenic avian influenza prior to ...

    African Journals Online (AJOL)

    The epizootiology of the highly pathogenic avian influenza prior to the anticipated pandemic of the early twenty first century. ... Transmission of highly pathogenic H5N1 from domestic fowls back to migratory waterfowl in western China has increased the geographic spread. This has grave consequences for the poultry ...

  13. 18 CFR 415.51 - Prior non-conforming structures.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 2 2010-04-01 2010-04-01 false Prior non-conforming structures. 415.51 Section 415.51 Conservation of Power and Water Resources DELAWARE RIVER BASIN COMMISSION... damaged by any means, including a flood, to the extent of 50 percent or more of its market value at that...

  14. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, Mathijs; Frehen, Rik; Schotman, Peter; Bauer, Rob

    We propose a hybrid approach for estimating beta that shrinks rolling window estimates towards firm-specific priors motivated by economic theory. Our method yields superior forecasts of beta that have important practical implications. First, hybrid betas carry a significant price of risk in the

  15. Estimating Security Betas Using Prior Information Based on Firm Fundamentals

    NARCIS (Netherlands)

    Cosemans, Mathijs; Frehen, Rik; Schotman, Peter; Bauer, Rob

    2016-01-01

    We propose a hybrid approach for estimating beta that shrinks rolling window estimates toward firm-specific priors motivated by economic theory. Our method yields superior forecasts of beta that have important practical implications. First, unlike standard rolling window betas, hybrid betas carry a

  16. Incorporating priors for EEG source imaging and connectivity analysis

    Directory of Open Access Journals (Sweden)

    Xu eLei

    2015-08-01

    Full Text Available Electroencephalography source imaging (ESI is a useful technique to localize the generators from a given scalp electric measurement and to investigate the temporal dynamics of the large-scale neural circuits. By introducing reasonable priors from other modalities, ESI reveals the most probable sources and communication structures at every moment in time. Here, we review the available priors from such techniques as magnetic resonance imaging (MRI, functional MRI (fMRI, and positron emission tomography (PET. The modality's specific contribution is analyzed from the perspective of source reconstruction. For spatial priors, such as EEG-correlated fMRI, temporally coherent networks and resting-state fMRI are systematically introduced in the ESI. Moreover, the fiber tracking (diffusion tensor imaging, DTI and neuro-stimulation techniques (transcranial magnetic stimulation, TMS are also introduced as the potential priors, which can help to draw inferences about the neuroelectric connectivity in the source space. We conclude that combining EEG source imaging with other complementary modalities is a promising approach towards the study of brain networks in cognitive and clinical neurosciences.

  17. Nonextensive Entropy, Prior PDFs and Spontaneous Symmetry Breaking

    OpenAIRE

    Shafee, Fariel

    2008-01-01

    We show that using nonextensive entropy can lead to spontaneous symmetry breaking when a parameter changes its value from that applicable for a symmetric domain, as in field theory. We give the physical reasons and also show that even for symmetric Dirichlet priors, such a defnition of the entropy and the parameter value can lead to asymmetry when entropy is maximized.

  18. Bayesian nonparametric system reliability using sets of priors

    NARCIS (Netherlands)

    Walter, G.M.; Aslett, L.J.M.; Coolen, F.P.A.

    2016-01-01

    An imprecise Bayesian nonparametric approach to system reliability with multiple types of components is developed. This allows modelling partial or imperfect prior knowledge on component failure distributions in a flexible way through bounds on the functioning probability. Given component level test

  19. Effects of prior interpretation on situation assessment is crime analysis

    NARCIS (Netherlands)

    Kerstholt, J.H.; Eikelboom, A.R.

    2007-01-01

    Purpose - To investigate the effects of prior case knowledge on the judgement of crime analysts. Design/methodology/approach - Explains that crime analysts assist when an investigation team has converged/agreed on a probable scenario, attributes this convergence to group-think, but points out this

  20. Prior experience, cognitive perceptions and psychological skills of ...

    African Journals Online (AJOL)

    The objective of this study was to investigate the interaction between the prior experience, cognitive perceptions and psychological skills of senior rugby players in South Africa. The study population included 139 trans-national players, 106 provincial players and 95 club rugby players (N=340). A cross-sectional design was ...

  1. Mountain bike racing - the influence of prior glycogen-inducing ...

    African Journals Online (AJOL)

    Objective. To investigate the effect of pre-exercise glutamine supplementation and the influence of a prior acute bout of glycogen-reducing exercise on the general stress and immune response to acute high-intensity cycling. Design. Randomised, double-blind, cross-over supplementation study. Setting and intervention.

  2. Preparing learners with partly incorrect intuitive prior knowledge for learning

    Directory of Open Access Journals (Sweden)

    Andrea eOhst

    2014-07-01

    Full Text Available Learners sometimes have incoherent and fragmented intuitive prior knowledge that is (partly ‘incompatible’ with the to-be-learned contents. Such knowledge in pieces can cause conceptual disorientation and cognitive overload while learning. We hypothesized that a pre-training intervention providing a generalized schema as a structuring framework for such knowledge in pieces would support (reorganizing-processes of prior knowledge and thus reduce unnecessary cognitive load during subsequent learning. Fifty-six student teachers participated in the experiment. A framework group underwent a pre-training intervention providing a generalized, categorical schema for categorizing primary learning strategies and related but different strategies as a cognitive framework for (re-organizing their prior knowledge. Our control group received comparable factual information but no framework. Afterwards, all participants learned about primary learning strategies. The framework group claimed to possess higher levels of interest and self-efficacy, achieved higher learning outcomes, and learned more efficiently. Hence, providing a categorical framework can help overcome the barrier of incorrect prior knowledge in pieces.

  3. Preparing learners with partly incorrect intuitive prior knowledge for learning

    Science.gov (United States)

    Ohst, Andrea; Fondu, Béatrice M. E.; Glogger, Inga; Nückles, Matthias; Renkl, Alexander

    2014-01-01

    Learners sometimes have incoherent and fragmented intuitive prior knowledge that is (partly) “incompatible” with the to-be-learned contents. Such knowledge in pieces can cause conceptual disorientation and cognitive overload while learning. We hypothesized that a pre-training intervention providing a generalized schema as a structuring framework for such knowledge in pieces would support (re)organizing-processes of prior knowledge and thus reduce unnecessary cognitive load during subsequent learning. Fifty-six student teachers participated in the experiment. A framework group underwent a pre-training intervention providing a generalized, categorical schema for categorizing primary learning strategies and related but different strategies as a cognitive framework for (re-)organizing their prior knowledge. Our control group received comparable factual information but no framework. Afterwards, all participants learned about primary learning strategies. The framework group claimed to possess higher levels of interest and self-efficacy, achieved higher learning outcomes, and learned more efficiently. Hence, providing a categorical framework can help overcome the barrier of incorrect prior knowledge in pieces. PMID:25071638

  4. 5 CFR 6701.106 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE GENERAL SERVICES ADMINISTRATION § 6701.106 Prior approval... to be performed; (4) The name and address of the prospective outside employer for which work will be... affects the outside employer and will disqualify himself from future participation in matters that could...

  5. 5 CFR 7401.102 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE MERIT SYSTEMS PROTECTION BOARD § 7401.102 Prior approval... written approval from the employee's supervisor and the concurrence of the Designated Agency Ethics... name of the employer or organization; (ii) The nature of the legal activity or other work to be...

  6. 5 CFR 7101.102 - Prior approval for outside employment.

    Science.gov (United States)

    2010-01-01

    ... STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE NATIONAL LABOR RELATIONS BOARD § 7101.102 Prior approval... forth, at a minimum: (i) The name of the employer; (ii) The nature of the legal activity or other work... designee may consult with the Designated Agency Ethics Official to ensure that the request for outside...

  7. Effects of regularisation priors on dynamic PET Data

    International Nuclear Information System (INIS)

    Caldeira, Liliana; Scheins, Juergen; Silva, Nuno da; Gaens, Michaela; Shah, N Jon

    2014-01-01

    Dynamic PET provides temporal information about tracer uptake. However, each PET frame has usually low statistics, resulting in noisy images. The goal is to study effects of prior regularisation on dynamic PET data. Quantification and noise in image-domain and time-domain as well as impact on parametric images is assessed.

  8. Exploiting prior knowledge of English, Mathematics and Chemistry ...

    African Journals Online (AJOL)

    This paper explores prior knowledge with the view to enhancing the study of French. Juxtaposing sentences in French and English to underscore syntactic differences and similarities, the paper attributes numerical values to nouns and adjectives in French in order to demonstrate the mathematical imbalance and lack of ...

  9. Using Students' Prior Knowledge to Teach Social Penetration Theory

    Science.gov (United States)

    Chornet-Roses, Daniel

    2010-01-01

    Bransford, Brown, and Cocking argue that acknowledging students' prior ideas and beliefs about a subject and incorporating them into the classroom enhances student learning. This article presents an activity which serves to hone three student learning outcomes: analysis of communication, inductive reasoning, and self-reflection. The goal of this…

  10. 40 CFR 266.101 - Management prior to burning.

    Science.gov (United States)

    2010-07-01

    ... storage units that store mixtures of hazardous waste and the primary fuel to the boiler or industrial... MANAGEMENT FACILITIES Hazardous Waste Burned in Boilers and Industrial Furnaces § 266.101 Management prior to burning. (a) Generators. Generators of hazardous waste that is burned in a boiler or industrial furnace...

  11. The Cost of Prior Restraint: "U. S. v. The Progressive."

    Science.gov (United States)

    Soloski, John; Dyer, Carolyn Stewart

    Increased litigation and rising litigation costs threaten the future of newspapers and magazines. A case study was conducted to determine the costs and effects of "United States v. 'The Progressive,'" a prior restraint case over the publication in 1979 of an article on the hydrogen bomb. "The Progressive," which operates at a…

  12. Class II correction prior to orthodontics with the carriere distalizer.

    Science.gov (United States)

    McFarlane, Bruce

    2013-01-01

    Class II correction is a challenge in orthodontics with many existing devices being complex, too compliance-driven, or too prone to breakage. The Carriere Distalizer allows for straightforward Class II correction prior to orthodontics (fixed or clear aligners) at a time when no other mechanics interfere, and compliance is at its best.

  13. Prior Exposure and Educational Environment towards Entrepreneurial Intention

    Directory of Open Access Journals (Sweden)

    Karla Soria-Barreto

    2017-07-01

    Full Text Available This research is based on the responses to a questionnaire applied to 351 students of business management in Chile and Colombia. Through the analysis of structural equations on Ajzen’s model, we found that entrepreneurial education, the University environment, and the prior entrepreneurial exposure are mediated by the factors of the Ajzen`s model to generate entrepreneurial intention in higher education students. The results show that entrepreneurial education strengthens the perceived control of behavior and, with it, albeit in a differentiated way, the entrepreneurial intention of men and women. University environment affects entrepreneurial intention through attitude towards entrepreneurship; and finally, the work experience, used as one of the variables that measure prior entrepreneurial exposure, explains the entrepreneurial intention inversely through the subjective norms. We found that gender has a moderate effect on perceived control of behavior and entrepreneurial education. The scarce studies on the impact of the University environment and the mixed results of the entrepreneurial education and prior entrepreneurial exposure toward entrepreneurial intention show the necessity for further research. A second contribution is the opportunity to present new evidence about the relationship between University environment, entrepreneurial education and prior exposure to developing countries of South America, including the gender effect (moderator for entrepreneurial intention. It is important to note that most of the research in this area applies to developed countries, and some scholars suggest that extrapolating the results is not convenient.

  14. Prior implicit knowledge shapes human threshold for orientation noise

    DEFF Research Database (Denmark)

    Christensen, Jeppe H; Bex, Peter J; Fiser, József

    2015-01-01

    , resulting in an image-class-specific threshold that changes the shape and position of the dipper function according to image class. These findings do not fit a filter-based feed-forward view of orientation coding, but can be explained by a process that utilizes an experience-based perceptual prior...

  15. Hypofractionated stereotactic radiotherapy in five daily fractions for post-operative surgical cavities in brain metastases patients with and without prior whole brain radiation.

    Science.gov (United States)

    Al-Omair, Ameen; Soliman, Hany; Xu, Wei; Karotki, Aliaksandr; Mainprize, Todd; Phan, Nicolas; Das, Sunit; Keith, Julia; Yeung, Robert; Perry, James; Tsao, May; Sahgal, Arjun

    2013-12-01

    Our purpose was to report efficacy of hypofractionated cavity stereotactic radiotherapy (HCSRT) in patients with and without prior whole brain radiotherapy (WBRT). 32 surgical cavities in 30 patients (20 patients/21 cavities had no prior WBRT and 10 patients/11 cavities had prior WBRT) were treated with image-guided linac stereotactic radiotherapy. 7 of the 10 prior WBRT patients had "resistant" local disease given prior surgery, post-operative WBRT and a re-operation, followed by salvage HCSRT. The clinical target volume was the post-surgical cavity, and a 2-mm margin applied as planning target volume. The median total dose was 30 Gy (range: 25-37.5 Gy) in 5 fractions. In the no prior and prior WBRT cohorts, the median follow-up was 9.7 months (range: 3.0-23.6) and 15.3 months (range: 2.9-39.7), the median survival was 23.6 months and 39.7 months, and the 1-year cavity local recurrence progression- free survival (LRFS) was 79 and 100%, respectively. At 18 months the LRFS dropped to 29% in the prior WBRT cohort. Grade 3 radiation necrosis occurred in 3 prior WBRT patients. We report favorable outcomes with HCSRT, and well selected patients with prior WBRT and "resistant" disease may have an extended survival favoring aggressive salvage HCSRT at a moderate risk of radiation necrosis.

  16. Clinical utility of carotid duplex ultrasound prior to cardiac surgery.

    Science.gov (United States)

    Lin, Judith C; Kabbani, Loay S; Peterson, Edward L; Masabni, Khalil; Morgan, Jeffrey A; Brooks, Sara; Wertella, Kathleen P; Paone, Gaetano

    2016-03-01

    Clinical utility and cost-effectiveness of carotid duplex examination prior to cardiac surgery have been questioned by the multidisciplinary committee creating the 2012 Appropriate Use Criteria for Peripheral Vascular Laboratory Testing. We report the clinical outcomes and postoperative neurologic symptoms in patients who underwent carotid duplex ultrasound prior to open heart surgery at a tertiary institution. Using the combined databases from our clinical vascular laboratory and the Society of Thoracic Surgery, a retrospective analysis of all patients who underwent carotid duplex ultrasound within 13 months prior to open heart surgery from March 2005 to March 2013 was performed. The outcomes between those who underwent carotid duplex scanning (group A) and those who did not (group B) were compared. Among 3233 patients in the cohort who underwent cardiac surgery, 515 (15.9%) patients underwent a carotid duplex ultrasound preoperatively, and 2718 patients did not (84.1%). Among the patients who underwent carotid screening vs no screening, there was no statistically significant difference in the risk factors of cerebrovascular disease (10.9% vs 12.7%; P = .26), prior stroke (8.2% vs 7.2%; P = .41), and prior transient ischemic attack (2.9% vs 3.3%; P = .24). For those undergoing isolated coronary artery bypass grafting (CABG), 306 (17.8%) of 1723 patients underwent preoperative carotid duplex ultrasound. Among patients who had carotid screening prior to CABG, the incidence of carotid disease was low: 249 (81.4%) had minimal or mild stenosis (duplex scanning and those who did not. Primary outcomes of patients who underwent open heart surgery also showed no difference in the perioperative mortality (5.1% vs 6.9%; P = .14) and stroke (2.6% vs 2.4%; P = .85) between patients undergoing preoperative duplex scanning and those who did not. Operative intervention of severe carotid stenosis prior to isolated CABG occurred in 2 of the 17 patients (11.8%) identified who

  17. Lesson 6: Signature Validation

    Science.gov (United States)

    Checklist items 13 through 17 are grouped under the Signature Validation Process, and represent CROMERR requirements that the system must satisfy as part of ensuring that electronic signatures it receives are valid.

  18. Development and validation of a rapid, selective, and sensitive LC-MS/MS method for simultaneous determination of D- and L-amino acids in human serum: application to the study of hepatocellular carcinoma.

    Science.gov (United States)

    Han, Minlu; Xie, Mengyu; Han, Jun; Yuan, Daoyi; Yang, Tian; Xie, Ying

    2018-04-01

    A validated liquid chromatography-tandem mass spectrometry method was developed for the simultaneous determination of D- and L-amino acids in human serum. Under the optimum conditions, except for DL-proline, L-glutamine, and D-lysine, the enantioseparation of the other 19 enantiomeric pairs of proteinogenic amino acids and nonchiral glycine was achieved with a CROWNPAK CR-I(+) chiral column within 13 min. The lower limits of quantitation for L-amino acids (including glycine) and D-amino acids were 5-56.25 μM and 0.625-500 nM, respectively, in human serum. The intraday precision and interday precision for all the analytes were less than 15%, and the accuracy ranged from -12.84% to 12.37% at three quality control levels. The proposed method, exhibiting high rapidity, enantioresolution, and sensitivity, was successfully applied to the quantification of D- and L-amino acid levels in serum from hepatocellular carcinoma patients and healthy individuals. The serum concentrations of L-arginine, L-isoleucine, L-aspartate, L-tryptophan, L-alanine, L-methionine, L-serine, glycine, L-valine, L-leucine, L-phenylalanine, L-threonine, D-isoleucine, D-alanine, D-glutamate, D-glutamine, D-methionine, and D-threonine were significantly reduced in the hepatocellular carcinoma patients compared with the healthy individuals (P hepatocellular carcinoma research. Graphical abstract Simultaneous determination of D- and L-amino acids in human serum from hepatocellular carcinoma patients and healthy individuals. AA amino acid, HCC hepatocellular carcinoma, LC liquid chromatography, MS/MS tandem mass spectrometry, NC normal control, TIC total ion chromatogram.

  19. 'The words will pass with the blowing wind': staff and parent views of the deferred consent process, with prior assent, used in an emergency fluids trial in two African hospitals.

    Directory of Open Access Journals (Sweden)

    Sassy Molyneux

    Full Text Available To document and explore the views and experiences of key stakeholders regarding the consent procedures of an emergency research clinical trial examining immediate fluid resuscitation strategies, and to discuss the implications for similar trials in future.A social science sub-study of the FEAST (Fluid Expansion As Supportive Therapy trial. Interviews were held with trial team members (n = 30, health workers (n = 15 and parents (n = 51 from two purposively selected hospitals in Soroti, Uganda, and Kilifi, Kenya.Overall, deferred consent with prior assent was seen by staff and parents as having the potential to protect the interests of both patients and researchers, and to avoid delays in starting treatment. An important challenge is that the validity of verbal assent is undermined when inadequate initial information is poorly understood. This concern needs to be balanced against the possibility that full prior consent on admission potentially causes harm through introducing delays. Full prior consent also potentially imposes worries on parents that clinicians are uncertain about how to proceed and that clinicians want to absolve themselves of any responsibility for the child's outcome (some parents' interpretation of the need for signed consent. Voluntariness is clearly compromised for both verbal assent and full prior consent in a context of such vulnerability and stress. Further challenges in obtaining verbal assent were: what to do in the absence of the household decision-maker (often the father; and how medical staff handle parents not giving a clear agreement or refusal.While the challenges identified are faced in all research in low-income settings, they are magnified for emergency trials by the urgency of decision making and treatment needs. Consent options will need to be tailored to particular studies and settings, and might best be informed by consultation with staff members and community representatives using a deliberative

  20. Biased and inadequate citation of prior research in reports of cardiovascular trials is a continuing source of waste in research.

    Science.gov (United States)

    Sawin, Veronica I; Robinson, Karen A

    2016-01-01

    We assessed citation of prior research over time and the association of citation with the agreement of results between the trial being reported and the prior trial. Groups of pharmacologic trials in cardiovascular disease were created using meta-analyses, and we assessed citation within these groups. We calculated the proportion of prior trials cited, the proportion of study participants captured in citations, and agreement of results between citing and cited trials. Analysis included 86 meta-analyses with 580 trials published between 1982 and 2011. Reports of trials cited 25% (median; 95% confidence interval [CI], 23-27%) of prior trials, capturing 31% (95% CI, 25-36%) of trial participants. Neither measure differed by publication of the citing trial before vs. after 2005. Prior trials with results that agreed with the reports of trials (supportive trials) were significantly more likely to be cited than nonsupportive trials (relative risk 1.45; 95% CI, 1.30-1.61, P < 0.001). Selective undercitation of prior research continues; three quarters of existing evidence is ignored. This source of waste may result in unnecessary, unethical, and unscientific studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Imaging performance of a hybrid x-ray computed tomography-fluorescence molecular tomography system using priors.

    Science.gov (United States)

    Ale, Angelique; Schulz, Ralf B; Sarantopoulos, Athanasios; Ntziachristos, Vasilis

    2010-05-01

    The performance is studied of two newly introduced and previously suggested methods that incorporate priors into inversion schemes associated with data from a recently developed hybrid x-ray computed tomography and fluorescence molecular tomography system, the latter based on CCD camera photon detection. The unique data set studied attains accurately registered data of high spatially sampled photon fields propagating through tissue along 360 degrees projections. Approaches that incorporate structural prior information were included in the inverse problem by adding a penalty term to the minimization function utilized for image reconstructions. Results were compared as to their performance with simulated and experimental data from a lung inflammation animal model and against the inversions achieved when not using priors. The importance of using priors over stand-alone inversions is also showcased with high spatial sampling simulated and experimental data. The approach of optimal performance in resolving fluorescent biodistribution in small animals is also discussed. Inclusion of prior information from x-ray CT data in the reconstruction of the fluorescence biodistribution leads to improved agreement between the reconstruction and validation images for both simulated and experimental data.

  2. [Selective mutism].

    Science.gov (United States)

    Ytzhak, A; Doron, Y; Lahat, E; Livne, A

    2012-10-01

    Selective mutism is an uncommon disorder in young children, in which they selectively don't speak in certain social situations, while being capable of speaking easily in other social situations. Many etiologies were proposed for selective mutism including psychodynamic, behavioral and familial etc. A developmental etiology that includes insights from all the above is gaining support. Accordingly, mild language impairment in a child with an anxiety trait may be at the root of developing selective mutism. The behavior will be reinforced by an avoidant pattern in the family. Early treatment and followup for children with selective mutism is important. The treatment includes non-pharmacological therapy (psychodynamic, behavioral and familial) and pharmacologic therapy--mainly selective serotonin reuptake inhibitors (SSRI).

  3. Evidências de validade da bateria de provas de raciocínio (BPR-5 para seleção de pessoal Evidences on the validity of the battery of reasoning tests (BPR-5 for employment selection

    Directory of Open Access Journals (Sweden)

    Viviane de Oliveira Baumgartl

    2006-01-01

    Full Text Available A utilização de testes psicológicos é uma prática comum em empresas brasileiras. Estas avaliações, no entanto, muitas vezes são realizadas sem levar em conta a eficácia dos instrumentos utilizados em discriminar critérios relevantes para um bom de desempenho do funcionário em seu posto de trabalho. Nesse sentido o presente estudo teve como objetivo verificar as evidências de validade do BPR-5 em um contexto organizacional. Os participantes foram 79 funcionários de uma empresa de energia elétrica brasileira. Como critério foi utilizado o número de acidentes de trabalho dos funcionários. Os dados foram analisados por meio de estatísticas descritivas e análises correlacionais. Nos resultados, o critério utilizado, referente ao número de acidentes de trabalho, correlacionou-se com a inteligência principalmente para funcionários com menor tempo na função (-0,39; pThe use of psychological tests is a common practice among Brazilian companies. These evaluations, however, are done many times without taking into account the efficiency of the instruments that are used for discriminating relevant criteria for a good performance of the employee at work. The purpose of this study was to check evidences on the validity of the BPR-5 test in an organizational context. The sample consisted of 79 employees of a Brazilian electric company. The number of injuries at the workplace was used as criteria. The data was analyzed using descriptive statistics and correlation analyses. The results showed that the criteria of injuries at the workplace presented significant correlations with the tests, indicating correlation with intelligence, especially for employees with less job experience (-0,39; p<0,05. The implications of these results for both research and practice are discussed.

  4. Prior-Based Quantization Bin Matching for Cloud Storage of JPEG Images.

    Science.gov (United States)

    Liu, Xianming; Cheung, Gene; Lin, Chia-Wen; Zhao, Debin; Gao, Wen

    2018-07-01

    Millions of user-generated images are uploaded to social media sites like Facebook daily, which translate to a large storage cost. However, there exists an asymmetry in upload and download data: only a fraction of the uploaded images are subsequently retrieved for viewing. In this paper, we propose a cloud storage system that reduces the storage cost of all uploaded JPEG photos, at the expense of a controlled increase in computation mainly during download of requested image subset. Specifically, the system first selectively re-encodes code blocks of uploaded JPEG images using coarser quantization parameters for smaller storage sizes. Then during download, the system exploits known signal priors-sparsity prior and graph-signal smoothness prior-for reverse mapping to recover original fine quantization bin indices, with either deterministic guarantee (lossless mode) or statistical guarantee (near-lossless mode). For fast reverse mapping, we use small dictionaries and sparse graphs that are tailored for specific clusters of similar blocks, which are classified via tree-structured vector quantizer. During image upload, cluster indices identifying the appropriate dictionaries and graphs for the re-quantized blocks are encoded as side information using a differential distributed source coding scheme to facilitate reverse mapping during image download. Experimental results show that our system can reap significant storage savings (up to 12.05%) at roughly the same image PSNR (within 0.18 dB).

  5. Recognition of prior learning candidates’ experiences in a nurse training programme

    Directory of Open Access Journals (Sweden)

    Nomathemba B. Mothokoa

    2018-06-01

    Full Text Available Recognition of prior learning (RPL in South Africa is critical to the development of an equitable education and training system. Historically, nursing has been known as one of the professions that provides access to the training and education of marginalised groups who have minimal access to formal education. The advent of implementing RPL in nursing has, however, not been without challenges. The purpose of this study was to explore and describe the experiences of RPL nursing candidates related to a 4-year comprehensive nursing training programme at a nursing education institution in Gauteng. An exploratory, descriptive and contextual qualitative research design was undertaken. The research sample comprised 13 purposefully selected participants. Face-to-face individual interviews, using open-ended questions, were used to collect data, which were analysed using Tesch’s approach. Recognition of prior learning candidates experienced a number of realities as adult learners. On a positive note, their prior knowledge and experience supported them in their learning endeavours. Participants, however, experienced a number of challenges on personal, interpersonal and socialisation, and educational levels. It is important that opportunities are created to support and assist RPL candidates to complete their nursing training. This support structure, among others, should include the provision of RPL-related information, giving appropriate advice, coaching and mentoring, effective administration services, integrated curriculum design, and a variety of formative and summative assessment practices.

  6. A Bayesian approach to PET reconstruction using image-modeling Gibbs priors: Implementation and comparison

    International Nuclear Information System (INIS)

    Chan, M.T.; Herman, G.T.; Levitan, E.

    1996-01-01

    We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an open-quotes image-modelingclose quotes prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach

  7. Hysteroscopic findings in patients with post-menstrual spotting with prior cesarean section

    Directory of Open Access Journals (Sweden)

    Valdely Helena Talamonte

    2012-03-01

    Full Text Available Objective: To identify uterine hysteroscopic findings among patientswith prior cesarean section and whom had post-menstrual bleedingspotting type. Methods: We conducted a descriptive and prospective study between June 2008 and December 2009 involving women admitted to our clinic in Ji-Paraná (RO, Brazil, and who complained of prolonged genital bleeding after menstrual period. A total of 20 women with the simultaneous following characteristics were selected: at least one prior cesarean section, aged between 18 and 45 years, no use of hormonal contraceptives, and no history of uterine surgery that could change the cavity anatomy. All participants underwent a hysteroscopic examination. Results: During hysteroscopy, in 90% of the patients, the presence of a cesarean section scar was observed in the last third of the cervix. This scarring causes an anomaly in the uterine cavity anatomy, characterized by the viewing of an enlargement followed by a retraction of the anterior wall, which affords the presence of a pseudocavity with depth and lumen narrowing in variable degrees. Two patients did not present the pseudocavity. Conclusion: Pseudocavities in cesarean section scar are usually found in hysteroscopic examination of patients with prior cesarean section and abnormal uterine spotting.

  8. The partner selection process : Steps, effectiveness, governance

    NARCIS (Netherlands)

    Duisters, D.; Duijsters, G.M.; de Man, A.P.

    2011-01-01

    Selecting the right partner is important for creating value in alliances. Even though prior research suggests that a structured partner selection process increases alliance success, empirical research remains scarce. This paper presents an explorative empirical study that shows that some steps in

  9. The partner selection process : steps, effectiveness, governance

    NARCIS (Netherlands)

    Duisters, D.; Duysters, G.M.; Man, de A.P.

    2011-01-01

    Selecting the right partner is important for creating value in alliances. Even though prior research suggests that a structured partner selection process increases alliance success, empirical research remains scarce. This paper presents an explorative empirical study that shows that some steps in

  10. Principles of Proper Validation

    DEFF Research Database (Denmark)

    Esbensen, Kim; Geladi, Paul

    2010-01-01

    to suffer from the same deficiencies. The PPV are universal and can be applied to all situations in which the assessment of performance is desired: prediction-, classification-, time series forecasting-, modeling validation. The key element of PPV is the Theory of Sampling (TOS), which allow insight......) is critically necessary for the inclusion of the sampling errors incurred in all 'future' situations in which the validated model must perform. Logically, therefore, all one data set re-sampling approaches for validation, especially cross-validation and leverage-corrected validation, should be terminated...

  11. Satellite Infrared Radiation Measurements Prior to the Major Earthquakes

    Science.gov (United States)

    Ouzounov, Dimitar; Pulintes, S.; Bryant, N.; Taylor, Patrick; Freund, F.

    2005-01-01

    This work describes our search for a relationship between tectonic stresses and increases in mid-infrared (IR) flux as part of a possible ensemble of electromagnetic (EM) phenomena that may be related to earthquake activity. We present and &scuss observed variations in thermal transients and radiation fields prior to the earthquakes of Jan 22, 2003 Colima (M6.7) Mexico, Sept. 28 .2004 near Parkfield (M6.0) in California and Northern Sumatra (M8.5) Dec. 26,2004. Previous analysis of earthquake events has indicated the presence of an IR anomaly, where temperatures increased or did not return to its usual nighttime value. Our procedures analyze nighttime satellite data that records the general condtion of the ground after sunset. We have found from the MODIS instrument data that five days before the Colima earthquake the IR land surface nighttime temperature rose up to +4 degrees C in a 100 km radius around the epicenter. The IR transient field recorded by MODIS in the vicinity of Parkfield, also with a cloud free environment, was around +1 degree C and is significantly smaller than the IR anomaly around the Colima epicenter. Ground surface temperatures near the Parkfield epicenter four days prior to the earthquake show steady increase. However, on the night preceding the quake, a significant drop in relative humidity was indicated, process similar to those register prior to the Colima event. Recent analyses of continuous ongoing long- wavelength Earth radiation (OLR) indicate significant and anomalous variability prior to some earthquakes. The cause of these anomalies is not well understood but could be the result of a triggering by an interaction between the lithosphere-hydrosphere and atmospheric related to changes in the near surface electrical field and/or gas composition prior to the earthquake. The OLR anomaly usually covers large areas surrounding the main epicenter. We have found strong anomalies signal (two sigma) along the epicentral area signals on Dec 21

  12. Prior image constrained image reconstruction in emerging computed tomography applications

    Science.gov (United States)

    Brunner, Stephen T.

    Advances have been made in computed tomography (CT), especially in the past five years, by incorporating prior images into the image reconstruction process. In this dissertation, we investigate prior image constrained image reconstruction in three emerging CT applications: dual-energy CT, multi-energy photon-counting CT, and cone-beam CT in image-guided radiation therapy. First, we investigate the application of Prior Image Constrained Compressed Sensing (PICCS) in dual-energy CT, which has been called "one of the hottest research areas in CT." Phantom and animal studies are conducted using a state-of-the-art 64-slice GE Discovery 750 HD CT scanner to investigate the extent to which PICCS can enable radiation dose reduction in material density and virtual monochromatic imaging. Second, we extend the application of PICCS from dual-energy CT to multi-energy photon-counting CT, which has been called "one of the 12 topics in CT to be critical in the next decade." Numerical simulations are conducted to generate multiple energy bin images for a photon-counting CT acquisition and to investigate the extent to which PICCS can enable radiation dose efficiency improvement. Third, we investigate the performance of a newly proposed prior image constrained scatter correction technique to correct scatter-induced shading artifacts in cone-beam CT, which, when used in image-guided radiation therapy procedures, can assist in patient localization, and potentially, dose verification and adaptive radiation therapy. Phantom studies are conducted using a Varian 2100 EX system with an on-board imager to investigate the extent to which the prior image constrained scatter correction technique can mitigate scatter-induced shading artifacts in cone-beam CT. Results show that these prior image constrained image reconstruction techniques can reduce radiation dose in dual-energy CT by 50% in phantom and animal studies in material density and virtual monochromatic imaging, can lead to radiation

  13. Site selection

    CERN Multimedia

    CERN PhotoLab

    1968-01-01

    To help resolve the problem of site selection for the proposed 300 GeV machine, the Council selected "three wise men" (left to right, J H Bannier of the Netherlands, A Chavanne of Switzerland and L K Boggild of Denmark).

  14. Benchmark selection

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tvede, Mich

    2002-01-01

    Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...... in order to obtain a unique selection...

  15. Post-prior equivalence for transfer reactions with complex potentials

    Science.gov (United States)

    Lei, Jin; Moro, Antonio M.

    2018-01-01

    In this paper, we address the problem of the post-prior equivalence in the calculation of inclusive breakup and transfer cross sections. For that, we employ the model proposed by Ichimura et al. [Phys. Rev. C 32, 431 (1985), 10.1103/PhysRevC.32.431], conveniently generalized to include the part of the cross section corresponding the transfer to bound states. We pay particular attention to the case in which the unobserved particle is left in a bound state of the residual nucleus, in which case the theory prescribes the use of a complex potential, responsible for the spreading width of the populated single-particle states. We see that the introduction of this complex potential gives rise to an additional term in the prior cross-section formula, not present in the usual case of real binding potentials. The equivalence is numerically tested for the 58Ni(d ,p X ) reaction.

  16. Bayesian Image Segmentations by Potts Prior and Loopy Belief Propagation

    Science.gov (United States)

    Tanaka, Kazuyuki; Kataoka, Shun; Yasuda, Muneki; Waizumi, Yuji; Hsu, Chiou-Ting

    2014-12-01

    This paper presents a Bayesian image segmentation model based on Potts prior and loopy belief propagation. The proposed Bayesian model involves several terms, including the pairwise interactions of Potts models, and the average vectors and covariant matrices of Gauss distributions in color image modeling. These terms are often referred to as hyperparameters in statistical machine learning theory. In order to determine these hyperparameters, we propose a new scheme for hyperparameter estimation based on conditional maximization of entropy in the Potts prior. The algorithm is given based on loopy belief propagation. In addition, we compare our conditional maximum entropy framework with the conventional maximum likelihood framework, and also clarify how the first order phase transitions in loopy belief propagations for Potts models influence our hyperparameter estimation procedures.

  17. The Effect of Prior Knowledge and Gender on Physics Achievement

    Science.gov (United States)

    Stewart, John; Henderson, Rachel

    2017-01-01

    Gender differences on the Conceptual Survey in Electricity and Magnetism (CSEM) have been extensively studied. Ten semesters (N=1621) of CSEM data is presented showing male students outperform female students on the CSEM posttest by 5 % (p qualitative in-semester test questions by 3 % (p = . 004), but no significant difference between male and female students was found on quantitative test questions. Male students enter the class with superior prior preparation in the subject and score 4 % higher on the CSEM pretest (p questions correctly (N=822), male and female differences on the CSEM and qualitative test questions cease to be significant. This suggests no intrinsic gender bias exists in the CSEM itself and that gender differences are the result of prior preparation measured by CSEM pretest score. Gender differences between male and female students increase with pretest score. Regression analyses are presented to further explore interactions between preparation, gender, and achievement.

  18. Asessment of unskilled adults' prior learning fair to whom

    DEFF Research Database (Denmark)

    Aarkrog, Vibe

    (Dreyfus & Dreyfus), and Bernstein’s distinction between horizontal and vertical learning, the paper gives an account of the students’ development in relation to assessment of their prior learning. The study includes a number of VET-programs. The paper focuses on two of them: Social and health care......This paper discusses research that examined the meeting between on the one hand the adults’ prior learning and on the other the school system and curricular standards. Applying a theoretical frame that includes concepts of communities of practice (Wenger), the development from novice to expert...... and childcare assistant. It addresses questions of what is a fair APL, perceived in relation to both the adults’ knowing in practice and the qualification standards, formulated in the learning outcome descriptions of the programs...

  19. Assessment of unskilled adults’ prior learning – fair to whom?

    DEFF Research Database (Denmark)

    Aarkrog, Vibe

    2014-01-01

    As in many other countries, Danish adult education policy focuses on how to encourage adults for education; the most important and challenging group of adults being those with few or no formal qualifications. Assessment of prior learning (APL) is perceived as an important tool for motivating adults...... the school system and curricular standards. Applying a theoretical frame that includes concepts of communities of practice (Wenger), the development from novice to expert (Dreyfus & Dreyfus), and Bernstein’s distinction between horizontal and vertical learning, the paper gives an account of the students......’ development in relation to assessment of their prior learning. The study includes a number of VET-programs. The paper focuses on one of them: Social and health care and clerical assistant. It addresses questions of what is a fair APL, perceived in relation to both the adults’ knowing in practice...

  20. Incorporating outcome uncertainty and prior outcome beliefs in stated preferences

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Jacobsen, Jette Bredahl; Hanley, Nick

    2015-01-01

    Stated preference studies tell respondents that policies create environmental changes with varying levels of uncertainty. However, respondents may include their own a priori assessments of uncertainty when making choices among policy options. Using a choice experiment eliciting respondents......’ preferences for conservation policies under climate change, we find that higher outcome uncertainty reduces utility. When accounting for endogeneity, we find that prior beliefs play a significant role in this cost of uncertainty. Thus, merely stating “objective” levels of outcome uncertainty...

  1. Morbidity in early Parkinson's disease and prior to diagnosis

    DEFF Research Database (Denmark)

    Frandsen, Rune; Kjellberg, Jakob; Ibsen, Rikke

    2014-01-01

    BACKGROUND: Nonmotor symptoms are probably present prior to, early on, and following, a diagnosis of Parkinson's disease. Nonmotor symptoms may hold important information about the progression of Parkinson's disease. OBJECTIVE: To evaluated the total early and prediagnostic morbidities in the 3......, poisoning and certain other external causes, and other factors influencing health status and contact with health services. It was negatively associated with neoplasm, cardiovascular, and respiratory diseases. CONCLUSIONS: Patients with a diagnosis of Parkinson's disease present significant differences...

  2. Anticipatory parental care: acquiring resources for offspring prior to conception.

    OpenAIRE

    Boutin, S; Larsen, K W; Berteaux, D

    2000-01-01

    Many organisms acquire and defend resources outside the breeding season and this is thought to be for immediate survival and reproductive benefits. Female red squirrels (Tamiasciurus hudsonicus) acquire traditional food cache sites up to four months prior to the presence of any physiological or behavioural cues associated with mating or offspring dependency. They subsequently relinquish these resources to one of their offspring at independence (ten months later). We experimentally show that a...

  3. Rapid sampling of molecular motions with prior information constraints.

    Science.gov (United States)

    Raveh, Barak; Enosh, Angela; Schueler-Furman, Ora; Halperin, Dan

    2009-02-01

    Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT). Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  4. Real earnings management activities prior to bond issuance

    Directory of Open Access Journals (Sweden)

    Cristhian Mellado-Cid

    2017-07-01

    Full Text Available We examine real activities manipulation by firms prior to their debt issuances and how such manipulation activities affect bond yield spreads. We find that bond-issuing firms increase their real activities manipulation in the five quarters leading to a bond issuance. We document an inverse association between yield spread and pre-issue real activities manipulation, i.e., firms engaged in abnormally high levels of real activities manipulation are associated with subsequent lower cost of debt.

  5. Astronauts Parise and Jernigan check helmets prior to training session

    Science.gov (United States)

    1994-01-01

    Attired in training versions of the Shuttle partial-pressure launch and entry suits, payload specialist Dr. Ronald A Parise (left) and astronaut Tamara E. Jernigan, payload commander, check over their helmets prior to a training session. Holding the helmets is suit expert Alan M. Rochford, of NASA. The two were about to join their crew mates in a session of emergency bailout training at JSC's Weightless Environment Training Facility (WETF).

  6. Rapid sampling of molecular motions with prior information constraints.

    Directory of Open Access Journals (Sweden)

    Barak Raveh

    2009-02-01

    Full Text Available Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT. Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  7. Hysteresis as an Implicit Prior in Tactile Spatial Decision Making

    Science.gov (United States)

    Thiel, Sabrina D.; Bitzer, Sebastian; Nierhaus, Till; Kalberlah, Christian; Preusser, Sven; Neumann, Jane; Nikulin, Vadim V.; van der Meer, Elke; Villringer, Arno; Pleger, Burkhard

    2014-01-01

    Perceptual decisions not only depend on the incoming information from sensory systems but constitute a combination of current sensory evidence and internally accumulated information from past encounters. Although recent evidence emphasizes the fundamental role of prior knowledge for perceptual decision making, only few studies have quantified the relevance of such priors on perceptual decisions and examined their interplay with other decision-relevant factors, such as the stimulus properties. In the present study we asked whether hysteresis, describing the stability of a percept despite a change in stimulus property and known to occur at perceptual thresholds, also acts as a form of an implicit prior in tactile spatial decision making, supporting the stability of a decision across successively presented random stimuli (i.e., decision hysteresis). We applied a variant of the classical 2-point discrimination task and found that hysteresis influenced perceptual decision making: Participants were more likely to decide ‘same’ rather than ‘different’ on successively presented pin distances. In a direct comparison between the influence of applied pin distances (explicit stimulus property) and hysteresis, we found that on average, stimulus property explained significantly more variance of participants’ decisions than hysteresis. However, when focusing on pin distances at threshold, we found a trend for hysteresis to explain more variance. Furthermore, the less variance was explained by the pin distance on a given decision, the more variance was explained by hysteresis, and vice versa. Our findings suggest that hysteresis acts as an implicit prior in tactile spatial decision making that becomes increasingly important when explicit stimulus properties provide decreasing evidence. PMID:24587045

  8. Analysis of Extracting Prior BRDF from MODIS BRDF Data

    OpenAIRE

    Hu Zhang; Ziti Jiao; Yadong Dong; Peng Du; Yang Li; Yi Lian; Tiejun Cui

    2016-01-01

    Many previous studies have attempted to extract prior reflectance anisotropy knowledge from the historical MODIS Bidirectional Reflectance Distribution Function (BRDF) product based on land cover or Normalized Difference Vegetation Index (NDVI) data. In this study, the feasibility of the method is discussed based on MODIS data and archetypal BRDFs. The BRDF is simplified into six archetypal BRDFs that represent different reflectance anisotropies. Five-year time series of MODIS BRDF data over ...

  9. The search for Infrared radiation prior to major earthquakes

    Science.gov (United States)

    Ouzounov, D.; Taylor, P.; Pulinets, S.

    2004-12-01

    This work describes our search for a relationship between tectonic stresses and electro-chemical and thermodynamic processes in the Earth and increases in mid-IR flux as part of a possible ensemble of electromagnetic (EM) phenomena that may be related to earthquake activity. Recent analysis of continuous ongoing long- wavelength Earth radiation (OLR) indicates significant and anomalous variability prior to some earthquakes. The cause of these anomalies is not well understood but could be the result of a triggering by an interaction between the lithosphere-hydrosphere and atmospheric related to changes in the near surface electrical field and gas composition prior to the earthquake. The OLR anomaly covers large areas surrounding the main epicenter. We have use the NOAA IR data to differentiate between the global and seasonal variability and these transient local anomalies. Indeed, on the basis of a temporal and spatial distribution analysis, an anomaly pattern is found to occur several days prior some major earthquakes. The significance of these observations was explored using data sets of some recent worldwide events.

  10. Generalized species sampling priors with latent Beta reinforcements

    Science.gov (United States)

    Airoldi, Edoardo M.; Costa, Thiago; Bassetti, Federico; Leisen, Fabrizio; Guindani, Michele

    2014-01-01

    Many popular Bayesian nonparametric priors can be characterized in terms of exchangeable species sampling sequences. However, in some applications, exchangeability may not be appropriate. We introduce a novel and probabilistically coherent family of non-exchangeable species sampling sequences characterized by a tractable predictive probability function with weights driven by a sequence of independent Beta random variables. We compare their theoretical clustering properties with those of the Dirichlet Process and the two parameters Poisson-Dirichlet process. The proposed construction provides a complete characterization of the joint process, differently from existing work. We then propose the use of such process as prior distribution in a hierarchical Bayes modeling framework, and we describe a Markov Chain Monte Carlo sampler for posterior inference. We evaluate the performance of the prior and the robustness of the resulting inference in a simulation study, providing a comparison with popular Dirichlet Processes mixtures and Hidden Markov Models. Finally, we develop an application to the detection of chromosomal aberrations in breast cancer by leveraging array CGH data. PMID:25870462

  11. Evaluation of the macula prior to cataract surgery.

    Science.gov (United States)

    McKeague, Marta; Sharma, Priya; Ho, Allen C

    2018-01-01

    To describe recent evidence regarding methods of evaluation of retinal structure and function prior to cataract surgery. Studies in patients with cataract but no clinically detectable retinal disease have shown that routine use of optical coherence tomography (OCT) prior to cataract surgery can detect subtle macular disease, which may alter the course of treatment or lead to modification of consent. The routine use of OCT has been especially useful in patients being considered for advanced-technology intraocular lenses (IOLs) as subtle macular disease can be a contraindication to the use of these lenses. The cost-effectiveness of routine use of OCT prior to cataract surgery has not been studied. Other technologies that assess retinal function rather than structure, such as microperimetry and electroretinogram (ERG) need further study to determine whether they can predict retinal potential in cataract patients. There is growing evidence for the importance of more detailed retinal evaluation of cataract patients even with clinically normal exam. OCT has been the most established and studied method for retinal evaluation in cataract patients, but other technologies such as microperimetry and ERG are beginning to be studied.

  12. Natural priors, CMSSM fits and LHC weather forecasts

    International Nuclear Information System (INIS)

    Allanach, Benjamin C.; Cranmer, Kyle; Lester, Christopher G.; Weber, Arne M.

    2007-01-01

    Previous LHC forecasts for the constrained minimal supersymmetric standard model (CMSSM), based on current astrophysical and laboratory measurements, have used priors that are flat in the parameter tan β, while being constrained to postdict the central experimental value of M Z . We construct a different, new and more natural prior with a measure in μ and B (the more fundamental MSSM parameters from which tan β and M Z are actually derived). We find that as a consequence this choice leads to a well defined fine-tuning measure in the parameter space. We investigate the effect of such on global CMSSM fits to indirect constraints, providing posterior probability distributions for Large Hadron Collider (LHC) sparticle production cross sections. The change in priors has a significant effect, strongly suppressing the pseudoscalar Higgs boson dark matter annihilation region, and diminishing the probable values of sparticle masses. We also show how to interpret fit information from a Markov Chain Monte Carlo in a frequentist fashion; namely by using the profile likelihood. Bayesian and frequentist interpretations of CMSSM fits are compared and contrasted

  13. SU-E-J-68: Adaptive Radiotherapy of Head and Neck Cancer: Re-Planning Based On Prior Dose

    Energy Technology Data Exchange (ETDEWEB)

    Dogan, N; Padgett, K [University of Miami Miller School of Medicine, Miami, FL (United States); Evans, J; Sleeman, W; Song, S [Virginia Commonwealth University, Richmond, VA (United States); Fatyga, M [Mayo Clinic Arizona, Phoenix, AZ (United States)

    2015-06-15

    Purpose: Adaptive Radiotherapy (ART) with frequent CT imaging has been used to improve dosimetric accuracy by accounting for anatomical variations, such as primary tumor shrinkage and/or body weight loss, in Head and Neck (H&N) patients. In most ART strategies, the difference between the planned and the delivered dose is estimated by generating new plans on repeated CT scans using dose-volume constraints used with the initial planning CT without considering already delivered dose. The aim of this study was to assess the dosimetric gains achieved by re-planning based on prior dose by comparing them to re-planning not based-on prior dose for H&N patients. Methods: Ten locally-advanced H&N cancer patients were selected for this study. For each patient, six weekly CT imaging were acquired during the course of radiotherapy. PTVs, parotids, cord, brainstem, and esophagus were contoured on both planning and six weekly CT images. ART with weekly re-plans were done by two strategies: 1) Generating a new optimized IMRT plan without including prior dose from previous fractions (NoPriorDose) and 2) Generating a new optimized IMRT plan based on the prior dose given from previous fractions (PriorDose). Deformable image registration was used to accumulate the dose distributions between planning and six weekly CT scans. The differences in accumulated doses for both strategies were evaluated using the DVH constraints for all structures. Results: On average, the differences in accumulated doses for PTV1, PTV2 and PTV3 for NoPriorDose and PriorDose strategies were <2%. The differences in Dmean to the cord and brainstem were within 3%. The esophagus Dmean was reduced by 2% using PriorDose. PriorDose strategy, however, reduced the left parotid D50 and Dmean by 15% and 14% respectively. Conclusion: This study demonstrated significant parotid sparing, potentially reducing xerostomia, by using ART with IMRT optimization based on prior dose for weekly re-planning of H&N cancer patients.

  14. Calibration and validation of coarse-grained models of atomic systems: application to semiconductor manufacturing

    Science.gov (United States)

    Farrell, Kathryn; Oden, J. Tinsley

    2014-07-01

    Coarse-grained models of atomic systems, created by aggregating groups of atoms into molecules to reduce the number of degrees of freedom, have been used for decades in important scientific and technological applications. In recent years, interest in developing a more rigorous theory for coarse graining and in assessing the predictivity of coarse-grained models has arisen. In this work, Bayesian methods for the calibration and validation of coarse-grained models of atomistic systems in thermodynamic equilibrium are developed. For specificity, only configurational models of systems in canonical ensembles are considered. Among major challenges in validating coarse-grained models are (1) the development of validation processes that lead to information essential in establishing confidence in the model's ability predict key quantities of interest and (2), above all, the determination of the coarse-grained model itself; that is, the characterization of the molecular architecture, the choice of interaction potentials and thus parameters, which best fit available data. The all-atom model is treated as the "ground truth," and it provides the basis with respect to which properties of the coarse-grained model are compared. This base all-atom model is characterized by an appropriate statistical mechanics framework in this work by canonical ensembles involving only configurational energies. The all-atom model thus supplies data for Bayesian calibration and validation methods for the molecular model. To address the first challenge, we develop priors based on the maximum entropy principle and likelihood functions based on Gaussian approximations of the uncertainties in the parameter-to-observation error. To address challenge (2), we introduce the notion of model plausibilities as a means for model selection. This methodology provides a powerful approach toward constructing coarse-grained models which are most plausible for given all-atom data. We demonstrate the theory and

  15. Assessment of prior learning in adult vocational education and training

    DEFF Research Database (Denmark)

    Aarkrog, Vibe; Wahlgren, Bjarne

    2015-01-01

    in the programs for gastronomes, respectively child care assistants the article discusses two issues in relation to APL: the encounter of practical experience and school-based knowledge and the validity and reliability of the assessment procedures. Through focusing on the students’ knowing that and knowing why...... the assessment is based on a scholastic perception of the students’ needs for training, reflecting one of the most important challenges in APL: how can practical experience be transformed into credits for the knowledge parts of the programs? The study shows that by combining several APL methods and comparing...... the teachers’ assessments the teachers respond to the issues of validity and reliability. However, validity and reliability might be even further strengthened, if the competencies are well defined, if the education system is aware ofsecuring a reasonable balance between knowing how, knowing that, and knowing...

  16. New Riemannian Priors on the Univariate Normal Model

    Directory of Open Access Journals (Sweden)

    Salem Said

    2014-07-01

    Full Text Available The current paper introduces new prior distributions on the univariate normal model, with the aim of applying them to the classification of univariate normal populations. These new prior distributions are entirely based on the Riemannian geometry of the univariate normal model, so that they can be thought of as “Riemannian priors”. Precisely, if {pθ ; θ ∈ Θ} is any parametrization of the univariate normal model, the paper considers prior distributions G( θ - , γ with hyperparameters θ - ∈ Θ and γ > 0, whose density with respect to Riemannian volume is proportional to exp(−d2(θ, θ - /2γ2, where d2(θ, θ - is the square of Rao’s Riemannian distance. The distributions G( θ - , γ are termed Gaussian distributions on the univariate normal model. The motivation for considering a distribution G( θ - , γ is that this distribution gives a geometric representation of a class or cluster of univariate normal populations. Indeed, G( θ - , γ has a unique mode θ - (precisely, θ - is the unique Riemannian center of mass of G( θ - , γ, as shown in the paper, and its dispersion away from θ - is given by γ.  Therefore, one thinks of members of the class represented by G( θ - , γ as being centered around θ - and  lying within a typical  distance determined by γ. The paper defines rigorously the Gaussian distributions G( θ - , γ and describes an algorithm for computing maximum likelihood estimates of their hyperparameters. Based on this algorithm and on the Laplace approximation, it describes how the distributions G( θ - , γ can be used as prior distributions for Bayesian classification of large univariate normal populations. In a concrete application to texture image classification, it is shown that  this  leads  to  an  improvement  in  performance  over  the  use  of  conjugate  priors.

  17. Mediastinal lymph node detection and station mapping on chest CT using spatial priors and random forest

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn; Yao, Jianhua; Lu, Le; Kim, Lauren; Turkbey, Evrim B.; Summers, Ronald M., E-mail: rms@nih.gov [Imaging Biomarkers and Computer-aided Diagnosis Laboratory, Radiology and Imaging Sciences, National Institutes of Health Clinical Center Building, 10 Room 1C224 MSC 1182, Bethesda, Maryland 20892-1182 (United States)

    2016-07-15

    Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifier for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.

  18. Mediastinal lymph node detection and station mapping on chest CT using spatial priors and random forest

    International Nuclear Information System (INIS)

    Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn; Yao, Jianhua; Lu, Le; Kim, Lauren; Turkbey, Evrim B.; Summers, Ronald M.

    2016-01-01

    Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifier for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.

  19. Prior elicitation and Bayesian analysis of the Steroids for Corneal Ulcers Trial.

    Science.gov (United States)

    See, Craig W; Srinivasan, Muthiah; Saravanan, Somu; Oldenburg, Catherine E; Esterberg, Elizabeth J; Ray, Kathryn J; Glaser, Tanya S; Tu, Elmer Y; Zegans, Michael E; McLeod, Stephen D; Acharya, Nisha R; Lietman, Thomas M

    2012-12-01

    To elicit expert opinion on the use of adjunctive corticosteroid therapy in bacterial corneal ulcers. To perform a Bayesian analysis of the Steroids for Corneal Ulcers Trial (SCUT), using expert opinion as a prior probability. The SCUT was a placebo-controlled trial assessing visual outcomes in patients receiving topical corticosteroids or placebo as adjunctive therapy for bacterial keratitis. Questionnaires were conducted at scientific meetings in India and North America to gauge expert consensus on the perceived benefit of corticosteroids as adjunct treatment. Bayesian analysis, using the questionnaire data as a prior probability and the primary outcome of SCUT as a likelihood, was performed. For comparison, an additional Bayesian analysis was performed using the results of the SCUT pilot study as a prior distribution. Indian respondents believed there to be a 1.21 Snellen line improvement, and North American respondents believed there to be a 1.24 line improvement with corticosteroid therapy. The SCUT primary outcome found a non-significant 0.09 Snellen line benefit with corticosteroid treatment. The results of the Bayesian analysis estimated a slightly greater benefit than did the SCUT primary analysis (0.19 lines verses 0.09 lines). Indian and North American experts had similar expectations on the effectiveness of corticosteroids in bacterial corneal ulcers; that corticosteroids would markedly improve visual outcomes. Bayesian analysis produced results very similar to those produced by the SCUT primary analysis. The similarity in result is likely due to the large sample size of SCUT and helps validate the results of SCUT.

  20. 3D variational brain tumor segmentation using Dirichlet priors on a clustered feature set.

    Science.gov (United States)

    Popuri, Karteek; Cobzas, Dana; Murtha, Albert; Jägersand, Martin

    2012-07-01

    Brain tumor segmentation is a required step before any radiation treatment or surgery. When performed manually, segmentation is time consuming and prone to human errors. Therefore, there have been significant efforts to automate the process. But, automatic tumor segmentation from MRI data is a particularly challenging task. Tumors have a large diversity in shape and appearance with intensities overlapping the normal brain tissues. In addition, an expanding tumor can also deflect and deform nearby tissue. In our work, we propose an automatic brain tumor segmentation method that addresses these last two difficult problems. We use the available MRI modalities (T1, T1c, T2) and their texture characteristics to construct a multidimensional feature set. Then, we extract clusters which provide a compact representation of the essential information in these features. The main idea in this work is to incorporate these clustered features into the 3D variational segmentation framework. In contrast to previous variational approaches, we propose a segmentation method that evolves the contour in a supervised fashion. The segmentation boundary is driven by the learned region statistics in the cluster space. We incorporate prior knowledge about the normal brain tissue appearance during the estimation of these region statistics. In particular, we use a Dirichlet prior that discourages the clusters from the normal brain region to be in the tumor region. This leads to a better disambiguation of the tumor from brain tissue. We evaluated the performance of our automatic segmentation method on 15 real MRI scans of brain tumor patients, with tumors that are inhomogeneous in appearance, small in size and in proximity to the major structures in the brain. Validation with the expert segmentation labels yielded encouraging results: Jaccard (58%), Precision (81%), Recall (67%), Hausdorff distance (24 mm). Using priors on the brain/tumor appearance, our proposed automatic 3D variational