WorldWideScience

Sample records for analyses predict a20

  1. Assessing the reliability, predictive and construct validity of historical, clinical and risk management-20 (HCR-20) in Mexican psychiatric inpatients.

    Science.gov (United States)

    Sada, Andrea; Robles-García, Rebeca; Martínez-López, Nicolás; Hernández-Ramírez, Rafael; Tovilla-Zarate, Carlos-Alfonso; López-Munguía, Fernando; Suárez-Alvarez, Enrique; Ayala, Xochitl; Fresán, Ana

    2016-08-01

    Assessing dangerousness to gauge the likelihood of future violent behaviour has become an integral part of clinical mental health practice in forensic and non-forensic psychiatric settings, one of the most effective instruments for this being the Historical, Clinical and Risk Management-20 (HCR-20). To examine the HCR-20 factor structure in Mexican psychiatric inpatients and to obtain its predictive validity and reliability for use in this population. In total, 225 patients diagnosed with psychotic, affective or personality disorders were included. The HCR-20 was applied at hospital admission and violent behaviours were assessed during psychiatric hospitalization using the Overt Aggression Scale (OAS). Construct validity, predictive validity and internal consistency were determined. Violent behaviour remains more severe in patients classified in the high-risk group during hospitalization. Fifteen items displayed adequate communalities in the original designated domains of the HCR-20 and internal consistency of the instruments was high. The HCR-20 is a suitable instrument for predicting violence risk in Mexican psychiatric inpatients.

  2. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma.

    Science.gov (United States)

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-06-01

    The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC).Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement.The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P histogram analyses-in particular for 1th percentile for PVP images-held promise for prediction of MVI of HCC.

  3. On the use of uncertainty analyses to test hypotheses regarding deterministic model predictions of environmental processes

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bittner, E.A.; Essington, E.H.

    1995-01-01

    This paper illustrates the use of Monte Carlo parameter uncertainty and sensitivity analyses to test hypotheses regarding predictions of deterministic models of environmental transport, dose, risk and other phenomena. The methodology is illustrated by testing whether 238 Pu is transferred more readily than 239+240 Pu from the gastrointestinal (GI) tract of cattle to their tissues (muscle, liver and blood). This illustration is based on a study wherein beef-cattle grazed for up to 1064 days on a fenced plutonium (Pu)-contaminated arid site in Area 13 near the Nevada Test Site in the United States. Periodically, cattle were sacrificed and their tissues analyzed for Pu and other radionuclides. Conditional sensitivity analyses of the model predictions were also conducted. These analyses indicated that Pu cattle tissue concentrations had the largest impact of any model parameter on the pdf of predicted Pu fractional transfers. Issues that arise in conducting uncertainty and sensitivity analyses of deterministic models are discussed. (author)

  4. Neonatal Sleep-Wake Analyses Predict 18-month Neurodevelopmental Outcomes.

    Science.gov (United States)

    Shellhaas, Renée A; Burns, Joseph W; Hassan, Fauziya; Carlson, Martha D; Barks, John D E; Chervin, Ronald D

    2017-11-01

    The neurological examination of critically ill neonates is largely limited to reflexive behavior. The exam often ignores sleep-wake physiology that may reflect brain integrity and influence long-term outcomes. We assessed whether polysomnography and concurrent cerebral near-infrared spectroscopy (NIRS) might improve prediction of 18-month neurodevelopmental outcomes. Term newborns with suspected seizures underwent standardized neurologic examinations to generate Thompson scores and had 12-hour bedside polysomnography with concurrent cerebral NIRS. For each infant, the distribution of sleep-wake stages and electroencephalogram delta power were computed. NIRS-derived fractional tissue oxygen extraction (FTOE) was calculated across sleep-wake stages. At age 18-22 months, surviving participants were evaluated with Bayley Scales of Infant Development (Bayley-III), 3rd edition. Twenty-nine participants completed Bayley-III. Increased newborn time in quiet sleep predicted worse 18-month cognitive and motor scores (robust regression models, adjusted r2 = 0.22, p = .007, and 0.27, .004, respectively). Decreased 0.5-2 Hz electroencephalograph (EEG) power during quiet sleep predicted worse 18-month language and motor scores (adjusted r2 = 0.25, p = .0005, and 0.33, .001, respectively). Predictive values remained significant after adjustment for neonatal Thompson scores or exposure to phenobarbital. Similarly, an attenuated difference in FTOE, between neonatal wakefulness and quiet sleep, predicted worse 18-month cognitive, language, and motor scores in adjusted analyses (each p sleep-as quantified by increased time in quiet sleep, lower electroencephalogram delta power during that stage, and muted differences in FTOE between quiet sleep and wakefulness-may improve prediction of adverse long-term outcomes for newborns with neurological dysfunction. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved

  5. Quantitative Prediction of Coalbed Gas Content Based on Seismic Multiple-Attribute Analyses

    Directory of Open Access Journals (Sweden)

    Renfang Pan

    2015-09-01

    Full Text Available Accurate prediction of gas planar distribution is crucial to selection and development of new CBM exploration areas. Based on seismic attributes, well logging and testing data we found that seismic absorption attenuation, after eliminating the effects of burial depth, shows an evident correlation with CBM gas content; (positive structure curvature has a negative correlation with gas content; and density has a negative correlation with gas content. It is feasible to use the hydrocarbon index (P*G and pseudo-Poisson ratio attributes for detection of gas enrichment zones. Based on seismic multiple-attribute analyses, a multiple linear regression equation was established between the seismic attributes and gas content at the drilling wells. Application of this equation to the seismic attributes at locations other than the drilling wells yielded a quantitative prediction of planar gas distribution. Prediction calculations were performed for two different models, one using pre-stack inversion and the other one disregarding pre-stack inversion. A comparison of the results indicates that both models predicted a similar trend for gas content distribution, except that the model using pre-stack inversion yielded a prediction result with considerably higher precision than the other model.

  6. Transport and stability analyses supporting disruption prediction in high beta KSTAR plasmas

    Science.gov (United States)

    Ahn, J.-H.; Sabbagh, S. A.; Park, Y. S.; Berkery, J. W.; Jiang, Y.; Riquezes, J.; Lee, H. H.; Terzolo, L.; Scott, S. D.; Wang, Z.; Glasser, A. H.

    2017-10-01

    KSTAR plasmas have reached high stability parameters in dedicated experiments, with normalized beta βN exceeding 4.3 at relatively low plasma internal inductance li (βN/li>6). Transport and stability analyses have begun on these plasmas to best understand a disruption-free path toward the design target of βN = 5 while aiming to maximize the non-inductive fraction of these plasmas. Initial analysis using the TRANSP code indicates that the non-inductive current fraction in these plasmas has exceeded 50 percent. The advent of KSTAR kinetic equilibrium reconstructions now allows more accurate computation of the MHD stability of these plasmas. Attention is placed on code validation of mode stability using the PEST-3 and resistive DCON codes. Initial evaluation of these analyses for disruption prediction is made using the disruption event characterization and forecasting (DECAF) code. The present global mode kinetic stability model in DECAF developed for low aspect ratio plasmas is evaluated to determine modifications required for successful disruption prediction of KSTAR plasmas. Work supported by U.S. DoE under contract DE-SC0016614.

  7. Predicting thermally stressful events in rivers with a strategy to evaluate management alternatives

    Science.gov (United States)

    Maloney, K.O.; Cole, J.C.; Schmid, M.

    2016-01-01

    Water temperature is an important factor in river ecology. Numerous models have been developed to predict river temperature. However, many were not designed to predict thermally stressful periods. Because such events are rare, traditionally applied analyses are inappropriate. Here, we developed two logistic regression models to predict thermally stressful events in the Delaware River at the US Geological Survey gage near Lordville, New York. One model predicted the probability of an event >20.0 °C, and a second predicted an event >22.2 °C. Both models were strong (independent test data sensitivity 0.94 and 1.00, specificity 0.96 and 0.96) predicting 63 of 67 events in the >20.0 °C model and all 15 events in the >22.2 °C model. Both showed negative relationships with released volume from the upstream Cannonsville Reservoir and positive relationships with difference between air temperature and previous day's water temperature at Lordville. We further predicted how increasing release volumes from Cannonsville Reservoir affected the probabilities of correctly predicted events. For the >20.0 °C model, an increase of 0.5 to a proportionally adjusted release (that accounts for other sources) resulted in 35.9% of events in the training data falling below cutoffs; increasing this adjustment by 1.0 resulted in 81.7% falling below cutoffs. For the >22.2 °C these adjustments resulted in 71.1% and 100.0% of events falling below cutoffs. Results from these analyses can help managers make informed decisions on alternative release scenarios.

  8. Large-scale binding ligand prediction by improved patch-based method Patch-Surfer2.0.

    Science.gov (United States)

    Zhu, Xiaolei; Xiong, Yi; Kihara, Daisuke

    2015-03-01

    Ligand binding is a key aspect of the function of many proteins. Thus, binding ligand prediction provides important insight in understanding the biological function of proteins. Binding ligand prediction is also useful for drug design and examining potential drug side effects. We present a computational method named Patch-Surfer2.0, which predicts binding ligands for a protein pocket. By representing and comparing pockets at the level of small local surface patches that characterize physicochemical properties of the local regions, the method can identify binding pockets of the same ligand even if they do not share globally similar shapes. Properties of local patches are represented by an efficient mathematical representation, 3D Zernike Descriptor. Patch-Surfer2.0 has significant technical improvements over our previous prototype, which includes a new feature that captures approximate patch position with a geodesic distance histogram. Moreover, we constructed a large comprehensive database of ligand binding pockets that will be searched against by a query. The benchmark shows better performance of Patch-Surfer2.0 over existing methods. http://kiharalab.org/patchsurfer2.0/ CONTACT: dkihara@purdue.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Validation of the online prediction tool PREDICT v. 2.0 in the Dutch breast cancer population

    NARCIS (Netherlands)

    Maaren, M.C. van; Steenbeek, C.D. van; Pharoah, P.D.; Witteveen, A.; Sonke, G.S.; Strobbe, L.J.A.; Poortmans, P.; Siesling, S.

    2017-01-01

    BACKGROUND: PREDICT version 2.0 is increasingly used to estimate prognosis in breast cancer. This study aimed to validate this tool in specific prognostic subgroups in the Netherlands. METHODS: All operated women with non-metastatic primary invasive breast cancer, diagnosed in 2005, were selected

  10. Validation of the online prediction tool PREDICT v. 2.0 in the Dutch breast cancer population

    NARCIS (Netherlands)

    van Maaren, M. C.; van Steenbeek, C. D.; Pharoah, P. D.P.; Witteveen, A.; Sonke, Gabe S.; Strobbe, L.J.A.; Poortmans, P.M.P.; Siesling, S.

    2017-01-01

    Background PREDICT version 2.0 is increasingly used to estimate prognosis in breast cancer. This study aimed to validate this tool in specific prognostic subgroups in the Netherlands. Methods All operated women with non-metastatic primary invasive breast cancer, diagnosed in 2005, were selected from

  11. Calculational criticality analyses of 10- and 20-MW UF6 freezer/sublimer vessels

    International Nuclear Information System (INIS)

    Jordan, W.C.

    1993-02-01

    Calculational criticality analyses have been performed for 10- and 20-MW UF 6 freezer/sublimer vessels. The freezer/sublimers have been analyzed over a range of conditions that encompass normal operation and abnormal conditions. The effects of HF moderation of the UF 6 in each vessel have been considered for uranium enriched between 2 and 5 wt % 235 U. The results indicate that the nuclearly safe enrichments originally established for the operation of a 10-MW freezer/sublimer, based on a hydrogen-to-uranium moderation ratio of 0.33, are acceptable. If strict moderation control can be demonstrated for hydrogen-to-uranium moderation ratios that are less than 0.33, then the enrichment limits for the 10-MW freezer/sublimer may be increased slightly. The calculations performed also allow safe enrichment limits to be established for a 20-NM freezer/sublimer under moderation control

  12. Circulating biomarkers for predicting cardiovascular disease risk; a systematic review and comprehensive overview of meta-analyses.

    Directory of Open Access Journals (Sweden)

    Thijs C van Holten

    Full Text Available BACKGROUND: Cardiovascular disease is one of the major causes of death worldwide. Assessing the risk for cardiovascular disease is an important aspect in clinical decision making and setting a therapeutic strategy, and the use of serological biomarkers may improve this. Despite an overwhelming number of studies and meta-analyses on biomarkers and cardiovascular disease, there are no comprehensive studies comparing the relevance of each biomarker. We performed a systematic review of meta-analyses on levels of serological biomarkers for atherothrombosis to compare the relevance of the most commonly studied biomarkers. METHODS AND FINDINGS: Medline and Embase were screened on search terms that were related to "arterial ischemic events" and "meta-analyses". The meta-analyses were sorted by patient groups without pre-existing cardiovascular disease, with cardiovascular disease and heterogeneous groups concerning general populations, groups with and without cardiovascular disease, or miscellaneous. These were subsequently sorted by end-point for cardiovascular disease or stroke and summarized in tables. We have identified 85 relevant full text articles, with 214 meta-analyses. Markers for primary cardiovascular events include, from high to low result: C-reactive protein, fibrinogen, cholesterol, apolipoprotein B, the apolipoprotein A/apolipoprotein B ratio, high density lipoprotein, and vitamin D. Markers for secondary cardiovascular events include, from high to low result: cardiac troponins I and T, C-reactive protein, serum creatinine, and cystatin C. For primary stroke, fibrinogen and serum uric acid are strong risk markers. Limitations reside in that there is no acknowledged search strategy for prognostic studies or meta-analyses. CONCLUSIONS: For primary cardiovascular events, markers with strong predictive potential are mainly associated with lipids. For secondary cardiovascular events, markers are more associated with ischemia. Fibrinogen is a

  13. Potential of MR histogram analyses for prediction of response to chemotherapy in patients with colorectal hepatic metastases.

    Science.gov (United States)

    Liang, He-Yue; Huang, Ya-Qin; Yang, Zhao-Xia; Ying-Ding; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-07-01

    To determine if magnetic resonance imaging (MRI) histogram analyses can help predict response to chemotherapy in patients with colorectal hepatic metastases by using response evaluation criteria in solid tumours (RECIST1.1) as the reference standard. Standard MRI including diffusion-weighted imaging (b=0, 500 s/mm(2)) was performed before chemotherapy in 53 patients with colorectal hepatic metastases. Histograms were performed for apparent diffusion coefficient (ADC) maps, arterial, and portal venous phase images; thereafter, mean, percentiles (1st, 10th, 50th, 90th, 99th), skewness, kurtosis, and variance were generated. Quantitative histogram parameters were compared between responders (partial and complete response, n=15) and non-responders (progressive and stable disease, n=38). Receiver operator characteristics (ROC) analyses were further analyzed for the significant parameters. The mean, 1st percentile, 10th percentile, 50th percentile, 90th percentile, 99th percentile of the ADC maps were significantly lower in responding group than that in non-responding group (p=0.000-0.002) with area under the ROC curve (AUCs) of 0.76-0.82. The histogram parameters of arterial and portal venous phase showed no significant difference (p>0.05) between the two groups. Histogram-derived parameters for ADC maps seem to be a promising tool for predicting response to chemotherapy in patients with colorectal hepatic metastases. • ADC histogram analyses can potentially predict chemotherapy response in colorectal liver metastases. • Lower histogram-derived parameters (mean, percentiles) for ADC tend to have good response. • MR enhancement histogram analyses are not reliable to predict response.

  14. A policy hackathon for analysing impacts and solutions up to 20 metres sea-level rise

    Science.gov (United States)

    Haasnoot, Marjolijn; Bouwer, Laurens; Kwadijk, Jaap

    2017-04-01

    We organised a policy hackathon in order to quantify the impacts accelerated and high-end sea-level rise up to 20 metres on the coast of the Netherlands, and develop possible solutions. This was done during one day, with 20 experts that had a wide variety of disciplines, including hydrology, geology, coastal engineering, economics, and public policy. During the process the problem was divided up into several sub-sets of issues that were analysed and solved within small teams of between 4 to 8 people. Both a top-down impact analysis and bottom-up vulnerability analysis was done by answering the questions: What is the impact of sea level rise of x meter?; and How much sea level rise can be accommodated with before transformative actions are needed? Next, adaptation tipping points were identified that describe conditions under which the coastal system starts to perform unacceptably. Reasons for an adaptation tipping point can be technical (technically not possible), economic (cost-benefits are negative), or resources (available space, sand, energy production, financial). The results are presented in a summary document, and through an infographic displaying different adaptation tipping points and milestones that occur when the sea level rises up to 20 m. No technical limitations were found for adaptation, but many important decisions need to be taken. Although accelerated sea level rise seems far away it can have important consequences for short-term decisions that are required for transformative actions. Such extensive actions require more time for implementation. Also, other action may become ineffective before their design life. This hackathon exercise shows that it is possible to map within a short time frame the issues at hand, as well as potentially effective solutions. This can be replicated for other problems, and can be useful for several decision-makers that require quick but in-depth analysis of their long-term planning problems.

  15. IMP 2.0: a multi-species functional genomics portal for integration, visualization and prediction of protein functions and networks.

    Science.gov (United States)

    Wong, Aaron K; Krishnan, Arjun; Yao, Victoria; Tadych, Alicja; Troyanskaya, Olga G

    2015-07-01

    IMP (Integrative Multi-species Prediction), originally released in 2012, is an interactive web server that enables molecular biologists to interpret experimental results and to generate hypotheses in the context of a large cross-organism compendium of functional predictions and networks. The system provides biologists with a framework to analyze their candidate gene sets in the context of functional networks, expanding or refining their sets using functional relationships predicted from integrated high-throughput data. IMP 2.0 integrates updated prior knowledge and data collections from the last three years in the seven supported organisms (Homo sapiens, Mus musculus, Rattus norvegicus, Drosophila melanogaster, Danio rerio, Caenorhabditis elegans, and Saccharomyces cerevisiae) and extends function prediction coverage to include human disease. IMP identifies homologs with conserved functional roles for disease knowledge transfer, allowing biologists to analyze disease contexts and predictions across all organisms. Additionally, IMP 2.0 implements a new flexible platform for experts to generate custom hypotheses about biological processes or diseases, making sophisticated data-driven methods easily accessible to researchers. IMP does not require any registration or installation and is freely available for use at http://imp.princeton.edu. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Predicting U.S. food demand in the 20th century: a new look at system dynamics

    Science.gov (United States)

    Moorthy, Mukund; Cellier, Francois E.; LaFrance, Jeffrey T.

    1998-08-01

    The paper describes a new methodology for predicting the behavior of macroeconomic variables. The approach is based on System Dynamics and Fuzzy Inductive Reasoning. A four- layer pseudo-hierarchical model is proposed. The bottom layer makes predications about population dynamics, age distributions among the populace, as well as demographics. The second layer makes predications about the general state of the economy, including such variables as inflation and unemployment. The third layer makes predictions about the demand for certain goods or services, such as milk products, used cars, mobile telephones, or internet services. The fourth and top layer makes predictions about the supply of such goods and services, both in terms of their prices. Each layer can be influenced by control variables the values of which are only determined at higher levels. In this sense, the model is not strictly hierarchical. For example, the demand for goods at level three depends on the prices of these goods, which are only determined at level four. Yet, the prices are themselves influenced by the expected demand. The methodology is exemplified by means of a macroeconomic model that makes predictions about US food demand during the 20th century.

  17. The 20-20-20 Package. Reform of Ets and scenarios cost analysis

    International Nuclear Information System (INIS)

    Clo, S.

    2008-01-01

    This article firstly analyses the important improvements that the European Commission indents to bring to the Ets, to its framework and functioning. The article intends to assess the impact of the 20-20-20 Package to the Italian industry, by analysing the different cost scenarios presented in the European Commission Impact Assessment. [it

  18. Analyses of electron and proton scattering to low excitation isoscalar states in 20Ne, 24Mg and 28Si

    International Nuclear Information System (INIS)

    Amos, K.; Bauhoff, W.

    1983-01-01

    Intermediate energy inelastic proton scattering differential cross section and polarization data from the 2 1 + states in 24 Mg and 28 Si and from the 4 1 + states in 28 Si have been analysed using the Distorted Wave Approximation with large basis models of nuclear structure. These structure models were tested by use in analyses of the longitudinal form factors obtained from inelastic electron scattering, so that analyses of the intermediate energy (p,p') data from the same transitions are then sensitive tests of the two-nucleon t-matrix. Data from these and other 2 1 + transitions in 12 C and 20 Ne at 49 MeV (24 MeV in the case of 20 Ne), were also analysed to compare models of t-matrices at lower energies. An ancilliary study of the momentum transfer dependence of effective charges has been made as both s-d shell and large basis structure models have been used to compare with form factor data up to momentum transfers of 2.5 fm -1 . The deduced momentum dependence of the effective charges is significant

  19. Combined visual and motor evoked potentials predict multiple sclerosis disability after 20 years.

    Science.gov (United States)

    Schlaeger, Regina; Schindler, Christian; Grize, Leticia; Dellas, Sophie; Radue, Ernst W; Kappos, Ludwig; Fuhr, Peter

    2014-09-01

    The development of predictors of multiple sclerosis (MS) disability is difficult due to the complex interplay of pathophysiological and adaptive processes. The purpose of this study was to investigate whether combined evoked potential (EP)-measures allow prediction of MS disability after 20 years. We examined 28 patients with clinically definite MS according to Poser's criteria with Expanded Disability Status Scale (EDSS) scores, combined visual and motor EPs at entry (T0), 6 (T1), 12 (T2) and 24 (T3) months, and a cranial magnetic resonance imaging (MRI) scan at T0 and T2. EDSS testing was repeated at year 14 (T4) and year 20 (T5). Spearman rank correlation was used. We performed a multivariable regression analysis to examine predictive relationships of the sum of z-transformed EP latencies (s-EPT0) and other baseline variables with EDSST5. We found that s-EPT0 correlated with EDSST5 (rho=0.72, pdisability in MS. © The Author(s) 2014.

  20. Validation of a Web-Based Tool to Predict the Ipsilateral Breast Tumor Recurrence (IBTR! 2.0) after Breast-Conserving Therapy for Korean Patients.

    Science.gov (United States)

    Jung, Seung Pil; Hur, Sung Mo; Lee, Se Kyung; Kim, Sangmin; Choi, Min-Young; Bae, Soo Youn; Kim, Jiyoung; Kim, Min Kuk; Kil, Won Ho; Choe, Jun-Ho; Kim, Jung-Han; Kim, Jee Soo; Nam, Seok Jin; Bae, Jeoung Won; Lee, Jeong Eon

    2013-03-01

    IBTR! 2.0 is a web-based nomogram that predicts the 10-year ipsilateral breast tumor recurrence (IBTR) rate after breast-conserving therapy. We validated this nomogram in Korean patients. The nomogram was tested for 520 Korean patients, who underwent breast-conserving surgery followed by radiation therapy. Predicted and observed 10-year outcomes were compared for the entire cohort and for each group, predefined by nomogram-predicted risks: group 1, 10%. In overall patients, the overall 10 year predicted and observed estimates of IBTR were 5.22% and 5.70% (p=0.68). In group 1, (n=124), the predicted and observed estimates were 2.25% and 1.80% (p=0.73), in group 2 (n=177), 3.95% and 3.90% (p=0.97), in group 3 (n=181), 7.14% and 8.80% (p=0.42), and in group 4 (n=38), 11.66% and 14.90% (p=0.73), respectively. In a previous validation of this nomogram based on American patients, nomogram-predicted IBTR rates were overestimated in the high-risk subgroup. However, our results based on Korean patients showed that the observed IBTR was higher than the predicted estimates in groups 3 and 4. This difference may arise from ethnic differences, as well as from the methods used to detect IBTR and the healthcare environment. IBTR! 2.0 may be considered as an acceptable nomogram in Korean patients with low- to moderate-risk of in-breast recurrence. Before widespread use of this nomogram, the IBTR! 2.0 needs a larger validation study and continuous modification.

  1. Evolutionary Analysis Predicts Sensitive Positions of MMP20 and Validates Newly- and Previously-Identified MMP20 Mutations Causing Amelogenesis Imperfecta

    Directory of Open Access Journals (Sweden)

    Barbara Gasse

    2017-06-01

    Full Text Available Amelogenesis imperfecta (AI designates a group of genetic diseases characterized by a large range of enamel disorders causing important social and health problems. These defects can result from mutations in enamel matrix proteins or protease encoding genes. A range of mutations in the enamel cleavage enzyme matrix metalloproteinase-20 gene (MMP20 produce enamel defects of varying severity. To address how various alterations produce a range of AI phenotypes, we performed a targeted analysis to find MMP20 mutations in French patients diagnosed with non-syndromic AI. Genomic DNA was isolated from saliva and MMP20 exons and exon-intron boundaries sequenced. We identified several homozygous or heterozygous mutations, putatively involved in the AI phenotypes. To validate missense mutations and predict sensitive positions in the MMP20 sequence, we evolutionarily compared 75 sequences extracted from the public databases using the Datamonkey webserver. These sequences were representative of mammalian lineages, covering more than 150 million years of evolution. This analysis allowed us to find 324 sensitive positions (out of the 483 MMP20 residues, pinpoint functionally important domains, and build an evolutionary chart of important conserved MMP20 regions. This is an efficient tool to identify new- and previously-identified mutations. We thus identified six functional MMP20 mutations in unrelated families, finding two novel mutated sites. The genotypes and phenotypes of these six mutations are described and compared. To date, 13 MMP20 mutations causing AI have been reported, making these genotypes and associated hypomature enamel phenotypes the most frequent in AI.

  2. Evolutionary Analysis Predicts Sensitive Positions of MMP20 and Validates Newly- and Previously-Identified MMP20 Mutations Causing Amelogenesis Imperfecta.

    Science.gov (United States)

    Gasse, Barbara; Prasad, Megana; Delgado, Sidney; Huckert, Mathilde; Kawczynski, Marzena; Garret-Bernardin, Annelyse; Lopez-Cazaux, Serena; Bailleul-Forestier, Isabelle; Manière, Marie-Cécile; Stoetzel, Corinne; Bloch-Zupan, Agnès; Sire, Jean-Yves

    2017-01-01

    Amelogenesis imperfecta (AI) designates a group of genetic diseases characterized by a large range of enamel disorders causing important social and health problems. These defects can result from mutations in enamel matrix proteins or protease encoding genes. A range of mutations in the enamel cleavage enzyme matrix metalloproteinase-20 gene ( MMP20 ) produce enamel defects of varying severity. To address how various alterations produce a range of AI phenotypes, we performed a targeted analysis to find MMP20 mutations in French patients diagnosed with non-syndromic AI. Genomic DNA was isolated from saliva and MMP20 exons and exon-intron boundaries sequenced. We identified several homozygous or heterozygous mutations, putatively involved in the AI phenotypes. To validate missense mutations and predict sensitive positions in the MMP20 sequence, we evolutionarily compared 75 sequences extracted from the public databases using the Datamonkey webserver. These sequences were representative of mammalian lineages, covering more than 150 million years of evolution. This analysis allowed us to find 324 sensitive positions (out of the 483 MMP20 residues), pinpoint functionally important domains, and build an evolutionary chart of important conserved MMP20 regions. This is an efficient tool to identify new- and previously-identified mutations. We thus identified six functional MMP20 mutations in unrelated families, finding two novel mutated sites. The genotypes and phenotypes of these six mutations are described and compared. To date, 13 MMP20 mutations causing AI have been reported, making these genotypes and associated hypomature enamel phenotypes the most frequent in AI.

  3. Analysing News for Stock Market Prediction

    Science.gov (United States)

    Ramalingam, V. V.; Pandian, A.; Dwivedi, shivam; Bhatt, Jigar P.

    2018-04-01

    Stock market means the aggregation of all sellers and buyers of stocks representing their ownership claims on the business. To be completely absolute about the investment on these stocks, proper knowledge about them as well as their pricing, for both present and future is very essential. Large amount of data is collected and parsed to obtain this essential information regarding the fluctuations in the stock market. This data can be any news or public opinions in general. Recently, many methods have been used, especially big unstructured data methods to predict the stock market values. We introduce another method of focusing on deriving the best statistical learning model for predicting the future values. The data set used is very large unstructured data collected from an online social platform, commonly known as Quindl. The data from this platform is then linked to a csv fie and cleaned to obtain the essential information for stock market prediction. The method consists of carrying out the NLP (Natural Language Processing) of the data and then making it easier for the system to understand, finds and identifies the correlation in between this data and the stock market fluctuations. The model is implemented using Python Programming Language throughout the entire project to obtain flexibility and convenience of the system.

  4. Structural Dynamic Analyses And Test Predictions For Spacecraft Structures With Non-Linearities

    Science.gov (United States)

    Vergniaud, Jean-Baptiste; Soula, Laurent; Newerla, Alfred

    2012-07-01

    The overall objective of the mechanical development and verification process is to ensure that the spacecraft structure is able to sustain the mechanical environments encountered during launch. In general the spacecraft structures are a-priori assumed to behave linear, i.e. the responses to a static load or dynamic excitation, respectively, will increase or decrease proportionally to the amplitude of the load or excitation induced. However, past experiences have shown that various non-linearities might exist in spacecraft structures and the consequences of their dynamic effects can significantly affect the development and verification process. Current processes are mainly adapted to linear spacecraft structure behaviour. No clear rules exist for dealing with major structure non-linearities. They are handled outside the process by individual analysis and margin policy, and analyses after tests to justify the CLA coverage. Non-linearities can primarily affect the current spacecraft development and verification process on two aspects. Prediction of flights loads by launcher/satellite coupled loads analyses (CLA): only linear satellite models are delivered for performing CLA and no well-established rules exist how to properly linearize a model when non- linearities are present. The potential impact of the linearization on the results of the CLA has not yet been properly analyzed. There are thus difficulties to assess that CLA results will cover actual flight levels. Management of satellite verification tests: the CLA results generated with a linear satellite FEM are assumed flight representative. If the internal non- linearities are present in the tested satellite then there might be difficulties to determine which input level must be passed to cover satellite internal loads. The non-linear behaviour can also disturb the shaker control, putting the satellite at risk by potentially imposing too high levels. This paper presents the results of a test campaign performed in

  5. Prediction methods environmental-effect reporting

    International Nuclear Information System (INIS)

    Jonker, R.J.; Koester, H.W.

    1987-12-01

    This report provides a survey of prediction methods which can be applied to the calculation of emissions in cuclear-reactor accidents, in the framework of environment-effect reports (dutch m.e.r.) or risk analyses. Also emissions during normal operation are important for m.e.r.. These can be derived from measured emissions of power plants being in operation. Data concerning the latter are reported. The report consists of an introduction into reactor technology, among which a description of some reactor types, the corresponding fuel cycle and dismantling scenarios - a discussion of risk-analyses for nuclear power plants and the physical processes which can play a role during accidents - a discussion of prediction methods to be employed and the expected developments in this area - some background information. (aughor). 145 refs.; 21 figs.; 20 tabs

  6. Combined analyses of 20 common obesity susceptibility variants

    DEFF Research Database (Denmark)

    Sandholt, Camilla Helene; Sparsø, Thomas; Grarup, Niels

    2010-01-01

    Genome-wide association studies and linkage studies have identified 20 validated genetic variants associated with obesity and/or related phenotypes. The variants are common, and they individually exhibit small-to-modest effect sizes.......Genome-wide association studies and linkage studies have identified 20 validated genetic variants associated with obesity and/or related phenotypes. The variants are common, and they individually exhibit small-to-modest effect sizes....

  7. Genomic Prediction of Gene Bank Wheat Landraces

    Directory of Open Access Journals (Sweden)

    José Crossa

    2016-07-01

    Full Text Available This study examines genomic prediction within 8416 Mexican landrace accessions and 2403 Iranian landrace accessions stored in gene banks. The Mexican and Iranian collections were evaluated in separate field trials, including an optimum environment for several traits, and in two separate environments (drought, D and heat, H for the highly heritable traits, days to heading (DTH, and days to maturity (DTM. Analyses accounting and not accounting for population structure were performed. Genomic prediction models include genotype × environment interaction (G × E. Two alternative prediction strategies were studied: (1 random cross-validation of the data in 20% training (TRN and 80% testing (TST (TRN20-TST80 sets, and (2 two types of core sets, “diversity” and “prediction”, including 10% and 20%, respectively, of the total collections. Accounting for population structure decreased prediction accuracy by 15–20% as compared to prediction accuracy obtained when not accounting for population structure. Accounting for population structure gave prediction accuracies for traits evaluated in one environment for TRN20-TST80 that ranged from 0.407 to 0.677 for Mexican landraces, and from 0.166 to 0.662 for Iranian landraces. Prediction accuracy of the 20% diversity core set was similar to accuracies obtained for TRN20-TST80, ranging from 0.412 to 0.654 for Mexican landraces, and from 0.182 to 0.647 for Iranian landraces. The predictive core set gave similar prediction accuracy as the diversity core set for Mexican collections, but slightly lower for Iranian collections. Prediction accuracy when incorporating G × E for DTH and DTM for Mexican landraces for TRN20-TST80 was around 0.60, which is greater than without the G × E term. For Iranian landraces, accuracies were 0.55 for the G × E model with TRN20-TST80. Results show promising prediction accuracies for potential use in germplasm enhancement and rapid introgression of exotic germplasm

  8. Measurement of the analysing power T20 in the backward elastic scattering d-vector.p in the region of Δ-excitation and theoretical analysis of this reaction

    International Nuclear Information System (INIS)

    Boudard, A.

    1984-03-01

    We have measured the analysing power T 20 in the backward elastic scattering d.p for 16 energies of the deuteron from 300 MeV to 2300 MeV. This is the region of the observed bump in the backward excitation function of the cross section. This bump is usually thought to be a signature of a Δ(3/2,3/2 + ) dynamically excited in the intermediate state. We have also measured Ay and Ayy from 70 0 to 180 0 for Tsub(d) = 1200 MeV. We have compared both T 20 and the backward cross section with a coherent sum between direct neutron exchange (ONT) and Δ excitation by intermediate exchange of π and rho mesons (TME). The overall shape of the cross section is reproduced. Unlike the earlier measurement from Argonne, there is a deep minimum in T 20 at Tsub(d) = 600 MeV, in agreement with the predictions of direct exchange models. However, an additional structure producing a second minimum at Tsub(d) = 1400 MeV (√S = 3240 MeV) is never reproduced by our calculations. This suggests either that refinements in the Δ treatment are needed or that a new reaction mechanism (resonance) takes place in that region [fr

  9. A Wear Rule and Cutter Life Prediction Model of a 20-in. TBM Cutter for Granite: A Case Study of a Water Conveyance Tunnel in China

    Science.gov (United States)

    Liu, Quansheng; Liu, Jianping; Pan, Yucong; Zhang, Xiaoping; Peng, Xingxin; Gong, Qiuming; Du, Lijie

    2017-05-01

    Disc cutter wear is one of the comprehensive results of the rock-machine interaction in tunnel boring machine (TBM) tunneling. The replacement of the disc cutter is a time-consuming and costly activity that can significantly reduce the TBM utilization ( U) and advance rate (AR), and has a major effect on the total time and cost of TBM tunneling projects. Therefore, the importance of predicting the cutter life accurately can never be overemphasized. Most cutter wear prediction models are only suitable for 17-in. or smaller disc cutters. However, use of large-diameter disc cutters has been an irresistible trend for large-section hard rock TBMs. This study attempts to reveal the genuine wear rule of a 20-in. disc cutter and develop a new empirical model for predicting the cutter life in granite based on field data collected from a water conveyance tunnel constructed by the TBM tunneling method in China. The field data including the actual cutter wear and the geological parameters along the studied tunnel were compiled in a special database that was subjected to statistical analysis to reveal the genuine wear rule of a 20-in. disc cutter and develop the reasonable correlations between some common intact rock parameters and the disc cutter life. These equations were developed based on data from massive to very massive granite with a UCS range of 40-100 MPa, which can be applied for the assessment of the cutter life of a 20-in. disc cutter in similar hard rock projects with similar rock strengths and rock abrasivities.

  10. Application of ASTEC V2.0 to severe accident analyses for German KONVOI type reactors

    International Nuclear Information System (INIS)

    Nowack, H.; Erdmann, W.; Reinke, N.

    2011-01-01

    The integral code ASTEC is jointly developed by IRSN (Institut de Radioprotection et de Surete Nucleaire, France) and GRS (Germany). Its main objective is to simulate severe accident scenarios in PWRs from the initiating event up to the release of radioactive material into the environment. This paper describes the ASTEC modeling approach and the nodalisation of a KONVOI type PWR as an application example. Results from an integral severe accident study are presented and shortcomings as well as advantages are outlined. As a conclusion, the applicability of ASTEC V2.0 for deterministic severe accident analyses used for PSA level 2 and Severe Accident Management studies will be assessed. (author)

  11. AUTO-MUTE 2.0: A Portable Framework with Enhanced Capabilities for Predicting Protein Functional Consequences upon Mutation.

    Science.gov (United States)

    Masso, Majid; Vaisman, Iosif I

    2014-01-01

    The AUTO-MUTE 2.0 stand-alone software package includes a collection of programs for predicting functional changes to proteins upon single residue substitutions, developed by combining structure-based features with trained statistical learning models. Three of the predictors evaluate changes to protein stability upon mutation, each complementing a distinct experimental approach. Two additional classifiers are available, one for predicting activity changes due to residue replacements and the other for determining the disease potential of mutations associated with nonsynonymous single nucleotide polymorphisms (nsSNPs) in human proteins. These five command-line driven tools, as well as all the supporting programs, complement those that run our AUTO-MUTE web-based server. Nevertheless, all the codes have been rewritten and substantially altered for the new portable software, and they incorporate several new features based on user feedback. Included among these upgrades is the ability to perform three highly requested tasks: to run "big data" batch jobs; to generate predictions using modified protein data bank (PDB) structures, and unpublished personal models prepared using standard PDB file formatting; and to utilize NMR structure files that contain multiple models.

  12. AUTO-MUTE 2.0: A Portable Framework with Enhanced Capabilities for Predicting Protein Functional Consequences upon Mutation

    Directory of Open Access Journals (Sweden)

    Majid Masso

    2014-01-01

    Full Text Available The AUTO-MUTE 2.0 stand-alone software package includes a collection of programs for predicting functional changes to proteins upon single residue substitutions, developed by combining structure-based features with trained statistical learning models. Three of the predictors evaluate changes to protein stability upon mutation, each complementing a distinct experimental approach. Two additional classifiers are available, one for predicting activity changes due to residue replacements and the other for determining the disease potential of mutations associated with nonsynonymous single nucleotide polymorphisms (nsSNPs in human proteins. These five command-line driven tools, as well as all the supporting programs, complement those that run our AUTO-MUTE web-based server. Nevertheless, all the codes have been rewritten and substantially altered for the new portable software, and they incorporate several new features based on user feedback. Included among these upgrades is the ability to perform three highly requested tasks: to run “big data” batch jobs; to generate predictions using modified protein data bank (PDB structures, and unpublished personal models prepared using standard PDB file formatting; and to utilize NMR structure files that contain multiple models.

  13. Factors predicting labor induction success: a critical analysis.

    Science.gov (United States)

    Crane, Joan M G

    2006-09-01

    Because of the risk of failed induction of labor, a variety of maternal and fetal factors as well as screening tests have been suggested to predict labor induction success. Certain characteristics of the woman (including parity, age, weight, height and body mass index), and of the fetus (including birth weight and gestational age) are associated with the success of labor induction; with parous, young women who are taller and lower weight having a higher rate of induction success. Fetuses with a lower birth weight or increased gestational age are also associated with increased induction success. The condition of the cervix at the start of induction is an important predictor, with the modified Bishop score being a widely used scoring system. The most important element of the Bishop score is dilatation. Other predictors, including transvaginal ultrasound (TVUS) and biochemical markers [including fetal fibronectin (fFN)] have been suggested. Meta-analyses of studies identified from MEDLINE, PubMed, and EMBASE and published from 1990 to October 2005 were performed evaluating the use of TVUS and fFN in predicting labor induction success in women at term with singleton gestations. Both TVUS and Bishop score predicted successful induction [likelihood ratio (LR)=1.82, 95% confidence interval (CI)=1.51-2.20 and LR=2.10, 95%CI=1.67-2.64, respectively]. As well, fFN and Bishop score predicted successful induction (LR=1.49, 95%CI=1.20-1.85, and LR=2.62, 95%CI=1.88-3.64, respectively). Although TVUS and fFN predicted successful labor induction, neither has been shown to be superior to Bishop score. Further research is needed to evaluate these potential predictors and insulin-like growth factor binding protein-1 (IGFBP-1), another potential biochemical marker.

  14. Condition based maintenance of a 20 kV-PE/XLPE-insulated cable network using the IRC-Analysis; Zustandsorientierte Instandhaltung eines polymerisolierten 20-kV-Kabelnetzes mit der IRC-Analyse. Moderne Diagnostik reduziert Stoerungsrate

    Energy Technology Data Exchange (ETDEWEB)

    Hoff, G.; Kranz, H.G. [BUGH Wuppertal (Germany). Labs. fuer Hochspannungstechnik; Beigert, M.; Petzold, F. [Seba Dynatronic Mess- und Ortungstechnik GmbH, Baunach (Germany); Kneissl, C. [Bereich Konzeption und Messtechnik, Lech Elektrizitaetswerke AG, Augsburg (Germany)

    2001-10-22

    For a preventive maintenance of a polymer insulated cable network a destruction free estimation of the status of the buried PE/XLPE-cables is needed. This contribution presents a condition based maintenance concept which is based on the IRC-Analysis. With this concept a major German utility was able to reduce the amount of failures in a part of the 20kV-cable network. The general tendency of increasing faults was broken. (orig.) [German] Die praeventive Instandhaltung in polymerisolierten Kabelnetzen setzt eine zerstoerungsfreie Zustandbestimmung von gelegten PE/VPE-isolierten Kabeln voraus. Die-Verfasser beschreiben ein zustandsorientiertes Wartungskonzept auf Basis der IRC-Analyse, mit dem es gelungen ist, in einem Teil des 20-kV-Kabelnetzes eines Energieversorgers die in der Vergangenheit stetig steigende Stoerungsrate drastisch zu reduzieren. (orig.)

  15. Visual Versus Fully Automated Analyses of 18F-FDG and Amyloid PET for Prediction of Dementia Due to Alzheimer Disease in Mild Cognitive Impairment.

    Science.gov (United States)

    Grimmer, Timo; Wutz, Carolin; Alexopoulos, Panagiotis; Drzezga, Alexander; Förster, Stefan; Förstl, Hans; Goldhardt, Oliver; Ortner, Marion; Sorg, Christian; Kurz, Alexander

    2016-02-01

    Biomarkers of Alzheimer disease (AD) can be imaged in vivo and can be used for diagnostic and prognostic purposes in people with cognitive decline and dementia. Indicators of amyloid deposition such as (11)C-Pittsburgh compound B ((11)C-PiB) PET are primarily used to identify or rule out brain diseases that are associated with amyloid pathology but have also been deployed to forecast the clinical course. Indicators of neuronal metabolism including (18)F-FDG PET demonstrate the localization and severity of neuronal dysfunction and are valuable for differential diagnosis and for predicting the progression from mild cognitive impairment (MCI) to dementia. It is a matter of debate whether to analyze these images visually or using automated techniques. Therefore, we compared the usefulness of both imaging methods and both analyzing strategies to predict dementia due to AD. In MCI participants, a baseline examination, including clinical and imaging assessments, and a clinical follow-up examination after a planned interval of 24 mo were performed. Of 28 MCI patients, 9 developed dementia due to AD, 2 developed frontotemporal dementia, and 1 developed moderate dementia of unknown etiology. The positive and negative predictive values and the accuracy of visual and fully automated analyses of (11)C-PiB for the prediction of progression to dementia due to AD were 0.50, 1.00, and 0.68, respectively, for the visual and 0.53, 1.00, and 0.71, respectively, for the automated analyses. Positive predictive value, negative predictive value, and accuracy of fully automated analyses of (18)F-FDG PET were 0.37, 0.78, and 0.50, respectively. Results of visual analyses were highly variable between raters but were superior to automated analyses. Both (18)F-FDG and (11)C-PiB imaging appear to be of limited use for predicting the progression from MCI to dementia due to AD in short-term follow-up, irrespective of the strategy of analysis. On the other hand, amyloid PET is extremely useful to

  16. Predictive analyses of flow-induced vibration and fretting wear in steam generator tubes

    International Nuclear Information System (INIS)

    Axisa, F.

    1989-01-01

    Maintaining the service life of PWR steam generators under highly reliable conditions requires a complex design to prevent various damaging processes, including those related to flow induced vibration. Predictive analyses have to rely on numerical tools to compute the vibratory response of multi-supported tubes in association with experimental data and semi-empirical relationships for quantifying flow-induced excitation mechanisms and tube damaging processes. In the presence of loose supports tube dynamics becomes highly nonlinear in nature. To deal with such problems CEA and FRAMATOME developed a computer program called GERBOISE. This paper provides a short description of an experimental program currently in progress at CEN Saclay to validate the numerical methods implemented in GERBOISE. According to the results obtained so far reasonable agreement is obtained between experiment and numerical simulation, especially as averaged quantities are concerned

  17. Comparison of the prognostic and predictive utilities of the 21-gene Recurrence Score assay and Adjuvant! for women with node-negative, ER-positive breast cancer: results from NSABP B-14 and NSABP B-20.

    Science.gov (United States)

    Tang, Gong; Shak, Steven; Paik, Soonmyung; Anderson, Stewart J; Costantino, Joseph P; Geyer, Charles E; Mamounas, Eleftherios P; Wickerham, D Lawrence; Wolmark, Norman

    2011-05-01

    The Oncotype DX Recurrence Score (RS) is a validated genomic predictor of outcome and response to adjuvant chemotherapy in ER-positive breast cancer. Adjuvant! was developed using SEER registry data and results from the Early Breast Cancer Clinical Trialists' overview analyses to estimate outcome and benefit from adjuvant hormonal therapy and chemotherapy. In this report we compare the prognostic and predictive utility of these two tools in node-negative, ER-positive breast cancer. RS and Adjuvant! results were available from 668 tamoxifen-treated NSABP B-14 patients, 227 tamoxifen-treated NSABP B-20 patients, and 424 chemotherapy plus tamoxifen-treated B-20 patients. Adjuvant! results were also available from 1952 B-20 patients. The primary endpoint was distant recurrence-free interval (DRFI). Cox proportional hazards models were used to compare the prognostic and predictive utility of RS and Adjuvant!. Both RS (P < 0.001) and Adjuvant! (P = 0.002) provided strong independent prognostic information in tamoxifen-treated patients. Combining RS and individual clinicopathologic characteristics provided greater prognostic discrimination than combining RS and the composite Adjuvant!. In the B-20 cohort with RS results (n = 651), RS was significantly predictive of chemotherapy benefit (interaction P = 0.031 for DRFI, P = 0.011 for overall survival [OS], P = 0.082 for disease-free survival [DFS]), but Adjuvant! was not (interaction P = 0.99, P = 0.311, and P = 0.357, respectively). However, in the larger B-20 sub-cohort (n = 1952), Adjuvant! was significantly predictive of chemotherapy benefit for OS (interaction P = 0.009) but not for DRFI (P = 0.219) or DFS (P = 0.099). Prognostic estimates can be optimized by combining RS and clinicopathologic information instead of simply combining RS and Adjuvant!. RS should be used for estimating relative chemotherapy benefit.

  18. A cycle simulation model for predicting the performance of a diesel engine fuelled by diesel and biodiesel blends

    International Nuclear Information System (INIS)

    Gogoi, T.K.; Baruah, D.C.

    2010-01-01

    Among the alternative fuels, biodiesel and its blends are considered suitable and the most promising fuel for diesel engine. The properties of biodiesel are found similar to that of diesel. Many researchers have experimentally evaluated the performance characteristics of conventional diesel engines fuelled by biodiesel and its blends. However, experiments require enormous effort, money and time. Hence, a cycle simulation model incorporating a thermodynamic based single zone combustion model is developed to predict the performance of diesel engine. The effect of engine speed and compression ratio on brake power and brake thermal efficiency is analysed through the model. The fuel considered for the analysis are diesel, 20%, 40%, 60% blending of diesel and biodiesel derived from Karanja oil (Pongamia Glabra). The model predicts similar performance with diesel, 20% and 40% blending. However, with 60% blending, it reveals better performance in terms of brake power and brake thermal efficiency.

  19. On the requirement for remodelling the spent nuclear fuel transportation casks for research reactors. A review of the drop impact analyses of JRC-80Y-20T

    International Nuclear Information System (INIS)

    2005-07-01

    The Japan Atomic Energy Research Institute (JAERI) constructed two stainless steel transportation casks, JRC-80Y-20T, for spent nuclear fuels of research reactors and had utilized them for transportation since 1981. A modification of the design was applied to the United States of America (USA) for transportation of silicide fuels. Additional analyses employing the impact analysis code LS-DYNA that was often used for safety analysis were submitted by the JAERI to the USA in 2003 to show integrity of the packages; the casks were still not approved, because inelastic deformation was occurred on the surface of the lid touching to the body. To resolve this problem on design approval of transportation casks, a review group was formed in June 2004. The group examined the impact analyses by reviewing the input data and performing the sensitivity analyses. As the drop impact analyses were found to be practically reasonable, it was concluded that the approval of the USA for the transportation casks could not be obtained just by revising the analyses; therefore, remodelling the casks is required. (author)

  20. Analyses of hypothetical nuclear criticality excursions in 10- and 20-MW freezer/sublimer vessels

    International Nuclear Information System (INIS)

    Haught, C.F.; Jordan, W.C.; Basoglu, B.; Dodds, H.L.; Wilkinson, A.D.

    1995-01-01

    A theoretical model is used to predict the consequences of a postulated hypothetical nuclear criticality excursion in a freezer/sublimer (F/S). Previous work has shown that an intrusion of water into a F/S may result in a critical configuration. A first attempt is made to model the neutronic and thermal-hydraulic phenomena occurring during a criticality excursion involving both uranium hexafluoride (UF 6 ) and uranyl fluoride (UO 2 F 2 ) solution, which is present in the F/S during upset conditions. The model employs point neutronics coupled with simple thermal hydraulics. Reactivity feedback from changes in the properties of the system are included in the model. The excursion is studied in a 10-MW F/S with an initial load of 3,500 kg of 5% weight enriched UF 6 and in a 20-MW F/S with an initial load of 6,800 kg of 2% weight enriched UF 6 . The magnitude of the fission release determined in this work is 5.93 x 10 18 fissions in the 10-MW F/S and 4.21 x 10 18 fissions in the 20-MW F/S. In order to demonstrate the reliability of the techniques used in this work, a limited validation study was conducted by comparing the fission release and peak fission rate determined by this work with experimental results for a limited number of experiments. The agreement between calculations and experiments in the validation study is considered to be satisfactory. The calculational results for the hypothetical accidents in the two F/S vessels appear reasonable

  1. Functional enrichment analyses and construction of functional similarity networks with high confidence function prediction by PFP

    Directory of Open Access Journals (Sweden)

    Kihara Daisuke

    2010-05-01

    Full Text Available Abstract Background A new paradigm of biological investigation takes advantage of technologies that produce large high throughput datasets, including genome sequences, interactions of proteins, and gene expression. The ability of biologists to analyze and interpret such data relies on functional annotation of the included proteins, but even in highly characterized organisms many proteins can lack the functional evidence necessary to infer their biological relevance. Results Here we have applied high confidence function predictions from our automated prediction system, PFP, to three genome sequences, Escherichia coli, Saccharomyces cerevisiae, and Plasmodium falciparum (malaria. The number of annotated genes is increased by PFP to over 90% for all of the genomes. Using the large coverage of the function annotation, we introduced the functional similarity networks which represent the functional space of the proteomes. Four different functional similarity networks are constructed for each proteome, one each by considering similarity in a single Gene Ontology (GO category, i.e. Biological Process, Cellular Component, and Molecular Function, and another one by considering overall similarity with the funSim score. The functional similarity networks are shown to have higher modularity than the protein-protein interaction network. Moreover, the funSim score network is distinct from the single GO-score networks by showing a higher clustering degree exponent value and thus has a higher tendency to be hierarchical. In addition, examining function assignments to the protein-protein interaction network and local regions of genomes has identified numerous cases where subnetworks or local regions have functionally coherent proteins. These results will help interpreting interactions of proteins and gene orders in a genome. Several examples of both analyses are highlighted. Conclusion The analyses demonstrate that applying high confidence predictions from PFP

  2. Prediction of the systemic exposure to oral 9-amino-20(S)-camptothecin using single-sample analysis

    NARCIS (Netherlands)

    Sparreboom, A.; de Jonge, M. J.; Punt, C. J.; Loos, W. J.; Nooter, K.; Stoter, G.; Porro, M. G.; Verweij, J.

    1999-01-01

    The purpose of this study was to develop and validate limited-sampling strategies for prediction of the area under the plasma-concentration time curves (AUCs) of the lactone and total (i. e., lactone plus carboxylate) forms of the novel topoisomerase-I inhibitor 9-amino-20(S)-camptothecin (9-AC).

  3. Predictive value of ocular trauma score in open globe combat eye injuries

    International Nuclear Information System (INIS)

    Islam, Q.

    2016-01-01

    Prediction of final visual outcome in ocular injuries is of paramount importance and various prognostic models have been proposed to predict final visual outcome. The objective of this study was to validate the predictive value of ocular trauma score (OTS) in patients with combat related open globe injuries and to evaluate the factors affecting the final visual outcome. Methods: Data of 93 patients admitted in AFIO Rawalpindi between Jan 2010 to June 2014 with combat related open globe ocular injuries was analysed. Initial and final best corrected visual acuity (BCVA) was categorized as No Light Perception (NLP), Light Perception (LP) to Hand Movement (HM), 1/200-19/200, 20/200-20/50, and =20/40. OTS was calculated for each eye by assigning numerical raw points to six variables and then scores were stratified into five OTS categories. Results: Mean age of study population was 28.77 ± 8.37 years. Presenting visual acuity was <20/200 (6/60) in 103 (96.23%) eyes. However, final BCVA of =20/40 (6/12) was achieved in 18 (16.82%) eyes, while 72 (67.28%) eyes had final BCVA of <20/200 (6/60). Final visual outcome in our study were similar to those in OTS study, except for NLP in category 1 (81% vs. 74%) and =20/40 in category 3 (30% vs. 41%). The OTS model predicted visual survival (LP or better) with a sensitivity of 94.80% and predicted no vision (NLP) with a specificity of 100%. Conclusion: OTS is a reliable tool for assessment of ocular injuries and predicting final visual outcome at the outset. (author)

  4. Autonomous and controlled motivational regulations for multiple health-related behaviors: between- and within-participants analyses

    Science.gov (United States)

    Hagger, M.S.; Hardcastle, S.J.; Chater, A.; Mallett, C.; Pal, S.; Chatzisarantis, N.L.D.

    2014-01-01

    Self-determination theory has been applied to the prediction of a number of health-related behaviors with self-determined or autonomous forms of motivation generally more effective in predicting health behavior than non-self-determined or controlled forms. Research has been confined to examining the motivational predictors in single health behaviors rather than comparing effects across multiple behaviors. The present study addressed this gap in the literature by testing the relative contribution of autonomous and controlling motivation to the prediction of a large number of health-related behaviors, and examining individual differences in self-determined motivation as a moderator of the effects of autonomous and controlling motivation on health behavior. Participants were undergraduate students (N = 140) who completed measures of autonomous and controlled motivational regulations and behavioral intention for 20 health-related behaviors at an initial occasion with follow-up behavioral measures taken four weeks later. Path analysis was used to test a process model for each behavior in which motivational regulations predicted behavior mediated by intentions. Some minor idiosyncratic findings aside, between-participants analyses revealed significant effects for autonomous motivational regulations on intentions and behavior across the 20 behaviors. Effects for controlled motivation on intentions and behavior were relatively modest by comparison. Intentions mediated the effect of autonomous motivation on behavior. Within-participants analyses were used to segregate the sample into individuals who based their intentions on autonomous motivation (autonomy-oriented) and controlled motivation (control-oriented). Replicating the between-participants path analyses for the process model in the autonomy- and control-oriented samples did not alter the relative effects of the motivational orientations on intention and behavior. Results provide evidence for consistent effects

  5. Cloning and molecular analyses of a gibberellin 20-oxidase gene expressed specifically in developing seeds of watermelon.

    Science.gov (United States)

    Kang, H G; Jun, S H; Kim, J; Kawaide, H; Kamiya, Y; An, G

    1999-10-01

    To understand the biosynthesis and functional role of gibberellins (GAs) in developing seeds, we isolated Cv20ox, a cDNA clone from watermelon (Citrullus lanatus) that shows significant amino acid homology with GA 20-oxidases. The complementary DNA clone was expressed in Escherichia coli as a fusion protein, which oxidized GA(12) at C-20 to the C(19) compound GA(9), a precursor of bioactive GAs. RNA-blot analysis showed that the Cv20ox gene was expressed specifically in developing seeds. The gene was strongly expressed in the integument tissues, and it was also expressed weakly in inner seed tissues. In parthenocarpic fruits induced by 1-(2-chloro-4-pyridyl)-3-phenylurea treatment, the expression pattern of Cv20ox did not change, indicating that the GA 20-oxidase gene is expressed primarily in the maternal cells of developing seeds. The promoter of Cv20ox was isolated and fused to the beta-glucuronidase (GUS) gene. In a transient expression system, beta-glucuronidase staining was detectable only in the integument tissues of developing watermelon seeds.

  6. A PREDICTABLE MULTI-THREADED MAIN-MEMORY STORAGE MANAGER

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper introduces the design and implementation of a predictable multi-threaded main-memo- ry storage manager (CS20), and emphasizes the database service mediator(DSM), an operation prediction model using exponential averaging. The memory manager, indexing, as well as lock manager in CS20 are also presented briefly. CS20 has been embedded in a mobile telecommunication service system. Practice showed, DSM effectively controls system load and hence improves the real-time characteristics of data accessing.

  7. When Bitcoin encounters information in an online forum: Using text mining to analyse user opinions and predict value fluctuation.

    Directory of Open Access Journals (Sweden)

    Young Bin Kim

    Full Text Available Bitcoin is an online currency that is used worldwide to make online payments. It has consequently become an investment vehicle in itself and is traded in a way similar to other open currencies. The ability to predict the price fluctuation of Bitcoin would therefore facilitate future investment and payment decisions. In order to predict the price fluctuation of Bitcoin, we analyse the comments posted in the Bitcoin online forum. Unlike most research on Bitcoin-related online forums, which is limited to simple sentiment analysis and does not pay sufficient attention to note-worthy user comments, our approach involved extracting keywords from Bitcoin-related user comments posted on the online forum with the aim of analytically predicting the price and extent of transaction fluctuation of the currency. The effectiveness of the proposed method is validated based on Bitcoin online forum data ranging over a period of 2.8 years from December 2013 to September 2016.

  8. When Bitcoin encounters information in an online forum: Using text mining to analyse user opinions and predict value fluctuation.

    Science.gov (United States)

    Kim, Young Bin; Lee, Jurim; Park, Nuri; Choo, Jaegul; Kim, Jong-Hyun; Kim, Chang Hun

    2017-01-01

    Bitcoin is an online currency that is used worldwide to make online payments. It has consequently become an investment vehicle in itself and is traded in a way similar to other open currencies. The ability to predict the price fluctuation of Bitcoin would therefore facilitate future investment and payment decisions. In order to predict the price fluctuation of Bitcoin, we analyse the comments posted in the Bitcoin online forum. Unlike most research on Bitcoin-related online forums, which is limited to simple sentiment analysis and does not pay sufficient attention to note-worthy user comments, our approach involved extracting keywords from Bitcoin-related user comments posted on the online forum with the aim of analytically predicting the price and extent of transaction fluctuation of the currency. The effectiveness of the proposed method is validated based on Bitcoin online forum data ranging over a period of 2.8 years from December 2013 to September 2016.

  9. Predictive Engineering Tools for Injection-Molded Long-Carbon-Thermoplastic Composites: Weight and Cost Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fifield, Leonard S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gandhi, Umesh N. [Toyota Research Inst. North America, Ann Arbor, MI (United States); Mori, Steven [MAGNA Exteriors and Interiors Corporation, Aurora, ON (Canada); Wollan, Eric J. [PlastiComp, Inc., Winona, MN (United States)

    2016-08-01

    This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used as resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.

  10. Cumulative risk, cumulative outcome: a 20-year longitudinal study.

    Directory of Open Access Journals (Sweden)

    Leslie Atkinson

    Full Text Available Cumulative risk (CR models provide some of the most robust findings in the developmental literature, predicting numerous and varied outcomes. Typically, however, these outcomes are predicted one at a time, across different samples, using concurrent designs, longitudinal designs of short duration, or retrospective designs. We predicted that a single CR index, applied within a single sample, would prospectively predict diverse outcomes, i.e., depression, intelligence, school dropout, arrest, smoking, and physical disease from childhood to adulthood. Further, we predicted that number of risk factors would predict number of adverse outcomes (cumulative outcome; CO. We also predicted that early CR (assessed at age 5/6 explains variance in CO above and beyond that explained by subsequent risk (assessed at ages 12/13 and 19/20. The sample consisted of 284 individuals, 48% of whom were diagnosed with a speech/language disorder. Cumulative risk, assessed at 5/6-, 12/13-, and 19/20-years-old, predicted aforementioned outcomes at age 25/26 in every instance. Furthermore, number of risk factors was positively associated with number of negative outcomes. Finally, early risk accounted for variance beyond that explained by later risk in the prediction of CO. We discuss these findings in terms of five criteria posed by these data, positing a "mediated net of adversity" model, suggesting that CR may increase some central integrative factor, simultaneously augmenting risk across cognitive, quality of life, psychiatric and physical health outcomes.

  11. Climate Prediction Center (CPC) US daily temperature analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The U.S. daily temperature analyses are maps depicting various temperature quantities utilizing daily maximum and minimum temperature data across the US. Maps are...

  12. Computational fluid dynamics analyses of lateral heat conduction, coolant azimuthal mixing and heat transfer predictions in a BR2 fuel assembly geometry

    International Nuclear Information System (INIS)

    Tzanos, C.P.; Dionne, B.

    2011-01-01

    To support the analyses related to the conversion of the BR2 core from highly-enriched (HEU) to low-enriched (LEU) fuel, the thermal-hydraulics codes PLTEMP and RELAP-3D are used to evaluate the safety margins during steady-state operation (PLTEMP), as well as after a loss-of-flow, loss-of-pressure, or a loss of coolant event (RELAP). In the 1-D PLTEMP and RELAP simulations, conduction in the azimuthal and axial directions is not accounted. The very good thermal conductivity of the cladding and the fuel meat and significant temperature gradients in the lateral directions (axial and azimuthal directions) could lead to a heat flux distribution that is significantly different than the power distribution. To evaluate the significance of the lateral heat conduction, 3-D computational fluid dynamics (CFD) simulations, using the CFD code STAR-CD, were performed. Safety margin calculations are typically performed for a hot stripe, i.e., an azimuthal region of the fuel plates/coolant channel containing the power peak. In a RELAP model, for example, a channel between two plates could be divided into a number of RELAP channels (stripes) in the azimuthal direction. In a PLTEMP model, the effect of azimuthal power peaking could be taken into account by using engineering factors. However, if the thermal mixing in the azimuthal direction of a coolant channel is significant, a stripping approach could be overly conservative by not taking into account this mixing. STAR-CD simulations were also performed to study the thermal mixing in the coolant. Section II of this document presents the results of the analyses of the lateral heat conduction and azimuthal thermal mixing in a coolant channel. Finally, PLTEMP and RELAP simulations rely on the use of correlations to determine heat transfer coefficients. Previous analyses showed that the Dittus-Boelter correlation gives significantly more conservative (lower) predictions than the correlations of Sieder-Tate and Petukhov. STAR-CD 3-D

  13. Interactions between risk factors in the prediction of onset of eating disorders: Exploratory hypothesis generating analyses.

    Science.gov (United States)

    Stice, Eric; Desjardins, Christopher D

    2018-06-01

    Because no study has tested for interactions between risk factors in the prediction of future onset of each eating disorder, this exploratory study addressed this lacuna to generate hypotheses to be tested in future confirmatory studies. Data from three prevention trials that targeted young women at high risk for eating disorders due to body dissatisfaction (N = 1271; M age 18.5, SD 4.2) and collected diagnostic interview data over 3-year follow-up were combined to permit sufficient power to predict onset of anorexia nervosa (AN), bulimia nervosa (BN), binge eating disorder (BED), and purging disorder (PD) using classification tree analyses, an analytic technique uniquely suited to detecting interactions. Low BMI was the most potent predictor of AN onset, and body dissatisfaction amplified this relation. Overeating was the most potent predictor of BN onset, and positive expectancies for thinness and body dissatisfaction amplified this relation. Body dissatisfaction was the most potent predictor of BED onset, and overeating, low dieting, and thin-ideal internalization amplified this relation. Dieting was the most potent predictor of PD onset, and negative affect and positive expectancies for thinness amplified this relation. Results provided evidence of amplifying interactions between risk factors suggestive of cumulative risk processes that were distinct for each disorder; future confirmatory studies should test the interactive hypotheses generated by these analyses. If hypotheses are confirmed, results may allow interventionists to target ultra high-risk subpopulations with more intensive prevention programs that are uniquely tailored for each eating disorder, potentially improving the yield of prevention efforts. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Influence of the Human Skin Tumor Type in Photodynamic Therapy Analysed by a Predictive Model

    Directory of Open Access Journals (Sweden)

    I. Salas-García

    2012-01-01

    Full Text Available Photodynamic Therapy (PDT modeling allows the prediction of the treatment results depending on the lesion properties, the photosensitizer distribution, or the optical source characteristics. We employ a predictive PDT model and apply it to different skin tumors. It takes into account optical radiation distribution, a nonhomogeneous topical photosensitizer spatial temporal distribution, and the time-dependent photochemical interaction. The predicted singlet oxygen molecular concentrations with varying optical irradiance are compared and could be directly related with the necrosis area. The results show a strong dependence on the particular lesion. This suggests the need to design optimal PDT treatment protocols adapted to the specific patient and lesion.

  15. Validation of the Web-Based IBTR! 2.0 Nomogram to Predict for Ipsilateral Breast Tumor Recurrence After Breast-Conserving Therapy.

    Science.gov (United States)

    Kindts, Isabelle; Laenen, Annouschka; Peeters, Stephanie; Janssen, Hilde; Depuydt, Tom; Nevelsteen, Ines; Van Limbergen, Erik; Weltens, Caroline

    2016-08-01

    To evaluate the IBTR! 2.0 nomogram, which predicts 10-year ipsilateral breast tumor recurrence (IBTR) after breast-conserving therapy with and without radiation therapy for breast cancer, by using a large, external, and independent cancer center database. We retrospectively identified 1898 breast cancer cases, treated with breast-conserving therapy and radiation therapy at the University Hospital Leuven from 2000 to 2007, with requisite data for the nomogram variables. Clinicopathologic factors were assessed. Two definitions of IBTR were considered where simultaneous regional or distant recurrence were either censored (conform IBTR! 2.0) or included as event. Validity of the prediction algorithm was tested in terms of discrimination and calibration. Discrimination was assessed by the concordance probability estimate and Harrell's concordance index. The mean predicted and observed 10-year estimates were compared for the entire cohort and for 4 risk groups predefined by nomogram-predicted IBTR risks, and a calibration plot was drawn. Median follow-up was 10.9 years. The 10-year IBTR rates were 1.3% and 2.1%, according to the 2 definitions of IBTR. The validation cohort differed from the development cohort with respect to the administration of hormonal therapy, surgical section margins, lymphovascular invasion, and tumor size. In univariable analysis, younger age (P=.002) and a positive nodal status (P=.048) were significantly associated with IBTR, with a trend for the omission of hormonal therapy (P=.061). The concordance probability estimate and concordance index varied between 0.57 and 0.67 for the 2 definitions of IBTR. In all 4 risk groups the model overestimated the IBTR risk. In particular, between the lowest-risk groups a limited differentiation was suggested by the calibration plot. The IBTR! 2.0 predictive model for IBTR in breast cancer patients shows substandard discriminative ability, with an overestimation of the risk in all subgroups. Copyright

  16. Validation of the Web-Based IBTR! 2.0 Nomogram to Predict for Ipsilateral Breast Tumor Recurrence After Breast-Conserving Therapy

    International Nuclear Information System (INIS)

    Kindts, Isabelle; Laenen, Annouschka; Peeters, Stephanie; Janssen, Hilde; Depuydt, Tom; Nevelsteen, Ines; Van Limbergen, Erik; Weltens, Caroline

    2016-01-01

    Purpose: To evaluate the IBTR! 2.0 nomogram, which predicts 10-year ipsilateral breast tumor recurrence (IBTR) after breast-conserving therapy with and without radiation therapy for breast cancer, by using a large, external, and independent cancer center database. Methods and Materials: We retrospectively identified 1898 breast cancer cases, treated with breast-conserving therapy and radiation therapy at the University Hospital Leuven from 2000 to 2007, with requisite data for the nomogram variables. Clinicopathologic factors were assessed. Two definitions of IBTR were considered where simultaneous regional or distant recurrence were either censored (conform IBTR! 2.0) or included as event. Validity of the prediction algorithm was tested in terms of discrimination and calibration. Discrimination was assessed by the concordance probability estimate and Harrell's concordance index. The mean predicted and observed 10-year estimates were compared for the entire cohort and for 4 risk groups predefined by nomogram-predicted IBTR risks, and a calibration plot was drawn. Results: Median follow-up was 10.9 years. The 10-year IBTR rates were 1.3% and 2.1%, according to the 2 definitions of IBTR. The validation cohort differed from the development cohort with respect to the administration of hormonal therapy, surgical section margins, lymphovascular invasion, and tumor size. In univariable analysis, younger age (P=.002) and a positive nodal status (P=.048) were significantly associated with IBTR, with a trend for the omission of hormonal therapy (P=.061). The concordance probability estimate and concordance index varied between 0.57 and 0.67 for the 2 definitions of IBTR. In all 4 risk groups the model overestimated the IBTR risk. In particular, between the lowest-risk groups a limited differentiation was suggested by the calibration plot. Conclusions: The IBTR! 2.0 predictive model for IBTR in breast cancer patients shows substandard discriminative ability, with an

  17. Validation of the Web-Based IBTR! 2.0 Nomogram to Predict for Ipsilateral Breast Tumor Recurrence After Breast-Conserving Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Kindts, Isabelle, E-mail: Isabelle.kindts@uzleuven.be [Department of Oncology, KU Leuven - University of Leuven, Leuven (Belgium); Department of Radiation Oncology, University Hospitals Leuven, Leuven (Belgium); Laenen, Annouschka [Leuven Biostatistics and Statistical Bioinformatics Center (L-Biostat), KU Leuven - University of Leuven, Leuven (Belgium); Peeters, Stephanie; Janssen, Hilde; Depuydt, Tom [Department of Oncology, KU Leuven - University of Leuven, Leuven (Belgium); Department of Radiation Oncology, University Hospitals Leuven, Leuven (Belgium); Nevelsteen, Ines [Department of Oncology, KU Leuven - University of Leuven, Leuven (Belgium); Department of Surgical Oncology, University Hospitals Leuven, Leuven (Belgium); Van Limbergen, Erik; Weltens, Caroline [Department of Oncology, KU Leuven - University of Leuven, Leuven (Belgium); Department of Radiation Oncology, University Hospitals Leuven, Leuven (Belgium)

    2016-08-01

    Purpose: To evaluate the IBTR! 2.0 nomogram, which predicts 10-year ipsilateral breast tumor recurrence (IBTR) after breast-conserving therapy with and without radiation therapy for breast cancer, by using a large, external, and independent cancer center database. Methods and Materials: We retrospectively identified 1898 breast cancer cases, treated with breast-conserving therapy and radiation therapy at the University Hospital Leuven from 2000 to 2007, with requisite data for the nomogram variables. Clinicopathologic factors were assessed. Two definitions of IBTR were considered where simultaneous regional or distant recurrence were either censored (conform IBTR! 2.0) or included as event. Validity of the prediction algorithm was tested in terms of discrimination and calibration. Discrimination was assessed by the concordance probability estimate and Harrell's concordance index. The mean predicted and observed 10-year estimates were compared for the entire cohort and for 4 risk groups predefined by nomogram-predicted IBTR risks, and a calibration plot was drawn. Results: Median follow-up was 10.9 years. The 10-year IBTR rates were 1.3% and 2.1%, according to the 2 definitions of IBTR. The validation cohort differed from the development cohort with respect to the administration of hormonal therapy, surgical section margins, lymphovascular invasion, and tumor size. In univariable analysis, younger age (P=.002) and a positive nodal status (P=.048) were significantly associated with IBTR, with a trend for the omission of hormonal therapy (P=.061). The concordance probability estimate and concordance index varied between 0.57 and 0.67 for the 2 definitions of IBTR. In all 4 risk groups the model overestimated the IBTR risk. In particular, between the lowest-risk groups a limited differentiation was suggested by the calibration plot. Conclusions: The IBTR! 2.0 predictive model for IBTR in breast cancer patients shows substandard discriminative ability, with an

  18. Tickled to death: analysing public perceptions of 'cute' videos of threatened species (slow lorises - Nycticebus spp.) on Web 2.0 sites.

    Science.gov (United States)

    Anne-Isola Nekaris, K; Nekaris, By K Anne-Isola; Campbell, Nicola; Coggins, Tim G; Rode, E Johanna; Nijman, Vincent

    2013-01-01

    The internet is gaining importance in global wildlife trade and changing perceptions of threatened species. There is little data available to examine the impact that popular Web 2.0 sites play on public perceptions of threatened species. YouTube videos portraying wildlife allow us to quantify these perceptions. Focussing on a group of threatened and globally protected primates, slow lorises, we quantify public attitudes towards wildlife conservation by analysing 12,411 comments and associated data posted on a viral YouTube video 'tickling slow loris' over a 33-months period. In the initial months a quarter of commentators indicated wanting a loris as a pet, but as facts about their conservation and ecology became more prevalent this dropped significantly. Endorsements, where people were directed to the site by celebrities, resulted mostly in numerous neutral responses with few links to conservation or awareness. Two conservation-related events, linked to Wikipedia and the airing of a television documentary, led to an increase in awareness, and ultimately to the removal of the analysed video. Slow loris videos that have gone viral have introduced these primates to a large cross-section of society that would not normally come into contact with them. Analyses of webometric data posted on the internet allow us quickly to gauge societal sentiments. We showed a clear temporal change in some views expressed but without an apparent increase in knowledge about the conservation plight of the species, or the illegal nature of slow loris trade. Celebrity endorsement of videos showing protected wildlife increases visits to such sites, but does not educate about conservation issues. The strong desire of commentators to express their want for one as a pet demonstrates the need for Web 2.0 sites to provide a mechanism via which illegal animal material can be identified and policed.

  19. Tickled to death: analysing public perceptions of 'cute' videos of threatened species (slow lorises - Nycticebus spp. on Web 2.0 sites.

    Directory of Open Access Journals (Sweden)

    K Anne-Isola Nekaris

    Full Text Available BACKGROUND: The internet is gaining importance in global wildlife trade and changing perceptions of threatened species. There is little data available to examine the impact that popular Web 2.0 sites play on public perceptions of threatened species. YouTube videos portraying wildlife allow us to quantify these perceptions. METHODOLOGY/PRINCIPAL FINDINGS: Focussing on a group of threatened and globally protected primates, slow lorises, we quantify public attitudes towards wildlife conservation by analysing 12,411 comments and associated data posted on a viral YouTube video 'tickling slow loris' over a 33-months period. In the initial months a quarter of commentators indicated wanting a loris as a pet, but as facts about their conservation and ecology became more prevalent this dropped significantly. Endorsements, where people were directed to the site by celebrities, resulted mostly in numerous neutral responses with few links to conservation or awareness. Two conservation-related events, linked to Wikipedia and the airing of a television documentary, led to an increase in awareness, and ultimately to the removal of the analysed video. CONCLUSIONS/SIGNIFICANCE: Slow loris videos that have gone viral have introduced these primates to a large cross-section of society that would not normally come into contact with them. Analyses of webometric data posted on the internet allow us quickly to gauge societal sentiments. We showed a clear temporal change in some views expressed but without an apparent increase in knowledge about the conservation plight of the species, or the illegal nature of slow loris trade. Celebrity endorsement of videos showing protected wildlife increases visits to such sites, but does not educate about conservation issues. The strong desire of commentators to express their want for one as a pet demonstrates the need for Web 2.0 sites to provide a mechanism via which illegal animal material can be identified and policed.

  20. Potential Predictability of the Sea-Surface Temperature Forced Equatorial East Africa Short Rains Interannual Variability in the 20th Century

    Science.gov (United States)

    Bahaga, T. K.; Gizaw, G.; Kucharski, F.; Diro, G. T.

    2014-12-01

    In this article, the predictability of the 20th century sea-surface temperature (SST) forced East African short rains variability is analyzed using observational data and ensembles of long atmospheric general circulation model (AGCM) simulations. To our knowledge, such an analysis for the whole 20th century using a series of AGCM ensemble simulations is carried out here for the first time. The physical mechanisms that govern the influence of SST on East African short rains in the model are also investigated. It is found that there is substantial skill in reproducing the East African short rains variability, given that the SSTs are known. Consistent with previous recent studies, it is found that the Indian Ocean and in particular the western pole of the Indian Ocean dipole (IOD) play a dominant role for the prediction skill, whereas SSTs outside the Indian Ocean play a minor role. The physical mechanism for the influence of the western Indian Ocean on East African rainfall in the model is consistent with previous findings and consists of a gill-type response to a warm (cold) anomaly that induces a westerly(easterly) low-level flow anomaly over equatorial Africa and leads to moisture flux convergence (divergence) over East Africa. On the other hand, a positive El Nino-Southern Oscillation (ENSO) anomaly leads to a spatially non-coherent reducing effect over parts of East Africa, but the relationship is not strong enough to provide any predictive skill in our model. The East African short rains prediction skill is also analyzed within a model-derived potential predictability framework and it is shown that the actual prediction skill is broadly consistent with the model potential prediction skill. Low-frequency variations of the prediction skill are mostly related to SSTs outside the Indian Ocean region and are likely due to an increased interference of ENSO with the Indian Ocean influence on East African short rains after the mid-1970s climate shift.

  1. Analyses of hypothetical FCI's in a fast reactor

    International Nuclear Information System (INIS)

    Padilla, A. Jr.; Martin, F.J.; Niccoli, L.G.

    1981-01-01

    Parametric analyses using the SIMMER code were performed to evaluate the potential for a severe recriticality from a pressure-driven recompaction caused by an energetic FCI during the transition phase of a hypothetical accident in a fast reactor. For realistic and reasonable estimates for the assumed accident conditions, a severe recriticality was not predicted. The conditions under which a severe recriticality would be obtained or averted were identified. 10 figures, 2 tables

  2. Cloning and Molecular Analyses of a Gibberellin 20-Oxidase Gene Expressed Specifically in Developing Seeds of Watermelon1

    Science.gov (United States)

    Kang, Hong-Gyu; Jun, Sung-Hoon; Kim, Junyul; Kawaide, Hiroshi; Kamiya, Yuji; An, Gynheung

    1999-01-01

    To understand the biosynthesis and functional role of gibberellins (GAs) in developing seeds, we isolated Cv20ox, a cDNA clone from watermelon (Citrullus lanatus) that shows significant amino acid homology with GA 20-oxidases. The complementary DNA clone was expressed in Escherichia coli as a fusion protein, which oxidized GA12 at C-20 to the C19 compound GA9, a precursor of bioactive GAs. RNA-blot analysis showed that the Cv20ox gene was expressed specifically in developing seeds. The gene was strongly expressed in the integument tissues, and it was also expressed weakly in inner seed tissues. In parthenocarpic fruits induced by 1-(2-chloro-4-pyridyl)-3-phenylurea treatment, the expression pattern of Cv20ox did not change, indicating that the GA 20-oxidase gene is expressed primarily in the maternal cells of developing seeds. The promoter of Cv20ox was isolated and fused to the β-glucuronidase (GUS) gene. In a transient expression system, β-glucuronidase staining was detectable only in the integument tissues of developing watermelon seeds. PMID:10517828

  3. Weight Suppression Predicts Bulimic Symptoms at 20-year Follow-up: The Mediating Role of Drive for Thinness

    Science.gov (United States)

    Bodell, Lindsay P.; Brown, Tiffany A.; Keel, Pamela K.

    2016-01-01

    Weight suppression predicts the onset and maintenance of bulimic syndromes. Despite this finding, no study has examined psychological mechanisms contributing to these associations using a longitudinal design. Given societal pressures to be thin and an actual history of higher weight, it is possible that greater weight suppression contributes to increased fear of gaining weight and preoccupation with being thin, which increase vulnerability to eating disorders. The present study investigated whether greater drive for thinness mediates associations between weight suppression and bulimic symptoms over long-term follow-up. Participants were women (n = 1190) and men (n = 509) who completed self-report surveys in college and 10- and 20- years later. Higher weight suppression at baseline predicted higher bulimic symptoms at 20-year follow-up (p symptoms, body mass index, and drive for thinness. Increased drive for thinness at 10-year follow-up mediated this effect. Findings highlight the long-lasting effect of weight suppression on bulimic symptoms and suggest that preoccupation with thinness may help maintain this association. Future studies would benefit from incorporating other hypothesized consequences of weight suppression, including biological factors, into risk models. PMID:27808544

  4. Adolescent oligomenorrhea (age 14-19) tracks into the third decade of life (age 20-28) and predicts increased cardiovascular risk factors and metabolic syndrome.

    Science.gov (United States)

    Glueck, Charles J; Woo, Jessica G; Khoury, Philip R; Morrison, John A; Daniels, Stephen R; Wang, Ping

    2015-04-01

    Assess whether adolescent oligomenorrhea (age 14-19) tracks into young adulthood (age 20-28) and predicts increased cardiometabolic risk factors, metabolic syndrome (MetS), and impaired fasting glucose-type II diabetes mellitus (IFG+T2DM). Prospective study of menstrual cyclicity and its metabolic effects in 865 black and white schoolgirls from age 9 to 19, and 605 of these 865 girls from age 20 to 28. Patterns of menstrual delays (oligomenorrhea) during ages 14-19 and ages 20-28 were closely related (ppolycystic ovary syndrome (PCOS, p=.049) predicted ages 20-28 menses delay. Menses delays during ages 14-19 and 20-28, and, their interaction product were correlated with IFG+T2DM and MetS at ages 20-28. Waist circumference (ages 20-28, prisk factor for future development of young adult IFG+T2DM, MetS, oligomenorrhea, and polycystic ovary syndrome. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. SDMtoolbox 2.0: the next generation Python-based GIS toolkit for landscape genetic, biogeographic and species distribution model analyses.

    Science.gov (United States)

    Brown, Jason L; Bennett, Joseph R; French, Connor M

    2017-01-01

    SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model's discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have 'universal' analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates-to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.

  6. A case study of GWE satellite data impact on GLA assimilation analyses of two ocean cyclones

    Science.gov (United States)

    Gallimore, R. G.; Johnson, D. R.

    1986-01-01

    The effects of the Global Weather Experiment (GWE) data obtained on January 18-20, 1979 on Goddard Laboratory for Atmospheres assimilation analyses of simultaneous cyclones in the western Pacific and Atlantic oceans are examined. The ability of satellite data within assimilation models to determine the baroclinic structures of developing extratropical cyclones is evaluated. The impact of the satellite data on the amplitude and phase of the temperature structure within the storm domain, potential energy, and baroclinic growth rate is studied. The GWE data are compared with Data Systems Test results. It is noted that it is necessary to characterize satellite effects on the baroclinic structure of cyclone waves which degrade numerical weather predictions of cyclogenesis.

  7. Exploring Factors that Predict Preservice Teachers' Intentions to Use Web 2.0 Technologies Using Decomposed Theory of Planned Behavior

    Science.gov (United States)

    Sadaf, Ayesha; Newby, Timothy J.; Ertmer, Peggy A.

    2013-01-01

    This study investigated factors that predict preservice teachers' intentions to use Web 2.0 technologies in their future classrooms. The researchers used a mixed-methods research design and collected qualitative interview data (n = 7) to triangulate quantitative survey data (n = 286). Results indicate that positive attitudes and perceptions of…

  8. Analysing Twitter and web queries for flu trend prediction.

    Science.gov (United States)

    Santos, José Carlos; Matos, Sérgio

    2014-05-07

    Social media platforms encourage people to share diverse aspects of their daily life. Among these, shared health related information might be used to infer health status and incidence rates for specific conditions or symptoms. In this work, we present an infodemiology study that evaluates the use of Twitter messages and search engine query logs to estimate and predict the incidence rate of influenza like illness in Portugal. Based on a manually classified dataset of 2704 tweets from Portugal, we selected a set of 650 textual features to train a Naïve Bayes classifier to identify tweets mentioning flu or flu-like illness or symptoms. We obtained a precision of 0.78 and an F-measure of 0.83, based on cross validation over the complete annotated set. Furthermore, we trained a multiple linear regression model to estimate the health-monitoring data from the Influenzanet project, using as predictors the relative frequencies obtained from the tweet classification results and from query logs, and achieved a correlation ratio of 0.89 (puser-generated content have mostly focused on the english language. Our results further validate those studies and show that by changing the initial steps of data preprocessing and feature extraction and selection, the proposed approaches can be adapted to other languages. Additionally, we investigated whether the predictive model created can be applied to data from the subsequent flu season. In this case, although the prediction result was good, an initial phase to adapt the regression model could be necessary to achieve more robust results.

  9. Monitoring and predicting crop growth and analysing agricultural ecosystems by remote sensing

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Akiyama

    1996-05-01

    Full Text Available LANDSAT/TM data, which are characterized by high spectral/spatial resolutions, are able to contribute to practical agricultural management. In the first part of the paper, the authors review some recent applications of satellite remote sensing in agriculture. Techniques for crop discrimination and mapping have made such rapid progress that we can classify crop types with more than 80% accuracy. The estimation of crop biomass using satellite data, including leaf area, dry and fresh weights, and the prediction of grain yield, has been attempted using various spectral vegetation indices. Plant stresses caused by nutrient deficiency and water deficit have also been analysed successfully. Such information may be useful for farm management. In the latter half of the paper, we introduce the Arctic Science Project, which was carried out under the Science and Technology Agency of Japan collaborating with Finnish scientists. In this project, monitoring of the boreal forest was carried out using LANDSAT data. Changes in the phenology of subarctic ground vegetation, based on spectral properties, were measured by a boom-mounted, four-band spectroradiometer. The turning point dates of the seasonal near-infrared (NIR and red (R reflectance factors might indicate the end of growth and the beginning of autumnal tints, respectively.

  10. Tooth width predictions in a sample of Black South Africans.

    Science.gov (United States)

    Khan, M I; Seedat, A K; Hlongwa, P

    2007-07-01

    Space analysis during the mixed dentition requires prediction of the mesiodistal widths of the unerupted permanent canines and premolars and prediction tables and equations may be used for this purpose. The Tanaka and Johnston prediction equations, which were derived from a North American White sample, is one example which is widely used. This prediction equation may be inapplicable to other race groups due to racial tooth size variability. Therefore the purpose of this study was to derive prediction equations that would be applicable to Black South African subjects. One hundred and ten pre-treatment study casts of Black South African subjects were analysed from the Department of Orthodontics' records at the University of Limpopo. The sample was equally divided by gender with all subjects having Class I molar relationship and relatively well aligned teeth. The mesiodistal widths of the maxillary and mandibular canines and premolars were measured with a digital vernier calliper and compared with the measurements predicted with the Tanaka and Johnston equations. The relationship between the measured and predicted values were analysed by correlation and regression analyses. The results indicated that the Tanaka and Johnston prediction equations were not fully applicable to the Black South African sample. The equations tended to underpredict the male sample, while slight overprediction was observed in the female sample. Therefore, new equations were formulated and proposed that would be accurate for Black subjects.

  11. High neuroticism at age 20 predicts history of mental disorders and low self-esteem at age 35.

    Science.gov (United States)

    Lönnqvist, Jan-Erik; Verkasalo, Markku; Mäkinen, Seppo; Henriksson, Markus

    2009-07-01

    The authors assessed whether neuroticism in emerging adulthood predicts mental disorders and self-esteem in early adulthood after controlling for possible confounding variables. A sample of 69 male military conscripts was initially assessed at age 20 and again as civilians at age 35. The initial assessment included a psychiatric interview, objective indicators of conscript competence, an intellectual performance test, and neuroticism questionnaires. The follow-up assessment included a Structured Clinical Interview for DSM-IV (SCID; First, Spitzer, Gibbon, & Williams, 1996) and the Rosenberg Self-Esteem Scale (Rosenberg, 1965). Neuroticism predicted future mental disorders and low self-esteem beyond more objective indicators of adjustment. The results support the use of neuroticism as a predictor of future mental disorders, even over periods of time when personality is subject to change.

  12. Prediction of Seismic Slope Displacements by Dynamic Stick-Slip Analyses

    International Nuclear Information System (INIS)

    Ausilio, Ernesto; Costanzo, Antonio; Silvestri, Francesco; Tropeano, Giuseppe

    2008-01-01

    A good-working balance between simplicity and reliability in assessing seismic slope stability is represented by displacement-based methods, in which the effects of deformability and ductility can be either decoupled or coupled in the dynamic analyses. In this paper, a 1D lumped mass ''stick-slip'' model is developed, accounting for soil heterogeneity and non-linear behaviour, with a base sliding mechanism at a potential rupture surface. The results of the preliminary calibration show a good agreement with frequency-domain site response analysis in no-slip conditions. The comparison with rigid sliding block analyses and with the decoupled approach proves that the stick-slip procedure can result increasingly unconservative for soft soils and deep sliding depths

  13. Factors predicting the development of pressure ulcers in an at-risk population who receive standardized preventive care: secondary analyses of a multicentre randomised controlled trial.

    Science.gov (United States)

    Demarre, Liesbet; Verhaeghe, Sofie; Van Hecke, Ann; Clays, Els; Grypdonck, Maria; Beeckman, Dimitri

    2015-02-01

    To identify predictive factors associated with the development of pressure ulcers in patients at risk who receive standardized preventive care. Numerous studies have examined factors that predict risk for pressure ulcer development. Only a few studies identified risk factors associated with pressure ulcer development in hospitalized patients receiving standardized preventive care. Secondary analyses of data collected in a multicentre randomized controlled trial. The sample consisted of 610 consecutive patients at risk for pressure ulcer development (Braden Score Pressure ulcers in category II-IV were significantly associated with non-blanchable erythema, urogenital disorders and higher body temperature. Predictive factors significantly associated with superficial pressure ulcers were admission to an internal medicine ward, incontinence-associated dermatitis, non-blanchable erythema and a lower Braden score. Superficial sacral pressure ulcers were significantly associated with incontinence-associated dermatitis. Despite the standardized preventive measures they received, hospitalized patients with non-blanchable erythema, urogenital disorders and a higher body temperature were at increased risk for developing pressure ulcers. Improved identification of at-risk patients can be achieved by taking into account specific predictive factors. Even if preventive measures are in place, continuous assessment and tailoring of interventions is necessary in all patients at risk. Daily skin observation can be used to continuously monitor the effectiveness of the intervention. © 2014 John Wiley & Sons Ltd.

  14. SDMtoolbox 2.0: the next generation Python-based GIS toolkit for landscape genetic, biogeographic and species distribution model analyses

    Directory of Open Access Journals (Sweden)

    Jason L. Brown

    2017-12-01

    Full Text Available SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs to maximize each model’s discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have ‘universal’ analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism, generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates—to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.

  15. Semen analysis and prediction of natural conception

    NARCIS (Netherlands)

    Leushuis, Esther; van der Steeg, Jan Willem; Steures, Pieternel; Repping, Sjoerd; Bossuyt, Patrick M. M.; Mol, Ben Willem J.; Hompes, Peter G. A.; van der Veen, Fulco

    2014-01-01

    Do two semen analyses predict natural conception better than a single semen analysis and will adding the results of repeated semen analyses to a prediction model for natural pregnancy improve predictions? A second semen analysis does not add helpful information for predicting natural conception

  16. A model for predicting lung cancer response to therapy

    International Nuclear Information System (INIS)

    Seibert, Rebecca M.; Ramsey, Chester R.; Hines, J. Wesley; Kupelian, Patrick A.; Langen, Katja M.; Meeks, Sanford L.; Scaperoth, Daniel D.

    2007-01-01

    Purpose: Volumetric computed tomography (CT) images acquired by image-guided radiation therapy (IGRT) systems can be used to measure tumor response over the course of treatment. Predictive adaptive therapy is a novel treatment technique that uses volumetric IGRT data to actively predict the future tumor response to therapy during the first few weeks of IGRT treatment. The goal of this study was to develop and test a model for predicting lung tumor response during IGRT treatment using serial megavoltage CT (MVCT). Methods and Materials: Tumor responses were measured for 20 lung cancer lesions in 17 patients that were imaged and treated with helical tomotherapy with doses ranging from 2.0 to 2.5 Gy per fraction. Five patients were treated with concurrent chemotherapy, and 1 patient was treated with neoadjuvant chemotherapy. Tumor response to treatment was retrospectively measured by contouring 480 serial MVCT images acquired before treatment. A nonparametric, memory-based locally weight regression (LWR) model was developed for predicting tumor response using the retrospective tumor response data. This model predicts future tumor volumes and the associated confidence intervals based on limited observations during the first 2 weeks of treatment. The predictive accuracy of the model was tested using a leave-one-out cross-validation technique with the measured tumor responses. Results: The predictive algorithm was used to compare predicted verse-measured tumor volume response for all 20 lesions. The average error for the predictions of the final tumor volume was 12%, with the true volumes always bounded by the 95% confidence interval. The greatest model uncertainty occurred near the middle of the course of treatment, in which the tumor response relationships were more complex, the model has less information, and the predictors were more varied. The optimal days for measuring the tumor response on the MVCT images were on elapsed Days 1, 2, 5, 9, 11, 12, 17, and 18 during

  17. Changes in environmental tobacco smoke (ETS) exposure over a 20-year period: cross-sectional and longitudinal analyses

    Science.gov (United States)

    Jefferis, Barbara J; Thomson, Andrew G; Lennon, Lucy T; Feyerabend, Colin; Doig, Mira; McMeekin, Laura; Wannamethee, S Goya; Cook, Derek G; Whincup, Peter H

    2009-01-01

    Aims To examine long-term changes in environmental tobacco smoke (ETS) exposure in British men between 1978 and 2000, using serum cotinine. Design Prospective cohort: British Regional Heart Study. Setting General practices in 24 towns in England, Wales and Scotland. Participants Non-smoking men: 2125 studied at baseline [questionnaire (Q1): 1978–80, aged 40–59 years], 3046 studied 20 years later (Q20: 1998–2000, aged 60–79 years) and 1208 studied at both times. Non-smokers were men reporting no current smoking with cotinine < 15 ng/ml at Q1 and/or Q20. Measurements Serum cotinine to assess ETS exposure. Findings In cross-sectional analysis, geometric mean cotinine level declined from 1.36 ng/ml [95% confidence interval (CI): 1.31, 1.42] at Q1 to 0.19 ng/ml (95% CI: 0.18, 0.19) at Q20. The prevalence of cotinine levels ≤ 0.7 ng/ml [associated with low coronary heart disease (CHD) risk] rose from 27.1% at Q1 to 83.3% at Q20. Manual social class and northern region of residence were associated with higher mean cotinine levels both at Q1 and Q20; older age was associated with lower cotinine level at Q20 only. Among 1208 persistent non-smokers, cotinine fell by 1.47 ng/ml (95% CI: 1.37, 1.57), 86% decline. Absolute falls in cotinine were greater in manual occupational groups, in the Midlands and Scotland compared to southern England, although percentage decline was very similar across groups. Conclusions A marked decline in ETS exposure occurred in Britain between 1978 and 2000, which is likely to have reduced ETS-related disease risks appreciably before the introduction of legislation banning smoking in public places. PMID:19207361

  18. Analyses of MMP20 Missense Mutations in Two Families with Hypomaturation Amelogenesis Imperfecta.

    Science.gov (United States)

    Kim, Youn Jung; Kang, Jenny; Seymen, Figen; Koruyucu, Mine; Gencay, Koray; Shin, Teo Jeon; Hyun, Hong-Keun; Lee, Zang Hee; Hu, Jan C-C; Simmer, James P; Kim, Jung-Wook

    2017-01-01

    Amelogenesis imperfecta is a group of rare inherited disorders that affect tooth enamel formation, quantitatively and/or qualitatively. The aim of this study was to identify the genetic etiologies of two families presenting with hypomaturation amelogenesis imperfecta. DNA was isolated from peripheral blood samples obtained from participating family members. Whole exome sequencing was performed using DNA samples from the two probands. Sequencing data was aligned to the NCBI human reference genome (NCBI build 37.2, hg19) and sequence variations were annotated with the dbSNP build 138. Mutations in MMP20 were identified in both probands. A homozygous missense mutation (c.678T>A; p.His226Gln) was identified in the consanguineous Family 1. Compound heterozygous MMP20 mutations (c.540T>A, p.Tyr180 * and c.389C>T, p.Thr130Ile) were identified in the non-consanguineous Family 2. Affected persons in Family 1 showed hypomaturation AI with dark brown discoloration, which is similar to the clinical phenotype in a previous report with the same mutation. However, the dentition of the Family 2 proband exhibited slight yellowish discoloration with reduced transparency. Functional analysis showed that the p.Thr130Ile mutant protein had reduced activity of MMP20, while there was no functional MMP20 in the Family 1 proband. These results expand the mutational spectrum of the MMP20 and broaden our understanding of genotype-phenotype correlations in amelogenesis imperfecta.

  19. Analyses of MMP20 Missense Mutations in Two Families with Hypomaturation Amelogenesis Imperfecta

    Directory of Open Access Journals (Sweden)

    Jung-Wook Kim

    2017-04-01

    Full Text Available Amelogenesis imperfecta is a group of rare inherited disorders that affect tooth enamel formation, quantitatively and/or qualitatively. The aim of this study was to identify the genetic etiologies of two families presenting with hypomaturation amelogenesis imperfecta. DNA was isolated from peripheral blood samples obtained from participating family members. Whole exome sequencing was performed using DNA samples from the two probands. Sequencing data was aligned to the NCBI human reference genome (NCBI build 37.2, hg19 and sequence variations were annotated with the dbSNP build 138. Mutations in MMP20 were identified in both probands. A homozygous missense mutation (c.678T>A; p.His226Gln was identified in the consanguineous Family 1. Compound heterozygous MMP20 mutations (c.540T>A, p.Tyr180* and c.389C>T, p.Thr130Ile were identified in the non-consanguineous Family 2. Affected persons in Family 1 showed hypomaturation AI with dark brown discoloration, which is similar to the clinical phenotype in a previous report with the same mutation. However, the dentition of the Family 2 proband exhibited slight yellowish discoloration with reduced transparency. Functional analysis showed that the p.Thr130Ile mutant protein had reduced activity of MMP20, while there was no functional MMP20 in the Family 1 proband. These results expand the mutational spectrum of the MMP20 and broaden our understanding of genotype-phenotype correlations in amelogenesis imperfecta.

  20. Sea surface temperature predictions using a multi-ocean analysis ensemble scheme

    Science.gov (United States)

    Zhang, Ying; Zhu, Jieshun; Li, Zhongxian; Chen, Haishan; Zeng, Gang

    2017-08-01

    This study examined the global sea surface temperature (SST) predictions by a so-called multiple-ocean analysis ensemble (MAE) initialization method which was applied in the National Centers for Environmental Prediction (NCEP) Climate Forecast System Version 2 (CFSv2). Different from most operational climate prediction practices which are initialized by a specific ocean analysis system, the MAE method is based on multiple ocean analyses. In the paper, the MAE method was first justified by analyzing the ocean temperature variability in four ocean analyses which all are/were applied for operational climate predictions either at the European Centre for Medium-range Weather Forecasts or at NCEP. It was found that these systems exhibit substantial uncertainties in estimating the ocean states, especially at the deep layers. Further, a set of MAE hindcasts was conducted based on the four ocean analyses with CFSv2, starting from each April during 1982-2007. The MAE hindcasts were verified against a subset of hindcasts from the NCEP CFS Reanalysis and Reforecast (CFSRR) Project. Comparisons suggested that MAE shows better SST predictions than CFSRR over most regions where ocean dynamics plays a vital role in SST evolutions, such as the El Niño and Atlantic Niño regions. Furthermore, significant improvements were also found in summer precipitation predictions over the equatorial eastern Pacific and Atlantic oceans, for which the local SST prediction improvements should be responsible. The prediction improvements by MAE imply a problem for most current climate predictions which are based on a specific ocean analysis system. That is, their predictions would drift towards states biased by errors inherent in their ocean initialization system, and thus have large prediction errors. In contrast, MAE arguably has an advantage by sampling such structural uncertainties, and could efficiently cancel these errors out in their predictions.

  1. Tickled to Death: Analysing Public Perceptions of ‘Cute’ Videos of Threatened Species (Slow Lorises – Nycticebus spp.) on Web 2.0 Sites

    Science.gov (United States)

    Nekaris, By K. Anne-Isola; Campbell, Nicola; Coggins, Tim G.; Rode, E. Johanna; Nijman, Vincent

    2013-01-01

    Background The internet is gaining importance in global wildlife trade and changing perceptions of threatened species. There is little data available to examine the impact that popular Web 2.0 sites play on public perceptions of threatened species. YouTube videos portraying wildlife allow us to quantify these perceptions. Methodology/Principal Findings Focussing on a group of threatened and globally protected primates, slow lorises, we quantify public attitudes towards wildlife conservation by analysing 12,411 comments and associated data posted on a viral YouTube video ‘tickling slow loris’ over a 33-months period. In the initial months a quarter of commentators indicated wanting a loris as a pet, but as facts about their conservation and ecology became more prevalent this dropped significantly. Endorsements, where people were directed to the site by celebrities, resulted mostly in numerous neutral responses with few links to conservation or awareness. Two conservation-related events, linked to Wikipedia and the airing of a television documentary, led to an increase in awareness, and ultimately to the removal of the analysed video. Conclusions/Significance Slow loris videos that have gone viral have introduced these primates to a large cross-section of society that would not normally come into contact with them. Analyses of webometric data posted on the internet allow us quickly to gauge societal sentiments. We showed a clear temporal change in some views expressed but without an apparent increase in knowledge about the conservation plight of the species, or the illegal nature of slow loris trade. Celebrity endorsement of videos showing protected wildlife increases visits to such sites, but does not educate about conservation issues. The strong desire of commentators to express their want for one as a pet demonstrates the need for Web 2.0 sites to provide a mechanism via which illegal animal material can be identified and policed. PMID:23894432

  2. Taxometric analyses and predictive accuracy of callous-unemotional traits regarding quality of life and behavior problems in non-conduct disorder diagnoses.

    Science.gov (United States)

    Herpers, Pierre C M; Klip, Helen; Rommelse, Nanda N J; Taylor, Mark J; Greven, Corina U; Buitelaar, Jan K

    2017-07-01

    Callous-unemotional (CU) traits have mainly been studied in relation to conduct disorder (CD), but can also occur in other disorder groups. However, it is unclear whether there is a clinically relevant cut-off value of levels of CU traits in predicting reduced quality of life (QoL) and clinical symptoms, and whether CU traits better fit a categorical (taxonic) or dimensional model. Parents of 979 youths referred to a child and adolescent psychiatric clinic rated their child's CU traits on the Inventory of Callous-Unemotional traits (ICU), QoL on the Kidscreen-27, and clinical symptoms on the Child Behavior Checklist. Experienced clinicians conferred DSM-IV-TR diagnoses of ADHD, ASD, anxiety/mood disorders and DBD-NOS/ODD. The ICU was also used to score the DSM-5 specifier 'with limited prosocial emotions' (LPE) of Conduct Disorder. Receiver operating characteristic (ROC) analyses revealed that the predictive accuracy of the ICU and LPE regarding QoL and clinical symptoms was poor to fair, and similar across diagnoses. A clinical cut-off point could not be defined. Taxometric analyses suggested that callous-unemotional traits on the ICU best reflect a dimension rather than taxon. More research is needed on the impact of CU traits on the functional adaptation, course, and response to treatment of non-CD conditions. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  3. What does a worm want with 20,000 genes?

    OpenAIRE

    Hodgkin, Jonathan

    2001-01-01

    The number of genes predicted for the Caenorhabditis elegans genome is remarkably high: approximately 20,000, if both protein-coding and RNA-coding genes are counted. This article discusses possible explanations for such a high value.

  4. Long-term fertilization alters chemically-separated soil organic carbon pools: Based on stable C isotope analyses

    Science.gov (United States)

    Dou, Xiaolin; He, Ping; Cheng, Xiaoli; Zhou, Wei

    2016-01-01

    Quantification of dynamics of soil organic carbon (SOC) pools under the influence of long-term fertilization is essential for predicting carbon (C) sequestration. We combined soil chemical fractionation with stable C isotope analyses to investigate the C dynamics of the various SOC pools after 25 years of fertilization. Five types of soil samples (0-20, 20-40 cm) including the initial level (CK) and four fertilization treatments (inorganic nitrogen fertilizer, IN; balanced inorganic fertilizer, NPK; inorganic fertilizer plus farmyard manure, MNPK; inorganic fertilizer plus corn straw residue, SNPK) were separated into recalcitrant and labile fractions, and the fractions were analysed for C content, C:N ratios, δ13C values, soil C and N recalcitrance indexes (RIC and RIN). Chemical fractionation showed long-term MNPK fertilization strongly increased the SOC storage in both soil layers (0-20 cm = 1492.4 gC m2 and 20-40 cm = 1770.6 gC m2) because of enhanced recalcitrant C (RC) and labile C (LC). The 25 years of inorganic fertilizer treatment did not increase the SOC storage mainly because of the offsetting effects of enhanced RC and decreased LC, whereas no clear SOC increases under the SNPK fertilization resulted from the fast decay rates of soil C.

  5. Designing Geometry 2.0 learning environments: a preliminary study with primary school students

    Science.gov (United States)

    Joglar Prieto, Nuria; María Sordo Juanena, José; Star, Jon R.

    2014-04-01

    The information and communication technologies of Web 2.0 are arriving in our schools, allowing the design and implementation of new learning environments with great educational potential. This article proposes a pedagogical model based on a new geometry technology-integrated learning environment, called Geometry 2.0, which was tested with 39 sixth grade students from a public school in Madrid (Spain). The main goals of the study presented here were to describe an optimal role for the mathematics teacher within Geometry 2.0, and to analyse how dynamic mathematics and communication might affect young students' learning of basic figural concepts in a real setting. The analyses offered in this article illustrate how our Geometry 2.0 model facilitates deeply mathematical tasks which encourage students' exploration, cooperation and communication, improving their learning while fostering geometrical meanings.

  6. PC-PVT 2.0: An updated platform for psychomotor vigilance task testing, analysis, prediction, and visualization.

    Science.gov (United States)

    Reifman, Jaques; Kumar, Kamal; Khitrov, Maxim Y; Liu, Jianbo; Ramakrishnan, Sridhar

    2018-07-01

    The psychomotor vigilance task (PVT) has been widely used to assess the effects of sleep deprivation on human neurobehavioral performance. To facilitate research in this field, we previously developed the PC-PVT, a freely available software system analogous to the "gold-standard" PVT-192 that, in addition to allowing for simple visual reaction time (RT) tests, also allows for near real-time PVT analysis, prediction, and visualization in a personal computer (PC). Here we present the PC-PVT 2.0 for Windows 10 operating system, which has the capability to couple PVT tests of a study protocol with the study's sleep/wake and caffeine schedules, and make real-time individualized predictions of PVT performance for such schedules. We characterized the accuracy and precision of the software in measuring RT, using 44 distinct combinations of PC hardware system configurations. We found that 15 system configurations measured RTs with an average delay of less than 10 ms, an error comparable to that of the PVT-192. To achieve such small delays, the system configuration should always use a gaming mouse as the means to respond to visual stimuli. We recommend using a discrete graphical processing unit for desktop PCs and an external monitor for laptop PCs. This update integrates a study's sleep/wake and caffeine schedules with the testing software, facilitating testing and outcome visualization, and provides near-real-time individualized PVT predictions for any sleep-loss condition considering caffeine effects. The software, with its enhanced PVT analysis, visualization, and prediction capabilities, can be freely downloaded from https://pcpvt.bhsai.org. Published by Elsevier B.V.

  7. Thermal analyses. Information on the expected baking process; Thermische analyses. Informatie over een te verwachten bakgedrag

    Energy Technology Data Exchange (ETDEWEB)

    Van Wijck, H. [Stichting Technisch Centrum voor de Keramische Industrie TCKI, Velp (Netherlands)

    2009-09-01

    The design process and the drying process for architectural ceramics and pottery partly determine the characteristics of the final product, but the largest changes occur during the baking process. An overview is provided of the different thermal analyses and how the information from these analyses can predict the process in practice. (mk) [Dutch] Het vormgevingsproces en het droogproces voor bouwkeramische producten en aardewerk bepalen voor een deel de eigenschappen van de eindproducten, maar de grootste veranderingen treden op bij het bakproces. Een overzicht wordt gegeven van de verschillende thermische analyses en hoe de informatie uit deze analyses het in de praktijk te verwachten gedrag kan voorspellen.

  8. Deflection type energy analyser for energetic electron beams in a beam-plasma system

    International Nuclear Information System (INIS)

    Michel, J.A.; Hogge, J.P.

    1988-11-01

    An energy analyser for the study of electron beam distribution functions in unmagnetized plasmas is described. This analyser is designed to avoid large electric fields which are created in multi-grid analysers and to measure directly the beam distribution function without differentiation. As an example of an application we present results on the propagation of an energetic beam (E b : 2.0 keV) in a plasma (n o : 1.10 10 cm -3 , T e : 1.4 eV) (author) 7 figs., 10 refs

  9. RELAP5 thermal-hydraulic analyses of overcooling sequences in a pressurized water reactor

    International Nuclear Information System (INIS)

    Bolander, M.A.; Fletcher, C.D.; Davis, C.B.; Kullberg, C.M.; Stitt, B.D.; Waterman, M.E.; Burtt, J.D.

    1984-01-01

    In support of the Pressurized Thermal Shock Integration Study, sponsored by the United States Nuclear Regulatory Commission, the Idaho National Engineering Laboratory has performed analyses of overcooling transients using the RELAP5/MOD1.6 and MOD2.0 computer codes. These analyses were performed for the H.B. Robinson Unit 2 pressurized water reactor, which is a Westinghouse 3-loop design plant. Results of the RELAP5 analyses are presented. The capabilities of the RELAP5 computer code as a tool for analyzing integral plant transients requiring a detailed plant model, including complex trip logic and major control systems, are examined

  10. A systematic review of the quality and impact of anxiety disorder meta-analyses.

    Science.gov (United States)

    Ipser, Jonathan C; Stein, Dan J

    2009-08-01

    Meta-analyses are seen as representing the pinnacle of a hierarchy of evidence used to inform clinical practice. Therefore, the potential importance of differences in the rigor with which they are conducted and reported warrants consideration. In this review, we use standardized instruments to describe the scientific and reporting quality of meta-analyses of randomized controlled trials of the treatment of anxiety disorders. We also use traditional and novel metrics of article impact to assess the influence of meta-analyses across a range of research fields in the anxiety disorders. Overall, although the meta-analyses that we examined had some flaws, their quality of reporting was generally acceptable. Neither the scientific nor reporting quality of the meta-analyses was predicted by any of the impact metrics. The finding that treatment meta-analyses were cited less frequently than quantitative reviews of studies in current "hot spots" of research (ie, genetics, imaging) points to the multifactorial nature of citation patterns. A list of the meta-analyses included in this review is available on an evidence-based website of anxiety and trauma-related disorders.

  11. NetMHCpan, a method for MHC class I binding prediction beyond humans

    DEFF Research Database (Denmark)

    Hoof, Ilka; Peters, B; Sidney, J

    2009-01-01

    molecules. We show that the NetMHCpan-2.0 method can accurately predict binding to uncharacterized HLA molecules, including HLA-C and HLA-G. Moreover, NetMHCpan-2.0 is demonstrated to accurately predict peptide binding to chimpanzee and macaque MHC class I molecules. The power of NetMHCpan-2.0 to guide...

  12. Comparative and Predictive Multimedia Assessments Using Monte Carlo Uncertainty Analyses

    Science.gov (United States)

    Whelan, G.

    2002-05-01

    Multiple-pathway frameworks (sometimes referred to as multimedia models) provide a platform for combining medium-specific environmental models and databases, such that they can be utilized in a more holistic assessment of contaminant fate and transport in the environment. These frameworks provide a relatively seamless transfer of information from one model to the next and from databases to models. Within these frameworks, multiple models are linked, resulting in models that consume information from upstream models and produce information to be consumed by downstream models. The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) is an example, which allows users to link their models to other models and databases. FRAMES is an icon-driven, site-layout platform that is an open-architecture, object-oriented system that interacts with environmental databases; helps the user construct a Conceptual Site Model that is real-world based; allows the user to choose the most appropriate models to solve simulation requirements; solves the standard risk paradigm of release transport and fate; and exposure/risk assessments to people and ecology; and presents graphical packages for analyzing results. FRAMES is specifically designed allow users to link their own models into a system, which contains models developed by others. This paper will present the use of FRAMES to evaluate potential human health exposures using real site data and realistic assumptions from sources, through the vadose and saturated zones, to exposure and risk assessment at three real-world sites, using the Multimedia Environmental Pollutant Assessment System (MEPAS), which is a multimedia model contained within FRAMES. These real-world examples use predictive and comparative approaches coupled with a Monte Carlo analysis. A predictive analysis is where models are calibrated to monitored site data, prior to the assessment, and a comparative analysis is where models are not calibrated but

  13. Can current moisture responses predict soil CO2 efflux under altered precipitation regimes? A synthesis of manipulation experiments

    DEFF Research Database (Denmark)

    Vicca, S.; Bahn, M.; Estiarte, M.

    2014-01-01

    to fluctuations in soil temperature and soil water content can be used to predict SCE under altered rainfall patterns. Of the 58 experiments for which we gathered SCE data, 20 were discarded because either too few data were available or inconsistencies precluded their incorporation in the analyses. The 38...... remaining experiments were used to test the hypothesis that a model parameterized with data from the control plots (using soil temperature and water content as predictor variables) could adequately predict SCE measured in the manipulated treatment. Only for 7 of these 38 experiments was this hypothesis...... strongly on establishing response functions across a broader range of precipitation regimes and soil moisture conditions. Such experiments should make accurate measurements of water availability, should conduct high-frequency SCE measurements, and should consider both instantaneous responses...

  14. A positive relationship between spring temperature and productivity in 20 songbird species in the boreal zone.

    Science.gov (United States)

    Meller, Kalle; Piha, Markus; Vähätalo, Anssi V; Lehikoinen, Aleksi

    2018-03-01

    Anthropogenic climate warming has already affected the population dynamics of numerous species and is predicted to do so also in the future. To predict the effects of climate change, it is important to know whether productivity is linked to temperature, and whether species' traits affect responses to climate change. To address these objectives, we analysed monitoring data from the Finnish constant effort site ringing scheme collected in 1987-2013 for 20 common songbird species together with climatic data. Warm spring temperature had a positive linear relationship with productivity across the community of 20 species independent of species' traits (realized thermal niche or migration behaviour), suggesting that even the warmest spring temperatures remained below the thermal optimum for reproduction, possibly due to our boreal study area being closer to the cold edge of all study species' distributions. The result also suggests a lack of mismatch between the timing of breeding and peak availability of invertebrate food of the study species. Productivity was positively related to annual growth rates in long-distance migrants, but not in short-distance migrants. Across the 27-year study period, temporal trends in productivity were mostly absent. The population sizes of species with colder thermal niches had decreasing trends, which were not related to temperature responses or temporal trends in productivity. The positive connection between spring temperature and productivity suggests that climate warming has potential to increase the productivity in bird species in the boreal zone, at least in the short term.

  15. 1D/2D analyses of the lower head vessel in contact with high temperature melt

    International Nuclear Information System (INIS)

    Chang, Jong Eun; Cho, Jae Seon; Suh, Kune Y.; Chung, Chang H.

    1998-01-01

    One- and two-dimensional analyses were performed for the ceramic/metal melt and the vessel to interpret the temperature history of the outer surface of the vessel wall measured from typical Al 2 O 3 /Fe thermite melt tests LAVA (Lower-plenum Arrested Vessel Attack) spanning heatup and cooldown periods. The LAVA tests were conducted at the Korea Atomic Energy Research Institute (KAERI) during the process of high temperature molten material relocation from the delivery duct down into the water in the test vessel pressurized to 2.0 MPa. Both analyses demonstrated reasonable predictions of the temperature history of the LHV (Lower Head Vessel). The comparison sheds light on the thermal hydraulic and material behavior of the high temperature melt within the hemispherical vessel

  16. Numerical prediction of augmented turbulent heat transfer in an annular fuel channel with repeated two-dimensional square ribs

    International Nuclear Information System (INIS)

    Takase, K.

    1996-01-01

    The square-ribbed fuel rod for high temperature gas-cooled reactors was designed and developed so as to enhance the turbulent heat transfer in comparison with the previous standard fuel rod. The turbulent heat transfer characteristics in an annular fuel channel with repeated two-dimensional square ribs were analysed numerically on a fully developed incompressible flow using the k-ε turbulence model and the two-dimensional axisymmetrical coordinate system. Numerical analyses were carried out under the conditions of Reynolds numbers from 3000 to 20000 and ratios of square-rib pitch to height of 10, 20 and 40 respectively. The predictions of the heat transfer coefficients agreed well within an error of 10% for the square-rib pitch to height ratio of 10, 20% for 20 and 25% for 40 respectively, with the heat transfer empirical correlations obtained from the experimental data due to the simulated square-ribbed fuel rods. Therefore it was found that the effect of heat transfer augmentation due to the square ribs could be predicted by the present numerical simulations and the mechanism could be explained by the change in the turbulence kinematic energy distribution along the flow direction. (orig.)

  17. Predicting child maltreatment: A meta-analysis of the predictive validity of risk assessment instruments.

    Science.gov (United States)

    van der Put, Claudia E; Assink, Mark; Boekhout van Solinge, Noëlle F

    2017-11-01

    Risk assessment is crucial in preventing child maltreatment since it can identify high-risk cases in need of child protection intervention. Despite widespread use of risk assessment instruments in child welfare, it is unknown how well these instruments predict maltreatment and what instrument characteristics are associated with higher levels of predictive validity. Therefore, a multilevel meta-analysis was conducted to examine the predictive accuracy of (characteristics of) risk assessment instruments. A literature search yielded 30 independent studies (N=87,329) examining the predictive validity of 27 different risk assessment instruments. From these studies, 67 effect sizes could be extracted. Overall, a medium significant effect was found (AUC=0.681), indicating a moderate predictive accuracy. Moderator analyses revealed that onset of maltreatment can be better predicted than recurrence of maltreatment, which is a promising finding for early detection and prevention of child maltreatment. In addition, actuarial instruments were found to outperform clinical instruments. To bring risk and needs assessment in child welfare to a higher level, actuarial instruments should be further developed and strengthened by distinguishing risk assessment from needs assessment and by integrating risk assessment with case management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Distinct coping strategies differentially predict urge levels and lapses in a smoking cessation attempt.

    Science.gov (United States)

    Brodbeck, Jeannette; Bachmann, Monica S; Znoj, Hansjörg

    2013-06-01

    This study analysed mechanisms through which stress-coping and temptation-coping strategies were associated with lapses. Furthermore, we explored whether distinct coping strategies differentially predicted reduced lapse risk, lower urge levels, or a weaker association between urge levels and lapses during the first week of an unassisted smoking cessation attempt. Participants were recruited via the internet and mass media in Switzerland. Ecological momentary assessment (EMA) with mobile devices was used to assess urge levels and lapses. Online questionnaires were used to measure smoking behaviours and coping variables at baseline, as well as smoking behaviour at the three-month follow-up. The sample consisted of 243 individuals, aged 20 to 40, who reported 4199 observations. Findings of multilevel regression analyses show that coping was mainly associated with a reduced lapse risk and not with lower urge levels or a weaker association between urge levels and lapses. 'Calming down' and 'commitment to change' predicted a lower lapse risk and also a weaker relation between urge levels and lapses. 'Stimulus control' predicted a lower lapse risk and lower urge levels. Conversely, 'task-orientation' and 'risk assessment' were related to higher lapse risk and 'risk assessment' also to higher urge levels. Disengagement coping i.e. 'eating or shopping', 'distraction', and 'mobilising social support' did not affect lapse risk. Promising coping strategies during the initial stage of smoking cessation attempt are targeted directly at reducing the lapse risk and are characterised by engagement with the stressor or one's reactions towards the stressor and a focus on positive consequences instead of health risks. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. ERANOS 2.0, Modular code and data system for fast reactor neutronics analyses

    International Nuclear Information System (INIS)

    2008-01-01

    1 - Description of program or function: The European Reactor Analysis Optimized calculation System, ERANOS, has been developed and validated with the aim of providing a suitable basis for reliable neutronic calculations of current as well as advanced fast reactor cores. It consists of data libraries, deterministic codes and calculation procedures which have been developed within the European Collaboration on Fast Reactors over the past 20 years or so, in order to answer the needs of both industrial and R and D organisations. The whole system counts roughly 250 functions and 3000 subroutines totalling 450000 lines of FORTRAN-77 and ESOPE instructions. ERANOS is written using the ALOS software which requires only standard FORTRAN compilers and includes advanced programming features. A modular structure was adopted for easier evolution and incorporation of new functionalities. Blocks of data (SETs) can be created or used by the modules themselves or by the user via the LU control language. Programming, and dynamic memory allocation, are performed by means of the ESOPE language. External temporary storage and permanent storage capabilities are provided by the GEMAT and ARCHIVE functions, respectively. ESOPE, LU, GEMAT and ARCHIVE are all part of the ALOS software. This modular structure allows different modules to be linked together in procedures corresponding to recommended calculation routes ranging from fast-running and moderately-accurate 'routine' procedures to slow-running but highly-accurate 'reference' procedures. The main contents of the ERANOS-2.0 package are: nuclear data libraries (multigroup cross-sections from the JEF-2.2 evaluated nuclear data file, and other specific data files), a cell and lattice code (ECCO), reactor flux solvers (diffusion, Sn transport, nodal variational transport), a burn-up module, various processing modules (material and neutron balance, breeding gains,...), tools related to perturbation theory and sensitivity analysis, core

  20. The structure of late-life depressive symptoms across a 20-year span: a taxometric investigation.

    Science.gov (United States)

    Holland, Jason M; Schutte, Kathleen K; Brennan, Penny L; Moos, Rudolf H

    2010-03-01

    Past studies of the underlying structure of depressive symptoms have yielded mixed results, with some studies supporting a continuous conceptualization and others supporting a categorical one. However, no study has examined this research question with an exclusively older adult sample, despite the potential uniqueness of late-life depressive symptoms. In the present study, the underlying structure of late-life depressive symptoms was examined among a sample of 1,289 individuals across 3 waves of data collection spanning 20 years. The authors employed a taxometric methodology using indicators of depression derived from the Research Diagnostic Criteria (R. L. Spitzer, J. Endicott, & E. Robins, 1978). Maximum eigenvalue analyses and inchworm consistency tests generally supported a categorical conceptualization and identified a group that was primarily characterized by thoughts about death and suicide. However, compared to a categorical depression variable, depressive symptoms treated continuously were generally better predictors of relevant criterion variables. These findings suggest that thoughts of death and suicide may characterize a specific type of late-life depression, yet a continuous conceptualization still typically maximizes the predictive utility of late-life depressive symptoms.

  1. Inelastic hadron-nucleus interactions at 20 and 37 GeV/c

    International Nuclear Information System (INIS)

    Faessler, M.A.; Lynen, U.; Niewisch, J.; Pietrzyk, B.; Povh, B.; Schroeder, H.; Gugelot, P.C.

    1979-01-01

    The experiment studies charged particle production for π - , K - , and anti p interactions on nuclei at 20 and 37 GeV/c at the CERN SPS. A non-magnetic detector, consisting of CsI(Tl) scintillation and lucite Cerenkov counters, distinguishes between fast particles, mainly pions, and slow particles, mainly nucleons, with a cut at velocity β approximately 0.7. Angular distributions, multiplicity distributions and correlations of slow and fast particles were analysed. It is shown that the measurement of the correlations can provide a critical test for different theoretical models of the hadron-nucleus interaction. At the energies studied so far a systematic deviation from KNO scaling is observed. This gives further support to the 'standard picture' of the hadron-nucleus interaction and it contrasts with predictions of the coherent-tube model. The regularity observed for the angular distribution of fast secondaries as a function of the number of slow particles can only be explained by combining features predicted by different models. (Auth.)

  2. Sequence analyses and 3D structure prediction of two Type III ...

    African Journals Online (AJOL)

    Internet

    2012-04-17

    Apr 17, 2012 ... analyses were performed using the sequence data of growth hormone gene (gh) ... used as a phylogenetic marker for different taxonomic ..... structural changes have been observed in some parts of ..... of spatial restraints.

  3. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  4. The groningen longitudinal glaucoma study III. The predictive value of frequency-doubling perimetry and GDx nerve fibre analyser test results for the development of glaucomatous visual field loss

    NARCIS (Netherlands)

    Heeg, G. P.; Jansonius, N. M.

    Purpose To investigate whether frequency-doubling perimetry (FDT) and nerve fibre analyser (GDx) test results are able to predict glaucomatous visual field loss in glaucoma suspect patients. Methods A large cohort of glaucoma suspect patients (patients with ocular hypertension or a positive family

  5. Predicting Psychotherapy Dropouts: A Multilevel Approach.

    Science.gov (United States)

    Kegel, Alexander F; Flückiger, Christoph

    2015-01-01

    The role of therapeutic processes in predicting premature termination of psychotherapy has been a particular focus of recent research. The purpose of this study was to contrast outpatients who completed therapy and those who dropped out with respect to their self-reported in-session experiences of self-esteem, mastery, clarification and the therapeutic alliance. The 296 patients with mixed disorders were treated with an integrative form of cognitive-behavioural therapy without pre-determined time limit (M = 20.2 sessions). Multilevel analyses indicated that patients who did not completetreatment reported, on average, lower levels of self-esteem, mastery and clarification and lower ratings of their therapeutic alliance in treatment in contrast to patients who completed therapy. Patient-reported change in self-esteem experiences over the course of treatment turned out to be the strongest predictor of dropout from psychotherapy or successful completion. When dropout occurred before the average treatment length was reached, patients reported fewer clarifying experiences as early as the first session and their ratings of the therapeutic alliance were characterized by an absence of positive development. Both of these aspects seem to be involved in patients' decisions to leave treatment early. The findings underscore the importance of the therapeutic process in understanding the mechanisms behind treatment dropout. Analyses data from 296 patients at a private outpatient clinic in a routine practice setting (CBT). Completer/dropout definition: presence or absence of measurement battery at post-assessment. Focuses on change in therapy processes by investigating post-session reports. Finds that positive changes in self-esteem experiences is the most robust predictor of dropout, followed by ratings of clarification experiences and the global alliance. In line with recent dropout research, these process indicators might help to detect therapeutic situations that are

  6. Phylemon 2.0: a suite of web-tools for molecular evolution, phylogenetics, phylogenomics and hypotheses testing.

    Science.gov (United States)

    Sánchez, Rubén; Serra, François; Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Pulido, Luis; de María, Alejandro; Capella-Gutíerrez, Salvador; Huerta-Cepas, Jaime; Gabaldón, Toni; Dopazo, Joaquín; Dopazo, Hernán

    2011-07-01

    Phylemon 2.0 is a new release of the suite of web tools for molecular evolution, phylogenetics, phylogenomics and hypotheses testing. It has been designed as a response to the increasing demand of molecular sequence analyses for experts and non-expert users. Phylemon 2.0 has several unique features that differentiates it from other similar web resources: (i) it offers an integrated environment that enables evolutionary analyses, format conversion, file storage and edition of results; (ii) it suggests further analyses, thereby guiding the users through the web server; and (iii) it allows users to design and save phylogenetic pipelines to be used over multiple genes (phylogenomics). Altogether, Phylemon 2.0 integrates a suite of 30 tools covering sequence alignment reconstruction and trimming; tree reconstruction, visualization and manipulation; and evolutionary hypotheses testing.

  7. Phylemon 2.0: a suite of web-tools for molecular evolution, phylogenetics, phylogenomics and hypotheses testing

    Science.gov (United States)

    Sánchez, Rubén; Serra, François; Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Pulido, Luis; de María, Alejandro; Capella-Gutíerrez, Salvador; Huerta-Cepas, Jaime; Gabaldón, Toni; Dopazo, Joaquín; Dopazo, Hernán

    2011-01-01

    Phylemon 2.0 is a new release of the suite of web tools for molecular evolution, phylogenetics, phylogenomics and hypotheses testing. It has been designed as a response to the increasing demand of molecular sequence analyses for experts and non-expert users. Phylemon 2.0 has several unique features that differentiates it from other similar web resources: (i) it offers an integrated environment that enables evolutionary analyses, format conversion, file storage and edition of results; (ii) it suggests further analyses, thereby guiding the users through the web server; and (iii) it allows users to design and save phylogenetic pipelines to be used over multiple genes (phylogenomics). Altogether, Phylemon 2.0 integrates a suite of 30 tools covering sequence alignment reconstruction and trimming; tree reconstruction, visualization and manipulation; and evolutionary hypotheses testing. PMID:21646336

  8. Severe accident recriticality analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. E-mail: wiktor.frid@ski.se; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H

    2001-11-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s{sup -1} injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g{sup -1}, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s{sup -1}. In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated

  9. Severe accident recriticality analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H.

    2001-01-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s -1 injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g -1 , was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s -1 . In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated quasi steady

  10. Validation of the sperm class analyser CASA system for sperm counting in a busy diagnostic semen analysis laboratory.

    Science.gov (United States)

    Dearing, Chey G; Kilburn, Sally; Lindsay, Kevin S

    2014-03-01

    Sperm counts have been linked to several fertility outcomes making them an essential parameter of semen analysis. It has become increasingly recognised that Computer-Assisted Semen Analysis (CASA) provides improved precision over manual methods but that systems are seldom validated robustly for use. The objective of this study was to gather the evidence to validate or reject the Sperm Class Analyser (SCA) as a tool for routine sperm counting in a busy laboratory setting. The criteria examined were comparison with the Improved Neubauer and Leja 20-μm chambers, within and between field precision, sperm concentration linearity from a stock diluted in semen and media, accuracy against internal and external quality material, assessment of uneven flow effects and a receiver operating characteristic (ROC) analysis to predict fertility in comparison with the Neubauer method. This work demonstrates that SCA CASA technology is not a standalone 'black box', but rather a tool for well-trained staff that allows rapid, high-number sperm counting providing errors are identified and corrected. The system will produce accurate, linear, precise results, with less analytical variance than manual methods that correlate well against the Improved Neubauer chamber. The system provides superior predictive potential for diagnosing fertility problems.

  11. Antiferromagnetism in a 20% Ho-80% Tb alloy single crystal

    DEFF Research Database (Denmark)

    Lebech, Bente

    1968-01-01

    20% Ho-80% Tb exhibits two magnetic phases, similar to those of Tb. The spiral turn angle varies from 31.1° to 21.4°. A minimum effective spin for the occurrence of stable simple ferromagnetic structure at low temperatures is predicted....

  12. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    International Nuclear Information System (INIS)

    Moore, Kevin L.; Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy

  13. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California 92093 (United States); Kagadis, George C. [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 26504 (Greece); McNutt, Todd R. [Department of Radiation Oncology and Molecular Radiation Science, School of Medicine, Johns Hopkins University, Baltimore, Maryland 21231 (United States); Mutic, Sasa [Department of Radiation Oncology, Washington University in St. Louis, St. Louis, Missouri 63110 (United States)

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  14. Vision 20/20: Automation and advanced computing in clinical radiation oncology.

    Science.gov (United States)

    Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  15. Combining Results from Distinct MicroRNA Target Prediction Tools Enhances the Performance of Analyses

    Directory of Open Access Journals (Sweden)

    Arthur C. Oliveira

    2017-05-01

    Full Text Available Target prediction is generally the first step toward recognition of bona fide microRNA (miRNA-target interactions in living cells. Several target prediction tools are now available, which use distinct criteria and stringency to provide the best set of candidate targets for a single miRNA or a subset of miRNAs. However, there are many false-negative predictions, and consensus about the optimum strategy to select and use the output information provided by the target prediction tools is lacking. We compared the performance of four tools cited in literature—TargetScan (TS, miRanda-mirSVR (MR, Pita, and RNA22 (R22, and we determined the most effective approach for analyzing target prediction data (individual, union, or intersection. For this purpose, we calculated the sensitivity, specificity, precision, and correlation of these approaches using 10 miRNAs (miR-1-3p, miR-17-5p, miR-21-5p, miR-24-3p, miR-29a-3p, miR-34a-5p, miR-124-3p, miR-125b-5p, miR-145-5p, and miR-155-5p and 1,400 genes (700 validated and 700 non-validated as targets of these miRNAs. The four tools provided a subset of high-quality predictions and returned few false-positive predictions; however, they could not identify several known true targets. We demonstrate that union of TS/MR and TS/MR/R22 enhanced the quality of in silico prediction analysis of miRNA targets. We conclude that the union rather than the intersection of the aforementioned tools is the best strategy for maximizing performance while minimizing the loss of time and resources in subsequent in vivo and in vitro experiments for functional validation of miRNA-target interactions.

  16. Pensions at a glance 2015 OECD and G20 indicators

    CERN Document Server

    2016-01-01

    The 10-year anniversary edition of Pensions at a Glance highlights the pension reforms undertaken by OECD and G20 countries over the last two years. Two special chapters provide deeper analysis of first-tier pension schemes and of the impact of short or interrupted careers, due to late entry into employment, childcare or unemployment, on pension entitlements. Another chapter analyses the sensitivity of long-term pension replacement rates on various parameters. A range of indicators for comparing pension policies and their outcomes between OECD and G20 countries is also provided.

  17. Analysing Culture and Interculture in Saudi EFL Textbooks: A Corpus Linguistic Approach

    Science.gov (United States)

    Almujaiwel, Sultan

    2018-01-01

    This paper combines corpus processing tools to investigate the cultural elements of Saudi education of English as a foreign language (EFL). The latest Saudi EFL textbooks (2016 onwards) are available in researchable PDF formats. This helps process them through corpus search software tools. The method adopted is based on analysing 20 cultural…

  18. The Sources of Life Chances: Does Education, Class Category, Occupation, or Short-Term Earnings Predict 20-Year Long-Term Earnings?

    Directory of Open Access Journals (Sweden)

    ChangHwan Kim

    2018-03-01

    Full Text Available In sociological studies of economic stratification and intergenerational mobility, occupation has long been presumed to reflect lifetime earnings better than do short-term earnings. However, few studies have actually tested this critical assumption. In this study, we investigate the cross-sectional determinants of 20-year accumulated earnings using data that match respondents in the Survey of Income and Program Participation to their longitudinal earnings records based on administrative tax information from 1990 to 2009. Fit statistics of regression models are estimated to assess the predictive power of various proxy variables, including occupation, education, and short-term earnings, on cumulative earnings over the 20-year time period. Contrary to the popular assumption in sociology, our results find that cross-sectional earnings have greater predictive power on long-term earnings than occupation-based class classifications, including three-digit detailed occupations for both men and women. The model based on educational attainment, including field of study, has slightly better fit than models based on one-digit occupation or the Erikson, Goldthorpe, and Portocarero class scheme. We discuss the theoretical implications of these findings for the sociology of stratification and intergenerational mobility.

  19. Prediction of infarction development after endovascular stroke therapy with dual-energy computed tomography

    International Nuclear Information System (INIS)

    Djurdjevic, Tanja; Gizewski, Elke Ruth; Grams, Astrid Ellen; Rehwald, Rafael; Glodny, Bernhard; Knoflach, Michael; Matosevic, Benjamin; Kiechl, Stefan

    2017-01-01

    After intraarterial recanalisation (IAR), the haemorrhage and the blood-brain barrier (BBB) disruption can be distinguished using dual-energy computed tomography (DECT). The aim of the present study was to investigate whether future infarction development can be predicted from DECT. DECT scans of 20 patients showing 45 BBB disrupted areas after IAR were assessed and compared with follow-up examinations. Receiver operator characteristic (ROC) analyses using densities from the iodine map (IM) and virtual non-contrast (VNC) were performed. Future infarction areas are denser than future non-infarction areas on IM series (23.44 ± 24.86 vs. 5.77 ± 2.77; p < 0.0001) and more hypodense on VNC series (29.71 ± 3.33 vs. 35.33 ± 3.50; p < 0.0001). ROC analyses for the IM series showed an area under the curve (AUC) of 0.99 (cut-off: <9.97 HU; p < 0.05; sensitivity 91.18 %; specificity 100.00 %; accuracy 0.93) for the prediction of future infarctions. The AUC for the prediction of haemorrhagic infarctions was 0.78 (cut-off >17.13 HU; p < 0.05; sensitivity 90.00 %; specificity 62.86 %; accuracy 0.69). The VNC series allowed prediction of infarction volume. Future infarction development after IAR can be reliably predicted with the IM series. The prediction of haemorrhages and of infarction size is less reliable. (orig.)

  20. Prediction of infarction development after endovascular stroke therapy with dual-energy computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Djurdjevic, Tanja; Gizewski, Elke Ruth; Grams, Astrid Ellen [Medical University of Innsbruck, Department of Neuroradiology, Innsbruck (Austria); Rehwald, Rafael; Glodny, Bernhard [Medical University of Innsbruck, Department of Radiology, Innsbruck (Austria); Knoflach, Michael; Matosevic, Benjamin; Kiechl, Stefan [Medical University of Innsbruck, Department of Neurology, Innsbruck (Austria)

    2017-03-15

    After intraarterial recanalisation (IAR), the haemorrhage and the blood-brain barrier (BBB) disruption can be distinguished using dual-energy computed tomography (DECT). The aim of the present study was to investigate whether future infarction development can be predicted from DECT. DECT scans of 20 patients showing 45 BBB disrupted areas after IAR were assessed and compared with follow-up examinations. Receiver operator characteristic (ROC) analyses using densities from the iodine map (IM) and virtual non-contrast (VNC) were performed. Future infarction areas are denser than future non-infarction areas on IM series (23.44 ± 24.86 vs. 5.77 ± 2.77; p < 0.0001) and more hypodense on VNC series (29.71 ± 3.33 vs. 35.33 ± 3.50; p < 0.0001). ROC analyses for the IM series showed an area under the curve (AUC) of 0.99 (cut-off: <9.97 HU; p < 0.05; sensitivity 91.18 %; specificity 100.00 %; accuracy 0.93) for the prediction of future infarctions. The AUC for the prediction of haemorrhagic infarctions was 0.78 (cut-off >17.13 HU; p < 0.05; sensitivity 90.00 %; specificity 62.86 %; accuracy 0.69). The VNC series allowed prediction of infarction volume. Future infarction development after IAR can be reliably predicted with the IM series. The prediction of haemorrhages and of infarction size is less reliable. (orig.)

  1. Simulation, prediction, and genetic analyses of daily methane emissions in dairy cattle.

    Science.gov (United States)

    Yin, T; Pinent, T; Brügemann, K; Simianer, H; König, S

    2015-08-01

    probability, there are more important effects contributing to variations of MEm that are not explained or are independent from these functions. Furthermore, autocorrelations exist between indicator traits and predicted MEm. Nevertheless, this integrative approach, combining information from dairy cattle nutrition with dairy cattle genetics, generated novel traits which are difficult to record on a large scale. The simulated data basis for MEm was used to determine the size of a cow calibration group for genomic selection. A calibration group including 2,581 cows with MEm phenotypes was competitive with conventional breeding strategies. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  2. Analyses and predictions of the thermodynamic properties and phase diagrams of silicate systems

    Energy Technology Data Exchange (ETDEWEB)

    Blander, M. (Argonne National Lab., IL (United States)); Pelton, A.; Eriksson, G. (Ecole Polytechnique, Montreal, PQ (Canada). Dept. of Metallurgy and Materials Engineering)

    1992-01-01

    Molten silicates are ordered solutions which can not be well represented by the usual polynomial representation of deviations from ideal solution behavior (i.e. excess free energies of mixing). An adaptation of quasichemical theory which is capable of describing the properties of ordered solutions represents the measured properties of binary silicates over broad ranges of composition and temperature. For simple silicates such as the MgO-FeO-SiO{sub 2} ternary system, in which silica is the only acid component, a combining rule generally leads to good predictions of ternary solutions from those of the binaries. In basic solutions, these predictions are consistent with those of the conformal ionic solution theory. Our results indicate that our approach could provide a potentially powerful tool for representing and predicting the properties of multicomponent molten silicates.

  3. Analyses and predictions of the thermodynamic properties and phase diagrams of silicate systems

    Energy Technology Data Exchange (ETDEWEB)

    Blander, M. [Argonne National Lab., IL (United States); Pelton, A.; Eriksson, G. [Ecole Polytechnique, Montreal, PQ (Canada). Dept. of Metallurgy and Materials Engineering

    1992-07-01

    Molten silicates are ordered solutions which can not be well represented by the usual polynomial representation of deviations from ideal solution behavior (i.e. excess free energies of mixing). An adaptation of quasichemical theory which is capable of describing the properties of ordered solutions represents the measured properties of binary silicates over broad ranges of composition and temperature. For simple silicates such as the MgO-FeO-SiO{sub 2} ternary system, in which silica is the only acid component, a combining rule generally leads to good predictions of ternary solutions from those of the binaries. In basic solutions, these predictions are consistent with those of the conformal ionic solution theory. Our results indicate that our approach could provide a potentially powerful tool for representing and predicting the properties of multicomponent molten silicates.

  4. Loss of epithelial FAM20A in mice causes amelogenesis imperfecta, tooth eruption delay and gingival overgrowth.

    Science.gov (United States)

    Li, Li-Li; Liu, Pei-Hong; Xie, Xiao-Hua; Ma, Su; Liu, Chao; Chen, Li; Qin, Chun-Lin

    2016-06-30

    FAM20A has been studied to a very limited extent. Mutations in human FAM20A cause amelogenesis imperfecta, gingival fibromatosis and kidney problems. It would be desirable to systemically analyse the expression of FAM20A in dental tissues and to assess the pathological changes when this molecule is specifically nullified in individual tissues. Recently, we generated mice with a Fam20A-floxed allele containing the beta-galactosidase reporter gene. We analysed FAM20A expression in dental tissues using X-Gal staining, immunohistochemistry and in situ hybridization, which showed that the ameloblasts in the mouse mandibular first molar began to express FAM20A at 1 day after birth, and the reduced enamel epithelium in erupting molars expressed a significant level of FAM20A. By breeding K14-Cre mice with Fam20A(flox/flox) mice, we created K14-Cre;Fam20A(flox/flox) (conditional knock out, cKO) mice, in which Fam20A was inactivated in the epithelium. We analysed the dental tissues of cKO mice using X-ray radiography, histology and immunohistochemistry. The molar enamel matrix in cKO mice was much thinner than normal and was often separated from the dentinoenamel junction. The Fam20A-deficient ameloblasts were non-polarized and disorganized and were detached from the enamel matrix. The enamel abnormality in cKO mice was consistent with the diagnosis of amelogenesis imperfecta. The levels of enamelin and matrix metalloproteinase 20 were lower in the ameloblasts and enamel of cKO mice than the normal mice. The cKO mice had remarkable delays in the eruption of molars and hyperplasia of the gingival epithelium. The findings emphasize the essential roles of FAM20A in the development of dental and oral tissues.

  5. Response to a combination of oxygen and a hypnotic as treatment for obstructive sleep apnoea is predicted by a patient's therapeutic CPAP requirement.

    Science.gov (United States)

    Landry, Shane A; Joosten, Simon A; Sands, Scott A; White, David P; Malhotra, Atul; Wellman, Andrew; Hamilton, Garun S; Edwards, Bradley A

    2017-08-01

    Upper airway collapsibility predicts the response to several non-continuous positive airway pressure (CPAP) interventions for obstructive sleep apnoea (OSA). Measures of upper airway collapsibility cannot be easily performed in a clinical context; however, a patient's therapeutic CPAP requirement may serve as a surrogate measure of collapsibility. The present work aimed to compare the predictive use of CPAP level with detailed physiological measures of collapsibility. Therapeutic CPAP levels and gold-standard pharyngeal collapsibility measures (passive pharyngeal critical closing pressure (P crit ) and ventilation at CPAP level of 0 cmH 2 O (V passive )) were retrospectively analysed from a randomized controlled trial (n = 20) comparing the combination of oxygen and eszopiclone (treatment) versus placebo/air control. Responders (9/20) to treatment were defined as those who exhibited a 50% reduction in apnoea/hypopnoea index (AHI) plus an AHICPAP requirement compared with non-responders (6.6 (5.4-8.1)  cmH 2 O vs 8.9 (8.4-10.4) cmH 2 O, P = 0.007), consistent with their reduced collapsibility (lower P crit , P = 0.017, higher V passive P = 0.025). Therapeutic CPAP level provided the highest predictive accuracy for differentiating responders from non-responders (area under the curve (AUC) = 0.86 ± 0.9, 95% CI: 0.68-1.00, P = 0.007). However, both P crit (AUC = 0.83 ± 0.11, 95% CI: 0.62-1.00, P = 0.017) and V passive (AUC = 0.77 ± 0.12, 95% CI: 0.53-1.00, P = 0.44) performed well, and the difference in AUC for these three metrics was not statistically different. A therapeutic CPAP level ≤8 cmH 2 O provided 78% sensitivity and 82% specificity (positive predictive value = 78%, negative predictive value = 82%) for predicting a response to these therapies. Therapeutic CPAP requirement, as a surrogate measure of pharyngeal collapsibility, predicts the response to non-anatomical therapy (oxygen and

  6. Pahute Mesa Well Development and Testing Analyses for Wells ER-20-8 and ER-20-4, Nevada National Security Site, Nye County, Nevada, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Greg Ruskauff and Sam Marutzky

    2012-09-01

    Wells ER-20-4 and ER-20-8 were drilled during fiscal year (FY) 2009 and FY 2010 (NNSA/NSO, 2011a and b). The closest underground nuclear test detonations to the area of investigation are TYBO (U-20y), BELMONT (U-20as), MOLBO (U-20ag), BENHAM (U-20c), and HOYA (U-20 be) (Figure 1-1). The TYBO, MOLBO, and BENHAM detonations had working points located below the regional water table. The BELMONT and HOYA detonation working points were located just above the water table, and the cavity for these detonations are calculated to extend below the water table (Pawloski et al., 2002). The broad purpose of Wells ER-20-4 and ER-20-8 is to determine the extent of radionuclide-contaminated groundwater, the geologic formations, groundwater geochemistry as an indicator of age and origin, and the water-bearing properties and hydraulic conditions that influence radionuclide migration. Well development and testing is performed to determine the hydraulic properties at the well and between other wells, and to obtain groundwater samples at the well that are representative of the formation at the well. The area location, wells, underground nuclear detonations, and other features are shown in Figure 1-1. Hydrostratigraphic cross sections A-A’, B-B’, C-C’, and D-D’ are shown in Figures 1-2 through 1-5, respectively.

  7. Complex behaviour and predictability of the European dry spell regimes

    Directory of Open Access Journals (Sweden)

    X. Lana

    2010-09-01

    Full Text Available The complex spatial and temporal characteristics of European dry spell lengths, DSL, (sequences of consecutive days with rainfall amount below a certain threshold and their randomness and predictive instability are analysed from daily pluviometric series recorded at 267 rain gauges along the second half of the 20th century. DSL are obtained by considering four thresholds, R0, of 0.1, 1.0, 5.0 and 10.0 mm/day. A proper quantification of the complexity, randomness and predictive instability of the different DSL regimes in Europe is achieved on the basis of fractal analyses and dynamic system theory, including the reconstruction theorem. First, the concept of lacunarity is applied to the series of daily rainfall, and the lacunarity curves are well fitted to Cantor and random Cantor sets. Second, the rescaled analysis reveals that randomness, persistence and anti-persistence are present on the European DSL series. Third, the complexity of the physical process governing the DSL series is quantified by the minimum number of nonlinear equations determined by the correlation dimension. And fourth, the loss of memory of the physical process, which is one of the reasons for the complex predictability, is characterized by the values of the Kolmogorov entropy, and the predictive instability is directly associated with positive Lyapunov exponents. In this way, new bases for a better prediction of DSLs in Europe, sometimes leading to drought episodes, are established. Concretely, three predictive strategies are proposed in Sect. 5. It is worth mentioning that the spatial distribution of all fractal parameters does not solely depend on latitude and longitude but also reflects the effects of orography, continental climate or vicinity to the Atlantic and Arctic Oceans and Mediterranean Sea.

  8. HostPhinder: A Phage Host Prediction Tool

    Directory of Open Access Journals (Sweden)

    Julia Villarroel

    2016-05-01

    Full Text Available The current dramatic increase of antibiotic resistant bacteria has revitalised the interest in bacteriophages as alternative antibacterial treatment. Meanwhile, the development of bioinformatics methods for analysing genomic data places high-throughput approaches for phage characterization within reach. Here, we present HostPhinder, a tool aimed at predicting the bacterial host of phages by examining the phage genome sequence. Using a reference database of 2196 phages with known hosts, HostPhinder predicts the host species of a query phage as the host of the most genomically similar reference phages. As a measure of genomic similarity the number of co-occurring k-mers (DNA sequences of length k is used. Using an independent evaluation set, HostPhinder was able to correctly predict host genus and species for 81% and 74% of the phages respectively, giving predictions for more phages than BLAST and significantly outperforming BLAST on phages for which both had predictions. HostPhinder predictions on phage draft genomes from the INTESTI phage cocktail corresponded well with the advertised targets of the cocktail. Our study indicates that for most phages genomic similarity correlates well with related bacterial hosts. HostPhinder is available as an interactive web service [1] and as a stand alone download from the Docker registry [2].

  9. Predicting well-being longitudinally for mothers rearing offspring with intellectual and developmental disabilities.

    Science.gov (United States)

    Grein, K A; Glidden, L M

    2015-07-01

    Well-being outcomes for parents of children with intellectual and developmental disabilities (IDD) may vary from positive to negative at different times and for different measures of well-being. Predicting and explaining this variability has been a major focus of family research for reasons that have both theoretical and applied implications. The current study used data from a 23-year longitudinal investigation of adoptive and birth parents of children with IDD to determine which early child, mother and family characteristics would predict the variance in maternal outcomes 20 years after their original measurement. Using hierarchical regression analyses, we tested the predictive power of variables measured when children were 7 years old on outcomes of maternal well-being when children were 26 years old. Outcome variables included maternal self-report measures of depression and well-being. Final models of well-being accounted for 20% to 34% of variance. For most outcomes, Family Accord and/or the personality variable of Neuroticism (emotional stability/instability) were significant predictors, but some variables demonstrated a different pattern. These findings confirm that (1) characteristics of the child, mother and family during childhood can predict outcomes of maternal well-being 20 years later; and (2) different predictor-outcome relationships can vary substantially, highlighting the importance of using multiple measures to gain a more comprehensive understanding of maternal well-being. These results have implications for refining prognoses for parents and for tailoring service delivery to individual child, parent and family characteristics. © 2014 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.

  10. Tree Species Abundance Predictions in a Tropical Agricultural Landscape with a Supervised Classification Model and Imbalanced Data

    Directory of Open Access Journals (Sweden)

    Sarah J. Graves

    2016-02-01

    Full Text Available Mapping species through classification of imaging spectroscopy data is facilitating research to understand tree species distributions at increasingly greater spatial scales. Classification requires a dataset of field observations matched to the image, which will often reflect natural species distributions, resulting in an imbalanced dataset with many samples for common species and few samples for less common species. Despite the high prevalence of imbalanced datasets in multiclass species predictions, the effect on species prediction accuracy and landscape species abundance has not yet been quantified. First, we trained and assessed the accuracy of a support vector machine (SVM model with a highly imbalanced dataset of 20 tropical species and one mixed-species class of 24 species identified in a hyperspectral image mosaic (350–2500 nm of Panamanian farmland and secondary forest fragments. The model, with an overall accuracy of 62% ± 2.3% and F-score of 59% ± 2.7%, was applied to the full image mosaic (23,000 ha at a 2-m resolution to produce a species prediction map, which suggested that this tropical agricultural landscape is more diverse than what has been presented in field-based studies. Second, we quantified the effect of class imbalance on model accuracy. Model assessment showed a trend where species with more samples were consistently over predicted while species with fewer samples were under predicted. Standardizing sample size reduced model accuracy, but also reduced the level of species over- and under-prediction. This study advances operational species mapping of diverse tropical landscapes by detailing the effect of imbalanced data on classification accuracy and providing estimates of tree species abundance in an agricultural landscape. Species maps using data and methods presented here can be used in landscape analyses of species distributions to understand human or environmental effects, in addition to focusing conservation

  11. Investigation of the tensor analysing power T{sub 20} in the vector d+p→{sup 3}He+η reaction with the COSY-ANKE experiment

    Energy Technology Data Exchange (ETDEWEB)

    Papenbrock, Michael Paul

    2016-07-01

    The work presented here is a contribution to the ongoing search for quasi-bound states between mesons and nuclei. Early theoretical predictions estimated that an η-meson may form a quasi-bound state with nuclei with a mass number of A≥12. Many measurements in this field have revealed evidence that such a state may indeed exist. More interestingly, it may already form at much lower mass numbers, e.g. between an η-meson and {sup 3}He nucleus. Of special note is an unpolarised measurement of the d+p→{sup 3}He+η reaction, which has been studied in unprecedented detail in the excess energy range -5 MeVa quasi-bound η{sup 3}He state was found by parametrising total and differential cross sections and extracting the final state interaction between the two ejectiles, which was found to be surprisingly strong. However, with both beam and target being unpolarised, possible contributions of the initial state to the cross section could not be investigated. In the d+p→{sup 3}He+η reaction the final state can be produced from two different initial spin states with S=1/2 and S=3/2. If the production amplitudes of these respective states inhibit a different energy dependence, it could manifest in the shape of the total cross section in an unpolarised measurement, thereby affecting the determination of the final state interaction. A new measurement with a vector and tensor polarised deuteron beam on the vector d+p→{sup 3}He+η reaction was performed with COSY-ANKE in the same excess energy range, i.e. -5 MeVanalysing power T{sub 20}. The extracted values of T{sub 20} are compatible with a constant across this energy range. A linear fit revealed only a rather small possible energy dependence. Hence, the possible impact of the initial spin states on the extraction of the final state interaction was found to be of very little significance. Combining

  12. HIV-1 DNA predicts disease progression and post-treatment virological control

    Science.gov (United States)

    Williams, James P; Hurst, Jacob; Stöhr, Wolfgang; Robinson, Nicola; Brown, Helen; Fisher, Martin; Kinloch, Sabine; Cooper, David; Schechter, Mauro; Tambussi, Giuseppe; Fidler, Sarah; Carrington, Mary; Babiker, Abdel; Weber, Jonathan

    2014-01-01

    In HIV-1 infection, a population of latently infected cells facilitates viral persistence despite antiretroviral therapy (ART). With the aim of identifying individuals in whom ART might induce a period of viraemic control on stopping therapy, we hypothesised that quantification of the pool of latently infected cells in primary HIV-1 infection (PHI) would predict clinical progression and viral replication following ART. We measured HIV-1 DNA in a highly characterised randomised population of individuals with PHI. We explored associations between HIV-1 DNA and immunological and virological markers of clinical progression, including viral rebound in those interrupting therapy. In multivariable analyses, HIV-1 DNA was more predictive of disease progression than plasma viral load and, at treatment interruption, predicted time to plasma virus rebound. HIV-1 DNA may help identify individuals who could safely interrupt ART in future HIV-1 eradication trials. Clinical trial registration: ISRCTN76742797 and EudraCT2004-000446-20 DOI: http://dx.doi.org/10.7554/eLife.03821.001 PMID:25217531

  13. Human Factors Risk Analyses of a Doffing Protocol for Ebola-Level Personal Protective Equipment: Mapping Errors to Contamination.

    Science.gov (United States)

    Mumma, Joel M; Durso, Francis T; Ferguson, Ashley N; Gipson, Christina L; Casanova, Lisa; Erukunuakpor, Kimberly; Kraft, Colleen S; Walsh, Victoria L; Zimring, Craig; DuBose, Jennifer; Jacob, Jesse T

    2018-03-05

    Doffing protocols for personal protective equipment (PPE) are critical for keeping healthcare workers (HCWs) safe during care of patients with Ebola virus disease. We assessed the relationship between errors and self-contamination during doffing. Eleven HCWs experienced with doffing Ebola-level PPE participated in simulations in which HCWs donned PPE marked with surrogate viruses (ɸ6 and MS2), completed a clinical task, and were assessed for contamination after doffing. Simulations were video recorded, and a failure modes and effects analysis and fault tree analyses were performed to identify errors during doffing, quantify their risk (risk index), and predict contamination data. Fifty-one types of errors were identified, many having the potential to spread contamination. Hand hygiene and removing the powered air purifying respirator (PAPR) hood had the highest total risk indexes (111 and 70, respectively) and number of types of errors (9 and 13, respectively). ɸ6 was detected on 10% of scrubs and the fault tree predicted a 10.4% contamination rate, likely occurring when the PAPR hood inadvertently contacted scrubs during removal. MS2 was detected on 10% of hands, 20% of scrubs, and 70% of inner gloves and the predicted rates were 7.3%, 19.4%, 73.4%, respectively. Fault trees for MS2 and ɸ6 contamination suggested similar pathways. Ebola-level PPE can both protect and put HCWs at risk for self-contamination throughout the doffing process, even among experienced HCWs doffing with a trained observer. Human factors methodologies can identify error-prone steps, delineate the relationship between errors and self-contamination, and suggest remediation strategies.

  14. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  15. Severe Accident Recriticality Analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Puska, E.K.; Nilsson, Lars; Sjoevall, H.

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B 4 C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  16. Amikacin Concentrations Predictive of Ototoxicity in Multidrug-Resistant Tuberculosis Patients.

    Science.gov (United States)

    Modongo, Chawangwa; Pasipanodya, Jotam G; Zetola, Nicola M; Williams, Scott M; Sirugo, Giorgio; Gumbo, Tawanda

    2015-10-01

    Aminoglycosides, such as amikacin, are used to treat multidrug-resistant tuberculosis. However, ototoxicity is a common problem and is monitored using peak and trough amikacin concentrations based on World Health Organization recommendations. Our objective was to identify clinical factors predictive of ototoxicity using an agnostic machine learning method. We used classification and regression tree (CART) analyses to identify clinical factors, including amikacin concentration thresholds that predicted audiometry-confirmed ototoxicity among 28 multidrug-resistant pulmonary tuberculosis patients in Botswana. Amikacin concentrations were measured for all patients. The quantitative relationship between predictive factors and the probability of ototoxicity were then identified using probit analyses. The primary predictors of ototoxicity on CART analyses were cumulative days of therapy, followed by cumulative area under the concentration-time curve (AUC), which improved on the primary predictor by 87%. The area under the receiver operating curve was 0.97 on the test set. Peak and trough were not predictors in any tree. When algorithms were forced to pick peak and trough as primary predictors, the area under the receiver operating curve fell to 0.46. Probit analysis revealed that the probability of ototoxicity increased sharply starting after 6 months of therapy to near maximum at 9 months. A 10% probability of ototoxicity occurred with a threshold cumulative AUC of 87,232 days · mg · h/liter, while that of 20% occurred at 120,000 days · mg · h/liter. Thus, cumulative amikacin AUC and duration of therapy, and not peak and trough concentrations, should be used as the primary decision-making parameters to minimize the likelihood of ototoxicity in multidrug-resistant tuberculosis. Copyright © 2015, Modongo et al.

  17. Predicting Word Reading Ability: A Quantile Regression Study

    Science.gov (United States)

    McIlraith, Autumn L.

    2018-01-01

    Predictors of early word reading are well established. However, it is unclear if these predictors hold for readers across a range of word reading abilities. This study used quantile regression to investigate predictive relationships at different points in the distribution of word reading. Quantile regression analyses used preschool and…

  18. BEAN 2.0: an integrated web resource for the identification and functional analysis of type III secreted effectors.

    Science.gov (United States)

    Dong, Xiaobao; Lu, Xiaotian; Zhang, Ziding

    2015-01-01

    Gram-negative pathogenic bacteria inject type III secreted effectors (T3SEs) into host cells to sabotage their immune signaling networks. Because T3SEs constitute a meeting-point of pathogen virulence and host defense, they are of keen interest to host-pathogen interaction research community. To accelerate the identification and functional understanding of T3SEs, we present BEAN 2.0 as an integrated web resource to predict, analyse and store T3SEs. BEAN 2.0 includes three major components. First, it provides an accurate T3SE predictor based on a hybrid approach. Using independent testing data, we show that BEAN 2.0 achieves a sensitivity of 86.05% and a specificity of 100%. Second, it integrates a set of online sequence analysis tools. Users can further perform functional analysis of putative T3SEs in a seamless way, such as subcellular location prediction, functional domain scan and disorder region annotation. Third, it compiles a database covering 1215 experimentally verified T3SEs and constructs two T3SE-related networks that can be used to explore the relationships among T3SEs. Taken together, by presenting a one-stop T3SE bioinformatics resource, we hope BEAN 2.0 can promote comprehensive understanding of the function and evolution of T3SEs. © The Author(s) 2015. Published by Oxford University Press.

  19. Applying a radiomics approach to predict prognosis of lung cancer patients

    Science.gov (United States)

    Emaminejad, Nastaran; Yan, Shiju; Wang, Yunzhi; Qian, Wei; Guan, Yubao; Zheng, Bin

    2016-03-01

    Radiomics is an emerging technology to decode tumor phenotype based on quantitative analysis of image features computed from radiographic images. In this study, we applied Radiomics concept to investigate the association among the CT image features of lung tumors, which are either quantitatively computed or subjectively rated by radiologists, and two genomic biomarkers namely, protein expression of the excision repair cross-complementing 1 (ERCC1) genes and a regulatory subunit of ribonucleotide reductase (RRM1), in predicting disease-free survival (DFS) of lung cancer patients after surgery. An image dataset involving 94 patients was used. Among them, 20 had cancer recurrence within 3 years, while 74 patients remained DFS. After tumor segmentation, 35 image features were computed from CT images. Using the Weka data mining software package, we selected 10 non-redundant image features. Applying a SMOTE algorithm to generate synthetic data to balance case numbers in two DFS ("yes" and "no") groups and a leave-one-case-out training/testing method, we optimized and compared a number of machine learning classifiers using (1) quantitative image (QI) features, (2) subjective rated (SR) features, and (3) genomic biomarkers (GB). Data analyses showed relatively lower correlation among the QI, SR and GB prediction results (with Pearson correlation coefficients 0.5). Among them, using QI yielded the highest performance.

  20. Potential predictability of a Colombian river flow

    Science.gov (United States)

    Córdoba-Machado, Samir; Palomino-Lemus, Reiner; Quishpe-Vásquez, César; García-Valdecasas-Ojeda, Matilde; Raquel Gámiz-Fortis, Sonia; Castro-Díez, Yolanda; Jesús Esteban-Parra, María

    2017-04-01

    In this study the predictability of an important Colombian river (Cauca) has been analysed based on the use of climatic variables as potential predictors. Cauca River is considered one of the most important rivers of Colombia because its basin supports important productive activities related with the agriculture, such as the production of coffee or sugar. Potential relationships between the Cauca River seasonal streamflow anomalies and different climatic variables such as sea surface temperature (SST), precipitation (Pt), temperature over land (Tm) and soil water (Sw) have been analysed for the period 1949-2009. For this end, moving correlation analysis of 30 years have been carried out for lags from one to four seasons for the global SST, and from one to two seasons for South America Pt, Tm and Sw. Also, the stability of the significant correlations have been also studied, identifying the regions used as potential predictors of streamflow. Finally, in order to establish a prediction scheme based on the previous stable correlations, a Principal Component Analysis (PCA) applied on the potential predictor regions has been carried out in order to obtain a representative time series for each predictor field. Significant and stable correlations between the seasonal streamflow and the tropical Pacific SST (El Niño region) are found for lags from one to four (one-year) season. Additionally, some regions in the Indian and Atlantic Oceans also show significant and stable correlations at different lags, highlighting the importance that exerts the Atlantic SST on the hydrology of Colombia. Also significant and stable correlations are found with the Pt, Tm and Sw for some regions over South America, at lags of one and two seasons. The prediction of Cauca seasonal streamflow based on this scheme shows an acceptable skill and represents a relative improvement compared with the predictability obtained using the teleconnection indices associated with El Niño. Keywords

  1. Sydlangeland Fjernvarme A.m.b.a. Feasibility study on 10-20.000 m{sup 3} pond heat storage[Denmark]; Sydlangeland Fjernvarme A.m.b.a. Forundersoegelse for 10-20.000 m{sup 3} damvarmelager

    Energy Technology Data Exchange (ETDEWEB)

    Visbjerg, L. [Sydlangeland Fjernvarme A.m.b.a. (Denmark); Wesenberg, C.; Bliksted, T. [NIRAS Raadgivende Ingenioerer og Planlaeggere A/S (Denmark); Porsvig, M. [Geoteknisk Institut (Denmark); Duer, K. [DTU, Solenergi Center Danmark (Denmark)

    2000-11-01

    The project includes siting analyses, geotechnical feasibility studies and evaluations of operating economy for establishing a seasonal heat storage carried out as a 10-20,000 m{sup 3} pilot pond heat storage at Sydlangeland straw-fired district heating plant. (EHS)

  2. Modeling a Predictive Energy Equation Specific for Maintenance Hemodialysis.

    Science.gov (United States)

    Byham-Gray, Laura D; Parrott, J Scott; Peters, Emily N; Fogerite, Susan Gould; Hand, Rosa K; Ahrens, Sean; Marcus, Andrea Fleisch; Fiutem, Justin J

    2017-03-01

    Hypermetabolism is theorized in patients diagnosed with chronic kidney disease who are receiving maintenance hemodialysis (MHD). We aimed to distinguish key disease-specific determinants of resting energy expenditure to create a predictive energy equation that more precisely establishes energy needs with the intent of preventing protein-energy wasting. For this 3-year multisite cross-sectional study (N = 116), eligible participants were diagnosed with chronic kidney disease and were receiving MHD for at least 3 months. Predictors for the model included weight, sex, age, C-reactive protein (CRP), glycosylated hemoglobin, and serum creatinine. The outcome variable was measured resting energy expenditure (mREE). Regression modeling was used to generate predictive formulas and Bland-Altman analyses to evaluate accuracy. The majority were male (60.3%), black (81.0%), and non-Hispanic (76.7%), and 23% were ≥65 years old. After screening for multicollinearity, the best predictive model of mREE ( R 2 = 0.67) included weight, age, sex, and CRP. Two alternative models with acceptable predictability ( R 2 = 0.66) were derived with glycosylated hemoglobin or serum creatinine. Based on Bland-Altman analyses, the maintenance hemodialysis equation that included CRP had the best precision, with the highest proportion of participants' predicted energy expenditure classified as accurate (61.2%) and with the lowest number of individuals with underestimation or overestimation. This study confirms disease-specific factors as key determinants of mREE in patients on MHD and provides a preliminary predictive energy equation. Further prospective research is necessary to test the reliability and validity of this equation across diverse populations of patients who are receiving MHD.

  3. Invited review: A commentary on predictive cheese yield formulas.

    Science.gov (United States)

    Emmons, D B; Modler, H W

    2010-12-01

    Predictive cheese yield formulas have evolved from one based only on casein and fat in 1895. Refinements have included moisture and salt in cheese and whey solids as separate factors, paracasein instead of casein, and exclusion of whey solids from moisture associated with cheese protein. The General, Barbano, and Van Slyke formulas were tested critically using yield and composition of milk, whey, and cheese from 22 vats of Cheddar cheese. The General formula is based on the sum of cheese components: fat, protein, moisture, salt, whey solids free of fat and protein, as well as milk salts associated with paracasein. The testing yielded unexpected revelations. It was startling that the sum of components in cheese was SofC) in cheese. The apparent low estimation of SofC led to the idea of adjusting upwards, for each vat, the 5 measured components in the formula by the observed SofC, as a fraction. The mean of the adjusted predicted yields as percentages of actual yields was 99.99%. The adjusted forms of the General, Barbano, and Van Slyke formulas gave predicted yields equal to the actual yields. It was apparent that unadjusted yield formulas did not accurately predict yield; however, unadjusted PY%AY can be useful as a control tool for analyses of cheese and milk. It was unexpected that total milk protein in the adjusted General formula gave the same predicted yields as casein and paracasein, indicating that casein or paracasein may not always be necessary for successful yield prediction. The use of constants for recovery of fat and protein in the adjusted General formula gave adjusted predicted yields equal to actual yields, indicating that analyses of cheese for protein and fat may not always be necessary for yield prediction. Composition of cheese was estimated using a predictive formula; actual yield was needed for estimation of composition. Adjusted formulas are recommended for estimating target yields and cheese yield efficiency. Constants for solute exclusion

  4. Recent plant studies using Victoria 2.0

    International Nuclear Information System (INIS)

    Bixler, Nathan E.; Gasser, Ronald D.

    2000-01-01

    VICTORIA 2.0 is a mechanistic computer code designed to analyze fission product behavior within the reactor coolant system (RCS) during a severe nuclear reactor accident. It provides detailed predictions of the release of radioactive and nonradioactive materials from the reactor core and transport and deposition of these materials within the RCS and secondary circuits. These predictions account for the chemical and aerosol processes that affect radionuclide behavior. VICTORIA 2.0 was released in early 1999; a new version VICTORIA 2.1, is now under development. The largest improvements in VICTORIA 2.1 are connected with the thermochemical database, which is being revised and expanded following the recommendations of a peer review. Three risk-significant severe accident sequences have recently been investigated using the VICTORIA 2.0 code. The focus here is on how various chemistry options affect the predictions. Additionally, the VICTORIA predictions are compared with ones made using the MELCOR code. The three sequences are a station blackout in a GE BWR and steam generator tube rupture (SGTR) and pump-seal LOCA sequences in a 3-loop Westinghouse PWR. These sequences cover a range of system pressures, from fully depressurized to full system pressure. The chief results of this study are the fission product fractions that are retained in the core, RCS, secondary, and containment and the fractions that are released into the environment

  5. Area under the curve predictions of dalbavancin, a new lipoglycopeptide agent, using the end of intravenous infusion concentration data point by regression analyses such as linear, log-linear and power models.

    Science.gov (United States)

    Bhamidipati, Ravi Kanth; Syed, Muzeeb; Mullangi, Ramesh; Srinivas, Nuggehally

    2018-02-01

    1. Dalbavancin, a lipoglycopeptide, is approved for treating gram-positive bacterial infections. Area under plasma concentration versus time curve (AUC inf ) of dalbavancin is a key parameter and AUC inf /MIC ratio is a critical pharmacodynamic marker. 2. Using end of intravenous infusion concentration (i.e. C max ) C max versus AUC inf relationship for dalbavancin was established by regression analyses (i.e. linear, log-log, log-linear and power models) using 21 pairs of subject data. 3. The predictions of the AUC inf were performed using published C max data by application of regression equations. The quotient of observed/predicted values rendered fold difference. The mean absolute error (MAE)/root mean square error (RMSE) and correlation coefficient (r) were used in the assessment. 4. MAE and RMSE values for the various models were comparable. The C max versus AUC inf exhibited excellent correlation (r > 0.9488). The internal data evaluation showed narrow confinement (0.84-1.14-fold difference) with a RMSE models predicted AUC inf with a RMSE of 3.02-27.46% with fold difference largely contained within 0.64-1.48. 5. Regardless of the regression models, a single time point strategy of using C max (i.e. end of 30-min infusion) is amenable as a prospective tool for predicting AUC inf of dalbavancin in patients.

  6. Neutron radiative capture by the 241Am nucleus in the energy range 1 keV-20 MeV

    International Nuclear Information System (INIS)

    Zolotarev, K.I.; Ignatyuk, A.V.; Tolstikov, V.A.; Tertychnyj, G.Ya.

    1998-01-01

    Production of high actinides leads to many technological problems in the nuclear power. The 241 Am(n,γ) 242 Am reaction is one of the sources of high actinide buildup. So a knowledge of the radiative capture cross-section of 241 Am for neutron energies up to 20 MeV is of considerable important for present day fission reactors and future advanced reactors. The main goal of this paper is the evaluation of the excitation function for the reaction 241 Am(n,γ) 242 Am in the energy range 1 keV-20 MeV. The evaluation was done on the basis of analysed experimental data, data from theoretical model calculations and systematic predictions for 14.5 MeV and 20 MeV. Data from the present evaluation are compared with the cross-section values given in the evaluations carried out earlier. (author)

  7. Predictive Genomic Analyses Inform the Basis for Vitamin Metabolism and Provisioning in Bacteria-Arthropod Endosymbioses.

    Science.gov (United States)

    Serbus, Laura R; Rodriguez, Brian Garcia; Sharmin, Zinat; Momtaz, A J M Zehadee; Christensen, Steen

    2017-06-07

    The requirement of vitamins for core metabolic processes creates a unique set of pressures for arthropods subsisting on nutrient-limited diets. While endosymbiotic bacteria carried by arthropods have been widely implicated in vitamin provisioning, the underlying molecular mechanisms are not well understood. To address this issue, standardized predictive assessment of vitamin metabolism was performed in 50 endosymbionts of insects and arachnids. The results predicted that arthropod endosymbionts overall have little capacity for complete de novo biosynthesis of conventional or active vitamin forms. Partial biosynthesis pathways were commonly predicted, suggesting a substantial role in vitamin provisioning. Neither taxonomic relationships between host and symbiont, nor the mode of host-symbiont interaction were clear predictors of endosymbiont vitamin pathway capacity. Endosymbiont genome size and the synthetic capacity of nonsymbiont taxonomic relatives were more reliable predictors. We developed a new software application that also predicted that last-step conversion of intermediates into active vitamin forms may contribute further to vitamin biosynthesis by endosymbionts. Most instances of predicted vitamin conversion were paralleled by predictions of vitamin use. This is consistent with achievement of provisioning in some cases through upregulation of pathways that were retained for endosymbiont benefit. The predicted absence of other enzyme classes further suggests a baseline of vitamin requirement by the majority of endosymbionts, as well as some instances of putative mutualism. Adaptation of this workflow to analysis of other organisms and metabolic pathways will provide new routes for considering the molecular basis for symbiosis on a comprehensive scale. Copyright © 2017 Serbus et al.

  8. Predictive Genomic Analyses Inform the Basis for Vitamin Metabolism and Provisioning in Bacteria-Arthropod Endosymbioses

    Directory of Open Access Journals (Sweden)

    Laura R. Serbus

    2017-06-01

    Full Text Available The requirement of vitamins for core metabolic processes creates a unique set of pressures for arthropods subsisting on nutrient-limited diets. While endosymbiotic bacteria carried by arthropods have been widely implicated in vitamin provisioning, the underlying molecular mechanisms are not well understood. To address this issue, standardized predictive assessment of vitamin metabolism was performed in 50 endosymbionts of insects and arachnids. The results predicted that arthropod endosymbionts overall have little capacity for complete de novo biosynthesis of conventional or active vitamin forms. Partial biosynthesis pathways were commonly predicted, suggesting a substantial role in vitamin provisioning. Neither taxonomic relationships between host and symbiont, nor the mode of host-symbiont interaction were clear predictors of endosymbiont vitamin pathway capacity. Endosymbiont genome size and the synthetic capacity of nonsymbiont taxonomic relatives were more reliable predictors. We developed a new software application that also predicted that last-step conversion of intermediates into active vitamin forms may contribute further to vitamin biosynthesis by endosymbionts. Most instances of predicted vitamin conversion were paralleled by predictions of vitamin use. This is consistent with achievement of provisioning in some cases through upregulation of pathways that were retained for endosymbiont benefit. The predicted absence of other enzyme classes further suggests a baseline of vitamin requirement by the majority of endosymbionts, as well as some instances of putative mutualism. Adaptation of this workflow to analysis of other organisms and metabolic pathways will provide new routes for considering the molecular basis for symbiosis on a comprehensive scale.

  9. MED: a new non-supervised gene prediction algorithm for bacterial and archaeal genomes

    Directory of Open Access Journals (Sweden)

    Yang Yi-Fan

    2007-03-01

    Full Text Available Abstract Background Despite a remarkable success in the computational prediction of genes in Bacteria and Archaea, a lack of comprehensive understanding of prokaryotic gene structures prevents from further elucidation of differences among genomes. It continues to be interesting to develop new ab initio algorithms which not only accurately predict genes, but also facilitate comparative studies of prokaryotic genomes. Results This paper describes a new prokaryotic genefinding algorithm based on a comprehensive statistical model of protein coding Open Reading Frames (ORFs and Translation Initiation Sites (TISs. The former is based on a linguistic "Entropy Density Profile" (EDP model of coding DNA sequence and the latter comprises several relevant features related to the translation initiation. They are combined to form a so-called Multivariate Entropy Distance (MED algorithm, MED 2.0, that incorporates several strategies in the iterative program. The iterations enable us to develop a non-supervised learning process and to obtain a set of genome-specific parameters for the gene structure, before making the prediction of genes. Conclusion Results of extensive tests show that MED 2.0 achieves a competitive high performance in the gene prediction for both 5' and 3' end matches, compared to the current best prokaryotic gene finders. The advantage of the MED 2.0 is particularly evident for GC-rich genomes and archaeal genomes. Furthermore, the genome-specific parameters given by MED 2.0 match with the current understanding of prokaryotic genomes and may serve as tools for comparative genomic studies. In particular, MED 2.0 is shown to reveal divergent translation initiation mechanisms in archaeal genomes while making a more accurate prediction of TISs compared to the existing gene finders and the current GenBank annotation.

  10. Analysing harmonic motions with an iPhone’s magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  11. Maladaptive social information processing in childhood predicts young men's atypical amygdala reactivity to threat.

    Science.gov (United States)

    Choe, Daniel Ewon; Shaw, Daniel S; Forbes, Erika E

    2015-05-01

    Maladaptive social information processing, such as hostile attributional bias and aggressive response generation, is associated with childhood maladjustment. Although social information processing problems are correlated with heightened physiological responses to social threat, few studies have examined their associations with neural threat circuitry, specifically amygdala activation to social threat. A cohort of 310 boys participated in an ongoing longitudinal study and completed questionnaires and laboratory tasks assessing their social and cognitive characteristics the boys were between 10 and 12 years of age. At age 20, 178 of these young men underwent functional magnetic resonance imaging and a social threat task. At age 22, adult criminal arrest records and self-reports of impulsiveness were obtained. Path models indicated that maladaptive social information-processing at ages 10 and 11 predicted increased left amygdala reactivity to fear faces, an ambiguous threat, at age 20 while accounting for childhood antisocial behavior, empathy, IQ, and socioeconomic status. Exploratory analyses indicated that aggressive response generation - the tendency to respond to threat with reactive aggression - predicted left amygdala reactivity to fear faces and was concurrently associated with empathy, antisocial behavior, and hostile attributional bias, whereas hostile attributional bias correlated with IQ. Although unrelated to social information-processing problems, bilateral amygdala reactivity to anger faces at age 20 was unexpectedly predicted by low IQ at age 11. Amygdala activation did not mediate associations between social information processing and number of criminal arrests, but both impulsiveness at age 22 and arrests were correlated with right amygdala reactivity to anger facial expressions at age 20. Childhood social information processing and IQ predicted young men's amygdala response to threat a decade later, which suggests that childhood social

  12. Cell division cycle 20 overexpression predicts poor prognosis for patients with lung adenocarcinoma.

    Science.gov (United States)

    Shi, Run; Sun, Qi; Sun, Jing; Wang, Xin; Xia, Wenjie; Dong, Gaochao; Wang, Anpeng; Jiang, Feng; Xu, Lin

    2017-03-01

    The cell division cycle 20, a key component of spindle assembly checkpoint, is an essential activator of the anaphase-promoting complex. Aberrant expression of cell division cycle 20 has been detected in various human cancers. However, its clinical significance has never been deeply investigated in non-small-cell lung cancer. By analyzing The Cancer Genome Atlas database and using some certain online databases, we validated overexpression of cell division cycle 20 in both messenger RNA and protein levels, explored its clinical significance, and evaluated the prognostic role of cell division cycle 20 in non-small-cell lung cancer. Cell division cycle 20 expression was significantly correlated with sex (p = 0.003), histological classification (p overexpression of cell division cycle 20 was significantly associated with bigger primary tumor size (p = 0.0023), higher MKI67 level (r = 0.7618, p Overexpression of cell division cycle 20 is associated with poor prognosis in lung adenocarcinoma patients, and its overexpression can also be used to identify high-risk groups. In conclusion, cell division cycle 20 might serve as a potential biomarker for lung adenocarcinoma patients.

  13. Usage Analysis of Web 2.0 and Library 2.0 Tools by Librarians in Kwara State Academic Libraries

    Science.gov (United States)

    Tella, Adeyinka; Soluoku, Taofeeqat

    2016-01-01

    This study analysed the usage of Web 2.0 and Library 2.0 tools by librarians in Kwara State academic libraries. A sample of 40 librarians was surveyed through total enumeration sampling technique from four different tertiary education institutions libraries in Kwara State, Nigeria. Questionnaire was used for the collection of data. The collected…

  14. Predicting the multi-domain progression of Parkinson's disease: a Bayesian multivariate generalized linear mixed-effect model.

    Science.gov (United States)

    Wang, Ming; Li, Zheng; Lee, Eun Young; Lewis, Mechelle M; Zhang, Lijun; Sterling, Nicholas W; Wagner, Daymond; Eslinger, Paul; Du, Guangwei; Huang, Xuemei

    2017-09-25

    It is challenging for current statistical models to predict clinical progression of Parkinson's disease (PD) because of the involvement of multi-domains and longitudinal data. Past univariate longitudinal or multivariate analyses from cross-sectional trials have limited power to predict individual outcomes or a single moment. The multivariate generalized linear mixed-effect model (GLMM) under the Bayesian framework was proposed to study multi-domain longitudinal outcomes obtained at baseline, 18-, and 36-month. The outcomes included motor, non-motor, and postural instability scores from the MDS-UPDRS, and demographic and standardized clinical data were utilized as covariates. The dynamic prediction was performed for both internal and external subjects using the samples from the posterior distributions of the parameter estimates and random effects, and also the predictive accuracy was evaluated based on the root of mean square error (RMSE), absolute bias (AB) and the area under the receiver operating characteristic (ROC) curve. First, our prediction model identified clinical data that were differentially associated with motor, non-motor, and postural stability scores. Second, the predictive accuracy of our model for the training data was assessed, and improved prediction was gained in particularly for non-motor (RMSE and AB: 2.89 and 2.20) compared to univariate analysis (RMSE and AB: 3.04 and 2.35). Third, the individual-level predictions of longitudinal trajectories for the testing data were performed, with ~80% observed values falling within the 95% credible intervals. Multivariate general mixed models hold promise to predict clinical progression of individual outcomes in PD. The data was obtained from Dr. Xuemei Huang's NIH grant R01 NS060722 , part of NINDS PD Biomarker Program (PDBP). All data was entered within 24 h of collection to the Data Management Repository (DMR), which is publically available ( https://pdbp.ninds.nih.gov/data-management ).

  15. Practices for predicting and preventing preterm birth in Ireland: a national survey.

    LENUS (Irish Health Repository)

    Smith, V

    2011-03-01

    Preterm birth can result in adverse outcomes for the neonate and\\/or his\\/her family. The accurate prediction and prevention of preterm birth is paramount. This study describes and critically analyses practices for predicting and preventing preterm birth in Ireland.

  16. Older but not wiser—Predicting a partner's preferences gets worse with age

    OpenAIRE

    Scheibehenne, Benjamin; Todd, Peter M.; Mata, Jutta

    2011-01-01

    To test the influence of relationship length on ability to predict a partner's preferences, 58 younger (M = 24.1 years) and 20 older (M = 68.7 years) couples made predictions in three domains that varied in daily importance. While prediction accuracy was generally better than chance, longer relationship length correlated with lower prediction accuracy and greater overconfidence. The difference in accuracy between older and younger couples increased for strong preferences and when controlling ...

  17. A novel health indicator for on-line lithium-ion batteries remaining useful life prediction

    Science.gov (United States)

    Zhou, Yapeng; Huang, Miaohua; Chen, Yupu; Tao, Ye

    2016-07-01

    Prediction of lithium-ion batteries remaining useful life (RUL) plays an important role in an intelligent battery management system. The capacity and internal resistance are often used as the batteries health indicator (HI) for quantifying degradation and predicting RUL. However, on-line measurement of capacity and internal resistance are hardly realizable due to the not fully charged and discharged condition and the extremely expensive cost, respectively. Therefore, there is a great need to find an optional way to deal with this plight. In this work, a novel HI is extracted from the operating parameters of lithium-ion batteries for degradation modeling and RUL prediction. Moreover, Box-Cox transformation is employed to improve HI performance. Then Pearson and Spearman correlation analyses are utilized to evaluate the similarity between real capacity and the estimated capacity derived from the HI. Next, both simple statistical regression technique and optimized relevance vector machine are employed to predict the RUL based on the presented HI. The correlation analyses and prediction results show the efficiency and effectiveness of the proposed HI for battery degradation modeling and RUL prediction.

  18. Analysing the length of care episode after hip fracture: a nonparametric and a parametric Bayesian approach.

    Science.gov (United States)

    Riihimäki, Jaakko; Sund, Reijo; Vehtari, Aki

    2010-06-01

    Effective utilisation of limited resources is a challenge for health care providers. Accurate and relevant information extracted from the length of stay distributions is useful for management purposes. Patient care episodes can be reconstructed from the comprehensive health registers, and in this paper we develop a Bayesian approach to analyse the length of care episode after a fractured hip. We model the large scale data with a flexible nonparametric multilayer perceptron network and with a parametric Weibull mixture model. To assess the performances of the models, we estimate expected utilities using predictive density as a utility measure. Since the model parameters cannot be directly compared, we focus on observables, and estimate the relevances of patient explanatory variables in predicting the length of stay. To demonstrate how the use of the nonparametric flexible model is advantageous for this complex health care data, we also study joint effects of variables in predictions, and visualise nonlinearities and interactions found in the data.

  19. Predicting behavior change from persuasive messages using neural representational similarity and social network analyses.

    Science.gov (United States)

    Pegors, Teresa K; Tompson, Steven; O'Donnell, Matthew Brook; Falk, Emily B

    2017-08-15

    Neural activity in medial prefrontal cortex (MPFC), identified as engaging in self-related processing, predicts later health behavior change. However, it is unknown to what extent individual differences in neural representation of content and lived experience influence this brain-behavior relationship. We examined whether the strength of content-specific representations during persuasive messaging relates to later behavior change, and whether these relationships change as a function of individuals' social network composition. In our study, smokers viewed anti-smoking messages while undergoing fMRI and we measured changes in their smoking behavior one month later. Using representational similarity analyses, we found that the degree to which message content (i.e. health, social, or valence information) was represented in a self-related processing MPFC region was associated with later smoking behavior, with increased representations of negatively valenced (risk) information corresponding to greater message-consistent behavior change. Furthermore, the relationship between representations and behavior change depended on social network composition: smokers who had proportionally fewer smokers in their network showed increases in smoking behavior when social or health content was strongly represented in MPFC, whereas message-consistent behavior (i.e., less smoking) was more likely for those with proportionally more smokers in their social network who represented social or health consequences more strongly. These results highlight the dynamic relationship between representations in MPFC and key outcomes such as health behavior change; a complete understanding of the role of MPFC in motivation and action should take into account individual differences in neural representation of stimulus attributes and social context variables such as social network composition. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. DMINDA: an integrated web server for DNA motif identification and analyses.

    Science.gov (United States)

    Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying

    2014-07-01

    DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Microstructure Evolution and Flow Stress Model of a 20Mn5 Hollow Steel Ingot during Hot Compression.

    Science.gov (United States)

    Liu, Min; Ma, Qing-Xian; Luo, Jian-Bin

    2018-03-21

    20Mn5 steel is widely used in the manufacture of heavy hydro-generator shaft due to its good performance of strength, toughness and wear resistance. However, the hot deformation and recrystallization behaviors of 20Mn5 steel compressed under high temperature were not studied. In this study, the hot compression experiments under temperatures of 850-1200 °C and strain rates of 0.01/s-1/s are conducted using Gleeble thermal and mechanical simulation machine. And the flow stress curves and microstructure after hot compression are obtained. Effects of temperature and strain rate on microstructure are analyzed. Based on the classical stress-dislocation relation and the kinetics of dynamic recrystallization, a two-stage constitutive model is developed to predict the flow stress of 20Mn5 steel. Comparisons between experimental flow stress and predicted flow stress show that the predicted flow stress values are in good agreement with the experimental flow stress values, which indicates that the proposed constitutive model is reliable and can be used for numerical simulation of hot forging of 20Mn5 hollow steel ingot.

  2. Mutations in SLC20A2 are a major cause of familial idiopathic basal ganglia calcification

    Science.gov (United States)

    Hsu, Sandy Chan; Sears, Renee L.; Lemos, Roberta R.; Quintáns, Beatriz; Huang, Alden; Spiteri, Elizabeth; Nevarez, Lisette; Mamah, Catherine; Zatz, Mayana; Pierce, Kerrie D.; Fullerton, Janice M.; Adair, John C.; Berner, Jon E.; Bower, Matthew; Brodaty, Henry; Carmona, Olga; Dobricić, Valerija; Fogel, Brent L.; García-Estevez, Daniel; Goldman, Jill; Goudreau, John L.; Hopfer, Suellen; Janković, Milena; Jaumà, Serge; Jen, Joanna C.; Kirdlarp, Suppachok; Klepper, Joerg; Kostić, Vladimir; Lang, Anthony E.; Linglart, Agnès; Maisenbacher, Melissa K.; Manyam, Bala V.; Mazzoni, Pietro; Miedzybrodzka, Zofia; Mitarnun, Witoon; Mitchell, Philip B.; Mueller, Jennifer; Novaković, Ivana; Paucar, Martin; Paulson, Henry; Simpson, Sheila A.; Svenningsson, Per; Tuite, Paul; Vitek, Jerrold; Wetchaphanphesat, Suppachok; Williams, Charles; Yang, Michele; Schofield, Peter R.; de Oliveira, João R. M.; Sobrido, María-Jesús

    2014-01-01

    Familial idiopathic basal ganglia calcification (IBGC) or Fahr’s disease is a rare neurodegenerative disorder characterized by calcium deposits in the basal ganglia and other brain regions, which is associated with neuropsychiatric and motor symptoms. Familial IBGC is genetically heterogeneous and typically transmitted in an autosomal dominant fashion. We performed a mutational analysis of SLC20A2, the first gene found to cause IBGC, to assess its genetic contribution to familial IBGC. We recruited 218 subjects from 29 IBGC-affected families of varied ancestry and collected medical history, neurological exam, and head CT scans to characterize each patient’s disease status. We screened our patient cohort for mutations in SLC20A2. Twelve novel (nonsense, deletions, missense, and splice site) potentially pathogenic variants, one synonymous variant, and one previously reported mutation were identified in 13 families. Variants predicted to be deleterious cosegregated with disease in five families. Three families showed nonsegregation with clinical disease of such variants, but retrospective review of clinical and neuroimaging data strongly suggested previous misclassification. Overall, mutations in SLC20A2 account for as many as 41 % of our familial IBGC cases. Our screen in a large series expands the catalog of SLC20A2 mutations identified to date and demonstrates that mutations in SLC20A2 are a major cause of familial IBGC. Non-perfect segregation patterns of predicted deleterious variants highlight the challenges of phenotypic assessment in this condition with highly variable clinical presentation. PMID:23334463

  3. Availability simulation software adaptation to the IFMIF accelerator facility RAMI analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bargalló, Enric, E-mail: enric.bargallo-font@upc.edu [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Sureda, Pere Joan [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Arroyo, Jose Manuel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, Madrid (Spain); Abal, Javier; De Blas, Alfredo; Dies, Javier; Tapia, Carlos [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Mollá, Joaquín; Ibarra, Ángel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, Madrid (Spain)

    2014-10-15

    Highlights: • The reason why IFMIF RAMI analyses needs a simulation is explained. • Changes, modifications and software validations done to AvailSim are described. • First IFMIF RAMI results obtained with AvailSim 2.0 are shown. • Implications of AvailSim 2.0 in IFMIF RAMI analyses are evaluated. - Abstract: Several problems were found when using generic reliability tools to perform RAMI (Reliability Availability Maintainability Inspectability) studies for the IFMIF (International Fusion Materials Irradiation Facility) accelerator. A dedicated simulation tool was necessary to model properly the complexity of the accelerator facility. AvailSim, the availability simulation software used for the International Linear Collider (ILC) became an excellent option to fulfill RAMI analyses needs. Nevertheless, this software needed to be adapted and modified to simulate the IFMIF accelerator facility in a useful way for the RAMI analyses in the current design phase. Furthermore, some improvements and new features have been added to the software. This software has become a great tool to simulate the peculiarities of the IFMIF accelerator facility allowing obtaining a realistic availability simulation. Degraded operation simulation and maintenance strategies are the main relevant features. In this paper, the necessity of this software, main modifications to improve it and its adaptation to IFMIF RAMI analysis are described. Moreover, first results obtained with AvailSim 2.0 and a comparison with previous results is shown.

  4. Availability simulation software adaptation to the IFMIF accelerator facility RAMI analyses

    International Nuclear Information System (INIS)

    Bargalló, Enric; Sureda, Pere Joan; Arroyo, Jose Manuel; Abal, Javier; De Blas, Alfredo; Dies, Javier; Tapia, Carlos; Mollá, Joaquín; Ibarra, Ángel

    2014-01-01

    Highlights: • The reason why IFMIF RAMI analyses needs a simulation is explained. • Changes, modifications and software validations done to AvailSim are described. • First IFMIF RAMI results obtained with AvailSim 2.0 are shown. • Implications of AvailSim 2.0 in IFMIF RAMI analyses are evaluated. - Abstract: Several problems were found when using generic reliability tools to perform RAMI (Reliability Availability Maintainability Inspectability) studies for the IFMIF (International Fusion Materials Irradiation Facility) accelerator. A dedicated simulation tool was necessary to model properly the complexity of the accelerator facility. AvailSim, the availability simulation software used for the International Linear Collider (ILC) became an excellent option to fulfill RAMI analyses needs. Nevertheless, this software needed to be adapted and modified to simulate the IFMIF accelerator facility in a useful way for the RAMI analyses in the current design phase. Furthermore, some improvements and new features have been added to the software. This software has become a great tool to simulate the peculiarities of the IFMIF accelerator facility allowing obtaining a realistic availability simulation. Degraded operation simulation and maintenance strategies are the main relevant features. In this paper, the necessity of this software, main modifications to improve it and its adaptation to IFMIF RAMI analysis are described. Moreover, first results obtained with AvailSim 2.0 and a comparison with previous results is shown

  5. Effect of bubble interface parameters on predicted of bubble departure diameter in a narrow channel

    International Nuclear Information System (INIS)

    Xu Jianjun; Xie Tianzhou; Zhou Wenbin; Chen Bingde; Huang Yanping

    2014-01-01

    The predicted model on the bubble departure diameter in a narrow channel is built by analysis of forces acting on the bubble, and effects of bubble interface parameters such as the bubble inclination angle, upstream contact angle, downstream contact angle and bubble contact diameter on predicted bubble departure diameters in a narrow channel are analysed by comparing with the visual experimental data. Based on the above results, the bubble interface parameters as the input parameters used to obtain the bubble departure diameter in a narrow channel are assured, and the bubble departure diameters in a narrow channel are predicted by solving the force equation. The predicted bubble departure diameters are verified by the 58 bubble departure diameters obtained from the vertical and inclined visual experiment, and the predicted results agree with the experimental results. The different forces acting on the bubble are obtained and the effect of thermal parameters in this experiment on bubble departure diameters is analysed. (authors)

  6. A mixed-method research to investigate the adoption of mobile devices and Web2.0 technologies among medical students and educators.

    Science.gov (United States)

    Fan, Si; Radford, Jan; Fabian, Debbie

    2016-04-19

    The past decade has witnessed the increasing adoption of Web 2.0 technologies in medical education. Recently, the notion of digital habitats, Web 2.0 supported learning environments, has also come onto the scene. While there has been initial research on the use of digital habitats for educational purposes, very limited research has examined the adoption of digital habitats by medical students and educators on mobile devices. This paper reports the Stage 1 findings of a two-staged study. The whole study aimed to develop and implement a personal digital habitat, namely digiMe, for medical students and educators at an Australian university. The first stage, however, examined the types of Web 2.0 tools and mobile devices that are being used by potential digiMe users, and reasons for their adoption. In this first stage of research, data were collected through a questionnaire and semi-structured interviews. Questionnaire data collected from 104 participants were analysed using the Predictive Analytics SoftWare (PASW). Frequencies, median and mean values were pursued. Kruskal Wallis tests were then performed to examine variations between views of different participant groups. Notes from the 6 interviews, together with responses to the open-ended section of the questionnaire, were analysed using the constructivist grounded theory approach, to generate key themes relevant to the adoption of Web 2.0 tools and mobile devices. The findings reflected the wide use of mobile devices, including both smart phones and computing tablets, by medical students and educators for learning, teaching and professional development purposes. Among the 22 types of Web 2.0 tools investigated, less than half of these tools were frequently used by the participants, this reflects the mismatch between users' desires and their actual practice. Age and occupation appeared to be the influential factors for their adoption. Easy access to information and improved communication are main purposes. This

  7. Titanium K-Shell X-Ray Production from High Velocity Wire Arrays Implosions on the 20-MA Z Accelerator

    International Nuclear Information System (INIS)

    Apruzese, J.P.; Beg, F.N.; Clark, R.C.; Coverdale, C.A.; Davis, J.; Deeney, C.; Douglas, M.R.; Nash, T.J.; Ruiz-Comacho, J.; Spielman, R.B.; Struve, K.W.; Thornhill, J.W.; Whitney, K.G.

    1999-01-01

    The advent of the 20-MA Z accelerator [R.B. Spielman, C. Deeney, G.A. Chandler, et al., Phys. Plasmas 5, 2105, (1997)] has enabled implosions of large diameter, high-wire-number arrays of titanium to begin testing Z-pinch K-shell scaling theories. The 2-cm long titanium arrays, which were mounted on a 40-mm diameter, produced between 75±15 to 125±20 kJ of K-shell x-rays. Mass scans indicate that, as predicted, higher velocity implosions in the series produced higher x-ray yields. Spectroscopic analyses indicate that these high velocity implosions achieved peak electron temperatures from 2.7±0.1 to 3.2±0.2 keV and obtained a K-shell emission mass participation of up to 12%

  8. Severe accident recriticality analyses (SARA)

    DEFF Research Database (Denmark)

    Frid, W.; Højerup, C.F.; Lindholm, I.

    2001-01-01

    with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality-both super-prompt power bursts and quasi steady-state power......Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies......, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g(-1), was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s(-1). In most cases, however, the predicted energy deposition was smaller, below...

  9. Intrinsic disorder in Viral Proteins Genome-Linked: experimental and predictive analyses

    Directory of Open Access Journals (Sweden)

    Van Dorsselaer Alain

    2009-02-01

    Full Text Available Abstract Background VPgs are viral proteins linked to the 5' end of some viral genomes. Interactions between several VPgs and eukaryotic translation initiation factors eIF4Es are critical for plant infection. However, VPgs are not restricted to phytoviruses, being also involved in genome replication and protein translation of several animal viruses. To date, structural data are still limited to small picornaviral VPgs. Recently three phytoviral VPgs were shown to be natively unfolded proteins. Results In this paper, we report the bacterial expression, purification and biochemical characterization of two phytoviral VPgs, namely the VPgs of Rice yellow mottle virus (RYMV, genus Sobemovirus and Lettuce mosaic virus (LMV, genus Potyvirus. Using far-UV circular dichroism and size exclusion chromatography, we show that RYMV and LMV VPgs are predominantly or partly unstructured in solution, respectively. Using several disorder predictors, we show that both proteins are predicted to possess disordered regions. We next extend theses results to 14 VPgs representative of the viral diversity. Disordered regions were predicted in all VPg sequences whatever the genus and the family. Conclusion Based on these results, we propose that intrinsic disorder is a common feature of VPgs. The functional role of intrinsic disorder is discussed in light of the biological roles of VPgs.

  10. ConKit: a python interface to contact predictions.

    Science.gov (United States)

    Simkovic, Felix; Thomas, Jens M H; Rigden, Daniel J

    2017-07-15

    Recent advances in protein residue contact prediction algorithms have led to the emergence of many new methods and a variety of file formats. We present ConKit , an open source, modular and extensible Python interface which allows facile conversion between formats and provides an interface to analyses of sequence alignments and sets of contact predictions. ConKit is available via the Python Package Index. The documentation can be found at http://www.conkit.org . ConKit is licensed under the BSD 3-Clause. hlfsimko@liverpool.ac.uk or drigden@liverpool.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  11. Microstructure analyses and thermoelectric properties of Ag1−xPb18Sb1+yTe20

    International Nuclear Information System (INIS)

    Perlt, S.; Höche, Th.; Dadda, J.; Müller, E.; Bauer Pereira, P.; Hermann, R.; Sarahan, M.; Pippel, E.; Brydson, R.

    2012-01-01

    This study reports microstructural investigations of long-term annealed Ag 1−x Pb m Sb 1+y Te 2+m (m=18, x=y=0, hereinafter referred to as AgPb 18 SbTe 20 ) (Lead–Antimony–Silver–Tellurium, LAST-18) as well as of Ag 1−x Pb 18 Sb 1+y Te 20 , i.e. Ag-deficient and Sb-excess LAST-18 (x≠0,y≠0), respectively. Two different length scales are explored. The micrometer scale was evaluated by SEM to analyze the volume fraction and the number of secondary phases as well as the impact of processing parameters on the homogeneity of bulk samples. For AgPb 18 SbTe 20 , site-specific FIB liftout of TEM lamellae from thermoelectrically characterized samples was accomplished to investigate the structure on the nanometer scale. High-resolution TEM and energy-filtered TEM were performed to reveal shape and size distribution of nanoprecipitates, respectively. A hypothesis concerning the structure–property relationship is set out within the frame of a gradient annealing experiment. This study is completed by results dealing with inhomogeneities on the micrometer scale of Ag 1−x Pb 18 Sb 1+y Te 20 and its electronic properties. Highlights: ► SEM and TEM microstructure investigation of long-term annealed AgPb 18 SbTe 20 . ► SEM and thermoelectric studies on Ag 1−x Pb 18 Sb 1+y Te 20 . ► Discussion concerning structure–property relationship in long-term annealed AgPb 18 SbTe 20 . ► Correlation between Ag 1−x Pb 18 Sb 1+y Te 20 microscale structure and electronic properties.

  12. Large-scale replication study reveals a limit on probabilistic prediction in language comprehension.

    Science.gov (United States)

    Nieuwland, Mante S; Politzer-Ahles, Stephen; Heyselaar, Evelien; Segaert, Katrien; Darley, Emily; Kazanina, Nina; Von Grebmer Zu Wolfsthurn, Sarah; Bartolozzi, Federica; Kogan, Vita; Ito, Aine; Mézière, Diane; Barr, Dale J; Rousselet, Guillaume A; Ferguson, Heather J; Busch-Moreno, Simon; Fu, Xiao; Tuomainen, Jyrki; Kulakova, Eugenia; Husband, E Matthew; Donaldson, David I; Kohút, Zdenko; Rueschemeyer, Shirley-Ann; Huettig, Falk

    2018-04-03

    Do people routinely pre-activate the meaning and even the phonological form of upcoming words? The most acclaimed evidence for phonological prediction comes from a 2005 Nature Neuroscience publication by DeLong, Urbach and Kutas, who observed a graded modulation of electrical brain potentials (N400) to nouns and preceding articles by the probability that people use a word to continue the sentence fragment ('cloze'). In our direct replication study spanning 9 laboratories ( N =334), pre-registered replication-analyses and exploratory Bayes factor analyses successfully replicated the noun-results but, crucially, not the article-results. Pre-registered single-trial analyses also yielded a statistically significant effect for the nouns but not the articles. Exploratory Bayesian single-trial analyses showed that the article-effect may be non-zero but is likely far smaller than originally reported and too small to observe without very large sample sizes. Our results do not support the view that readers routinely pre-activate the phonological form of predictable words. © 2018, Nieuwland et al.

  13. Safety, efficacy, predictability and stability of laser in situ keratomileusis (LASIK) with a 1000-Hz scanning spot excimer laser.

    Science.gov (United States)

    Khoramnia, Ramin; Salgado, Josefina P; Wuellner, Christian; Donitzky, Christof; Lohmann, Chris P; Winkler von Mohrenfels, Christoph

    2012-09-01

    To evaluate the safety, efficacy, predictability and stability of laser in situ keratomileusis (LASIK) with a 1000-Hz scanning spot excimer laser (Concept System 1000; WaveLight GmbH, Erlangen, Germany). LASIK was performed on twenty eyes with myopia or myopic astigmatism (mean spherical equivalent refraction: -3.97±1.72 dioptres (D); mean cylinder: -0.84±0.77 D) using a microkeratome for flap creation and the Concept System 1000 for photoablation. Patients were examined preoperatively as well as 1, 3 and 6 months after the treatment. Manifest sphere and cylinder, uncorrected (UCDVA) and best corrected (BCDVA) distance visual acuity, corneal topography and pachymetry were analysed. We observed no adverse events that might have been associated with the use of a repetition rate of 1000 Hz. All eyes maintained or had improved BCDVA at 6 months after treatment when compared to preoperative values. Six months after LASIK, UCDVA was 20/20 or better in 85% and 20/25 or better in 100% of the eyes. The spherical equivalent refraction was within ±0.50 D in 95% of the eyes at 6 months after surgery. The refraction stayed stable over time; 95% of the eyes changedLASIK with the prototype 1000-Hz excimer laser was safe, efficient and predictable. The postoperative refraction was stable over time. There were no specific clinical side-effects that might be associated with the use of such a high repetition rate. © 2011 The Authors. Acta Ophthalmologica © 2011 Acta Ophthalmologica Scandinavica Foundation.

  14. IntaRNA 2.0: enhanced and customizable prediction of RNA–RNA interactions

    Science.gov (United States)

    Mann, Martin; Wright, Patrick R.

    2017-01-01

    Abstract The IntaRNA algorithm enables fast and accurate prediction of RNA–RNA hybrids by incorporating seed constraints and interaction site accessibility. Here, we introduce IntaRNAv2, which enables enhanced parameterization as well as fully customizable control over the prediction modes and output formats. Based on up to date benchmark data, the enhanced predictive quality is shown and further improvements due to more restrictive seed constraints are highlighted. The extended web interface provides visualizations of the new minimal energy profiles for RNA–RNA interactions. These allow a detailed investigation of interaction alternatives and can reveal potential interaction site multiplicity. IntaRNAv2 is freely available (source and binary), and distributed via the conda package manager. Furthermore, it has been included into the Galaxy workflow framework and its already established web interface enables ad hoc usage. PMID:28472523

  15. LoopIng: a template-based tool for predicting the structure of protein loops.

    KAUST Repository

    Messih, Mario Abdel; Lepore, Rosalba; Tramontano, Anna

    2015-01-01

    ) and significant enhancements for long loops (11-20 residues). The quality of the predictions is robust to errors that unavoidably affect the stem regions when these are modeled. The method returns a confidence score for the predicted template loops and has

  16. Prothrombin time is predictive of low plasma prothrombin concentration and clinical outcome in patients with trauma hemorrhage: analyses of prospective observational cohort studies.

    Science.gov (United States)

    Balendran, Clare A; Lövgren, Ann; Hansson, Kenny M; Nelander, Karin; Olsson, Marita; Johansson, Karin J; Brohi, Karim; Fries, Dietmar; Berggren, Anders

    2017-03-14

    Fibrinogen and prothrombin have been suggested to become rate limiting in trauma associated coagulopathy. Administration of fibrinogen is now recommended, however, the importance of prothrombin to patient outcome is unknown. We have utilized two trauma patient databases (database 1 n = 358 and database 2 n = 331) to investigate the relationship of plasma prothrombin concentration on clinical outcome and coagulation status. Database 1 has been used to assess the relationship of plasma prothrombin to administered packed red blood cells (PRBC), clinical outcome and coagulation biomarkers (Prothrombin Time (PT), ROTEM EXTEM Coagulation Time (CT) and Maximum Clot Firmness (MCF)). ROC analyses have been performed to investigate the ability of admission coagulation biomarkers to predict low prothrombin concentration (database 1), massive transfusion and 24 h mortality (database 1 and 2). The importance of prothrombin was further investigated in vitro by PT and ROTEM assays in the presence of a prothrombin neutralizing monoclonal antibody and following step-wise dilution. Patients who survived the first 24 h had higher admission prothrombin levels compared to those who died (94 vs.67 IU/dL). Patients with lower transfusion requirements within the first 24 h (≤10 units of PRBCs) also had higher admission prothrombin levels compared to patients with massive transfusion demands (>10 units of PRBCs) (95 vs.62 IU/dL). Admission PT, in comparison to admission ROTEM EXTEM CT and MCF, was found to be a better predictor of prothrombin concentration <60 IU/dL (AUC 0.94 in database 1), of massive transfusion (AUC 0.92 and 0.81 in database 1 and 2 respectively) and 24 h mortality (AUC 0.90 and 0.78 in database 1 and 2, respectively). In vitro experiments supported a critical role for prothrombin in coagulation and demonstrated that PT and ROTEM EXTEM CT are sensitive methods to measure low prothrombin concentration. Our analyses suggest that prothrombin concentration

  17. Microbase2.0: A Generic Framework for Computationally Intensive Bioinformatics Workflows in the Cloud

    OpenAIRE

    Flanagan Keith; Nakjang Sirintra; Hallinan Jennifer; Harwood Colin; Hirt Robert P.; Pocock Matthew R.; Wipat Anil

    2012-01-01

    As bioinformatics datasets grow ever larger, and analyses become increasingly complex, there is a need for data handling infrastructures to keep pace with developing technology. One solution is to apply Grid and Cloud technologies to address the computational requirements of analysing high throughput datasets. We present an approach for writing new, or wrapping existing applications, and a reference implementation of a framework, Microbase2.0, for executing those applications using Grid and C...

  18. Persistence of antibodies 20 y after vaccination with a combined hepatitis A and B vaccine

    Science.gov (United States)

    Van Damme, Pierre; Leroux-Roels, Geert; Suryakiran, P.; Folschweiller, Nicolas; Van Der Meeren, Olivier

    2017-01-01

    ABSTRACT Vaccination is the most effective and well-tolerated method of conferring long-term protection against hepatitis A and B viruses (HAV; HBV). Long-term studies are required to characterize the duration of protection and need for boosters. Following primary immunization of 150 and 157 healthy adults with 3-doses of combined hepatitis A/hepatitis B vaccine (HAB; Twinrix™, GSK Vaccines, Belgium) at 0-1-6 months in 2 separate studies, we measured vaccine-induced antibody persistence against HAV and HBV annually for 20 y (Study A: NCT01000324; Study B: NCT01037114). Subjects with circulating anti-HAV antibodies hepatitis B surface antigen hepatitis A and/or B vaccine dose (Havrix™/Engerix™-B, GSK Vaccines, Belgium). Applying the immunogenicity results from these studies, mathematical modeling predicted long-term persistence. After 20 y, 18 and 25 subjects in studies A and B, respectively, comprised the long-term according-to-protocol cohort for immunogenicity; 100% and 96.0% retained anti-HAV antibodies ≥ 15 mIU/mL, respectively; 94.4% and 92.0% had anti-HBs antibodies ≥ 10 mIU/mL, respectively. Between Years 16–20, 4 subjects who received a challenge dose of monovalent hepatitis A vaccine (N = 2) or hepatitis B vaccine (N = 2), all mounted a strong anamnestic response suggestive of immune memory despite low antibody levels. Mathematical modeling predicts that 40 y after vaccination ≥ 97% vaccinees will maintain anti-HAV ≥ 15 mIU/mL and ≥ 50% vaccinees will retain anti-HBs ≥ 10 mIU/mL. Immunogenicity data confirm that primary immunization with 3-doses of HAB induces persisting anti-HAV and anti-HBs specific antibodies in most adults for up to 20 y; mathematical modeling predicts even longer-term protection. PMID:28281907

  19. Persistence of antibodies 20 y after vaccination with a combined hepatitis A and B vaccine.

    Science.gov (United States)

    Van Damme, Pierre; Leroux-Roels, Geert; Suryakiran, P; Folschweiller, Nicolas; Van Der Meeren, Olivier

    2017-05-04

    Vaccination is the most effective and well-tolerated method of conferring long-term protection against hepatitis A and B viruses (HAV; HBV). Long-term studies are required to characterize the duration of protection and need for boosters. Following primary immunization of 150 and 157 healthy adults with 3-doses of combined hepatitis A/hepatitis B vaccine (HAB; Twinrix™, GSK Vaccines, Belgium) at 0-1-6 months in 2 separate studies, we measured vaccine-induced antibody persistence against HAV and HBV annually for 20 y (Study A: NCT01000324; Study B: NCT01037114). Subjects with circulating anti-HAV antibodies B surface antigen B vaccine dose (Havrix™/Engerix™-B, GSK Vaccines, Belgium). Applying the immunogenicity results from these studies, mathematical modeling predicted long-term persistence. After 20 y, 18 and 25 subjects in studies A and B, respectively, comprised the long-term according-to-protocol cohort for immunogenicity; 100% and 96.0% retained anti-HAV antibodies ≥ 15 mIU/mL, respectively; 94.4% and 92.0% had anti-HBs antibodies ≥ 10 mIU/mL, respectively. Between Years 16-20, 4 subjects who received a challenge dose of monovalent hepatitis A vaccine (N = 2) or hepatitis B vaccine (N = 2), all mounted a strong anamnestic response suggestive of immune memory despite low antibody levels. Mathematical modeling predicts that 40 y after vaccination ≥ 97% vaccinees will maintain anti-HAV ≥ 15 mIU/mL and ≥ 50% vaccinees will retain anti-HBs ≥ 10 mIU/mL. Immunogenicity data confirm that primary immunization with 3-doses of HAB induces persisting anti-HAV and anti-HBs specific antibodies in most adults for up to 20 y; mathematical modeling predicts even longer-term protection.

  20. Kaolin Quality Prediction from Samples: A Bayesian Network Approach

    International Nuclear Information System (INIS)

    Rivas, T.; Taboada, J.; Ordonez, C.; Matias, J. M.

    2009-01-01

    We describe the results of an expert system applied to the evaluation of samples of kaolin for industrial use in paper or ceramic manufacture. Different machine learning techniques - classification trees, support vector machines and Bayesian networks - were applied with the aim of evaluating and comparing their interpretability and prediction capacities. The predictive capacity of these models for the samples analyzed was highly satisfactory, both for ceramic quality and paper quality. However, Bayesian networks generally proved to be the most useful technique for our study, as this approach combines good predictive capacity with excellent interpretability of the kaolin quality structure, as it graphically represents relationships between variables and facilitates what-if analyses.

  1. Quality of courses evaluated by 'predictions' rather than opinions: Fewer respondents needed for similar results.

    Science.gov (United States)

    Cohen-Schotanus, Janke; Schönrock-Adema, Johanna; Schmidt, Henk G

    2010-01-01

    A well-known problem with student surveys is a too low response rate. Experiences with predicting electoral outcomes, which required much smaller sample sizes, inspired us to adopt a similar approach to course evaluation. We expected that having respondents estimate the average opinions of their peers required fewer respondents for comparable outcomes than giving own opinions. Two course evaluation studies were performed among successive first-year medical students (N = 380 and 450, respectively). Study 1: Half the cohort gave opinions on nine questions, while the other half predicted the average outcomes. A prize was offered for the three best predictions (motivational remedy). Study 2: Half the cohort gave opinions, a quarter made predictions without a prize and a quarter made predictions with previous year's results as prior knowledge (cognitive remedy). The numbers of respondents required for stable outcomes were determined following an iterative process. Differences between numbers of respondents required and between average scores were analysed with ANOVA. In both studies, the prediction conditions required significantly fewer respondents (p < 0.001) for comparable outcomes. The informed prediction condition required the fewest respondents (N < 20). Problems with response rates can be reduced by asking respondents to predict evaluation outcomes rather than giving opinions.

  2. The IntFOLD server: an integrated web resource for protein fold recognition, 3D model quality assessment, intrinsic disorder prediction, domain prediction and ligand binding site prediction.

    Science.gov (United States)

    Roche, Daniel B; Buenavista, Maria T; Tetchner, Stuart J; McGuffin, Liam J

    2011-07-01

    The IntFOLD server is a novel independent server that integrates several cutting edge methods for the prediction of structure and function from sequence. Our guiding principles behind the server development were as follows: (i) to provide a simple unified resource that makes our prediction software accessible to all and (ii) to produce integrated output for predictions that can be easily interpreted. The output for predictions is presented as a simple table that summarizes all results graphically via plots and annotated 3D models. The raw machine readable data files for each set of predictions are also provided for developers, which comply with the Critical Assessment of Methods for Protein Structure Prediction (CASP) data standards. The server comprises an integrated suite of five novel methods: nFOLD4, for tertiary structure prediction; ModFOLD 3.0, for model quality assessment; DISOclust 2.0, for disorder prediction; DomFOLD 2.0 for domain prediction; and FunFOLD 1.0, for ligand binding site prediction. Predictions from the IntFOLD server were found to be competitive in several categories in the recent CASP9 experiment. The IntFOLD server is available at the following web site: http://www.reading.ac.uk/bioinf/IntFOLD/.

  3. Surface Temperature Prediction of a Bridge for Tactical Decision Aide Modelling

    Science.gov (United States)

    1988-01-01

    Roadway And Piling Surface Temperature Predictions (No Radiosity Incident on Lower Surface) Compared to Temperature Estimates...Heat gained from water = Heat lost by long wave radiosity radiation. Algebraically, with the conduction term expressed in the same manner as for...5 10 15 20 LOCAL TIME (hrs.) Figure 8. Effect of No Radiosity Incident on Lower Surface. 37 U 8a M OT U% 60-- 0- o.. 20- 0- 1 T I I 5 10 15 20 LOCAL

  4. IntaRNA 2.0: enhanced and customizable prediction of RNA-RNA interactions.

    Science.gov (United States)

    Mann, Martin; Wright, Patrick R; Backofen, Rolf

    2017-07-03

    The IntaRNA algorithm enables fast and accurate prediction of RNA-RNA hybrids by incorporating seed constraints and interaction site accessibility. Here, we introduce IntaRNAv2, which enables enhanced parameterization as well as fully customizable control over the prediction modes and output formats. Based on up to date benchmark data, the enhanced predictive quality is shown and further improvements due to more restrictive seed constraints are highlighted. The extended web interface provides visualizations of the new minimal energy profiles for RNA-RNA interactions. These allow a detailed investigation of interaction alternatives and can reveal potential interaction site multiplicity. IntaRNAv2 is freely available (source and binary), and distributed via the conda package manager. Furthermore, it has been included into the Galaxy workflow framework and its already established web interface enables ad hoc usage. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    Cavitation instabilities have been predicted for a single void in a ductile metal stressed under high triaxiality conditions. In experiments for a ceramic reinforced by metal particles a single dominant void has been observed on the fracture surface of some of the metal particles bridging a crack......, and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses......, while the surrounding voids are represented by a porous ductile material model in terms of a field quantity that specifies the variation of the void volume fraction in the surrounding metal....

  6. Hazard Forecasting by MRI: A Prediction Algorithm of the First Kind

    Science.gov (United States)

    Lomnitz, C.

    2003-12-01

    Seismic gaps do not tell us when and where the next earthquake is due. We present new results on limited earthquake hazard prediction at plate boundaries. Our algorithm quantifies earthquake hazard in seismic gaps. The prediction window found for M7 is on the order of 50 km by 20 years (Lomnitz, 1996a). The earth is unstable with respect to small perturbations of the initial conditions. A prediction of the first kind is an estimate of the time evolution of a complex system with fixed boundary conditions in response to changes in the initial state, for example, weather prediction (Edward Lorenz, 1975; Hasselmann, 2002). We use the catalog of large world earthquakes as a proxy for the initial conditions. The MRI algorithm simulates the response of the system to updating the catalog. After a local stress transient dP the entropy decays as (grad dP)2 due to transient flows directed toward the epicenter. Healing is the thermodynamic process which resets the state of stress. It proceeds as a power law from the rupture boundary inwards, as in a wound. The half-life of a rupture is defined as the healing time which shrinks the size of a scar by half. Healed segments of plate boundary can rupture again. From observations in Chile, Mexico and Japan we find that the half-life of a seismic rupture is about 20 years, in agreement with seismic gap observations. The moment ratio MR is defined as the contrast between the cumulative regional moment release and the local moment deficiency at time t along the plate boundary. The procedure is called MRI. The findings: (1) MRI works; (2) major earthquakes match prominent peaks in the MRI graph; (3) important events (Central Chile 1985; Mexico 1985; Kobe 1995) match MRI peaks which began to emerge 10 to 20 years before the earthquake; (4) The emergence of peaks in MRI depends on earlier ruptures that occurred, not adjacent to but at 10 to 20 fault lengths from the epicentral region, in agreement with triggering effects. The hazard

  7. Identifying null meta-analyses that are ripe for updating

    Directory of Open Access Journals (Sweden)

    Fang Manchun

    2003-07-01

    Full Text Available Abstract Background As an increasingly large number of meta-analyses are published, quantitative methods are needed to help clinicians and systematic review teams determine when meta-analyses are not up to date. Methods We propose new methods for determining when non-significant meta-analytic results might be overturned, based on a prediction of the number of participants required in new studies. To guide decision making, we introduce the "new participant ratio", the ratio of the actual number of participants in new studies to the predicted number required to obtain statistical significance. A simulation study was conducted to study the performance of our methods and a real meta-analysis provides further evidence. Results In our three simulation configurations, our diagnostic test for determining whether a meta-analysis is out of date had sensitivity of 55%, 62%, and 49% with corresponding specificity of 85%, 80%, and 90% respectively. Conclusions Simulations suggest that our methods are able to detect out-of-date meta-analyses. These quick and approximate methods show promise for use by systematic review teams to help decide whether to commit the considerable resources required to update a meta-analysis. Further investigation and evaluation of the methods is required before they can be recommended for general use.

  8. RSIC after 20 years: a look back and a look ahead

    International Nuclear Information System (INIS)

    Maskewitz, B.F.; Roussin, R.W.; Trubev, D.K.

    1983-01-01

    On the occasion of RSIC's 20th anniversary year, this review includes highlights and lessons learned. In June 1963, the first RSIC Newsletter was published, and information analyses procedures and practices were initiated. Evidence indicates that RSIC served for 20 years as the focal point for the exchange and transfer of radiation transport technology and contributed to the advancement of the state of the art. The original concept is found to be sound: operate an information analysis center by collecting, organizing, evaluating, and analyzing all relevant information and making the information available in a form readily useful to scientists and engineers. Computing technology, a computer-based literature information system, and an advisory service remain important elements of the center. Continuing interaction between the center, developers, and users of information products and services has been a key to RSIC success. A look to the future reflects optimism

  9. Safety analyses of the nuclear-powered ship Mutsu with RETRAN

    International Nuclear Information System (INIS)

    Naruko, Y.; Ishida, T.; Tanaka, Y.; Futamura, Y.

    1982-01-01

    To provide a quantitative basis for the safety evaluation of the N.S. Mutsu, a number of safety analyses were performed in the course of reexamination. With respect to operational transient analyses, the RETRAN computer code was used to predict plant performances on the basis of postulated transient scenarios. The COBRA-IV computer code was also used to obtain a value of the minimum DNBR for each transient, which is necessary to predict detailed thermal-hydraulic performances in the core region of the reactor. In the present paper, the following three operational transients, which were calculated as a part of the safety analyses, are being dealt with: a complete loss of load without reactor scram; an excessive load increase incident, which is followed by a 30 percent stepwise load increase in the steam dump flow; and an accidental depressurization of the primary system, which is followed by a sudden full opening of the pressurizer spray valve. A Mutsu two-loop RETRAN model and simulation results were described. The results being compared with those of land-based PWRs, the characteristic features of the Mutsu reactor were presented and the safety of the plant under the operational transient conditions was confirmed

  10. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  11. Evaluating the Predictive Value of Growth Prediction Models

    Science.gov (United States)

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  12. SHERPA: A systematic human error reduction and prediction approach

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1986-01-01

    This paper describes a Systematic Human Error Reduction and Prediction Approach (SHERPA) which is intended to provide guidelines for human error reduction and quantification in a wide range of human-machine systems. The approach utilizes as its basic current cognitive models of human performance. The first module in SHERPA performs task and human error analyses, which identify likely error modes, together with guidelines for the reduction of these errors by training, procedures and equipment redesign. The second module uses a SARAH approach to quantify the probability of occurrence of the errors identified earlier, and provides cost benefit analyses to assist in choosing the appropriate error reduction approaches in the third module

  13. Analyses and characterization of double shell tank

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-04

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams.

  14. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  15. Collaborative Learning with Web 2.0 Tools: Analysing Malaysian Students' Perceptions and Peer Interaction

    Science.gov (United States)

    Leow, Fui Theng; Neo, Mai

    2015-01-01

    Today, ICT, web resources and multimedia contents have become prevalent in Malaysian university classrooms; hence, the learning approaches need to be redesigned for enabling students to use these technologies in co-constructing new meaning. This study analyses student's perception and their peer interaction in the constructivist-collaborative…

  16. SitesIdentify: a protein functional site prediction tool

    Directory of Open Access Journals (Sweden)

    Doig Andrew J

    2009-11-01

    Full Text Available Abstract Background The rate of protein structures being deposited in the Protein Data Bank surpasses the capacity to experimentally characterise them and therefore computational methods to analyse these structures have become increasingly important. Identifying the region of the protein most likely to be involved in function is useful in order to gain information about its potential role. There are many available approaches to predict functional site, but many are not made available via a publicly-accessible application. Results Here we present a functional site prediction tool (SitesIdentify, based on combining sequence conservation information with geometry-based cleft identification, that is freely available via a web-server. We have shown that SitesIdentify compares favourably to other functional site prediction tools in a comparison of seven methods on a non-redundant set of 237 enzymes with annotated active sites. Conclusion SitesIdentify is able to produce comparable accuracy in predicting functional sites to its closest available counterpart, but in addition achieves improved accuracy for proteins with few characterised homologues. SitesIdentify is available via a webserver at http://www.manchester.ac.uk/bioinformatics/sitesidentify/

  17. A study on two phase flows of linear compressors for the prediction of refrigerant leakage

    International Nuclear Information System (INIS)

    Hwang, Il Sun; Lee, Young Lim; Oh, Won Sik; Park, Kyeong Bae

    2015-01-01

    Usage of linear compressors is on the rise due to their high efficiency. In this paper, leakage of a linear compressor has been studied through numerical analysis and experiments. First, nitrogen leakage for a stagnant piston with fixed cylinder pressure as well as for a moving piston with fixed cylinder pressure was analyzed to verify the validity of the two-phase flow analysis model. Next, refrigerant leakage of a linear compressor in operation was finally predicted through 3-dimensional unsteady, two phase flow CFD (Computational fluid dynamics). According to the research results, the numerical analyses for the fixed cylinder pressure models were in good agreement with the experimental results. The refrigerant leakage of the linear compressor in operation mainly occurred through the oil exit and the leakage became negligible after about 0.4s following operation where the leakage became lower than 2.0x10 -4 kg/s.

  18. The influence of bilirubin, haemolysis and turbidity on 20 analytical tests performed on automatic analysers. Results of an interlaboratory study.

    Science.gov (United States)

    Grafmeyer, D; Bondon, M; Manchon, M; Levillain, P

    1995-01-01

    The director of a laboratory has to be sure to give out reliable results for routine tests on automatic analysers regardless of the clinical context. However, he may find hyperbilirubinaemia in some circumstances, parenteral nutrition causing turbidity in others, and haemolysis occurring if sampling is difficult. For this reason, the Commission for Instrumentation of the Société Française de Biologie Clinique (SFBC) (president Alain Feuillu) decided to look into "visible" interferences--bilirubin, haemolysis and turbidity--and their effect on 20 major tests: 13 substrates/chemistries: albumin, calcium, cholesterol, creatinine, glucose, iron, magnesium, phosphorus, total bilirubin, total proteins, triacylglycerols, uric acid, urea, and 7 enzymatic activities: alkaline phosphatase, alanine aminotransferase, alpha-amylase, aspartate aminotransferase, creatine kinase, gamma-glutamyl transferase and lactate dehydrogenase measured on 15 automatic analysers representative of those found on the French market (Astra 8, AU 510, AU 5010, AU 5000, Chem 1, CX 7, Dax 72, Dimension, Ektachem, Hitachi 717, Hitachi 737, Hitachi 747, Monarch, Open 30, Paramax, Wako 30 R) and to see how much they affect the accuracy of results under routine conditions in the laboratory. The study was carried out following the SFBC protocol for the validation of techniques using spiked plasma pools with bilirubin, ditauro-bilirubin, haemoglobin (from haemolysate) and Intralipid (turbidity). Overall, the following results were obtained: haemolysis affects tests the most often (34.5% of cases); total bilirubin interferes in 21.7% of cases; direct bilirubin and turbidity seem to interfere less at around 17%. The different tests are not affected to the same extent; enzyme activity is hardly affected at all; on the other hand certain major tests are extremely sensitive, increasingly so as we go through the following: creatinine (interference of bilirubin), triacylglycerols (interference of bilirubin and

  19. Reliability and validity of a 20-s alternative to the wingate anaerobic test in team sport male athletes.

    Directory of Open Access Journals (Sweden)

    Ahmed Attia

    Full Text Available The intent of this study was to evaluate relative and absolute reliability of the 20-s anaerobic test (WAnT20 versus the WAnT30 and to verify how far the various indices of the 30-s Wingate anaerobic test (WAnT30 could be predicted from the WAnT20 data in male athletes. The participants were Exercise Science majors (age: 21.5±1.6 yrs, stature: 0.183±0.08 m, body mass: 81.2±10.9 kg who participated regularly in team sports. In Phase I, 41 participants performed duplicate WAnT20 and WAnT30 tests to assess reliability. In Phase II, 31 participants performed one trial each of the WAnT20 and WAnT30 to determine the ability of the WAnT20 to predict components of the WAnT30. In Phase III, 31 participants were used to cross-validate the prediction equations developed in Phase II. Respective intra-class correlation coefficients (ICC for peak power output (PPO (ICC = 0.98 and 0.95 and mean power output (MPO (ICC 0.98 and 0.90 did not differ significantly between WAnT20 and WAnT30. ICCs for minimal power output (POmin and fatigue index (FI were poor for both tests (range 0.53 to 0.76. Standard errors of the means (SEM for PPO and MPO were less than their smallest worthwhile changes (SWC in both tests; however, POmin and FI values were "marginal," with SEM values greater than their respective SWCs for both tests values. Stepwise regression analysis showed that MPO had the highest coefficient of predictability (R = 0.97, with POmin and FI considerable lower (R = 0.71 and 0.41 respectively. Cross-validation showed insignificant bias with limits of agreement of 0.99±1.04, 6.5±92.7 W, and 1.6±9.8% between measured and predicted MPO, POmin, and FI, respectively. WAnT20 offers a reliable and valid test of leg anaerobic power in male athletes and could replace the classic WAnT30.

  20. Angular analyses of $b \\to s \\mu^+ \\mu^-$ transitions at CMS

    CERN Document Server

    Wang, Dayong

    2018-01-01

    The flavour changing neutral current decays can be interesting probes for searching for new physics. Angular distributions of $b \\to s \\ell^+ \\ell^-$ transition processes of both $\\mathrm{B}^0 \\to \\mathrm{K}^{*0} \\mu^ +\\mu^-$ and $\\mathrm{B}^{+} \\to \\mathrm{K}^{+} \\mu^+\\mu^-$ are studied using a sample of proton-proton collisions at $\\sqrt{s} = 8~\\mathrm{TeV}$ collected with the CMS detector at the LHC, corresponding to an integrated luminosity of $20.5~\\mathrm{fb}^{-1}$. Angular analyses are performed to determine $P_1$ and $P_5'$ angular parameters for $\\mathrm{B}^0 \\to \\mathrm{K}^{*0} \\mu^ +\\mu^-$ and $A_{FB}$ and $F_{H}$ parameters for $\\mathrm{B}^{+} \\to \\mathrm{K}^{+} \\mu^+\\mu^-$, all as functions of the dimuon invariant mass squared. The $P_5'$ parameter is of particular interest due to recent measurements that indicate a potential discrepancy with the standard model. All the measurements are consistent with the standard model predictions. Efforts with more channels and more coming data will be con...

  1. Predicting future forestland area: a comparison of econometric approaches.

    Science.gov (United States)

    SoEun Ahn; Andrew J. Plantinga; Ralph J. Alig

    2000-01-01

    Predictions of future forestland area are an important component of forest policy analyses. In this article, we test the ability of econometric land use models to accurately forecast forest area. We construct a panel data set for Alabama consisting of county and time-series observation for the period 1964 to 1992. We estimate models using restricted data sets-namely,...

  2. Genetically Predicted Body Mass Index and Breast Cancer Risk: Mendelian Randomization Analyses of Data from 145,000 Women of European Descent.

    Directory of Open Access Journals (Sweden)

    Yan Guo

    2016-08-01

    Full Text Available Observational epidemiological studies have shown that high body mass index (BMI is associated with a reduced risk of breast cancer in premenopausal women but an increased risk in postmenopausal women. It is unclear whether this association is mediated through shared genetic or environmental factors.We applied Mendelian randomization to evaluate the association between BMI and risk of breast cancer occurrence using data from two large breast cancer consortia. We created a weighted BMI genetic score comprising 84 BMI-associated genetic variants to predicted BMI. We evaluated genetically predicted BMI in association with breast cancer risk using individual-level data from the Breast Cancer Association Consortium (BCAC (cases  =  46,325, controls  =  42,482. We further evaluated the association between genetically predicted BMI and breast cancer risk using summary statistics from 16,003 cases and 41,335 controls from the Discovery, Biology, and Risk of Inherited Variants in Breast Cancer (DRIVE Project. Because most studies measured BMI after cancer diagnosis, we could not conduct a parallel analysis to adequately evaluate the association of measured BMI with breast cancer risk prospectively.In the BCAC data, genetically predicted BMI was found to be inversely associated with breast cancer risk (odds ratio [OR]  =  0.65 per 5 kg/m2 increase, 95% confidence interval [CI]: 0.56-0.75, p = 3.32 × 10-10. The associations were similar for both premenopausal (OR   =   0.44, 95% CI:0.31-0.62, p  =  9.91 × 10-8 and postmenopausal breast cancer (OR  =  0.57, 95% CI: 0.46-0.71, p  =  1.88 × 10-8. This association was replicated in the data from the DRIVE consortium (OR  =  0.72, 95% CI: 0.60-0.84, p   =   1.64 × 10-7. Single marker analyses identified 17 of the 84 BMI-associated single nucleotide polymorphisms (SNPs in association with breast cancer risk at p < 0.05; for 16 of them, the

  3. Change of CD20 Expression in Diffuse Large B-Cell Lymphoma Treated with Rituximab, an Anti-CD20 Monoclonal Antibody: A Study of the Osaka Lymphoma Study Group

    Directory of Open Access Journals (Sweden)

    Naoki Wada

    2009-10-01

    Full Text Available Change of CD20 expression was examined in cases of diffuse large B-cell lymphoma (DLBCL. CD20 expression after treatment with anti-CD20 antibody (rituximab, Rx for DLBCL was examined in 23 cases who received serial biopsy by immunohistochemistry (IHC and flow cytometry (FCM. CD20– by IHC and/or FCM was defined as CD20–. Four cases were CD20– at initial biopsy but became CD20+ after chemotherapy with Rx (CH-R (group A. Recurrent tumors in three group A cases became resistant to CH-R. Initial and recurrent tumors were CD20+ before and after CH-R in 17 cases (group B. Tumors before CH-R were CD20– in two cases (group C and continued to be CD20– in one and turned CD20+ in the other with survival time after the relapse of 8 and 23 months, respectively. Evaluation of CD20 expression with immunohistochemical and flow cytometric methods is used for the prediction of responsiveness of relapsed DLBCL for CH-R.

  4. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    Science.gov (United States)

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  5. Prediction of infarction development after endovascular stroke therapy with dual-energy computed tomography.

    Science.gov (United States)

    Djurdjevic, Tanja; Rehwald, Rafael; Knoflach, Michael; Matosevic, Benjamin; Kiechl, Stefan; Gizewski, Elke Ruth; Glodny, Bernhard; Grams, Astrid Ellen

    2017-03-01

    After intraarterial recanalisation (IAR), the haemorrhage and the blood-brain barrier (BBB) disruption can be distinguished using dual-energy computed tomography (DECT). The aim of the present study was to investigate whether future infarction development can be predicted from DECT. DECT scans of 20 patients showing 45 BBB disrupted areas after IAR were assessed and compared with follow-up examinations. Receiver operator characteristic (ROC) analyses using densities from the iodine map (IM) and virtual non-contrast (VNC) were performed. Future infarction areas are denser than future non-infarction areas on IM series (23.44 ± 24.86 vs. 5.77 ± 2.77; p VNC series (29.71 ± 3.33 vs. 35.33 ± 3.50; p 17.13 HU; p VNC series allowed prediction of infarction volume. Future infarction development after IAR can be reliably predicted with the IM series. The prediction of haemorrhages and of infarction size is less reliable. • The IM series (DECT) can predict future infarction development after IAR. • Later haemorrhages can be predicted using the IM and the BW series. • The volume of definable hypodense areas in VNC correlates with infarction volume.

  6. Castor-1C spent fuel storage cask decay heat, heat transfer, and shielding analyses

    International Nuclear Information System (INIS)

    Rector, D.R.; McCann, R.A.; Jenquin, U.P.; Heeb, C.M.; Creer, J.M.; Wheeler, C.L.

    1986-12-01

    This report documents the decay heat, heat transfer, and shielding analyses of the Gesellschaft fuer Nuklear Services (GNS) CASTOR-1C cask used in a spent fuel storage demonstration performed at Preussen Elektra's Wurgassen nuclear power plant. The demonstration was performed between March 1982 and January 1984, and resulted in cask and fuel temperature data and cask exterior surface gamma-ray and neutron radiation dose rate measurements. The purpose of the analyses reported here was to evaluate decay heat, heat transfer, and shielding computer codes. The analyses consisted of (1) performing pre-look predictions (predictions performed before the analysts were provided the test data), (2) comparing ORIGEN2 (decay heat), COBRA-SFS and HYDRA (heat transfer), and QAD and DOT (shielding) results to data, and (3) performing post-test analyses if appropriate. Even though two heat transfer codes were used to predict CASTOR-1C cask test data, no attempt was made to compare the two codes. The codes are being evaluated with other test data (single-assembly data and other cask data), and to compare the codes based on one set of data may be premature and lead to erroneous conclusions

  7. Virtual screening of cocrystal formers for CL-20

    Science.gov (United States)

    Zhou, Jun-Hong; Chen, Min-Bo; Chen, Wei-Ming; Shi, Liang-Wei; Zhang, Chao-Yang; Li, Hong-Zhen

    2014-08-01

    According to the structure characteristics of 2,4,6,8,10,12-hexanitrohexaazaisowurtzitane (CL-20) and the kinetic mechanism of the cocrystal formation, the method of virtual screening CL-20 cocrystal formers by the criterion of the strongest intermolecular site pairing energy (ISPE) was proposed. In this method the strongest ISPE was thought to determine the first step of the cocrystal formation. The prediction results for four sets of common drug molecule cocrystals by this method were compared with those by the total ISPE method from the reference (Musumeci et al., 2011), and the experimental results. This method was then applied to virtually screen the CL-20 cocrystal formers, and the prediction results were compared with the experimental results.

  8. The Bacillus subtilis Conjugative Plasmid pLS20 Encodes Two Ribbon-Helix-Helix Type Auxiliary Relaxosome Proteins That Are Essential for Conjugation

    Directory of Open Access Journals (Sweden)

    Andrés Miguel-Arribas

    2017-11-01

    Full Text Available Bacterial conjugation is the process by which a conjugative element (CE is transferred horizontally from a donor to a recipient cell via a connecting pore. One of the first steps in the conjugation process is the formation of a nucleoprotein complex at the origin of transfer (oriT, where one of the components of the nucleoprotein complex, the relaxase, introduces a site- and strand specific nick to initiate the transfer of a single DNA strand into the recipient cell. In most cases, the nucleoprotein complex involves, besides the relaxase, one or more additional proteins, named auxiliary proteins, which are encoded by the CE and/or the host. The conjugative plasmid pLS20 replicates in the Gram-positive Firmicute bacterium Bacillus subtilis. We have recently identified the relaxase gene and the oriT of pLS20, which are separated by a region of almost 1 kb. Here we show that this region contains two auxiliary genes that we name aux1LS20 and aux2LS20, and which we show are essential for conjugation. Both Aux1LS20 and Aux2LS20 are predicted to contain a Ribbon-Helix-Helix DNA binding motif near their N-terminus. Analyses of the purified proteins show that Aux1LS20 and Aux2LS20 form tetramers and hexamers in solution, respectively, and that they both bind preferentially to oriTLS20, although with different characteristics and specificities. In silico analyses revealed that genes encoding homologs of Aux1LS20 and/or Aux2LS20 are located upstream of almost 400 relaxase genes of the RelLS20 family (MOBL of relaxases. Thus, Aux1LS20 and Aux2LS20 of pLS20 constitute the founding member of the first two families of auxiliary proteins described for CEs of Gram-positive origin.

  9. The Bacillus subtilis Conjugative Plasmid pLS20 Encodes Two Ribbon-Helix-Helix Type Auxiliary Relaxosome Proteins That Are Essential for Conjugation.

    Science.gov (United States)

    Miguel-Arribas, Andrés; Hao, Jian-An; Luque-Ortega, Juan R; Ramachandran, Gayetri; Val-Calvo, Jorge; Gago-Córdoba, César; González-Álvarez, Daniel; Abia, David; Alfonso, Carlos; Wu, Ling J; Meijer, Wilfried J J

    2017-01-01

    Bacterial conjugation is the process by which a conjugative element (CE) is transferred horizontally from a donor to a recipient cell via a connecting pore. One of the first steps in the conjugation process is the formation of a nucleoprotein complex at the origin of transfer ( oriT ), where one of the components of the nucleoprotein complex, the relaxase, introduces a site- and strand specific nick to initiate the transfer of a single DNA strand into the recipient cell. In most cases, the nucleoprotein complex involves, besides the relaxase, one or more additional proteins, named auxiliary proteins, which are encoded by the CE and/or the host. The conjugative plasmid pLS20 replicates in the Gram-positive Firmicute bacterium Bacillus subtilis . We have recently identified the relaxase gene and the oriT of pLS20, which are separated by a region of almost 1 kb. Here we show that this region contains two auxiliary genes that we name aux1 LS20 and aux2 LS20 , and which we show are essential for conjugation. Both Aux1 LS20 and Aux2 LS20 are predicted to contain a Ribbon-Helix-Helix DNA binding motif near their N-terminus. Analyses of the purified proteins show that Aux1 LS20 and Aux2 LS20 form tetramers and hexamers in solution, respectively, and that they both bind preferentially to oriT LS20 , although with different characteristics and specificities. In silico analyses revealed that genes encoding homologs of Aux1 LS20 and/or Aux2 LS20 are located upstream of almost 400 relaxase genes of the Rel LS20 family (MOB L ) of relaxases. Thus, Aux1 LS20 and Aux2 LS20 of pLS20 constitute the founding member of the first two families of auxiliary proteins described for CEs of Gram-positive origin.

  10. Conceptual design and neutronics analyses of a fusion reactor blanket simulation facility

    International Nuclear Information System (INIS)

    Beller, D.E.

    1986-01-01

    A new conceptual design of a fusion reactor blanket simulation facility was developed. This design follows the principles that have been successfully employed in the Purdue Fast Breeder Blanket Facility (FBBR), because experiments conducted in it have resulted in the discovery of deficiencies in neutronics prediction methods. With this design, discrepancies between calculation and experimental data can be fully attributed to calculation methods because design deficiencies that could affect results are insignificant. Inelastic scattering cross sections are identified as a major source of these discrepancies. The conceptual design of this FBBR analog, the fusion reactor blanket facility (FRBF), is presented. Essential features are a cylindrical geometry and a distributed, cosine-shaped line source of 14-MeV neutrons. This source can be created by sweeping a deuteron beam over an elongated titanium-tritide target. To demonstrate that the design of the FRBF will not contribute significant deviations in experimental results, neutronics analyses were performed: results of comparisons of 2-dimensional to 1-dimensional predictions are reported for two blanket compositions. Expected deviations from 1-D predictions which are due to source anisotropy and blanket asymmetry are minimal. Thus, design of the FRBF allows simple and straightforward interpretation of the experimental results, without a need for coarse 3-D calculations

  11. Analyses of bundle experiment data using MATRA-h

    Energy Technology Data Exchange (ETDEWEB)

    Lim, In Cheol; Chea, Hee Taek [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    When the construction and operation license for HANARO was renewed in 1995, 25% of CHF penalty was imposed. The reason for this was that the validation work related to the CHF design calculation was not enough for the assurance of CHF margin. As a part of the works to recover this CHF penalty, MATRA-h was developed by implementing the new correlations for the heat transfer, CHF prediction, subcooled void to the MATRA-a, which is the modified version of COBRA-IV-I done by KAERI. Using MATRA-h, the subchannel analyses for the bundle experiment data were performed. The comparison of the code predictions with the experimental results, it was found that the code would give the conservative predictions as far as the CHF in the bundle geometry is concerned. (author). 12 refs., 25 figs., 16 tabs.

  12. Incidence, Mortality, and Predictive Factors of Hepatocellular Carcinoma in Primary Biliary Cirrhosis

    Directory of Open Access Journals (Sweden)

    Kenichi Hosonuma

    2013-01-01

    Full Text Available Background. The study aims to analyze in detail the incidence, mortality using the standardized incidence ratio (SIR, and standardized mortality ratio (SMR of hepatocellular carcinoma (HCC in primary biliary cirrhosis (PBC, because no large case studies have focused on the detailed statistical analysis of them in Asia. Methods. The study cohorts were consecutively diagnosed at Gunma University and its affiliated hospitals. Age- or sex-specific annual cancer incidence and deaths were obtained from Japanese Cancer Registry and Death Registry as a reference for the comparison of SIR or SMR of HCC. Moreover, univariate analyses and multivariate analyses were performed to clarify predictive factors for the incidence of HCC. Results. The overall 179 patients were followed up for a median of 97 months. HCC had developed in 13 cases. SIR for HCC was 11.6 (95% confidence interval (CI, 6.2–19.8 and SMR for HCC was 11.2 (95% CI, 5.4–20.6 in overall patients. The serum albumin levels were a predictive factor for the incidence of HCC in overall patients. Conclusions. The incidence and mortality of HCC in PBC patients were significantly higher than those in Japanese general population. PBC patients with low serum albumin levels were populations at high risk for HCC.

  13. Re-experiencing phenomena following a disaster: The long-term predictive role of intrusion symptoms in the development of post-trauma depression and anxiety.

    Science.gov (United States)

    Lawrence-Wood, Ellie; Van Hooff, Miranda; Baur, Jenelle; McFarlane, Alexander C

    2016-01-15

    Contention in the literature regarding the diagnostic utility of intrusion symptoms highlights that they have high sensitivity but low specificity in predicting PTSD. They are highly prevalent following a range of traumatic events, and across a range of disorders. The prevalence of intrusion symptoms in the absence of PTSD suggests their relevance to the development of other psychopathology. Therefore, the predictive role of intrusion symptoms for other post-trauma psychopathology was examined using data from an epidemiological, longitudinal sample of adults recruited in childhood. From 5 phases of data collection for this sample, these analyses focused on the 20 year and 28 year follow-ups (n=583). Lifetime exposure to trauma was assessed using a modified set of 10 Criterion-A events from the Composite International Diagnostic Interview (CIDI), with PTSD assessed in reference to a self-nominated worst lifetime event, and other DSM-IV disorder also assessed using the CIDI. Results showed that the presence of intrusion symptoms without PTSD at the 20 year follow-up was predictive of increased risk at 28 years for depressive but not anxiety disorders. There was limited psychopathology in the sample, reducing the power to examine many individual disorders. Furthermore, trauma history and psychiatric symptoms were retrospectively reported, introducing the possibility of recall bias. Together the findings suggest that intrusion symptoms may play an aetiological role in the development and/or maintenance of disorders other than PTSD. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    Science.gov (United States)

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  15. Connecting clinical and actuarial prediction with rule-based methods.

    Science.gov (United States)

    Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H

    2015-06-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).

  16. Amorphous-to-Cu51Zr14 phase transformation in Cu60Ti20Zr20 alloy

    International Nuclear Information System (INIS)

    Cao, Q P; Zhou, Y H; Horsewell, A; Jiang, J Z

    2003-01-01

    The kinetics of an amorphous-to-Cu 51 Zr 14 phase transformation in an as-cast Cu 60 Ti 20 Zr 20 rod have been investigated by differential scanning calorimetry. The relative volume fractions of the transferred crystalline phase as a function of annealing time, obtained at 713, 716, 723, 728, and 733 K, have been analysed in detail using 14 nucleation and growth models together with the JMA model. A time-dependent nucleation process is revealed. A steady-state nucleation rate of the order of 10 22 - 10 23 nuclei m -3 s -1 in the temperature range 713-733 K and an activation energy of the order of 550 kJmol -1 for the phase transformation in the as-cast Cu 60 Ti 20 Zr 20 rod were detected, for which some possible reasons are suggested

  17. Prediction of reported consumption of selected fat-containing foods.

    Science.gov (United States)

    Tuorila, H; Pangborn, R M

    1988-10-01

    A total of 100 American females (mean age = 20.8 years) completed a questionnaire, in which their beliefs, evaluations, liking and consumption (frequency, consumption compared to others, intention to consume) of milk, cheese, ice cream, chocolate and "high-fat foods" were measured. For the design and analysis, the basic frame of reference was the Fishbein-Ajzen model of reasoned action, but the final analyses were carried out with stepwise multiple regression analysis. In addition to the components of the Fishbein-Ajzen model, beliefs and evaluations were used as independent variables. On the average, subjects reported liking all the products but not "high-fat foods", and thought that milk and cheese were "good for you" whereas the remaining items were "bad for you". Principal component analysis for beliefs revealed factors related to pleasantness/benefit aspects, to health and weight concern and to the "functionality" of the foods. In stepwise multiple regression analyses, liking was the predominant predictor of reported consumption for all the foods, but various belief factors, particularly those related to concern with weight, also significantly predicted consumption. Social factors played only a minor role. The multiple R's of the predictive functions varied from 0.49 to 0.74. The fact that all four foods studied elicited individual sets of beliefs and belief structures, and that none of them was rated similar to the generic "high-fat foods", emphasizes that consumers attach meaning to integrated food entities rather than to ingredients.

  18. SU-F-P-20: Predicting Waiting Times in Radiation Oncology Using Machine Learning

    International Nuclear Information System (INIS)

    Joseph, A; Herrera, D; Hijal, T; Kildea, J; Hendren, L; Leung, A; Wainberg, J; Sawaf, M; Gorshkov, M; Maglieri, R; Keshavarz, M

    2016-01-01

    Purpose: Waiting times remain one of the most vexing patient satisfaction challenges facing healthcare. Waiting time uncertainty can cause patients, who are already sick or in pain, to worry about when they will receive the care they need. These waiting periods are often difficult for staff to predict and only rough estimates are typically provided based on personal experience. This level of uncertainty leaves most patients unable to plan their calendar, making the waiting experience uncomfortable, even painful. In the present era of electronic health records (EHRs), waiting times need not be so uncertain. Extensive EHRs provide unprecedented amounts of data that can statistically cluster towards representative values when appropriate patient cohorts are selected. Predictive modelling, such as machine learning, is a powerful approach that benefits from large, potentially complex, datasets. The essence of machine learning is to predict future outcomes by learning from previous experience. The application of a machine learning algorithm to waiting time data has the potential to produce personalized waiting time predictions such that the uncertainty may be removed from the patient’s waiting experience. Methods: In radiation oncology, patients typically experience several types of waiting (eg waiting at home for treatment planning, waiting in the waiting room for oncologist appointments and daily waiting in the waiting room for radiotherapy treatments). A daily treatment wait time model is discussed in this report. To develop a prediction model using our large dataset (with more than 100k sample points) a variety of machine learning algorithms from the Python package sklearn were tested. Results: We found that the Random Forest Regressor model provides the best predictions for daily radiotherapy treatment waiting times. Using this model, we achieved a median residual (actual value minus predicted value) of 0.25 minutes and a standard deviation residual of 6.5 minutes

  19. SU-F-P-20: Predicting Waiting Times in Radiation Oncology Using Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, A; Herrera, D; Hijal, T; Kildea, J [McGill University Health Centre, Montreal, Quebec (Canada); Hendren, L; Leung, A; Wainberg, J; Sawaf, M; Gorshkov, M; Maglieri, R; Keshavarz, M [McGill University, Montreal, Quebec (Canada)

    2016-06-15

    Purpose: Waiting times remain one of the most vexing patient satisfaction challenges facing healthcare. Waiting time uncertainty can cause patients, who are already sick or in pain, to worry about when they will receive the care they need. These waiting periods are often difficult for staff to predict and only rough estimates are typically provided based on personal experience. This level of uncertainty leaves most patients unable to plan their calendar, making the waiting experience uncomfortable, even painful. In the present era of electronic health records (EHRs), waiting times need not be so uncertain. Extensive EHRs provide unprecedented amounts of data that can statistically cluster towards representative values when appropriate patient cohorts are selected. Predictive modelling, such as machine learning, is a powerful approach that benefits from large, potentially complex, datasets. The essence of machine learning is to predict future outcomes by learning from previous experience. The application of a machine learning algorithm to waiting time data has the potential to produce personalized waiting time predictions such that the uncertainty may be removed from the patient’s waiting experience. Methods: In radiation oncology, patients typically experience several types of waiting (eg waiting at home for treatment planning, waiting in the waiting room for oncologist appointments and daily waiting in the waiting room for radiotherapy treatments). A daily treatment wait time model is discussed in this report. To develop a prediction model using our large dataset (with more than 100k sample points) a variety of machine learning algorithms from the Python package sklearn were tested. Results: We found that the Random Forest Regressor model provides the best predictions for daily radiotherapy treatment waiting times. Using this model, we achieved a median residual (actual value minus predicted value) of 0.25 minutes and a standard deviation residual of 6.5 minutes

  20. The Use of Web 2.0 Tools by Students in Learning and Leisure Contexts: A Study in a Portuguese Institution of Higher Education

    Science.gov (United States)

    Costa, Carolina; Alvelos, Helena; Teixeira, Leonor

    2016-01-01

    This study analyses and compares the use of Web 2.0 tools by students in both learning and leisure contexts. Data were collected based on a questionnaire applied to 234 students from the University of Aveiro (Portugal) and the results were analysed by using descriptive analysis, paired samples t-tests, cluster analyses and Kruskal-Wallis tests.…

  1. Analysing the Relevance of Experience Partitions to the Prediction of Players’ Self-Reports of Affect

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez; Yannakakis, Georgios N.

    2011-01-01

    A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features, to the accur......A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features...

  2. Ultrastructural analyses of blood capillaries of ovary of 20-days albino rats foetus under their irradiation in different periods of embryogenesis

    International Nuclear Information System (INIS)

    Ablekovskaya, O.N.; Amvros'ev, A.P.

    1999-01-01

    The character and direction of structural transformations of blood capillaries of micro circulatory channel of 20-days white rat foetus in normal conditions and after single external 0,5 Gy dose irradiation by 10 and 14 days of embryogenesis were examined. Electron-microscopical, stereo logical and statistical analyses were used. The peculiarities of reactions of hemo capillaries and their cell structure to gamma-rays action in embryogenesis were revealed. It was shown the increase of diameters of capillaries, extension of section area of cytoplasm of endotheliocytes, diminution the size of nuclei of these cells. Polyploid endotheliocytes were found in the experimental conditions. Prenatal acute irradiation in low doses leaded to reduction of the number of microvessels and mitochondria in cytoplasm of cells of blood capillaries in ovary of rat foetus. These results revealed that low dose ionizing radiation changed the morphological expression of important synthetic, transport and energy processes in capillary cells of ovary in fetal period of ontogenesis

  3. Multipinhole collimator with 20 apertures for a brain SPECT application

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Tzu-Cheng; Ellin, Justin R.; Shrestha, Uttam; Seo, Youngho, E-mail: youngho.seo@ucsf.edu [Physics Research Laboratory, Department of Radiology and Biomedical Imaging, University of California, San Francisco, California 94107 (United States); Huang, Qiu [School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai 200030 (China); Gullberg, Grant T. [Department of Radiotracer Development and Imaging Technology, Life Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, California 94702 (United States)

    2014-11-01

    Purpose: Several new technologies for single photon emission computed tomography (SPECT) instrumentation with parallel-hole collimation have been proposed to improve detector sensitivity and signal collection efficiency. Benefits from improved signal efficiency include shorter acquisition times and lower dose requirements. In this paper, the authors show a possibility of over an order of magnitude enhancement in photon detection efficiency (from 7.6 × 10{sup −5} to 1.6 × 10{sup −3}) for dopamine transporter (DaT) imaging of the striatum over the conventional SPECT parallel-hole collimators by use of custom-designed 20 multipinhole (20-MPH) collimators with apertures of 0.75 cm diameter. Methods: Quantifying specific binding ratio (SBR) of {sup 123}I-ioflupane or {sup 123}I-iometopane’s signal at the striatal region is a common brain imaging method to confirm the diagnosis of the Parkinson’s disease. The authors performed imaging of a striatal phantom filled with aqueous solution of I-123 and compared camera recovery ratios of SBR acquired between low-energy high-resolution (LEHR) parallel-hole collimators and 20-MPH collimators. Results: With only two-thirds of total acquisition time (20 min against 30 min), a comparable camera recovery ratio of SBR was achieved using 20-MPH collimators in comparison to that from the LEHR collimator study. Conclusions: Their systematic analyses showed that the 20-MPH collimator could be a promising alternative for the DaT SPECT imaging for brain over the traditional LEHR collimator, which could give both shorter scan time and improved diagnostic accuracy.

  4. A NEW METHOD FOR PREDICTING SURVIVAL AND ESTIMATING UNCERTAINTY IN TRAUMA PATIENTS

    Directory of Open Access Journals (Sweden)

    V. G. Schetinin

    2017-01-01

    Full Text Available The Trauma and Injury Severity Score (TRISS is the current “gold” standard of screening patient’s condition for purposes of predicting survival probability. More than 40 years of TRISS practice revealed a number of problems, particularly, 1 unexplained fluctuation of predicted values caused by aggregation of screening tests, and 2 low accuracy of uncertainty intervals estimations. We developed a new method made it available for practitioners as a web calculator to reduce negative effect of factors given above. The method involves Bayesian methodology of statistical inference which, being computationally expensive, in theory provides most accurate predictions. We implemented and tested this approach on a data set including 571,148 patients registered in the US National Trauma Data Bank (NTDB with 1–20 injuries. These patients were distributed over the following categories: (1 174,647 with 1 injury, (2 381,137 with 2–10 injuries, and (3 15,364 with 11–20 injuries. Survival rates in each category were 0.977, 0.953, and 0.831, respectively. The proposed method has improved prediction accuracy by 0.04%, 0.36%, and 3.64% (p-value <0.05 in the categories 1, 2, and 3, respectively. Hosmer-Lemeshow statistics showed a significant improvement of the new model calibration. The uncertainty 2σ intervals were reduced from 0.628 to 0.569 for patients of the second category and from 1.227 to 0.930 for patients of the third category, both with p-value <0.005. The new method shows the statistically significant improvement (p-value <0.05 in accuracy of predicting survival and estimating the uncertainty intervals. The largest improvement has been achieved for patients with 11–20 injuries. The method is available for practitioners as a web calculator http://www.traumacalc.org.

  5. Prediction, Regression and Critical Realism

    DEFF Research Database (Denmark)

    Næss, Petter

    2004-01-01

    This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... phenomena. This position is fundamentally problematic to public planning. Without at least some ability to predict the likely consequences of different proposals, the justification for public sector intervention into market mechanisms will be frail. Statistical methods like regression analyses are commonly...... seen as necessary in order to identify aggregate level effects of policy measures, but are questioned by many advocates of critical realist ontology. Using research into the relationship between urban structure and travel as an example, the paper discusses relevant research methods and the kinds...

  6. The Predictive Utility of Narcissism among Children and Adolescents: Evidence for a Distinction between Adaptive and Maladaptive Narcissism

    Science.gov (United States)

    Barry, Christopher T.; Frick, Paul J.; Adler, Kristy K.; Grafeman, Sarah J.

    2007-01-01

    We examined the predictive utility of narcissism among a community sample of children and adolescents (N=98) longitudinally. Analyses focused on the differential utility between maladaptive and adaptive narcissism for predicting later delinquency. Maladaptive narcissism significantly predicted self-reported delinquency at one-, two-, and…

  7. Tensor analysing power T{sub 2}0 in inelastic (d, d`) X scattering at 0{sup 0} on {sup 1}H and {sup 12}C from 4.5 to 9.0 GeV/c

    Energy Technology Data Exchange (ETDEWEB)

    Azhgirej, L S; Chernykh, E V [Laboratory of High Energies, Joint Institute for Nuclear Research, Dubna (Russian Federation); Kobushkin, A P [Institute of Theoretical Physics, Ukrainian Academy of Sciences, Kiev (Ukraine); and others

    1998-12-01

    Tensor analysing power T{sub 20} for inelastic (d, d`) X reaction at deuteron momentum from 4.2 to 9 GeV/c is presented. It is observed that T{sub 20} taken as a function of the four-momentum transfer squared t demonstrates an approximate scaling; its absolute value is small at |t| <{approx_equal} (0.05 - 0.1) GeV{sup 2}/c{sup 2} and has a maximum at -t {approx_equal}0.3 GeV{sup 2}/c{sup 2}. No significant dependence on the type of the target was observed 10 refs., 4 figs., 4 tabs.

  8. Extending and Applying Spartan to Perform Temporal Sensitivity Analyses for Predicting Changes in Influential Biological Pathways in Computational Models.

    Science.gov (United States)

    Alden, Kieran; Timmis, Jon; Andrews, Paul S; Veiga-Fernandes, Henrique; Coles, Mark

    2017-01-01

    Through integrating real time imaging, computational modelling, and statistical analysis approaches, previous work has suggested that the induction of and response to cell adhesion factors is the key initiating pathway in early lymphoid tissue development, in contrast to the previously accepted view that the process is triggered by chemokine mediated cell recruitment. These model derived hypotheses were developed using spartan, an open-source sensitivity analysis toolkit designed to establish and understand the relationship between a computational model and the biological system that model captures. Here, we extend the functionality available in spartan to permit the production of statistical analyses that contrast the behavior exhibited by a computational model at various simulated time-points, enabling a temporal analysis that could suggest whether the influence of biological mechanisms changes over time. We exemplify this extended functionality by using the computational model of lymphoid tissue development as a time-lapse tool. By generating results at twelve- hour intervals, we show how the extensions to spartan have been used to suggest that lymphoid tissue development could be biphasic, and predict the time-point when a switch in the influence of biological mechanisms might occur.

  9. Hot Deformation Behavior and a Two-Stage Constitutive Model of 20Mn5 Solid Steel Ingot during Hot Compression

    Directory of Open Access Journals (Sweden)

    Min Liu

    2018-03-01

    Full Text Available 20Mn5 steel is widely used in the manufacture of heavy hydro-generator shaft forging due to its strength, toughness, and wear resistance. However, the hot deformation and recrystallization behaviors of 20Mn5 steel compressed under a high temperature were not studied. For this article, hot compression experiments under temperatures of 850–1200 °C and strain rates of 0.01 s−1–1 s−1 were conducted using a Gleeble-1500D thermo-mechanical simulator. Flow stress-strain curves and microstructure after hot compression were obtained. Effects of temperature and strain rate on microstructure are analyzed. Based on the classical stress-dislocation relationship and the kinetics of dynamic recrystallization, a two-stage constitutive model is developed to predict the flow stress of 20Mn5 steel. Comparisons between experimental flow stress and predicted flow stress show that the predicted flow stress values are in good agreement with the experimental flow stress values, which indicates that the proposed constitutive model is reliable and can be used for numerical simulation of hot forging of 20Mn5 solid steel ingot.

  10. CADDIS Volume 4. Data Analysis: Advanced Analyses - Controlling for Natural Variability

    Science.gov (United States)

    Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.

  11. Physiological and enzymatic analyses of pineapple subjected to ionizing radiation

    International Nuclear Information System (INIS)

    Silva, Josenilda Maria da; Silva, Juliana Pizarro; Spoto, Marta Helena Fillet

    2007-01-01

    The physiological and enzymatic post-harvest characteristics of the pineapple cultivar Smooth Cayenne were evaluated after the fruits were gamma-irradiated with doses of 100 and 150 Gy and the fruits were stored for 10, 20 and 30 days at 12 deg C (±1) and relative humidity of 85% (±5). Physiological and enzymatic analyses were made for each storage period to evaluate the alterations resulting from the application of ionizing radiation. Control specimens showed higher values of soluble pectins, total pectins, reducing sugars, sucrose and total sugars and lower values of polyphenyloxidase and polygalacturonase enzyme activities. All the analyses indicated that storage time is a significantly influencing factor. The 100 Gy dosage and 20-day storage period presented the best results from the standpoint of maturation and conservation of the fruits quality. (author)

  12. Analysing power for neutron-proton scattering at 14.1 MeV

    International Nuclear Information System (INIS)

    Brock, J.E.; Chisholm, A.; Duder, J.C.; Garrett, R.; Poletti, J.L.

    1981-01-01

    The analysing power Asub(y)(theta) for neutron-proton scattering has been measured at 14.1 MeV for c.m. angles between 50 0 and 157 0 . A polarized neutron beam was produced by the reaction 3 H(d,n) 4 He at 110 keV, using polarized deuterons from an atomic beam polarized ion source. Liquid and plastic scintillators were used for proton targets and the scattered particles were detected in an array of platic scintillators. Use of the associated alpha technique, multi-parameter recording of events and off-line computer treatment led to very low backgrounds. The results differ significantly from the predictions of the phase-shift analyses of Yale IV, Livermore X and Arndt et al. We find, however, excellent agreement with the predictions of the Paris potential of Lacombe et al. Existing n-p analysing power results up to 30 MeV are surveyed and found to be consistent. An attempt was made to look for an isospin splitting of the triplet P-wave phase shifts. (orig.)

  13. PRmePRed: A protein arginine methylation prediction tool.

    Directory of Open Access Journals (Sweden)

    Pawan Kumar

    Full Text Available Protein methylation is an important Post-Translational Modification (PTMs of proteins. Arginine methylation carries out and regulates several important biological functions, including gene regulation and signal transduction. Experimental identification of arginine methylation site is a daunting task as it is costly as well as time and labour intensive. Hence reliable prediction tools play an important task in rapid screening and identification of possible methylation sites in proteomes. Our preliminary assessment using the available prediction methods on collected data yielded unimpressive results. This motivated us to perform a comprehensive data analysis and appraisal of features relevant in the context of biological significance, that led to the development of a prediction tool PRmePRed with better performance. The PRmePRed perform reasonably well with an accuracy of 84.10%, 82.38% sensitivity, 83.77% specificity, and Matthew's correlation coefficient of 66.20% in 10-fold cross-validation. PRmePRed is freely available at http://bioinfo.icgeb.res.in/PRmePRed/.

  14. The prediction of surface temperature in the new seasonal prediction system based on the MPI-ESM coupled climate model

    Science.gov (United States)

    Baehr, J.; Fröhlich, K.; Botzet, M.; Domeisen, D. I. V.; Kornblueh, L.; Notz, D.; Piontek, R.; Pohlmann, H.; Tietsche, S.; Müller, W. A.

    2015-05-01

    A seasonal forecast system is presented, based on the global coupled climate model MPI-ESM as used for CMIP5 simulations. We describe the initialisation of the system and analyse its predictive skill for surface temperature. The presented system is initialised in the atmospheric, oceanic, and sea ice component of the model from reanalysis/observations with full field nudging in all three components. For the initialisation of the ensemble, bred vectors with a vertically varying norm are implemented in the ocean component to generate initial perturbations. In a set of ensemble hindcast simulations, starting each May and November between 1982 and 2010, we analyse the predictive skill. Bias-corrected ensemble forecasts for each start date reproduce the observed surface temperature anomalies at 2-4 months lead time, particularly in the tropics. Niño3.4 sea surface temperature anomalies show a small root-mean-square error and predictive skill up to 6 months. Away from the tropics, predictive skill is mostly limited to the ocean, and to regions which are strongly influenced by ENSO teleconnections. In summary, the presented seasonal prediction system based on a coupled climate model shows predictive skill for surface temperature at seasonal time scales comparable to other seasonal prediction systems using different underlying models and initialisation strategies. As the same model underlying our seasonal prediction system—with a different initialisation—is presently also used for decadal predictions, this is an important step towards seamless seasonal-to-decadal climate predictions.

  15. Comparative biochemical analyses of venous blood and peritoneal fluid from horses with colic using a portable analyser and an in-house analyser.

    Science.gov (United States)

    Saulez, M N; Cebra, C K; Dailey, M

    2005-08-20

    Fifty-six horses with colic were examined over a period of three months. The concentrations of glucose, lactate, sodium, potassium and chloride, and the pH of samples of blood and peritoneal fluid, were determined with a portable clinical analyser and with an in-house analyser and the results were compared. Compared with the in-house analyser, the portable analyser gave higher pH values for blood and peritoneal fluid with greater variability in the alkaline range, and lower pH values in the acidic range, lower concentrations of glucose in the range below 8.3 mmol/l, and lower concentrations of lactate in venous blood in the range below 5 mmol/l and in peritoneal fluid in the range below 2 mmol/l, with less variability. On average, the portable analyser underestimated the concentrations of lactate and glucose in peritoneal fluid in comparison with the in-house analyser. Its measurements of the concentrations of sodium and chloride in peritoneal fluid had a higher bias and were more variable than the measurements in venous blood, and its measurements of potassium in venous blood and peritoneal fluid had a smaller bias and less variability than the measurements made with the in-house analyser.

  16. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part I: Effects of Random Error

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.

  17. A Trillion-Dollar Question: What Predicts Student Loan Delinquencies?

    Science.gov (United States)

    Mezza, Alvaro; Sommer, Kamila

    2016-01-01

    The recent significant increase in student loan delinquencies has generated interest in understanding the key factors predicting the non-performance of these loans. However, despite the large size of the student loan market, existing analyses have been limited by lack of data. This paper studies predictors of student loan delinquencies using a…

  18. Predicting epileptic seizures in advance.

    Directory of Open Access Journals (Sweden)

    Negin Moghim

    Full Text Available Epilepsy is the second most common neurological disorder, affecting 0.6-0.8% of the world's population. In this neurological disorder, abnormal activity of the brain causes seizures, the nature of which tend to be sudden. Antiepileptic Drugs (AEDs are used as long-term therapeutic solutions that control the condition. Of those treated with AEDs, 35% become resistant to medication. The unpredictable nature of seizures poses risks for the individual with epilepsy. It is clearly desirable to find more effective ways of preventing seizures for such patients. The automatic detection of oncoming seizures, before their actual onset, can facilitate timely intervention and hence minimize these risks. In addition, advance prediction of seizures can enrich our understanding of the epileptic brain. In this study, drawing on the body of work behind automatic seizure detection and prediction from digitised Invasive Electroencephalography (EEG data, a prediction algorithm, ASPPR (Advance Seizure Prediction via Pre-ictal Relabeling, is described. ASPPR facilitates the learning of predictive models targeted at recognizing patterns in EEG activity that are in a specific time window in advance of a seizure. It then exploits advanced machine learning coupled with the design and selection of appropriate features from EEG signals. Results, from evaluating ASPPR independently on 21 different patients, suggest that seizures for many patients can be predicted up to 20 minutes in advance of their onset. Compared to benchmark performance represented by a mean S1-Score (harmonic mean of Sensitivity and Specificity of 90.6% for predicting seizure onset between 0 and 5 minutes in advance, ASPPR achieves mean S1-Scores of: 96.30% for prediction between 1 and 6 minutes in advance, 96.13% for prediction between 8 and 13 minutes in advance, 94.5% for prediction between 14 and 19 minutes in advance, and 94.2% for prediction between 20 and 25 minutes in advance.

  19. Cross-species Analyses Unravel the Complexity of H3K27me3 and H4K20me3 in the Context of Neural Stem Progenitor Cells.

    Science.gov (United States)

    Rhodes, Christopher T; Sandstrom, Richard S; Huang, Shu-Wei Angela; Wang, Yufeng; Schotta, Gunnar; Berger, Mitchel S; Lin, Chin-Hsing Annie

    2016-06-01

    Neural stem progenitor cells (NSPCs) in the human subventricular zone (SVZ) potentially contribute to life-long neurogenesis, yet subtypes of glioblastoma multiforme (GBM) contain NSPC signatures that highlight the importance of cell fate regulation. Among numerous regulatory mechanisms, the post-translational methylations onto histone tails are crucial regulator of cell fate. The work presented here focuses on the role of two repressive chromatin marks tri-methylations on histone H3 lysine 27 (H3K27me3) and histone H4 lysine 20 (H4K20me3) in the adult NSPC within the SVZ. To best model healthy human NSPCs as they exist in vivo for epigenetic profiling of H3K27me3 and H4K20me3, we utilized NSPCs isolated from the adult SVZ of baboon brain ( Papio anubis ) with brain structure and genomic level similar to human. The putative role of H3K27me3 in normal NSPCs predominantly falls into the regulation of gene expression, cell cycle, and differentiation, whereas H4K20me3 is involved in DNA replication/repair, metabolism, and cell cycle. Using conditional knock-out mouse models to diminish Ezh2 and Suv4-20h responsible for H3K27me3 and H4K20me3, respectively, we found that both repressive marks have irrefutable function for cell cycle regulation in the NSPC population. While both EZH2/H3K27me3 and Suv4-20h/H4K20me3 have implication in cancers, our comparative genomics approach between healthy NSPCs and human GBM specimens revealed that substantial sets of genes enriched with H3K27me3 and H4K20me3 in the NSPCs are altered in the human GBM. In sum, our integrated analyses across species highlight important roles of H3K27me3 and H4K20me3 in normal and disease conditions in the context of NSPC.

  20. Cross-species analyses unravel the complexity of H3K27me3 and H4K20me3 in the context of neural stem progenitor cells

    Directory of Open Access Journals (Sweden)

    Christopher T. Rhodes

    2016-06-01

    Full Text Available Neural stem progenitor cells (NSPCs in the human subventricular zone (SVZ potentially contribute to lifelong neurogenesis, yet subtypes of glioblastoma multiforme (GBM contain NSPC signatures that highlight the importance of cell fate regulation. Among numerous regulatory mechanisms, the posttranslational methylations onto histone tails are crucial regulator of cell fate. The work presented here focuses on the role of 2 repressive chromatin marks trimethylations on histone H3 lysine 27 (H3K27me3 and histone H4 lysine 20 (H4K20me3 in the adult NSPC within the SVZ. To best model healthy human NSPCs as they exist in vivo for epigenetic profiling of H3K27me3 and H4K20me3, we used NSPCs isolated from the adult SVZ of baboon brain (Papio anubis with brain structure and genomic level similar to human. The putative role of H3K27me3 in normal NSPCs predominantly falls into the regulation of gene expression, cell cycle, and differentiation, whereas H4K20me3 is involved in DNA replication/repair, metabolism, and cell cycle. Using conditional knockout mouse models to diminish Ezh2 and Suv4-20h responsible for H3K27me3 and H4K20me3, respectively, we found that both repressive marks have irrefutable function for cell cycle regulation in the NSPC population. Although both EZH2/H3K27me3 and Suv4-20h/H4K20me3 have implication in cancers, our comparative genomics approach between healthy NSPCs and human GBM specimens revealed that substantial sets of genes enriched with H3K27me3 and H4K20me3 in the NSPCs are altered in the human GBM. In sum, our integrated analyses across species highlight important roles of H3K27me3 and H4K20me3 in normal and disease conditions in the context of NSPC.

  1. Development of a Thermal Equilibrium Prediction Algorithm

    International Nuclear Information System (INIS)

    Aviles-Ramos, Cuauhtemoc

    2002-01-01

    A thermal equilibrium prediction algorithm is developed and tested using a heat conduction model and data sets from calorimetric measurements. The physical model used in this study is the exact solution of a system of two partial differential equations that govern the heat conduction in the calorimeter. A multi-parameter estimation technique is developed and implemented to estimate the effective volumetric heat generation and thermal diffusivity in the calorimeter measurement chamber, and the effective thermal diffusivity of the heat flux sensor. These effective properties and the exact solution are used to predict the heat flux sensor voltage readings at thermal equilibrium. Thermal equilibrium predictions are carried out considering only 20% of the total measurement time required for thermal equilibrium. A comparison of the predicted and experimental thermal equilibrium voltages shows that the average percentage error from 330 data sets is only 0.1%. The data sets used in this study come from calorimeters of different sizes that use different kinds of heat flux sensors. Furthermore, different nuclear material matrices were assayed in the process of generating these data sets. This study shows that the integration of this algorithm into the calorimeter data acquisition software will result in an 80% reduction of measurement time. This reduction results in a significant cutback in operational costs for the calorimetric assay of nuclear materials. (authors)

  2. Validation of ASTEC v2.0 corium jet fragmentation model using FARO experiments

    International Nuclear Information System (INIS)

    Hermsmeyer, S.; Pla, P.; Sangiorgi, M.

    2015-01-01

    Highlights: • Model validation base extended to six FARO experiments. • Focus on the calculation of the fragmented particle diameter. • Capability and limits of the ASTEC fragmentation model. • Sensitivity analysis of model outputs. - Abstract: ASTEC is an integral code for the prediction of Severe Accidents in Nuclear Power Plants. As such, it needs to cover all physical processes that could occur during accident progression, yet keeping its models simple enough for the ensemble to stay manageable and produce results within an acceptable time. The present paper is concerned with the validation of the Corium jet fragmentation model of ASTEC v2.0 rev3 by means of a selection of six experiments carried out within the FARO facility. The different conditions applied within these six experiments help to analyse the model behaviour in different situations and to expose model limits. In addition to comparing model outputs with experimental measurements, sensitivity analyses are applied to investigate the model. Results of the paper are (i) validation runs, accompanied by an identification of situations where the implemented fragmentation model does not match the experiments well, and discussion of results; (ii) its special attention to the models calculating the diameter of fragmented particles, the identification of a fault in one model implemented, and the discussion of simplification and ad hoc modification to improve the model fit; and, (iii) an investigation of the sensitivity of predictions towards inputs and parameters. In this way, the paper offers a thorough investigation of the merit and limitation of the fragmentation model used in ASTEC

  3. Parental phonological memory contributes to prediction of outcome of late talkers from 20 months to 4 years: a longitudinal study of precursors of specific language impairment

    Directory of Open Access Journals (Sweden)

    Bishop Dorothy VM

    2012-02-01

    Full Text Available Abstract Background Many children who are late talkers go on to develop normal language, but others go on to have longer-term language difficulties. In this study, we considered which factors were predictive of persistent problems in late talkers. Methods Parental report of expressive vocabulary at 18 months of age was used to select 26 late talkers and 70 average talkers, who were assessed for language and cognitive ability at 20 months of age. Follow-up at 4 years of age was carried out for 24 late and 58 average talkers. A psychometric test battery was used to categorize children in terms of language status (unimpaired or impaired and nonverbal ability (normal range or more than 1 SD below average. The vocabulary and non-word repetition skills of the accompanying parent were also assessed. Results Among the late talkers, seven (29% met our criteria for specific language impairment (SLI at 4 years of age, and a further two (8% had low nonverbal ability. In the group of average talkers, eight (14% met the criteria for SLI at 4 years, and five other children (8% had low nonverbal ability. Family history of language problems was slightly better than late-talker status as a predictor of SLI.. The best predictors of SLI at 20 months of age were score on the receptive language scale of the Mullen Scales of Early Learning and the parent's performance on a non-word repetition task. Maternal education was not a significant predictor of outcome. Conclusions In this study, around three-quarters of late talkers did not have any language difficulties at 4 years of age, provided there was no family history of language impairment. A family history of language-literacy problems was found to be a significant predictor for persisting problems. Nevertheless, there are children with SLI for whom prediction is difficult because they did not have early language delay.

  4. Percent free prostate-specific antigen is effective to predict prostate biopsy outcome in Chinese men with prostate-specific antigen between 10.1 and 20.0 ng ml−1

    Science.gov (United States)

    Chen, Rui; Zhou, Li-Qun; Cai, Xiao-Bing; Xie, Li-Ping; Huang, Yi-Ran; He, Da-Lin; Gao, Xu; Xu, Chuan-Liang; Ding, Qiang; Wei, Qiang; Yin, Chang-Jun; Ren, Shan-Cheng; Wang, Fu-Bo; Tian, Ye; Sun, Zhong-Quan; Fu, Qiang; Ma, Lu-Lin; Zheng, Jun-Hua; Ye, Zhang-Qun; Ye, Ding-Wei; Xu, Dan-Feng; Hou, Jian-Quan; Xu, Ke-Xin; Yuan, Jian-Lin; Gao, Xin; Liu, Chun-Xiao; Pan, Tie-Jun; Sun, Ying-Hao

    2015-01-01

    Percent free prostatic-specific antigen (%fPSA) has been introduced as a tool to avoid unnecessary biopsies in patients with a serum PSA level of 4.0–10.0 ng ml−1, however, it remains controversial whether %fPSA is effective in PSA range of 10.1–20.0 ng ml−1 in both Chinese and Western population. In this study, the diagnostic performance of %fPSA and serum PSA in predicting prostate cancer (PCa) and high-grade PCa (HGPCa) was analyzed in a multi-center biopsy cohort of 5915 consecutive Chinese patients who underwent prostate biopsy in 22 hospitals across China from January 1, 2010 to December 31, 2013. The indication for biopsy was PSA>4.0 ng ml−1 or/and suspicious digital rectal examination. Total and free serum PSA determinations were performed by three types of electrochemiluminescence immunoassays with recalibration to the World Health Organization standards. The diagnostics accuracy of PSA, %fPSA and %fPSA in combination with PSA (%fPSA + PSA) was determined by the area under the receivers operating characteristic curve (AUC). %fPSA was more effective than PSA in men aged ≥60 years old. The AUC was 0.584 and 0.635 in men aged ≥60 years old with a PSA of 4.0–10.0 ng ml−1 and 10.1–20.0 ng ml−1, respectively. The AUC of %fPSA was superior to that of PSA in predicting HGPCa in patients ≥60 years old in these two PSA range. Our results indicated that %fPSA is both statistically effective and clinical applicable to predict prostate biopsy outcome in Chinese patients aged ≥60 years old with a PSA of 4.0–10.0 ng ml−1 and 10.1–20.0 ng ml−1. PMID:25926603

  5. Percent free prostate-specific antigen is effective to predict prostate biopsy outcome in Chinese men with prostate-specific antigen between 10.1 and 20.0 ng ml(-1).

    Science.gov (United States)

    Chen, Rui; Zhou, Li-Qun; Cai, Xiao-Bing; Xie, Li-Ping; Huang, Yi-Ran; He, Da-Lin; Gao, Xu; Xu, Chuan-Liang; Ding, Qiang; Wei, Qiang; Yin, Chang-Jun; Ren, Shan-Cheng; Wang, Fu-Bo; Tian, Ye; Sun, Zhong-Quan; Fu, Qiang; Ma, Lu-Lin; Zheng, Jun-Hua; Ye, Zhang-Qun; Ye, Ding-Wei; Xu, Dan-Feng; Hou, Jian-Quan; Xu, Ke-Xin; Yuan, Jian-Lin; Gao, Xin; Liu, Chun-Xiao; Pan, Tie-Jun; Sun, Ying-Hao

    2015-01-01

    Percent free prostatic-specific antigen (%fPSA) has been introduced as a tool to avoid unnecessary biopsies in patients with a serum PSA level of 4.0-10.0 ng ml-1 , however, it remains controversial whether %fPSA is effective in PSA range of 10.1-20.0 ng ml-1 in both Chinese and Western population. In this study, the diagnostic performance of %fPSA and serum PSA in predicting prostate cancer (PCa) and high-grade PCa (HGPCa) was analyzed in a multi-center biopsy cohort of 5915 consecutive Chinese patients who underwent prostate biopsy in 22 hospitals across China from January 1, 2010 to December 31, 2013. The indication for biopsy was PSA>4.0 ng ml-1 or/and suspicious digital rectal examination. Total and free serum PSA determinations were performed by three types of electrochemiluminescence immunoassays with recalibration to the World Health Organization standards. The diagnostics accuracy of PSA, %fPSA and %fPSA in combination with PSA (%fPSA + PSA) was determined by the area under the receivers operating characteristic curve (AUC). %fPSA was more effective than PSA in men aged ≥60 years old. The AUC was 0.584 and 0.635 in men aged ≥60 years old with a PSA of 4.0-10.0 ng ml-1 and 10.1-20.0 ng ml-1 , respectively. The AUC of %fPSA was superior to that of PSA in predicting HGPCa in patients ≥60 years old in these two PSA range. Our results indicated that %fPSA is both statistically effective and clinical applicable to predict prostate biopsy outcome in Chinese patients aged ≥60 years old with a PSA of 4.0-10.0 ng ml-1 and 10.1-20.0 ng ml-1 .

  6. Use of protection motivation theory, affect, and barriers to understand and predict adherence to outpatient rehabilitation.

    Science.gov (United States)

    Grindley, Emma J; Zizzi, Samuel J; Nasypany, Alan M

    2008-12-01

    Protection motivation theory (PMT) has been used in more than 20 different health-related fields to study intentions and behavior, albeit primarily outside the area of injury rehabilitation. In order to examine and predict patient adherence behavior, this study was carried out to explore the use of PMT as a screening tool in a general sample of people with orthopedic conditions. New patients who were more than 18 years old and who were prescribed 4 to 8 weeks of physical therapy treatment (n=229) were administered a screening tool (Sports Injury Rehabilitation Beliefs Scale, Positive and Negative Affect Schedule, and a barriers checklist) prior to treatment. Participants' adherence was assessed with several attendance measures and an in-clinic assessment of behavior. Statistical analyses included correlation, chi-square, multiple regression, and discriminant function analyses. A variety of relationships among affect, barriers, and PMT components were evident. In-clinic behavior and attendance were influenced by affect, whereas dropout status was predicted by affect, severity, self-efficacy, and age. The screening tool used in this study may assist in identifying patients who are at risk for poor adherence and provide valuable information to enhance provider-patient relationships and foster patient adherence. However, it is recommended that more research be conducted to further understand the impact of variables on patient adherence and that the screening tool be enhanced to increase its predictive ability.

  7. Linear filters as a method of real-time prediction of geomagnetic activity

    International Nuclear Information System (INIS)

    McPherron, R.L.; Baker, D.N.; Bargatze, L.F.

    1985-01-01

    Important factors controlling geomagnetic activity include the solar wind velocity, the strength of the interplanetary magnetic field (IMF), and the field orientation. Because these quantities change so much in transit through the solar wind, real-time monitoring immediately upstream of the earth provides the best input for any technique of real-time prediction. One such technique is linear prediction filtering which utilizes past histories of the input and output of a linear system to create a time-invariant filter characterizing the system. Problems of nonlinearity or temporal changes of the system can be handled by appropriate choice of input parameters and piecewise approximation in various ranges of the input. We have created prediction filters for all the standard magnetic indices and tested their efficiency. The filters show that the initial response of the magnetosphere to a southward turning of the IMF peaks in 20 minutes and then again in 55 minutes. After a northward turning, auroral zone indices and the midlatitude ASYM index return to background within 2 hours, while Dst decays exponentially with a time constant of about 8 hours. This paper describes a simple, real-time system utilizing these filters which could predict a substantial fraction of the variation in magnetic activity indices 20 to 50 minutes in advance

  8. Numerical prediction of turbulent heat transfer augmentation in an annular fuel channel with two-dimensional square ribs

    International Nuclear Information System (INIS)

    Takase, Kazuyuki

    1996-01-01

    The square-ribbed fuel rod for high temperature gas-cooled reactors was developed in order to enhance the turbulent heat transfer in comparison with the standard fuel rod. To evaluate the heat transfer performance of the square-ribbed fuel rod, the turbulent heat transfer coefficients in an annular fuel channel with repeated two-dimensional square ribs were analyzed numerically on a fully developed incompressible flow using the k - ε turbulence model and the two-dimensional axisymmetrical coordinate system. Numerical analyses were carried out for a range of Reynolds numbers from 3000 to 20000 and ratios of square-rib pitch to height of 10, 20 and 40, respectively. The predicted values of the heat transfer coefficients agreed within an error of 10% for the square-rib pitch to height ratio of 10, 20% for 20 and 25% for 40, respectively, with the heat transfer empirical correlations obtained from the experimental data. It was concluded by the present study that the effect of the heat transfer augmentation by square ribs could be predicted sufficiently by the present numerical simulations and also a part of its mechanism could be explained by means of the change in the turbulence kinematic energy distribution along the flow direction. (author)

  9. Novel FAM20A mutation causes autosomal recessive amelogenesis imperfecta.

    Science.gov (United States)

    Volodarsky, Michael; Zilberman, Uri; Birk, Ohad S

    2015-06-01

    To relate the peculiar phenotype of amelogenesis imperfecta in a large Bedouin family to the genotype determined by whole genome linkage analysis. Amelogenesis imperfecta (AI) is a broad group of inherited pathologies affecting enamel formation, characterized by variability in phenotypes, causing mutations and modes of inheritance. Autosomal recessive or compound heterozygous mutations in FAM20A, encoding sequence similarity 20, member A, have been shown to cause several AI phenotypes. Five members from a large consanguineous Bedouin family presented with hypoplastic amelogenesis imperfecta with unerupted and resorbed permanent molars. Following Soroka Medical Center IRB approval and informed consent, blood samples were obtained from six affected offspring, five obligatory carriers and two unaffected siblings. Whole genome linkage analysis was performed followed by Sanger sequencing of FAM20A. The sequencing unravelled a novel homozygous deletion mutation in exon 11 (c.1523delC), predicted to insert a premature stop codon (p.Thr508Lysfs*6). We provide an interesting case of novel mutation in this rare disorder, in which the affected kindred is unique in the large number of family members sharing a similar phenotype. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Target gene analyses of 39 amelogenesis imperfecta kindreds

    Science.gov (United States)

    Chan, Hui-Chen; Estrella, Ninna M. R. P.; Milkovich, Rachel N.; Kim, Jung-Wook; Simmer, James P.; Hu, Jan C-C.

    2012-01-01

    Previously, mutational analyses identified six disease-causing mutations in 24 amelogenesis imperfecta (AI) kindreds. We have since expanded the number of AI kindreds to 39, and performed mutation analyses covering the coding exons and adjoining intron sequences for the six proven AI candidate genes [amelogenin (AMELX), enamelin (ENAM), family with sequence similarity 83, member H (FAM83H), WD repeat containing domain 72 (WDR72), enamelysin (MMP20), and kallikrein-related peptidase 4 (KLK4)] and for ameloblastin (AMBN) (a suspected candidate gene). All four of the X-linked AI families (100%) had disease-causing mutations in AMELX, suggesting that AMELX is the only gene involved in the aetiology of X-linked AI. Eighteen families showed an autosomal-dominant pattern of inheritance. Disease-causing mutations were identified in 12 (67%): eight in FAM83H, and four in ENAM. No FAM83H coding-region or splice-junction mutations were identified in three probands with autosomal-dominant hypocalcification AI (ADHCAI), suggesting that a second gene may contribute to the aetiology of ADHCAI. Six families showed an autosomal-recessive pattern of inheritance, and disease-causing mutations were identified in three (50%): two in MMP20, and one in WDR72. No disease-causing mutations were found in 11 families with only one affected member. We conclude that mutation analyses of the current candidate genes for AI have about a 50% chance of identifying the disease-causing mutation in a given kindred. PMID:22243262

  11. 10 CFR 436.20 - Net savings.

    Science.gov (United States)

    2010-01-01

    ... ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.20 Net savings. For a retrofit project, net savings may be found by subtracting life cycle costs based on the proposed project from life cycle costs based on not having it. For a...

  12. Development of a risk index for the prediction of chronic post-surgical pain.

    Science.gov (United States)

    Althaus, A; Hinrichs-Rocker, A; Chapman, R; Arránz Becker, O; Lefering, R; Simanski, C; Weber, F; Moser, K-H; Joppich, R; Trojan, S; Gutzeit, N; Neugebauer, E

    2012-07-01

    The incidence of chronic post-surgical pain (CPSP) after various common operations is 10% to 50%. Identification of patients at risk of developing chronic pain, and the management and prevention of CPSP remains inadequate. The aim of this study was to develop an easily applicable risk index for the detection of high-risk patients that takes into account the multifactorial aetiology of CPSP. A comprehensive item pool was derived from a systematic literature search. Items that turned out significant in bivariate analyses were then analysed multivariately, using logistic regression analyses. The items that yielded significant predictors in the multivariate analyses were compiled into an index. The cut-off score for a high risk of developing CPSP with an optimal trade-off between sensitivity and specificity was identified. The data of 150 patients who underwent different types of surgery were included in the analyses. Six months after surgery, 43.3% of the patients reported CPSP. Five predictors multivariately contributed to the prediction of CPSP: capacity overload, preoperative pain in the operating field, other chronic preoperative pain, post-surgical acute pain and co-morbid stress symptoms. These results suggest that several easily assessable preoperative and perioperative patient characteristics can predict a patient's risk of developing CPSP. The risk index may help caregivers to tailor individual pain management and to assist high-risk patients with pain coping. © 2011 European Federation of International Association for the Study of Pain Chapters.

  13. Development of a predictive system for SLM product quality

    Science.gov (United States)

    Park, H. S.; Tran, N. H.; Nguyen, D. S.

    2017-08-01

    Recently, layer by layer manufacturing or additive manufacturing (AM) has been used in many application fields. Selective laser melting (SLM) is the most attractive method for building layer by layer from metallic powders. However, applications of AM in general and SLM in particular to industry have some barriers due to the quality of the manufactured parts which are affected by the high residual stresses and large deformation. SLM process is characterized by high heat source and fast solidification which lead to large thermal stress. The aim of this research is to develop a system for predicting the printed part quality during SLM process by simulation in consideration of the temperature distribution on the workpiece. For carrying out the system, model for predicting the temperature distribution was established. From this model, influences of process parameters to temperature distribution were analysed. The thermal model in consideration of relationship among printing parameters with temperature distribution is used for optimizing printing process parameters. Then, these results are used for calculating residual stress and predicting the workpiece deformation. The functionality of the proposed predictive system is proven through a case study on aluminium material manufactured on a MetalSys150 - SLM machine.

  14. Predicting complex acute wound healing in patients from a wound expertise centre registry: a prognostic study

    OpenAIRE

    Ubbink, Dirk T; Lindeboom, Robert; Eskes, Anne M; Brull, Huub; Legemate, Dink A; Vermeulen, Hester

    2015-01-01

    It is important for caregivers and patients to know which wounds are at risk of prolonged wound healing to enable timely communication and treatment. Available prognostic models predict wound healing in chronic ulcers, but not in acute wounds, that is, originating after trauma or surgery. We developed a model to detect which factors can predict (prolonged) healing of complex acute wounds in patients treated in a large wound expertise centre (WEC). Using Cox and linear regression analyses, we ...

  15. A literature review and meta-analyses of cannabis use and suicidality.

    Science.gov (United States)

    Borges, Guilherme; Bagge, Courtney L; Orozco, Ricardo

    2016-05-01

    We lack a review of the epidemiological literature on cannabis use (acute use and chronic-usual quantity/frequency and heavy use) and suicidality (suicide death, suicide ideation, suicide attempt). The English language literature on Medline, PsychInfo, Google Scholar, and public-use databases was searched for original articles, critical review reports, and public use data on cannabis use and suicide for the period ranging from 1990-2015 (February). Odds ratios (OR) from random effects in meta-analyses for any cannabis use and heavy cannabis use were calculated. The acute cannabis-suicidality literature mostly includes descriptive toxicology reports. In terms of death by suicide, the average positive cannabis rate was 9.50% for studies sampling from all suicides, with higher cannabis detection rates amongst suicide decedents by non-overdose methods. We found only 4 studies providing estimates for any chronic cannabis use and death by suicide (OR=2.56 (1.25-5.27)). After deleting duplicates we found 6 studies on any cannabis use and suicide ideation (OR=1.43 (1.13-1.83)), 5 studies on heavy cannabis use and suicide ideation (OR=2.53 (1.00-6.39)), 6 studies on any cannabis use and suicide attempt (OR=2.23 (1.24-4.00)) and 6 studies on heavy cannabis use and suicide attempt (OR=3.20 (1.72-5.94)). We currently lack evidence that acute cannabis use increases imminent risk for suicidality. The evidence tends to support that chronic cannabis use can predict suicidality, but the lack of homogeneity in the measurement of cannabis exposure and, in some instances, the lack of systematic control for known risk factors tempered this finding. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Prediction Models for Assessing Lycopene in Open-Field Cultivated Tomatoes by Means of a Portable Reflectance Sensor: Cultivar and Growing-Season Effects.

    Science.gov (United States)

    Ciaccheri, Leonardo; Tuccio, Lorenza; Mencaglia, Andrea A; Sikorska-Zimny, Kalina; Hallmann, Ewelina; Kowalski, Artur; G Mignani, Anna; Kaniszewski, Stanislaw; Agati, Giovanni

    2018-05-09

    Reflectance spectroscopy represents a useful tool for the nondestructive assessment of tomato lycopene, even in the field. For this reason, a compact, low-cost, light emitting diode-based sensor has been developed to measure reflectance in the 400-750 nm spectral range. It was calibrated against wet chemistry and evaluated by partial least squares (PLS) regression analyses. The lycopene prediction models were defined for two open-field cultivated red-tomato varieties: the processing oblong tomatoes of the cv. Calista (average weight: 76 g) and the fresh-consumption round tomatoes of the cv. Volna (average weight: 130 g), over a period of two consecutive years. The lycopene prediction models were dependent on both cultivar and season. The lycopene root mean square error of prediction produced by the 2014 single-cultivar calibrations validated on the 2015 samples was large (33 mg kg -1 ) in the Calista tomatoes and acceptable (9.5 mg kg -1 ) in the Volna tomatoes. A more general bicultivar and biyear model could still explain almost 80% of the predicted lycopene variance, with a relative error in red tomatoes of less than 20%. In 2016, the in-field applications of the multiseasonal prediction models, built with the 2014 and 2015 data, showed significant ( P lycopene estimated in the crop on two sampling dates that were 20 days apart: on August 19 and September 7, 2016, the lycopene was 98.9 ± 9.3 and 92.2 ± 10.8 mg kg -1 FW for cv. Calista and 54.6 ± 13.2 and 60.8 ± 6.8 mg kg -1 FW for cv. Volna. The sensor was also able to monitor the temporal evolution of lycopene accumulation on the very same fruits attached to the plants. These results indicated that a simple, compact reflectance device and PLS analysis could provide adequately precise and robust (through-seasons) models for the nondestructive assessment of lycopene in whole tomatoes. This technique could guarantee tomatoes with the highest nutraceutical value from the production, during storage and

  17. Factors Predicting Pre-Service Teachers' Adoption of Web 2.0 Technologies

    Science.gov (United States)

    Cheon, Jongpil; Coward, Fanni; Song, Jaeki; Lim, Sunho

    2012-01-01

    Classrooms full of "digital natives" represent the norm in U. S. schools, but like their predecessors, they mostly inhabit spaces characterized by a traditional view of teaching and learning. Understanding contributors to this mismatch, and especially teachers' role, is especially critical as Web 2.0 technologies enable greater learner…

  18. A Simple, Reliable Precision Time Analyser

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, B. V.; Nargundkar, V. R.; Subbarao, K.; Kamath, M. S.; Eligar, S. K. [Atomic Energy Establishment Trombay, Bombay (India)

    1966-06-15

    A 30-channel time analyser is described. The time analyser was designed and built for pulsed neutron research but can be applied to other uses. Most of the logic is performed by means of ferrite memory core and transistor switching circuits. This leads to great versatility, low power consumption, extreme reliability and low cost. The analyser described provides channel Widths from 10 {mu}s to 10 ms; arbitrarily wider channels are easily obtainable. It can handle counting rates up to 2000 counts/min in each channel with less than 1% dead time loss. There is a provision for an initial delay equal to 100 channel widths. An input pulse de-randomizer unit using tunnel diodes ensures exactly equal channel widths. A brief description of the principles involved in core switching circuitry is given. The core-transistor transfer loop is compared with the usual core-diode loops and is shown to be more versatile and better adapted to the making of a time analyser. The circuits derived from the basic loop are described. These include the scale of ten, the frequency dividers and the delay generator. The current drivers developed for driving the cores are described. The crystal-controlled clock which controls the width of the time channels and synchronizes the operation of the various circuits is described. The detector pulse derandomizer unit using tunnel diodes is described. The scheme of the time analyser is then described showing how the various circuits can be integrated together to form a versatile time analyser. (author)

  19. A new approach to analyse longitudinal epidemiological data with an excess of zeros.

    Science.gov (United States)

    Spriensma, Alette S; Hajos, Tibor R S; de Boer, Michiel R; Heymans, Martijn W; Twisk, Jos W R

    2013-02-20

    Within longitudinal epidemiological research, 'count' outcome variables with an excess of zeros frequently occur. Although these outcomes are frequently analysed with a linear mixed model, or a Poisson mixed model, a two-part mixed model would be better in analysing outcome variables with an excess of zeros. Therefore, objective of this paper was to introduce the relatively 'new' method of two-part joint regression modelling in longitudinal data analysis for outcome variables with an excess of zeros, and to compare the performance of this method to current approaches. Within an observational longitudinal dataset, we compared three techniques; two 'standard' approaches (a linear mixed model, and a Poisson mixed model), and a two-part joint mixed model (a binomial/Poisson mixed distribution model), including random intercepts and random slopes. Model fit indicators, and differences between predicted and observed values were used for comparisons. The analyses were performed with STATA using the GLLAMM procedure. Regarding the random intercept models, the two-part joint mixed model (binomial/Poisson) performed best. Adding random slopes for time to the models changed the sign of the regression coefficient for both the Poisson mixed model and the two-part joint mixed model (binomial/Poisson) and resulted into a much better fit. This paper showed that a two-part joint mixed model is a more appropriate method to analyse longitudinal data with an excess of zeros compared to a linear mixed model and a Poisson mixed model. However, in a model with random slopes for time a Poisson mixed model also performed remarkably well.

  20. [Methods, challenges and opportunities for big data analyses of microbiome].

    Science.gov (United States)

    Sheng, Hua-Fang; Zhou, Hong-Wei

    2015-07-01

    Microbiome is a novel research field related with a variety of chronic inflamatory diseases. Technically, there are two major approaches to analysis of microbiome: metataxonome by sequencing the 16S rRNA variable tags, and metagenome by shot-gun sequencing of the total microbial (mainly bacterial) genome mixture. The 16S rRNA sequencing analyses pipeline includes sequence quality control, diversity analyses, taxonomy and statistics; metagenome analyses further includes gene annotation and functional analyses. With the development of the sequencing techniques, the cost of sequencing will decrease, and big data analyses will become the central task. Data standardization, accumulation, modeling and disease prediction are crucial for future exploit of these data. Meanwhile, the information property in these data, and the functional verification with culture-dependent and culture-independent experiments remain the focus in future research. Studies of human microbiome will bring a better understanding of the relations between the human body and the microbiome, especially in the context of disease diagnosis and therapy, which promise rich research opportunities.

  1. Comparative transcriptome analyses of three medicinal Forsythia species and prediction of candidate genes involved in secondary metabolisms.

    Science.gov (United States)

    Sun, Luchao; Rai, Amit; Rai, Megha; Nakamura, Michimi; Kawano, Noriaki; Yoshimatsu, Kayo; Suzuki, Hideyuki; Kawahara, Nobuo; Saito, Kazuki; Yamazaki, Mami

    2018-05-07

    The three Forsythia species, F. suspensa, F. viridissima and F. koreana, have been used as herbal medicines in China, Japan and Korea for centuries and they are known to be rich sources of numerous pharmaceutical metabolites, forsythin, forsythoside A, arctigenin, rutin and other phenolic compounds. In this study, de novo transcriptome sequencing and assembly was performed on these species. Using leaf and flower tissues of F. suspensa, F. viridissima and F. koreana, 1.28-2.45-Gbp sequences of Illumina based pair-end reads were obtained and assembled into 81,913, 88,491 and 69,458 unigenes, respectively. Classification of the annotated unigenes in gene ontology terms and KEGG pathways was used to compare the transcriptome of three Forsythia species. The expression analysis of orthologous genes across all three species showed the expression in leaf tissues being highly correlated. The candidate genes presumably involved in the biosynthetic pathway of lignans and phenylethanoid glycosides were screened as co-expressed genes. They express highly in the leaves of F. viridissima and F. koreana. Furthermore, the three unigenes annotated as acyltransferase were predicted to be associated with the biosynthesis of acteoside and forsythoside A from the expression pattern and phylogenetic analysis. This study is the first report on comparative transcriptome analyses of medicinally important Forsythia genus and will serve as an important resource to facilitate further studies on biosynthesis and regulation of therapeutic compounds in Forsythia species.

  2. Model-free prediction and regression a transformation-based approach to inference

    CERN Document Server

    Politis, Dimitris N

    2015-01-01

    The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, co...

  3. Development of steady-state model for MSPT and detailed analyses of receiver

    Science.gov (United States)

    Yuasa, Minoru; Sonoda, Masanori; Hino, Koichi

    2016-05-01

    Molten salt parabolic trough system (MSPT) uses molten salt as heat transfer fluid (HTF) instead of synthetic oil. The demonstration plant of MSPT was constructed by Chiyoda Corporation and Archimede Solar Energy in Italy in 2013. Chiyoda Corporation developed a steady-state model for predicting the theoretical behavior of the demonstration plant. The model was designed to calculate the concentrated solar power and heat loss using ray tracing of incident solar light and finite element modeling of thermal energy transferred into the medium. This report describes the verification of the model using test data on the demonstration plant, detailed analyses on the relation between flow rate and temperature difference on the metal tube of receiver and the effect of defocus angle on concentrated power rate, for solar collector assembly (SCA) development. The model is accurate to an extent of 2.0% as systematic error and 4.2% as random error. The relationships between flow rate and temperature difference on metal tube and the effect of defocus angle on concentrated power rate are shown.

  4. A Prediction Model for ROS1-Rearranged Lung Adenocarcinomas based on Histologic Features

    OpenAIRE

    Zhou, Jianya; Zhao, Jing; Zheng, Jing; Kong, Mei; Sun, Ke; Wang, Bo; Chen, Xi; Ding, Wei; Zhou, Jianying

    2016-01-01

    Aims To identify the clinical and histological characteristics of ROS1-rearranged non-small-cell lung carcinomas (NSCLCs) and build a prediction model to prescreen suitable patients for molecular testing. Methods and Results We identified 27 cases of ROS1-rearranged lung adenocarcinomas in 1165 patients with NSCLCs confirmed by real-time PCR and FISH and performed univariate and multivariate analyses to identify predictive factors associated with ROS1 rearrangement and finally developed predi...

  5. GIMDA: Graphlet interaction-based MiRNA-disease association prediction.

    Science.gov (United States)

    Chen, Xing; Guan, Na-Na; Li, Jian-Qiang; Yan, Gui-Ying

    2018-03-01

    MicroRNAs (miRNAs) have been confirmed to be closely related to various human complex diseases by many experimental studies. It is necessary and valuable to develop powerful and effective computational models to predict potential associations between miRNAs and diseases. In this work, we presented a prediction model of Graphlet Interaction for MiRNA-Disease Association prediction (GIMDA) by integrating the disease semantic similarity, miRNA functional similarity, Gaussian interaction profile kernel similarity and the experimentally confirmed miRNA-disease associations. The related score of a miRNA to a disease was calculated by measuring the graphlet interactions between two miRNAs or two diseases. The novelty of GIMDA lies in that we used graphlet interaction to analyse the complex relationships between two nodes in a graph. The AUCs of GIMDA in global and local leave-one-out cross-validation (LOOCV) turned out to be 0.9006 and 0.8455, respectively. The average result of five-fold cross-validation reached to 0.8927 ± 0.0012. In case study for colon neoplasms, kidney neoplasms and prostate neoplasms based on the database of HMDD V2.0, 45, 45, 41 of the top 50 potential miRNAs predicted by GIMDA were validated by dbDEMC and miR2Disease. Additionally, in the case study of new diseases without any known associated miRNAs and the case study of predicting potential miRNA-disease associations using HMDD V1.0, there were also high percentages of top 50 miRNAs verified by the experimental literatures. © 2017 The Authors. Journal of Cellular and Molecular Medicine published by John Wiley & Sons Ltd and Foundation for Cellular and Molecular Medicine.

  6. Designing a Prognostic Scoring System for Predicting the Outcomes of Proximal Fifth Metatarsal Fractures at 20 Weeks

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Tahririan

    2015-03-01

    Full Text Available Background: Fractures of the proximal fifth metatarsal bone are among the most common fractures observed in the foot and their classification and management has been subject to much discussion and disagreement. In this study, we aim to identify and quantify the effect of possible predictors of the outcome of the treatment of proximal fifth metatarsal fractures. Methods: Patients with established proximal fifth metatarsal fractures were enrolled in this prospective cohort and the outcome of their treatment was assessed using the AOFAS mid foot scale at 6 and 20 weeks. Results: 143 patients were included in the study. Our study showed that displacement, weight and type III fractures were significant independent predictors of poor outcome at 6 weeks while at 20 weeks in addition to these factors, gender and diabetes mellitus were also shown to be significant independent predictors of poor outcome. A scoring system was designed by assigning weight to these factors and it was shown to be a strong predictor of outcome at 20 weeks. Conclusion: We recommend that our scoring system would help surgeons to decide whether patients’ prognostic factors are significant enough for him/her to opt for a surgical approach to treatment rather than a conservative approach.

  7. PSPP: a protein structure prediction pipeline for computing clusters.

    Directory of Open Access Journals (Sweden)

    Michael S Lee

    2009-07-01

    Full Text Available Protein structures are critical for understanding the mechanisms of biological systems and, subsequently, for drug and vaccine design. Unfortunately, protein sequence data exceed structural data by a factor of more than 200 to 1. This gap can be partially filled by using computational protein structure prediction. While structure prediction Web servers are a notable option, they often restrict the number of sequence queries and/or provide a limited set of prediction methodologies. Therefore, we present a standalone protein structure prediction software package suitable for high-throughput structural genomic applications that performs all three classes of prediction methodologies: comparative modeling, fold recognition, and ab initio. This software can be deployed on a user's own high-performance computing cluster.The pipeline consists of a Perl core that integrates more than 20 individual software packages and databases, most of which are freely available from other research laboratories. The query protein sequences are first divided into domains either by domain boundary recognition or Bayesian statistics. The structures of the individual domains are then predicted using template-based modeling or ab initio modeling. The predicted models are scored with a statistical potential and an all-atom force field. The top-scoring ab initio models are annotated by structural comparison against the Structural Classification of Proteins (SCOP fold database. Furthermore, secondary structure, solvent accessibility, transmembrane helices, and structural disorder are predicted. The results are generated in text, tab-delimited, and hypertext markup language (HTML formats. So far, the pipeline has been used to study viral and bacterial proteomes.The standalone pipeline that we introduce here, unlike protein structure prediction Web servers, allows users to devote their own computing assets to process a potentially unlimited number of queries as well as perform

  8. A survey on computational intelligence approaches for predictive modeling in prostate cancer

    OpenAIRE

    Cosma, G; Brown, D; Archer, M; Khan, M; Pockley, AG

    2017-01-01

    Predictive modeling in medicine involves the development of computational models which are capable of analysing large amounts of data in order to predict healthcare outcomes for individual patients. Computational intelligence approaches are suitable when the data to be modelled are too complex forconventional statistical techniques to process quickly and eciently. These advanced approaches are based on mathematical models that have been especially developed for dealing with the uncertainty an...

  9. Diagnostic Comparison of Meteorological Analyses during the 2002 Antarctic Winter

    Science.gov (United States)

    Manney, Gloria L.; Allen, Douglas R.; Kruger, Kirstin; Naujokat, Barbara; Santee, Michelle L.; Sabutis, Joseph L.; Pawson, Steven; Swinbank, Richard; Randall, Cora E.; Simmons, Adrian J.; hide

    2005-01-01

    Several meteorological datasets, including U.K. Met Office (MetO), European Centre for Medium-Range Weather Forecasts (ECMWF), National Centers for Environmental Prediction (NCEP), and NASA's Goddard Earth Observation System (GEOS-4) analyses, are being used in studies of the 2002 Southern Hemisphere (SH) stratospheric winter and Antarctic major warming. Diagnostics are compared to assess how these studies may be affected by the meteorological data used. While the overall structure and evolution of temperatures, winds, and wave diagnostics in the different analyses provide a consistent picture of the large-scale dynamics of the SH 2002 winter, several significant differences may affect detailed studies. The NCEP-NCAR reanalysis (REAN) and NCEP-Department of Energy (DOE) reanalysis-2 (REAN-2) datasets are not recommended for detailed studies, especially those related to polar processing, because of lower-stratospheric temperature biases that result in underestimates of polar processing potential, and because their winds and wave diagnostics show increasing differences from other analyses between similar to 30 and 10 hPa (their top level). Southern Hemisphere polar stratospheric temperatures in the ECMWF 40-Yr Re-analysis (ERA-40) show unrealistic vertical structure, so this long-term reanalysis is also unsuited for quantitative studies. The NCEP/Climate Prediction Center (CPC) objective analyses give an inferior representation of the upper-stratospheric vortex. Polar vortex transport barriers are similar in all analyses, but there is large variation in the amount, patterns, and timing of mixing, even among the operational assimilated datasets (ECMWF, MetO, and GEOS-4). The higher-resolution GEOS-4 and ECMWF assimilations provide significantly better representation of filamentation and small-scale structure than the other analyses, even when fields gridded at reduced resolution are studied. The choice of which analysis to use is most critical for detailed transport

  10. A fast and robust iterative algorithm for prediction of RNA pseudoknotted secondary structures

    Science.gov (United States)

    2014-01-01

    Background Improving accuracy and efficiency of computational methods that predict pseudoknotted RNA secondary structures is an ongoing challenge. Existing methods based on free energy minimization tend to be very slow and are limited in the types of pseudoknots that they can predict. Incorporating known structural information can improve prediction accuracy; however, there are not many methods for prediction of pseudoknotted structures that can incorporate structural information as input. There is even less understanding of the relative robustness of these methods with respect to partial information. Results We present a new method, Iterative HFold, for pseudoknotted RNA secondary structure prediction. Iterative HFold takes as input a pseudoknot-free structure, and produces a possibly pseudoknotted structure whose energy is at least as low as that of any (density-2) pseudoknotted structure containing the input structure. Iterative HFold leverages strengths of earlier methods, namely the fast running time of HFold, a method that is based on the hierarchical folding hypothesis, and the energy parameters of HotKnots V2.0. Our experimental evaluation on a large data set shows that Iterative HFold is robust with respect to partial information, with average accuracy on pseudoknotted structures steadily increasing from roughly 54% to 79% as the user provides up to 40% of the input structure. Iterative HFold is much faster than HotKnots V2.0, while having comparable accuracy. Iterative HFold also has significantly better accuracy than IPknot on our HK-PK and IP-pk168 data sets. Conclusions Iterative HFold is a robust method for prediction of pseudoknotted RNA secondary structures, whose accuracy with more than 5% information about true pseudoknot-free structures is better than that of IPknot, and with about 35% information about true pseudoknot-free structures compares well with that of HotKnots V2.0 while being significantly faster. Iterative HFold and all data used in

  11. A REDEFINITION OF TELEWORK THROUGH CLOUD COMPUTING - TELEWORK 2.0

    Directory of Open Access Journals (Sweden)

    B. Ghilic-Micu

    2016-11-01

    Full Text Available In this paper we aim to analyse two paradigms, from the perspective of the mutual recursivity between them: telework and cloud computing. The main purpose of this scientific endeavour is to determine the level of support of each paradigm for the other and the synergic effect generated by their interdependence. We will approach functional, juridical and environmental issues. As result, we aim to highlight the way Cloud computing solutions may revolutionize all that is telework and who telework may be redefined through the transition to a superior level, called telework 2.0.

  12. Spent-fuel composition: a comparison of predicted and measured data

    International Nuclear Information System (INIS)

    Thomas, C.C. Jr.; Cobb, D.D.; Ostenak, C.A.

    1981-03-01

    The uncertainty in predictions of the nuclear materials content of spent light-water reactor fuel was investigated to obtain guidelines for nondestructive spent-fuel verification and assay. Values predicted by the reactor operator were compared with measured values from fuel reprocessors for six reactors (three PWR and three BWR). The study indicates that total uranium, total plutonium, fissile uranium, fissile plutonium, and total fissile content can be predicted with biases ranging from 1 to 6% and variabilities (1-sigma) ranging from 2 to 7%. The higher values generally are associated with BWRs. Based on the results of this study, nondestructive assay measurements that are accurate and precise to 5 to 10% (1sigma) or better should be useful for quantitative analyses of typical spent fuel

  13. Summary of dynamic analyses of the advanced neutron source reactor inner control rods

    International Nuclear Information System (INIS)

    Hendrich, W.R.

    1995-08-01

    A summary of the structural dynamic analyses that were instrumental in providing design guidance to the Advanced Neutron source (ANS) inner control element system is presented in this report. The structural analyses and the functional constraints that required certain performance parameters were combined to shape and guide the design effort toward a prediction of successful and reliable control and scram operation to be provided by these inner control rods

  14. Rockslide and Impulse Wave Modelling in the Vajont Reservoir by DEM-CFD Analyses

    Science.gov (United States)

    Zhao, T.; Utili, S.; Crosta, G. B.

    2016-06-01

    This paper investigates the generation of hydrodynamic water waves due to rockslides plunging into a water reservoir. Quasi-3D DEM analyses in plane strain by a coupled DEM-CFD code are adopted to simulate the rockslide from its onset to the impact with the still water and the subsequent generation of the wave. The employed numerical tools and upscaling of hydraulic properties allow predicting a physical response in broad agreement with the observations notwithstanding the assumptions and characteristics of the adopted methods. The results obtained by the DEM-CFD coupled approach are compared to those published in the literature and those presented by Crosta et al. (Landslide spreading, impulse waves and modelling of the Vajont rockslide. Rock mechanics, 2014) in a companion paper obtained through an ALE-FEM method. Analyses performed along two cross sections are representative of the limit conditions of the eastern and western slope sectors. The max rockslide average velocity and the water wave velocity reach ca. 22 and 20 m/s, respectively. The maximum computed run up amounts to ca. 120 and 170 m for the eastern and western lobe cross sections, respectively. These values are reasonably similar to those recorded during the event (i.e. ca. 130 and 190 m, respectively). Therefore, the overall study lays out a possible DEM-CFD framework for the modelling of the generation of the hydrodynamic wave due to the impact of a rapid moving rockslide or rock-debris avalanche.

  15. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.; Mai, Paul Martin; Thingbaijam, Kiran Kumar; Razafindrakoto, H. N. T.; Genton, Marc G.

    2014-01-01

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  16. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.

    2014-11-10

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  17. Species classifier choice is a key consideration when analysing low-complexity food microbiome data.

    Science.gov (United States)

    Walsh, Aaron M; Crispie, Fiona; O'Sullivan, Orla; Finnegan, Laura; Claesson, Marcus J; Cotter, Paul D

    2018-03-20

    The use of shotgun metagenomics to analyse low-complexity microbial communities in foods has the potential to be of considerable fundamental and applied value. However, there is currently no consensus with respect to choice of species classification tool, platform, or sequencing depth. Here, we benchmarked the performances of three high-throughput short-read sequencing platforms, the Illumina MiSeq, NextSeq 500, and Ion Proton, for shotgun metagenomics of food microbiota. Briefly, we sequenced six kefir DNA samples and a mock community DNA sample, the latter constructed by evenly mixing genomic DNA from 13 food-related bacterial species. A variety of bioinformatic tools were used to analyse the data generated, and the effects of sequencing depth on these analyses were tested by randomly subsampling reads. Compositional analysis results were consistent between the platforms at divergent sequencing depths. However, we observed pronounced differences in the predictions from species classification tools. Indeed, PERMANOVA indicated that there was no significant differences between the compositional results generated by the different sequencers (p = 0.693, R 2  = 0.011), but there was a significant difference between the results predicted by the species classifiers (p = 0.01, R 2  = 0.127). The relative abundances predicted by the classifiers, apart from MetaPhlAn2, were apparently biased by reference genome sizes. Additionally, we observed varying false-positive rates among the classifiers. MetaPhlAn2 had the lowest false-positive rate, whereas SLIMM had the greatest false-positive rate. Strain-level analysis results were also similar across platforms. Each platform correctly identified the strains present in the mock community, but accuracy was improved slightly with greater sequencing depth. Notably, PanPhlAn detected the dominant strains in each kefir sample above 500,000 reads per sample. Again, the outputs from functional profiling analysis using

  18. A predictive coding account of bistable perception - a model-based fMRI study.

    Science.gov (United States)

    Weilnhammer, Veith; Stuke, Heiner; Hesselmann, Guido; Sterzer, Philipp; Schmack, Katharina

    2017-05-01

    In bistable vision, subjective perception wavers between two interpretations of a constant ambiguous stimulus. This dissociation between conscious perception and sensory stimulation has motivated various empirical studies on the neural correlates of bistable perception, but the neurocomputational mechanism behind endogenous perceptual transitions has remained elusive. Here, we recurred to a generic Bayesian framework of predictive coding and devised a model that casts endogenous perceptual transitions as a consequence of prediction errors emerging from residual evidence for the suppressed percept. Data simulations revealed close similarities between the model's predictions and key temporal characteristics of perceptual bistability, indicating that the model was able to reproduce bistable perception. Fitting the predictive coding model to behavioural data from an fMRI-experiment on bistable perception, we found a correlation across participants between the model parameter encoding perceptual stabilization and the behaviourally measured frequency of perceptual transitions, corroborating that the model successfully accounted for participants' perception. Formal model comparison with established models of bistable perception based on mutual inhibition and adaptation, noise or a combination of adaptation and noise was used for the validation of the predictive coding model against the established models. Most importantly, model-based analyses of the fMRI data revealed that prediction error time-courses derived from the predictive coding model correlated with neural signal time-courses in bilateral inferior frontal gyri and anterior insulae. Voxel-wise model selection indicated a superiority of the predictive coding model over conventional analysis approaches in explaining neural activity in these frontal areas, suggesting that frontal cortex encodes prediction errors that mediate endogenous perceptual transitions in bistable perception. Taken together, our current work

  19. A predictive coding account of bistable perception - a model-based fMRI study.

    Directory of Open Access Journals (Sweden)

    Veith Weilnhammer

    2017-05-01

    Full Text Available In bistable vision, subjective perception wavers between two interpretations of a constant ambiguous stimulus. This dissociation between conscious perception and sensory stimulation has motivated various empirical studies on the neural correlates of bistable perception, but the neurocomputational mechanism behind endogenous perceptual transitions has remained elusive. Here, we recurred to a generic Bayesian framework of predictive coding and devised a model that casts endogenous perceptual transitions as a consequence of prediction errors emerging from residual evidence for the suppressed percept. Data simulations revealed close similarities between the model's predictions and key temporal characteristics of perceptual bistability, indicating that the model was able to reproduce bistable perception. Fitting the predictive coding model to behavioural data from an fMRI-experiment on bistable perception, we found a correlation across participants between the model parameter encoding perceptual stabilization and the behaviourally measured frequency of perceptual transitions, corroborating that the model successfully accounted for participants' perception. Formal model comparison with established models of bistable perception based on mutual inhibition and adaptation, noise or a combination of adaptation and noise was used for the validation of the predictive coding model against the established models. Most importantly, model-based analyses of the fMRI data revealed that prediction error time-courses derived from the predictive coding model correlated with neural signal time-courses in bilateral inferior frontal gyri and anterior insulae. Voxel-wise model selection indicated a superiority of the predictive coding model over conventional analysis approaches in explaining neural activity in these frontal areas, suggesting that frontal cortex encodes prediction errors that mediate endogenous perceptual transitions in bistable perception. Taken together

  20. Analyser-based phase contrast image reconstruction using geometrical optics.

    Science.gov (United States)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  1. Deep learning predictions of survival based on MRI in amyotrophic lateral sclerosis.

    Science.gov (United States)

    van der Burgh, Hannelore K; Schmidt, Ruben; Westeneng, Henk-Jan; de Reus, Marcel A; van den Berg, Leonard H; van den Heuvel, Martijn P

    2017-01-01

    Amyotrophic lateral sclerosis (ALS) is a progressive neuromuscular disease, with large variation in survival between patients. Currently, it remains rather difficult to predict survival based on clinical parameters alone. Here, we set out to use clinical characteristics in combination with MRI data to predict survival of ALS patients using deep learning, a machine learning technique highly effective in a broad range of big-data analyses. A group of 135 ALS patients was included from whom high-resolution diffusion-weighted and T1-weighted images were acquired at the first visit to the outpatient clinic. Next, each of the patients was monitored carefully and survival time to death was recorded. Patients were labeled as short, medium or long survivors, based on their recorded time to death as measured from the time of disease onset. In the deep learning procedure, the total group of 135 patients was split into a training set for deep learning (n = 83 patients), a validation set (n = 20) and an independent evaluation set (n = 32) to evaluate the performance of the obtained deep learning networks. Deep learning based on clinical characteristics predicted survival category correctly in 68.8% of the cases. Deep learning based on MRI predicted 62.5% correctly using structural connectivity and 62.5% using brain morphology data. Notably, when we combined the three sources of information, deep learning prediction accuracy increased to 84.4%. Taken together, our findings show the added value of MRI with respect to predicting survival in ALS, demonstrating the advantage of deep learning in disease prognostication.

  2. Refining Prediction in Treatment-Resistant Depression: Results of Machine Learning Analyses in the TRD III Sample.

    Science.gov (United States)

    Kautzky, Alexander; Dold, Markus; Bartova, Lucie; Spies, Marie; Vanicek, Thomas; Souery, Daniel; Montgomery, Stuart; Mendlewicz, Julien; Zohar, Joseph; Fabbri, Chiara; Serretti, Alessandro; Lanzenberger, Rupert; Kasper, Siegfried

    The study objective was to generate a prediction model for treatment-resistant depression (TRD) using machine learning featuring a large set of 47 clinical and sociodemographic predictors of treatment outcome. 552 Patients diagnosed with major depressive disorder (MDD) according to DSM-IV criteria were enrolled between 2011 and 2016. TRD was defined as failure to reach response to antidepressant treatment, characterized by a Montgomery-Asberg Depression Rating Scale (MADRS) score below 22 after at least 2 antidepressant trials of adequate length and dosage were administered. RandomForest (RF) was used for predicting treatment outcome phenotypes in a 10-fold cross-validation. The full model with 47 predictors yielded an accuracy of 75.0%. When the number of predictors was reduced to 15, accuracies between 67.6% and 71.0% were attained for different test sets. The most informative predictors of treatment outcome were baseline MADRS score for the current episode; impairment of family, social, and work life; the timespan between first and last depressive episode; severity; suicidal risk; age; body mass index; and the number of lifetime depressive episodes as well as lifetime duration of hospitalization. With the application of the machine learning algorithm RF, an efficient prediction model with an accuracy of 75.0% for forecasting treatment outcome could be generated, thus surpassing the predictive capabilities of clinical evaluation. We also supply a simplified algorithm of 15 easily collected clinical and sociodemographic predictors that can be obtained within approximately 10 minutes, which reached an accuracy of 70.6%. Thus, we are confident that our model will be validated within other samples to advance an accurate prediction model fit for clinical usage in TRD. © Copyright 2017 Physicians Postgraduate Press, Inc.

  3. Vaginal birth after caesarean section prediction models: a UK comparative observational study.

    Science.gov (United States)

    Mone, Fionnuala; Harrity, Conor; Mackie, Adam; Segurado, Ricardo; Toner, Brenda; McCormick, Timothy R; Currie, Aoife; McAuliffe, Fionnuala M

    2015-10-01

    Primarily, to assess the performance of three statistical models in predicting successful vaginal birth in patients attempting a trial of labour after one previous lower segment caesarean section (TOLAC). The statistically most reliable models were subsequently subjected to validation testing in a local antenatal population. A retrospective observational study was performed with study data collected from the Northern Ireland Maternity Service Database (NIMATs). The study population included all women that underwent a TOLAC (n=385) from 2010 to 2012 in a regional UK obstetric unit. Data was collected from the Northern Ireland Maternity Service Database (NIMATs). Area under the curve (AUC) and correlation analysis was performed. Of the three prediction models evaluated, AUC calculations for the Smith et al., Grobman et al. and Troyer and Parisi Models were 0.74, 0.72 and 0.65, respectively. Using the Smith et al. model, 52% of women had a low risk of caesarean section (CS) (predicted VBAC >72%) and 20% had a high risk of CS (predicted VBAC <60%), of whom 20% and 63% had delivery by CS. The fit between observed and predicted outcome in this study cohort using the Smith et al. and Grobman et al. models were greatest (Chi-square test, p=0.228 and 0.904), validating both within the population. The Smith et al. and Grobman et al. models could potentially be utilized within the UK to provide women with an informed choice when deciding on mode of delivery after a previous CS. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  4. Beyond discrimination: A comparison of calibration methods and clinical usefulness of predictive models of readmission risk.

    Science.gov (United States)

    Walsh, Colin G; Sharman, Kavya; Hripcsak, George

    2017-12-01

    Prior to implementing predictive models in novel settings, analyses of calibration and clinical usefulness remain as important as discrimination, but they are not frequently discussed. Calibration is a model's reflection of actual outcome prevalence in its predictions. Clinical usefulness refers to the utilities, costs, and harms of using a predictive model in practice. A decision analytic approach to calibrating and selecting an optimal intervention threshold may help maximize the impact of readmission risk and other preventive interventions. To select a pragmatic means of calibrating predictive models that requires a minimum amount of validation data and that performs well in practice. To evaluate the impact of miscalibration on utility and cost via clinical usefulness analyses. Observational, retrospective cohort study with electronic health record data from 120,000 inpatient admissions at an urban, academic center in Manhattan. The primary outcome was thirty-day readmission for three causes: all-cause, congestive heart failure, and chronic coronary atherosclerotic disease. Predictive modeling was performed via L1-regularized logistic regression. Calibration methods were compared including Platt Scaling, Logistic Calibration, and Prevalence Adjustment. Performance of predictive modeling and calibration was assessed via discrimination (c-statistic), calibration (Spiegelhalter Z-statistic, Root Mean Square Error [RMSE] of binned predictions, Sanders and Murphy Resolutions of the Brier Score, Calibration Slope and Intercept), and clinical usefulness (utility terms represented as costs). The amount of validation data necessary to apply each calibration algorithm was also assessed. C-statistics by diagnosis ranged from 0.7 for all-cause readmission to 0.86 (0.78-0.93) for congestive heart failure. Logistic Calibration and Platt Scaling performed best and this difference required analyzing multiple metrics of calibration simultaneously, in particular Calibration

  5. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a open-quotes lower boundclose quotes, open-quotes best estimateclose quotes, and open-quotes upper boundclose quotes failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties

  6. A simple technique investigating baseline heterogeneity helped to eliminate potential bias in meta-analyses.

    Science.gov (United States)

    Hicks, Amy; Fairhurst, Caroline; Torgerson, David J

    2018-03-01

    To perform a worked example of an approach that can be used to identify and remove potentially biased trials from meta-analyses via the analysis of baseline variables. True randomisation produces treatment groups that differ only by chance; therefore, a meta-analysis of a baseline measurement should produce no overall difference and zero heterogeneity. A meta-analysis from the British Medical Journal, known to contain significant heterogeneity and imbalance in baseline age, was chosen. Meta-analyses of baseline variables were performed and trials systematically removed, starting with those with the largest t-statistic, until the I 2 measure of heterogeneity became 0%, then the outcome meta-analysis repeated with only the remaining trials as a sensitivity check. We argue that heterogeneity in a meta-analysis of baseline variables should not exist, and therefore removing trials which contribute to heterogeneity from a meta-analysis will produce a more valid result. In our example none of the overall outcomes changed when studies contributing to heterogeneity were removed. We recommend routine use of this technique, using age and a second baseline variable predictive of outcome for the particular study chosen, to help eliminate potential bias in meta-analyses. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Increased Expression of microRNA-17 Predicts Poor Prognosis in Human Glioma

    Directory of Open Access Journals (Sweden)

    Shengkui Lu

    2012-01-01

    Full Text Available Aim. To investigate the clinical significance of microRNA-17 (miR-17 expression in human gliomas. Methods. Quantitative real-time polymerase chain reaction (qRT-PCR analysis was used to characterize the expression patterns of miR-17 in 108 glioma and 20 normal brain tissues. The associations of miR-17 expression with clinicopathological factors and prognosis of glioma patients were also statistically analyzed. Results. Compared with normal brain tissues, miR-17 expression was significantly higher in glioma tissues (P<0.001. In addition, the increased expression of miR-17 in glioma was significantly associated with advanced pathological grade (P=0.006 and low Karnofsky performance score (KPS, P=0.01. Moreover, Kaplan-Meier survival and Cox regression analyses showed that miR-17 overexpression (P=0.008 and advanced pathological grade (P=0.02 were independent factors predicting poor prognosis for gliomas. Furthermore, subgroup analyses showed that miR-17 expression was significantly associated with poor overall survival in glioma patients with high pathological grades (for grade III~IV: P<0.001. Conclusions. Our data offer the convinced evidence that the increased expression of miR-17 may have potential value for predicting poor prognosis in glioma patients with high pathological grades, indicating that miR-17 may contribute to glioma progression and be a candidate therapeutic target for this disease.

  8. Model-Based Recursive Partitioning for Subgroup Analyses.

    Science.gov (United States)

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-05-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by predictive factors. The method starts with a model for the overall treatment effect as defined for the primary analysis in the study protocol and uses measures for detecting parameter instabilities in this treatment effect. The procedure produces a segmented model with differential treatment parameters corresponding to each patient subgroup. The subgroups are linked to predictive factors by means of a decision tree. The method is applied to the search for subgroups of patients suffering from amyotrophic lateral sclerosis that differ with respect to their Riluzole treatment effect, the only currently approved drug for this disease.

  9. Active Canada 20/20: A physical activity plan for Canada.

    Science.gov (United States)

    Spence, John C; Faulkner, Guy; Costas Bradstreet, Christa; Duggan, Mary; Tremblay, Mark S

    2016-03-16

    Physical inactivity is a pressing public health concern. In this commentary we argue that Canada's approach to increasing physical activity (PA) has been fragmented and has lacked coordination, funding and a strategic approach. We then describe a potential solution in Active Canada 20/20 (AC 20/20), which provides both a national plan and a commitment to action from non-government and public sectors with a view to engaging corporate Canada and the general public. It outlines a road map for initiating, coordinating and implementing proactive initiatives to address this prominent health risk factor. The identified actions are based on the best available evidence and have been endorsed by the majority of representatives in the relevant sectors. The next crucial steps are to engage all those involved in public health promotion, service provision and advocacy at the municipal, provincial and national levels in order to incorporate AC 20/20 principles into practice and planning and thus increase the PA level of every person in Canada. Further, governments, as well as the private, not-for-profit and philanthropic sectors, should demonstrate leadership and continue their efforts toward providing the substantial and sustained resources needed to recalibrate Canadians' habitual PA patterns; this will ultimately improve the overall health of our citizens.

  10. Accuracy of algorithms to predict accessory pathway location in children with Wolff-Parkinson-White syndrome.

    Science.gov (United States)

    Wren, Christopher; Vogel, Melanie; Lord, Stephen; Abrams, Dominic; Bourke, John; Rees, Philip; Rosenthal, Eric

    2012-02-01

    The aim of this study was to examine the accuracy in predicting pathway location in children with Wolff-Parkinson-White syndrome for each of seven published algorithms. ECGs from 100 consecutive children with Wolff-Parkinson-White syndrome undergoing electrophysiological study were analysed by six investigators using seven published algorithms, six of which had been developed in adult patients. Accuracy and concordance of predictions were adjusted for the number of pathway locations. Accessory pathways were left-sided in 49, septal in 20 and right-sided in 31 children. Overall accuracy of prediction was 30-49% for the exact location and 61-68% including adjacent locations. Concordance between investigators varied between 41% and 86%. No algorithm was better at predicting septal pathways (accuracy 5-35%, improving to 40-78% including adjacent locations), but one was significantly worse. Predictive accuracy was 24-53% for the exact location of right-sided pathways (50-71% including adjacent locations) and 32-55% for the exact location of left-sided pathways (58-73% including adjacent locations). All algorithms were less accurate in our hands than in other authors' own assessment. None performed well in identifying midseptal or right anteroseptal accessory pathway locations.

  11. Prediction of cereal feed value using spectroscopy and chemometrics

    DEFF Research Database (Denmark)

    Jørgensen, Johannes Ravn; Gislum, René

    2009-01-01

    of EDOM, EDOMi, FEso and FEsv. The outcome of a successful NIRS calibration will be a relatively cheap tool to monitor, diversify and evaluate the quality of cereals for animal feed, a possible tool to assess the feed value of new varieties in the variety testing and a useful, cheap and rapid tool...... for cereal breeders. A collection of 1213 grain samples of wheat, triticale, barley and rye, and related chemical reference analyses to describe the feed value have been established. The samples originate from available field trials over a three-year period. The chemical reference analyses are dry matter...... value, the prediction error has to be compared with the error in the chemical analysis. Prediction error by NIRS prediction of feed value is above the error of the chemical measurement. The conclusion is that it is possible to predict the feed value in cereals with NIRS quickly and cheaply...

  12. MCNP benchmark analyses of critical experiments for the Space Nuclear Thermal Propulsion program

    International Nuclear Information System (INIS)

    Selcow, E.C.; Cerbone, R.J.; Ludewig, H.; Mughabghab, S.F.; Schmidt, E.; Todosow, M.; Parma, E.J.; Ball, R.M.; Hoovler, G.S.

    1993-01-01

    Benchmark analyses have been performed of Particle Bed Reactor (PBR) critical experiments (CX) using the MCNP radiation transport code. The experiments have been conducted at the Sandia National Laboratory reactor facility in support of the Space Nuclear Thermal Propulsion (SNTP) program. The test reactor is a nineteen element water moderated and reflected thermal system. A series of integral experiments have been carried out to test the capabilities of the radiation transport codes to predict the performance of PBR systems. MCNP was selected as the preferred radiation analysis tool for the benchmark experiments. Comparison between experimental and calculational results indicate close agreement. This paper describes the analyses of benchmark experiments designed to quantify the accuracy of the MCNP radiation transport code for predicting the performance characteristics of PBR reactors

  13. MCNP benchmark analyses of critical experiments for the Space Nuclear Thermal Propulsion program

    Science.gov (United States)

    Selcow, Elizabeth C.; Cerbone, Ralph J.; Ludewig, Hans; Mughabghab, Said F.; Schmidt, Eldon; Todosow, Michael; Parma, Edward J.; Ball, Russell M.; Hoovler, Gary S.

    1993-01-01

    Benchmark analyses have been performed of Particle Bed Reactor (PBR) critical experiments (CX) using the MCNP radiation transport code. The experiments have been conducted at the Sandia National Laboratory reactor facility in support of the Space Nuclear Thermal Propulsion (SNTP) program. The test reactor is a nineteen element water moderated and reflected thermal system. A series of integral experiments have been carried out to test the capabilities of the radiation transport codes to predict the performance of PBR systems. MCNP was selected as the preferred radiation analysis tool for the benchmark experiments. Comparison between experimental and calculational results indicate close agreement. This paper describes the analyses of benchmark experiments designed to quantify the accuracy of the MCNP radiation transport code for predicting the performance characteristics of PBR reactors.

  14. Cast iron - a predictable material

    Directory of Open Access Journals (Sweden)

    Jorg C. Sturm

    2011-02-01

    Full Text Available High strength compacted graphite iron (CGI or alloyed cast iron components are substituting previously used non-ferrous castings in automotive power train applications. The mechanical engineering industry has recognized the value in substituting forged or welded structures with stiff and light-weight cast iron castings. New products such as wind turbines have opened new markets for an entire suite of highly reliable ductile iron cast components. During the last 20 years, casting process simulation has developed from predicting hot spots and solidification to an integral assessment tool for foundries for the entire manufacturing route of castings. The support of the feeding related layout of the casting is still one of the most important duties for casting process simulation. Depending on the alloy poured, different feeding behaviors and self-feeding capabilities need to be considered to provide a defect free casting. Therefore, it is not enough to base the prediction of shrinkage defects solely on hot spots derived from temperature fields. To be able to quantitatively predict these defects, solidification simulation had to be combined with density and mass transport calculations, in order to evaluate the impact of the solidification morphology on the feeding behavior as well as to consider alloy dependent feeding ranges. For cast iron foundries, the use of casting process simulation has become an important instrument to predict the robustness and reliability of their processes, especially since the influence of alloying elements, melting practice and metallurgy need to be considered to quantify the special shrinkage and solidification behavior of cast iron. This allows the prediction of local structures, phases and ultimately the local mechanical properties of cast irons, to asses casting quality in the foundry but also to make use of this quantitative information during design of the casting. Casting quality issues related to thermally driven

  15. MCNP benchmark analyses of critical experiments for space nuclear thermal propulsion

    International Nuclear Information System (INIS)

    Selcow, E.C.; Cerbone, R.J.; Ludewig, H.

    1993-01-01

    The particle-bed reactor (PBR) system is being developed for use in the Space Nuclear Thermal Propulsion (SNTP) Program. This reactor system is characterized by a highly heterogeneous, compact configuration with many streaming pathways. The neutronics analyses performed for this system must be able to accurately predict reactor criticality, kinetics parameters, material worths at various temperatures, feedback coefficients, and detailed fission power and heating distributions. The latter includes coupled axial, radial, and azimuthal profiles. These responses constitute critical inputs and interfaces with the thermal-hydraulics design and safety analyses of the system

  16. Motor dual-tasking deficits predict falls in Parkinson's disease: A prospective study.

    Science.gov (United States)

    Heinzel, Sebastian; Maechtel, Mirjam; Hasmann, Sandra E; Hobert, Markus A; Heger, Tanja; Berg, Daniela; Maetzler, Walter

    2016-05-01

    Falls severely affect lives of Parkinson's disease (PD) patients. Cognitive impairment including dual-tasking deficits contribute to fall risk in PD. However, types of dual-tasking deficits preceding falls in PD are still unclear. Walking velocities during box-checking and subtracting serial 7s were assessed twice a year in 40 PD patients over 2.8 ± 1.0 years. Fourteen patients reported a fall within this period (4 excluded fallers already reported falls at baseline). Their dual-task costs (DTC; mean ± standard deviation) 4.2 ± 2.2 months before the first fall were compared with 22 patients never reporting falls. ROC analyses and logistic regressions accounting for DTC, UPDRS-III and disease duration were used for faller classification and prediction. Only walking/box-checking predicted fallers. Fallers showed higher DTC for walking while box-checking, p = 0.029, but not for box-checking while walking, p = 0.178 (combined motor DTC, p = 0.022), than non-fallers. Combined motor DTC classified fallers and non-fallers (area under curve: 0.75; 95% confidence interval, CI: 0.60-0.91) with 71.4% sensitivity (95%CI: 41.9%-91.6%) and 77.3% specificity (54.6%-92.2%), and significantly predicted future fallers (p = 0.023). Here, 20.4%-points higher combined motor DTC (i.e. the mean difference between fallers and non-fallers) was associated with a 2.6 (1.1-6.0) times higher odds to be a future faller. Motor dual-tasking is a potentially valuable predictor of falls in PD, suggesting that avoiding dual task situations as well as specific motor dual-task training might help to prevent falls in PD. These findings and their therapeutic relevance need to be further validated in PD patients without fall history, in early PD stages, and with various motor-motor dual-task challenges. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Predictive characterization of aging and degradation of reactor materials in extreme environments. Final report, December 20, 2013 - September 20, 2017

    Energy Technology Data Exchange (ETDEWEB)

    Qu, Jianmin [Northwestern Univ., Evanston, IL (United States)

    2017-09-20

    Understanding of reactor material behavior in extreme environments is vital not only to the development of new materials for the next generation nuclear reactors, but also to the extension of the operating lifetimes of the current fleet of nuclear reactors. To this end, this project conducted a suite of unique experimental techniques, augmented by a mesoscale computational framework, to understand and predict the long-term effects of irradiation, temperature, and stress on material microstructures and their macroscopic behavior. The experimental techniques and computational tools were demonstrated on two distinctive types of reactor materials, namely, Zr alloys and high-Cr martensitic steels. These materials are chosen as the test beds because they are the archetypes of high-performance reactor materials (cladding, wrappers, ducts, pressure vessel, piping, etc.). To fill the knowledge gaps, and to meet the technology needs, a suite of innovative in situ transmission electron microscopy (TEM) characterization techniques (heating, heavy ion irradiation, He implantation, quantitative small-scale mechanical testing, and various combinations thereof) were developed and used to elucidate and map the fundamental mechanisms of microstructure evolution in both Zr and Cr alloys for a wide range environmental boundary conditions in the thermal-mechanical-irradiation input space. Knowledge gained from the experimental observations of the active mechanisms and the role of local microstructural defects on the response of the material has been incorporated into a mathematically rigorous and comprehensive three-dimensional mesoscale framework capable of accounting for the compositional variation, microstructural evolution and localized deformation (radiation damage) to predict aging and degradation of key reactor materials operating in extreme environments. Predictions from this mesoscale framework were compared with the in situ TEM observations to validate the model.

  18. Predictive characterization of aging and degradation of reactor materials in extreme environments. Final report, December 20, 2013 - September 20, 2017

    International Nuclear Information System (INIS)

    Qu, Jianmin

    2017-01-01

    Understanding of reactor material behavior in extreme environments is vital not only to the development of new materials for the next generation nuclear reactors, but also to the extension of the operating lifetimes of the current fleet of nuclear reactors. To this end, this project conducted a suite of unique experimental techniques, augmented by a mesoscale computational framework, to understand and predict the long-term effects of irradiation, temperature, and stress on material microstructures and their macroscopic behavior. The experimental techniques and computational tools were demonstrated on two distinctive types of reactor materials, namely, Zr alloys and high-Cr martensitic steels. These materials are chosen as the test beds because they are the archetypes of high-performance reactor materials (cladding, wrappers, ducts, pressure vessel, piping, etc.). To fill the knowledge gaps, and to meet the technology needs, a suite of innovative in situ transmission electron microscopy (TEM) characterization techniques (heating, heavy ion irradiation, He implantation, quantitative small-scale mechanical testing, and various combinations thereof) were developed and used to elucidate and map the fundamental mechanisms of microstructure evolution in both Zr and Cr alloys for a wide range environmental boundary conditions in the thermal-mechanical-irradiation input space. Knowledge gained from the experimental observations of the active mechanisms and the role of local microstructural defects on the response of the material has been incorporated into a mathematically rigorous and comprehensive three-dimensional mesoscale framework capable of accounting for the compositional variation, microstructural evolution and localized deformation (radiation damage) to predict aging and degradation of key reactor materials operating in extreme environments. Predictions from this mesoscale framework were compared with the in situ TEM observations to validate the model.

  19. Network-derived inhomogeneity in monthly rainfall analyses over western Tasmania

    International Nuclear Information System (INIS)

    Fawcett, Robert; Trewin, Blair; Barnes-Keoghan, Ian

    2010-01-01

    Monthly rainfall in the wetter western half of Tasmania was relatively poorly observed in the early to middle parts of the 20th century, and this causes a marked inhomogeneity in the operational gridded monthly rainfall analyses generated by the Australian Bureau of Meteorology up until the end of 2009. These monthly rainfall analyses were generated for the period 1900 to 2009 in two forms; a national analysis at 0.25 0 latitude-longitude resolution, and a southeastern Australia regional analysis at 0.1 0 resolution. For any given month, they used all the monthly data from the standard Bureau rainfall gauge network available in the Australian Data Archive for Meteorology. Since this network has changed markedly since Federation (1901), there is obvious scope for network-derived inhomogeneities in the analyses. In this study, we show that the topography-resolving techniques of the new Australian Water Availability Project analyses, adopted as the official operational analyses from the start of 2010, substantially diminish those inhomogeneities, while using largely the same observation network. One result is an improved characterisation of recent rainfall declines across Tasmania. The new analyses are available at two resolutions, 0.25 0 and 0.05 0 .

  20. Compound-specific isotopic analyses: a novel tool for reconstruction of ancient biogeochemical processes

    Science.gov (United States)

    Hayes, J. M.; Freeman, K. H.; Popp, B. N.; Hoham, C. H.

    1990-01-01

    Patterns of isotopic fractionation in biogeochemical processes are reviewed and it is suggested that isotopic fractionations will be small when substrates are large. If so, isotopic compositions of biomarkers will reflect those of their biosynthetic precursors. This prediction is tested by consideration of results of analyses of geoporphyrins and geolipids from the Greenhorn Formation (Cretaceous, Western Interior Seaway of North America) and the Messel Shale (Eocene, lacustrine, southern Germany). It is shown (i) that isotopic compositions of porphyrins that are related to a common source, but which have been altered structurally, cluster tightly and (ii) that isotopic differences between geolipids and porphyrins related to a common source are equal to those observed in modern biosynthetic products. Both of these observations are consistent with preservation of biologically controlled isotopic compositions during diagenesis. Isotopic compositions of individual compounds can thus be interpreted in terms of biogeochemical processes in ancient depositional environments. In the Cretaceous samples, isotopic compositions of n-alkanes are covariant with those of total organic carbon, while delta values for pristane and phytane are covariant with those of porphyrins. In this unit representing an open marine environment, the preserved acyclic polyisoprenoids apparently derive mainly from primary material, while the extractable, n-alkanes derive mainly from lower levels of the food chain. In the Messel Shale, isotopic compositions of individual biomarkers range from -20.9 to -73.4% vs PDB. Isotopic compositions of specific compounds can be interpreted in terms of origin from methylotrophic, chemautotrophic, and chemolithotrophic microorganisms as well as from primary producers that lived in the water column and sediments of this ancient lake.

  1. A systematic review of the quality of conduct and reporting of systematic reviews and meta-analyses in paediatric surgery.

    Directory of Open Access Journals (Sweden)

    Paul Stephen Cullis

    Full Text Available Our objective was to evaluate quality of conduct and reporting of published systematic reviews and meta-analyses in paediatric surgery. We also aimed to identify characteristics predictive of review quality.Systematic reviews summarise evidence by combining sources, but are potentially prone to bias. To counter this, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA was published to aid in reporting. Similarly, the Assessing the Methodological Quality of Systematic Reviews (AMSTAR measurement tool was designed to appraise methodology. The paediatric surgical literature has seen an increasing number of reviews over the past decade, but quality has not been evaluated.Adhering to PRISMA guidelines, we performed a systematic review with a priori design to identify systematic reviews and meta-analyses of interventions in paediatric surgery. From 01/2010 to 06/2016, we searched: MEDLINE, EMBASE, Cochrane, Centre for Reviews and Dissemination, Web of Science, Google Scholar, reference lists and journals. Two reviewers independently selected studies and extracted data. We assessed conduct and reporting using AMSTAR and PRISMA. Scores were calculated as the sum of reported items. We also extracted author, journal and article characteristics, and used them in exploratory analysis to determine which variables predict quality.112 articles fulfilled eligibility criteria (53 systematic reviews; 59 meta-analyses. Overall, 68% AMSTAR and 56.8% PRISMA items were reported adequately. Poorest scores were identified with regards a priori design, inclusion of structured summaries, including the grey literature, citing excluded articles and evaluating bias. 13 reviews were pre-registered and 6 in PRISMA-endorsing journals. The following predicted quality in univariate analysis:, word count, Cochrane review, journal h-index, impact factor, journal endorses PRISMA, PRISMA adherence suggested in author guidance, article mentions PRISMA

  2. MO-G-304-02: Knowledge Based DVH Prediction Using a Geometric Dose Transform

    International Nuclear Information System (INIS)

    Staub, D; Wang, J; Jiang, S

    2015-01-01

    Purpose: To demonstrate a novel method for predicting patient dose-volume histograms (DVHs) using a prior database of optimized radiotherapy treatment plans. Such predicted DVHs could be useful for automating treatment planning. Methods: Our initial demonstration utilized a database of 100 prostate intensity-modulated radiotherapy (IMRT) data-sets. Each data-set contained a CT image with contours of the planning target volume (PTV), rectum, and bladder, the parameters of a clinically approved IMRT plan, and a corresponding simulated dose distribution. We applied a novel geometric transformation to remove the influence of the PTV size, shape, and location on the dose distribution. We termed the transformed distribution the geometrically normalized dose distribution (GNDD). This normalization transform was applied to 80 data-sets randomly selected from the database, and a population GNDD was computed as the average. Next, the population GNDD was mapped onto each of the remaining 20 patient datasets using the reverse of the geometric normalization transform, and predicted DVHs were calculated from the reverse transformed dose distributions (GNDD-DVHs). In addition, a state of the art machine learning based method from the literature was tested for comparison. Results: DVH prediction accuracy was quantified by calculating the relative root mean squared error (rRMSE) on predicted DVHs for the 20 test patients using their known DVHs. For bladder, rectum, and PTV average rRMSEs for the GNDD method were 9.7 ± 4.2%, 13.9 ± 6.0%, and 2.3 ± 0.5% respectively. Prediction results using GNDD were roughly equivalent to that from the machine learning method. Conclusion: We developed a new method for predicting DVH curves from a database of prior patient plans. We demonstrated that our simple approach achieves accuracy comparable to a method using a complicated machine learning based approach

  3. Elastic scattering of 7Li projectiles in the energy range of 20 to 34 MeV

    International Nuclear Information System (INIS)

    Khallaf, S.A.E.

    1983-01-01

    As far as it is known, the Watanabe folding model has not been used to analyse the elastic scattering of 7 Li projectiles. The main purpose of the present work is to calculate the differential cross sections for 7 Li elastic scattering von 90 Zr, 48 , 40 Ca, 16 O and 12 C at incident energies of 20 to 34 MeV using the Watanabe folding model and to study the applicability of this model for 7 Li elastic scattering. The potentials of 7 Li ions are revealed by Taylor expansions of alpha and triton cluster potentials. The resulting differential cross sections are compared with the predicted cross sections using phenomenological potentials of 7 Li ions. (orig./WL)

  4. Prediction of the recovery rate after surgery for cervical myelopathy from the view of CT-myelography

    International Nuclear Information System (INIS)

    Koyanagi, Takahiro; Satomi, Kazuhiko; Asazuma, Takahito; Toyama, Yoshiaki; Fujimura, Shoichi; Hirabayashi, Kiyoshi; Hamano, Yasuyuki; Shiraishi, Takeshi.

    1991-01-01

    This study was designed to prepare a formula for predicting postoperative recovery in cervical myelopathy. Preoperative CT-myelography (CT-M) was performed in a total of 103 patients, consisting of 44 with cervical spinal myelopathy (CSM), 39 with ossification of the posterior longitudinal ligament (OPLL), and 20 with cervical disk herniation (CDH). Multivariate analyses were used to obtain correlations between CT-M findings (spinal cord area and the rate of spinal cord flatness) and clinical items (age, disease duration, preoperative JOA score, and postoperative recovery rate). There was a strong positive correlation between spinal cord area and postoperative recovery rate. Because both spinal cord area and disease duration for the CSM and OPLL groups had a strong positive correlation with the recovery rate, they were found to predict postoperative recovery. In the CDH group, there was no predictive index. Spinal cord area was more potential index than preoperative severity. Disease duration may also serve as an index complementing spinal cord area in the evaluation of postoperative recovery. (N.K.)

  5. Predicting Coronary Artery Aneurysms in Kawasaki Disease at a North American Center: An Assessment of Baseline z Scores.

    Science.gov (United States)

    Son, Mary Beth F; Gauvreau, Kimberlee; Kim, Susan; Tang, Alexander; Dedeoglu, Fatma; Fulton, David R; Lo, Mindy S; Baker, Annette L; Sundel, Robert P; Newburger, Jane W

    2017-05-31

    Accurate risk prediction of coronary artery aneurysms (CAAs) in North American children with Kawasaki disease remains a clinical challenge. We sought to determine the predictive utility of baseline coronary dimensions adjusted for body surface area ( z scores) for future CAAs in Kawasaki disease and explored the extent to which addition of established Japanese risk scores to baseline coronary artery z scores improved discrimination for CAA development. We explored the relationships of CAA with baseline z scores; with Kobayashi, Sano, Egami, and Harada risk scores; and with the combination of baseline z scores and risk scores. We defined CAA as a maximum z score (zMax) ≥2.5 of the left anterior descending or right coronary artery at 4 to 8 weeks of illness. Of 261 patients, 77 patients (29%) had a baseline zMax ≥2.0. CAAs occurred in 15 patients (6%). CAAs were strongly associated with baseline zMax ≥2.0 versus Baseline zMax ≥2.0 had a C statistic of 0.77, good sensitivity (80%), and excellent negative predictive value (98%). None of the risk scores alone had adequate discrimination. When high-risk status per the Japanese risk scores was added to models containing baseline zMax ≥2.0, none were significantly better than baseline zMax ≥2.0 alone. In a North American center, baseline zMax ≥2.0 in children with Kawasaki disease demonstrated high predictive utility for later development of CAA. Future studies should validate the utility of our findings. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  6. A prediction model for assessing residential radon concentration in Switzerland

    International Nuclear Information System (INIS)

    Hauri, Dimitri D.; Huss, Anke; Zimmermann, Frank; Kuehni, Claudia E.; Röösli, Martin

    2012-01-01

    Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the nationwide Swiss radon database collected between 1994 and 2004. Of these, 80% randomly selected measurements were used for model development and the remaining 20% for an independent model validation. A multivariable log-linear regression model was fitted and relevant predictors selected according to evidence from the literature, the adjusted R², the Akaike's information criterion (AIC), and the Bayesian information criterion (BIC). The prediction model was evaluated by calculating Spearman rank correlation between measured and predicted values. Additionally, the predicted values were categorised into three categories (50th, 50th–90th and 90th percentile) and compared with measured categories using a weighted Kappa statistic. The most relevant predictors for indoor radon levels were tectonic units and year of construction of the building, followed by soil texture, degree of urbanisation, floor of the building where the measurement was taken and housing type (P-values <0.001 for all). Mean predicted radon values (geometric mean) were 66 Bq/m³ (interquartile range 40–111 Bq/m³) in the lowest exposure category, 126 Bq/m³ (69–215 Bq/m³) in the medium category, and 219 Bq/m³ (108–427 Bq/m³) in the highest category. Spearman correlation between predictions and measurements was 0.45 (95%-CI: 0.44; 0.46) for the development dataset and 0.44 (95%-CI: 0.42; 0.46) for the validation dataset. Kappa coefficients were 0.31 for the development and 0.30 for the validation dataset, respectively. The model explained 20% overall variability (adjusted R²). In conclusion, this residential radon prediction model, based on a large number of measurements, was demonstrated to be

  7. The draft genome of watermelon (Citrullus lanatus) and resequencing of 20 diverse accessions.

    Science.gov (United States)

    Guo, Shaogui; Zhang, Jianguo; Sun, Honghe; Salse, Jerome; Lucas, William J; Zhang, Haiying; Zheng, Yi; Mao, Linyong; Ren, Yi; Wang, Zhiwen; Min, Jiumeng; Guo, Xiaosen; Murat, Florent; Ham, Byung-Kook; Zhang, Zhaoliang; Gao, Shan; Huang, Mingyun; Xu, Yimin; Zhong, Silin; Bombarely, Aureliano; Mueller, Lukas A; Zhao, Hong; He, Hongju; Zhang, Yan; Zhang, Zhonghua; Huang, Sanwen; Tan, Tao; Pang, Erli; Lin, Kui; Hu, Qun; Kuang, Hanhui; Ni, Peixiang; Wang, Bo; Liu, Jingan; Kou, Qinghe; Hou, Wenju; Zou, Xiaohua; Jiang, Jiao; Gong, Guoyi; Klee, Kathrin; Schoof, Heiko; Huang, Ying; Hu, Xuesong; Dong, Shanshan; Liang, Dequan; Wang, Juan; Wu, Kui; Xia, Yang; Zhao, Xiang; Zheng, Zequn; Xing, Miao; Liang, Xinming; Huang, Bangqing; Lv, Tian; Wang, Junyi; Yin, Ye; Yi, Hongping; Li, Ruiqiang; Wu, Mingzhu; Levi, Amnon; Zhang, Xingping; Giovannoni, James J; Wang, Jun; Li, Yunfu; Fei, Zhangjun; Xu, Yong

    2013-01-01

    Watermelon, Citrullus lanatus, is an important cucurbit crop grown throughout the world. Here we report a high-quality draft genome sequence of the east Asia watermelon cultivar 97103 (2n = 2× = 22) containing 23,440 predicted protein-coding genes. Comparative genomics analysis provided an evolutionary scenario for the origin of the 11 watermelon chromosomes derived from a 7-chromosome paleohexaploid eudicot ancestor. Resequencing of 20 watermelon accessions representing three different C. lanatus subspecies produced numerous haplotypes and identified the extent of genetic diversity and population structure of watermelon germplasm. Genomic regions that were preferentially selected during domestication were identified. Many disease-resistance genes were also found to be lost during domestication. In addition, integrative genomic and transcriptomic analyses yielded important insights into aspects of phloem-based vascular signaling in common between watermelon and cucumber and identified genes crucial to valuable fruit-quality traits, including sugar accumulation and citrulline metabolism.

  8. What Factors are Predictive of Patient-reported Outcomes? A Prospective Study of 337 Shoulder Arthroplasties.

    Science.gov (United States)

    Matsen, Frederick A; Russ, Stacy M; Vu, Phuong T; Hsu, Jason E; Lucas, Robert M; Comstock, Bryan A

    2016-11-01

    Although shoulder arthroplasties generally are effective in improving patients' comfort and function, the results are variable for reasons that are not well understood. We posed two questions: (1) What factors are associated with better 2-year outcomes after shoulder arthroplasty? (2) What are the sensitivities, specificities, and positive and negative predictive values of a multivariate predictive model for better outcome? Three hundred thirty-nine patients having a shoulder arthroplasty (hemiarthroplasty, arthroplasty for cuff tear arthropathy, ream and run arthroplasty, total shoulder or reverse total shoulder arthroplasty) between August 24, 2010 and December 31, 2012 consented to participate in this prospective study. Two patients were excluded because they were missing baseline variables. Forty-three patients were missing 2-year data. Univariate and multivariate analyses determined the relationship of baseline patient, shoulder, and surgical characteristics to a "better" outcome, defined as an improvement of at least 30% of the maximal possible improvement in the Simple Shoulder Test. The results were used to develop a predictive model, the accuracy of which was tested using a 10-fold cross-validation. After controlling for potentially relevant confounding variables, the multivariate analysis showed that the factors significantly associated with better outcomes were American Society of Anesthesiologists Class I (odds ratio [OR], 1.94; 95% CI, 1.03-3.65; p = 0.041), shoulder problem not related to work (OR, 5.36; 95% CI, 2.15-13.37; p factors listed above. The area under the receiver operating characteristic curve generated from the cross-validated enhanced predictive model was 0.79 (generally values of 0.7 to 0.8 are considered fair and values of 0.8 to 0.9 are considered good). The false-positive fraction and the true-positive fraction depended on the cutoff probability selected (ie, the selected probability above which the prediction would be classified as

  9. A new, accurate predictive model for incident hypertension.

    Science.gov (United States)

    Völzke, Henry; Fung, Glenn; Ittermann, Till; Yu, Shipeng; Baumeister, Sebastian E; Dörr, Marcus; Lieb, Wolfgang; Völker, Uwe; Linneberg, Allan; Jørgensen, Torben; Felix, Stephan B; Rettig, Rainer; Rao, Bharat; Kroemer, Heyo K

    2013-11-01

    Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures. The primary study population consisted of 1605 normotensive individuals aged 20-79 years with 5-year follow-up from the population-based study, that is the Study of Health in Pomerania (SHIP). The initial set was randomly split into a training and a testing set. We used a probabilistic graphical model applying a Bayesian network to create a predictive model for incident hypertension and compared the predictive performance with the established Framingham risk score for hypertension. Finally, the model was validated in 2887 participants from INTER99, a Danish community-based intervention study. In the training set of SHIP data, the Bayesian network used a small subset of relevant baseline features including age, mean arterial pressure, rs16998073, serum glucose and urinary albumin concentrations. Furthermore, we detected relevant interactions between age and serum glucose as well as between rs16998073 and urinary albumin concentrations [area under the receiver operating characteristic (AUC 0.76)]. The model was confirmed in the SHIP validation set (AUC 0.78) and externally replicated in INTER99 (AUC 0.77). Compared to the established Framingham risk score for hypertension, the predictive performance of the new model was similar in the SHIP validation set and moderately better in INTER99. Data mining procedures identified a predictive model for incident hypertension, which included innovative and easy-to-measure variables. The findings promise great applicability in screening settings and clinical practice.

  10. A rapid method of predicting radiocaesium concentrations in sheep from activity levels in faeces

    International Nuclear Information System (INIS)

    McGee, E.J.; Synnott, H.J.; Colgan, P.A.; Keatinge, M.J.

    1994-01-01

    The use of faecal samples taken from sheep flocks as a means of predicting radiocaesium concentrations in live animals was studied. Radiocaesium levels in 1726 sheep from 29 flocks were measured using in vivo techniques and a single faecal sample taken from each flock was also analysed. A highly significant relationship was found to exist between mean flock activity and activity in the corresponding faecal samples. Least-square regression yielded a simple model for predicting mean flock radiocaesium concentrations based on activity levels in faecal samples. A similar analysis of flock maxima and activity levels in faeces provides an alternative model for predicting the expected within-flock maximum radiocaesium concentration. (Author)

  11. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  12. A Landscape-based model for predicting Mycobacterium ulcerans infection (Buruli Ulcer disease) presence in Benin, West Africa.

    Science.gov (United States)

    Wagner, Tyler; Benbow, M Eric; Burns, Meghan; Johnson, R Christian; Merritt, Richard W; Qi, Jiaguo; Small, Pamela L C

    2008-03-01

    Mycobacterium ulcerans infection (Buruli ulcer [BU] disease) is an emerging tropical disease that causes severe morbidity in many communities, especially those in close proximity to aquatic environments. Research and control efforts are severely hampered by the paucity of data regarding the ecology of this disease; for example, the vectors and modes of transmission remain unknown. It is hypothesized that BU presence is associated with altered landscapes that perturb aquatic ecosystems; however, this has yet to be quantified over large spatial scales. We quantified relationships between land use/land cover (LULC) characteristics surrounding individual villages and BU presence in Benin, West Africa. We also examined the effects of other village-level characteristics which we hypothesized to affect BU presence, such as village distance to the nearest river. We found that as the percent urban land use in a 50-km buffer surrounding a village increased, the probability of BU presence decreased. Conversely, as the percent agricultural land use in a 20-km buffer surrounding a village increased, the probability of BU presence increased. Landscape-based models had predictive ability when predicting BU presence using validation data sets from Benin and Ghana, West Africa. Our analyses suggest that relatively small amounts of urbanization are associated with a decrease in the probability of BU presence, and we hypothesize that this is due to the increased availability of pumped water in urban environments. Our models provide an initial approach to predicting the probability of BU presence over large spatial scales in Benin and Ghana, using readily available land use data.

  13. Characterizing Tumor Heterogeneity With Functional Imaging and Quantifying High-Risk Tumor Volume for Early Prediction of Treatment Outcome: Cervical Cancer as a Model

    Energy Technology Data Exchange (ETDEWEB)

    Mayr, Nina A., E-mail: Nina.Mayr@osumc.edu [Department of Radiation Oncology, Ohio State University, Columbus, OH (United States); Huang Zhibin [Department of Radiation Oncology and Department of Physics, East Carolina University, Greenville, NC (United States); Wang, Jian Z. [Department of Radiation Oncology, Ohio State University, Columbus, OH (United States); Lo, Simon S. [Department of Radiation Oncology, Case Western Reserve University, Cleveland, OH (United States); Fan, Joline M. [Department of Molecular Biology, Stanford University, Stanford, CA (United States); Grecula, John C. [Department of Radiation Oncology, Ohio State University, Columbus, OH (United States); Sammet, Steffen [Department of Radiology, University of Chicago, Chicago, IL (United States); Department of Radiology, Ohio State University, Columbus, OH (United States); Sammet, Christina L. [Department of Radiology, University of Chicago, Chicago, IL (United States); Jia Guang; Zhang Jun; Knopp, Michael V.; Yuh, William T.C. [Department of Radiology, Ohio State University, Columbus, OH (United States)

    2012-07-01

    Purpose: Treatment response in cancer has been monitored by measuring anatomic tumor volume (ATV) at various times without considering the inherent functional tumor heterogeneity known to critically influence ultimate treatment outcome: primary tumor control and survival. This study applied dynamic contrast-enhanced (DCE) functional MRI to characterize tumors' heterogeneous subregions with low DCE values, at risk for treatment failure, and to quantify the functional risk volume (FRV) for personalized early prediction of treatment outcome. Methods and Materials: DCE-MRI was performed in 102 stage IB{sub 2}-IVA cervical cancer patients to assess tumor perfusion heterogeneity before and during radiation/chemotherapy. FRV represents the total volume of tumor voxels with critically low DCE signal intensity (<2.1 compared with precontrast image, determined by previous receiver operator characteristic analysis). FRVs were correlated with treatment outcome (follow-up: 0.2-9.4, mean 6.8 years) and compared with ATVs (Mann-Whitney, Kaplan-Meier, and multivariate analyses). Results: Before and during therapy at 2-2.5 and 4-5 weeks of RT, FRVs >20, >13, and >5 cm{sup 3}, respectively, significantly predicted unfavorable 6-year primary tumor control (p = 0.003, 7.3 Multiplication-Sign 10{sup -8}, 2.0 Multiplication-Sign 10{sup -8}) and disease-specific survival (p = 1.9 Multiplication-Sign 10{sup -4}, 2.1 Multiplication-Sign 10{sup -6}, 2.5 Multiplication-Sign 10{sup -7}, respectively). The FRVs were superior to the ATVs as early predictors of outcome, and the differentiating power of FRVs increased during treatment. Discussion: Our preliminary results suggest that functional tumor heterogeneity can be characterized by DCE-MRI to quantify FRV for predicting ultimate long-term treatment outcome. FRV is a novel functional imaging heterogeneity parameter, superior to ATV, and can be clinically translated for personalized early outcome prediction before or as early as 2

  14. Validation of a CFD Analysis Model for Predicting CANDU-6 Moderator Temperature Against SPEL Experiments

    International Nuclear Information System (INIS)

    Churl Yoon; Bo Wook Rhee; Byung-Joo Min

    2002-01-01

    A validation of a 3D CFD model for predicting local subcooling of the moderator in the vicinity of calandria tubes in a CANDU-6 reactor is performed. The small scale moderator experiments performed at Sheridan Park Experimental Laboratory (SPEL) in Ontario, Canada[1] is used for the validation. Also a comparison is made between previous CFD analyses based on 2DMOTH and PHOENICS, and the current analysis for the same SPEL experiment. For the current model, a set of grid structures for the same geometry as the experimental test section is generated and the momentum, heat and continuity equations are solved by CFX-4.3, a CFD code developed by AEA technology. The matrix of calandria tubes is simplified by the porous media approach. The standard k-ε turbulence model associated with logarithmic wall treatment and SIMPLEC algorithm on the body fitted grid are used. Buoyancy effects are accounted for by the Boussinesq approximation. For the test conditions simulated in this study, the flow pattern identified is the buoyancy-dominated flow, which is generated by the interaction between the dominant buoyancy force by heating and inertial momentum forces by the inlet jets. As a result, the current CFD moderator analysis model predicts the moderator temperature reasonably, and the maximum error against the experimental data is kept at less than 2.0 deg. C over the whole domain. The simulated velocity field matches with the visualization of SPEL experiments quite well. (authors)

  15. Predicted Aerodynamic Characteristics of a NACA 0015 Airfoil Having a 25% Integral-Type Trailing Edge Flap

    Science.gov (United States)

    Hassan, Ahmed

    1999-01-01

    Using the two-dimensional ARC2D Navier-Stokes flow solver analyses were conducted to predict the sectional aerodynamic characteristics of the flapped NACA-0015 airfoil section. To facilitate the analyses and the generation of the computational grids, the airfoil with the deflected trailing edge flap was treated as a single element airfoil with no allowance for a gap between the flap's leading edge and the base of the forward portion of the airfoil. Generation of the O-type computational grids was accomplished using the HYGRID hyperbolic grid generation program. Results were obtained for a wide range of Mach numbers, angles of attack and flap deflections. The predicted sectional lift, drag and pitching moment values for the airfoil were then cast in tabular format (C81) to be used in lifting-line helicopter rotor aerodynamic performance calculations. Similar were also generated for the flap. Mathematical expressions providing the variation of the sectional lift and pitching moment coefficients for the airfoil and for the flap as a function of flap chord length and flap deflection angle were derived within the context of thin airfoil theory. The airfoil's sectional drag coefficient were derived using the ARC2D drag predictions for equivalent two dimensional flow conditions.

  16. Impact of modellers' decisions on hydrological a priori predictions

    Science.gov (United States)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2014-06-01

    In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of

  17. The predictive value of markers of fibrinolysis and endothelial dysfunction in the post thrombotic syndrome. A systematic review.

    Science.gov (United States)

    Rabinovich, Anat; Cohen, Jacqueline M; Kahn, Susan R

    2014-06-01

    The post thrombotic syndrome (PTS) develops in 20-40% of deep venous thrombosis (DVT) patients. Risk factors for PTS have not been well elucidated. Identification of risk factors would facilitate individualised risk assessment for PTS. We conducted a systematic review to determine whether biomarkers of fibrinolysis or endothelial dysfunction can predict the risk for PTS among DVT patients. Studies were identified by searching the electronic databases PubMed, EMBASE, Scopus and Web of science. We included studies published between 1990 and 2013, measured biomarker levels in adult DVT patients, and reported rates of PTS development. Fourteen studies were included: 11 investigated the association between D-dimer and PTS; three examined fibrinogen; two measured von Willebrand factor; one measured plasminogen activator inhibitor-1; one assessed ADAMTS-13 (A Disintegrin and Metalloprotease with Thrombospondin type 1 repeats) and one measured factor XIII activity. Studies varied with regards to inclusion criteria, definition of PTS, time point and method of biomarker measurement. We were unable to meta-analyse results due to marked clinical heterogeneity. Descriptively, a significant association with PTS was found for D-dimer in four studies and factor XIII in one study. Further prospective research is needed to elucidate whether these markers might be useful to predict PTS development.

  18. TRAC analyses for CCTF and SCTF tests and UPTF design/operation

    International Nuclear Information System (INIS)

    Williams, K.A.

    1983-01-01

    The 2D/3D Program is a multinational (Germany, Japan, and the United States) experimental and analytical nuclear reactor safety research program. The Los Alamos analysis effort is functioning as a vital part of the 2D/3D program. The CCTF and SCTF analyses have demonstrated that TRAC-PF1 can correctly predict multidimensional, nonequilibrium behavior in large-scale facilities prototypical of actual PWR's. Through these and future TRAC analyses the experimental findings can be related from facility to facility, and the results of this research program can be directly related to licensing concerns affecting actual PWR's

  19. Chemical, laboratory analyses, physical and profile oceanographic data collected aboard the JACK FITZ in the Gulf of Mexico from 2010-06-12 to 2010-06-20 in response to the Deepwater Horizon Oil Spill event (NODC Accession 0069074)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Chemical, laboratory analyses, physical and profile oceanographic data were collected aboard the JACK FITZ in the Gulf of Mexico from 2010-06-12 to 2010-06-20 in...

  20. Connecting clinical and actuarial prediction with rule-based methods

    NARCIS (Netherlands)

    Fokkema, M.; Smits, N.; Kelderman, H.; Penninx, B.W.J.H.

    2015-01-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction

  1. The IEA Annex 20 Two-Dimensional Benchmark Test for CFD Predictions

    DEFF Research Database (Denmark)

    Nielsen, Peter V.; Rong, Li; Cortes, Ines Olmedo

    2010-01-01

    predictions both for isothermal flow and for nonisothermal flow. The benchmark is defined on a web page, which also shows about 50 different benchmark tests with studies of e.g. grid dependence, numerical schemes, different source codes, different turbulence models, RANS or LES, different turbulence levels...... in a supply opening, study of local emission and study of airborne chemical reactions. Therefore the web page is also a collection of information which describes the importance of the different elements of a CFD procedure. The benchmark is originally developed for test of two-dimensional flow, but the paper...

  2. A critical pressure based panel method for prediction of unsteady loading of marine propellers under cavitation

    International Nuclear Information System (INIS)

    Liu, P.; Bose, N.; Colbourne, B.

    2002-01-01

    A simple numerical procedure is established and implemented into a time domain panel method to predict hydrodynamic performance of marine propellers with sheet cavitation. This paper describes the numerical formulations and procedures to construct this integration. Predicted hydrodynamic loads were compared with both a previous numerical model and experimental measurements for a propeller in steady flow. The current method gives a substantial improvement in thrust and torque coefficient prediction over a previous numerical method at low cavitation numbers of less than 2.0, where severe cavitation occurs. Predicted pressure coefficient distributions are also presented. (author)

  3. Complete genome of a European hepatitis C virus subtype 1g isolate: phylogenetic and genetic analyses.

    Science.gov (United States)

    Bracho, Maria A; Saludes, Verónica; Martró, Elisa; Bargalló, Ana; González-Candelas, Fernando; Ausina, Vicent

    2008-06-05

    Hepatitis C virus isolates have been classified into six main genotypes and a variable number of subtypes within each genotype, mainly based on phylogenetic analysis. Analyses of the genetic relationship among genotypes and subtypes are more reliable when complete genome sequences (or at least the full coding region) are used; however, so far 31 of 80 confirmed or proposed subtypes have at least one complete genome available. Of these, 20 correspond to confirmed subtypes of epidemic interest. We present and analyse the first complete genome sequence of a HCV subtype 1g isolate. Phylogenetic and genetic distance analyses reveal that HCV-1g is the most divergent subtype among the HCV-1 confirmed subtypes. Potential genomic recombination events between genotypes or subtype 1 genomes were ruled out. We demonstrate phylogenetic congruence of previously deposited partial sequences of HCV-1g with respect to our sequence. In light of this, we propose changing the current status of its subtype-specific designation from provisional to confirmed.

  4. A Traffic Prediction Algorithm for Street Lighting Control Efficiency

    Directory of Open Access Journals (Sweden)

    POPA Valentin

    2013-01-01

    Full Text Available This paper presents the development of a traffic prediction algorithm that can be integrated in a street lighting monitoring and control system. The prediction algorithm must enable the reduction of energy costs and improve energy efficiency by decreasing the light intensity depending on the traffic level. The algorithm analyses and processes the information received at the command center based on the traffic level at different moments. The data is collected by means of the Doppler vehicle detection sensors integrated within the system. Thus, two methods are used for the implementation of the algorithm: a neural network and a k-NN (k-Nearest Neighbor prediction algorithm. For 500 training cycles, the mean square error of the neural network is 9.766 and for 500.000 training cycles the error amounts to 0.877. In case of the k-NN algorithm the error increases from 8.24 for k=5 to 12.27 for a number of 50 neighbors. In terms of a root means square error parameter, the use of a neural network ensures the highest performance level and can be integrated in a street lighting control system.

  5. Use of a portable X-ray analyser for manganese and iron assay in minerals

    International Nuclear Information System (INIS)

    Taqueda, M.H.S.; Agudo, E.G.

    1975-01-01

    The use of a protable X-ray fluorescence analyser for manganese and iron assay in minerals is described. The concentration range in the measured samples was 30% to 60% for Mn and 2% to 20% for Fe. The excitation source used was a 3 mCi 109 Cd sealed source. Balanced filters were used for the X-ray analysis. The statistical study of results showed a precision better than 0,5 for Mn, but only 4% for iron. They can be improved either increasing the counting time or using a 238 Pu source

  6. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  7. Prediction of cereal feed value by near infrared spectroscopy

    DEFF Research Database (Denmark)

    Jørgensen, Johannes Ravn

    . NIRS is therefore appropriate as a quick method for the determination of FEsv and FEso, since it is rapid (approximately 1 minute per measurement of a ground test) and cheap. The aim is to develop a rapid method to analyse grain feed value. This will contribute to highlight the opportunities...... feed, a possible tool to assess the feed value of new varieties in the variety testing and a useful, cheap and rapid tool for cereal breeders. A bank of 1213 grain samples of wheat, triticale, barley and rye, and related chemical reference analyses to describe the feed value have been established...... with the error in the chemical analysis. Prediction error by NIRS prediction of feed value has been shown to be above the error of the chemical measurement. The conclusion is that it has proved possible to predict the feed value in cereals with NIRS quickly and cheaply, but prediction error with this method...

  8. HbA1c and the Prediction of Type 2 Diabetes in Children and Adults.

    Science.gov (United States)

    Vijayakumar, Pavithra; Nelson, Robert G; Hanson, Robert L; Knowler, William C; Sinha, Madhumita

    2017-01-01

    Long-term data validating glycated hemoglobin (HbA 1c ) in assessing the risk of type 2 diabetes in children are limited. HbA 1c , fasting plasma glucose (FPG), and 2-h postload plasma glucose (2hPG) concentrations were measured in a longitudinal study of American Indians to determine their utility in predicting incident diabetes, all of which is thought to be type 2 in this population. Incident diabetes (FPG ≥126 mg/dL [7.0 mmol/L], 2hPG ≥200 mg/dL [11.1 mmol/L], HbA 1c ≥6.5% [8 mmol/mol], or clinical diagnosis) was determined in 2,095 children without diabetes ages 10-19 years monitored through age 39, and in 2,005 adults ages 20-39 monitored through age 59. Areas under the receiver operating characteristic (ROC) curve for HbA 1c , FPG, and 2hPG in predicting diabetes within 10 years were compared. During long-term follow-up of children and adolescents who did not initially have diabetes, the incidence rate of subsequent diabetes was fourfold (in boys) as high and more than sevenfold (in girls) as high in those with HbA 1c ≥5.7% as in those with HbA 1c ≤5.3%-greater rate ratios than experienced by adults in the same HbA 1c categories. Analyses of ROCs revealed no significant differences between HbA 1c , FPG, and 2hPG in sensitivity and specificity for identifying children and adolescents who later developed diabetes. HbA 1c is a useful predictor of diabetes risk in children and can be used to identify prediabetes in children with other type 2 diabetes risk factors with the same predictive value as FPG and 2hPG. © 2017 by the American Diabetes Association.

  9. Analyser-based phase contrast image reconstruction using geometrical optics

    International Nuclear Information System (INIS)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-01-01

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 μm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser

  10. Introduction: Analysing Emotion and Theorising Affect

    Directory of Open Access Journals (Sweden)

    Peta Tait

    2016-08-01

    Full Text Available This discussion introduces ideas of emotion and affect for a volume of articles demonstrating the scope of approaches used in their study within the humanities and creative arts. The volume offers multiple perspectives on emotion and affect within 20th-century and 21st-century texts, arts and organisations and their histories. The discussion explains how emotion encompasses the emotions, emotional feeling, sensation and mood and how these can be analysed particularly in relation to literature, art and performance. It briefly summarises concepts of affect theory within recent approaches before introducing the articles.

  11. A new approach to predicting environmental transfer of radionuclides to wildlife: A demonstration for freshwater fish and caesium

    Energy Technology Data Exchange (ETDEWEB)

    Beresford, N.A., E-mail: nab@ceh.ac.uk [NERC Centre for Ecology and Hydrology, Lancaster Environment Centre, Library Av. Bailrigg, Lancaster LA1 4AP (United Kingdom); Yankovich, T.L. [Saskatchewan Research Council, Environment and Forestry, 125, 15 Innovation Blvd., Saskatoon, SK S7N 2X8 (Canada); Wood, M.D. [School of Environment and Life Sciences, Room 323, Peel Building, University of Salford, Manchester, M5 4WT (United Kingdom); Fesenko, S. [International Atomic Energy Agency, 1400 Vienna (Austria); Andersson, P. [Strålsäkerhetsnymdigheten, Swedish Radiation Safety Authority, SE-171 16 Stockholm (Sweden); Muikku, M. [STUK, P.O. Box 14, 00881 Helsinki (Finland); Willey, N.J. [Centre for Research in Biosciences, University of the West of England, Coldharbour Lane, Frenchay, Bristol BS16 1QY (United Kingdom)

    2013-10-01

    The application of the concentration ratio (CR) to predict radionuclide activity concentrations in wildlife from those in soil or water has become the widely accepted approach for environmental assessments. Recently both the ICRP and IAEA have produced compilations of CR values for application in environmental assessment. However, the CR approach has many limitations, most notably, that the transfer of most radionuclides is largely determined by site-specific factors (e.g. water or soil chemistry). Furthermore, there are few, if any, CR values for many radionuclide-organism combinations. In this paper, we propose an alternative approach and, as an example, demonstrate and test this for caesium and freshwater fish. Using a Residual Maximum Likelihood (REML) mixed-model regression we analysed a dataset comprising 597 entries for 53 freshwater fish species from 67 sites. The REML analysis generated a mean value for each species on a common scale after REML adjustment taking account of the effect of the inter-site variation. Using an independent dataset, we subsequently test the hypothesis that the REML model outputs can be used to predict radionuclide, in this case radiocaesium, activity concentrations in unknown species from the results of a species which has been sampled at a specific site. The outputs of the REML analysis accurately predicted {sup 137}Cs activity concentrations in different species of fish from 27 Finnish lakes; these data had not been used in our initial analyses. We recommend that this alternative approach be further investigated for other radionuclides and ecosystems. - Highlights: • An alternative approach to estimating radionuclide transfer to wildlife is presented. • Analysed a dataset comprising 53 freshwater fish species collected from 67 sites. • Residual Maximum Likelihood mixed model regression is used. • Model output takes account of the effect of inter-site variation. • Successfully predicted {sup 137}Cs concentrations in

  12. Predicting violence in veterans with posttraumatic stress disorder

    Directory of Open Access Journals (Sweden)

    Jovanović Aleksandar A.

    2009-01-01

    Full Text Available Background/Aim. Frequent expression of negative affects, hostility and violent behavior in individuals suffering from posttraumatic stress disorder (PTSD were recognized long ago, and have been retrospectively well documented in war veterans with PTSD who were shown to have an elevated risk for violent behavior when compared to both veterans without PTSD and other psychiatric patients. The aim of this study was to evaluate the accuracy of clinical prediction of violence in combat veterans suffering from PTSD. Methods. The subjects of this study, 104 male combat veterans with PTSD were assessed with the Historical, Clinical and Risk Management 20 (HCR-20, a 20-item clinicianrated instrument for assessing the risks for violence, and their acts of violence during one-year follow-up period were registered based on bimonthly check-up interviews. Results. Our findings showed that the HCR-20, as an actuarial measure, had good internal consistency reliability (α = 0.82, excellent interrater reliability (Interaclass Correlation ICC = 0.85, as well as excellent predictive validity for acts of any violence, non-physical violence or physical violence in the follow-up period (AUC = 0.82-0.86. The HCR-20 also had good interrater reliability (Cohen's kappa = 0.74, and acceptable predictive accuracy for each outcome criterion (AUC = 0.73-0.79. Conclusion. The results of this research confirm that the HCR-20 may also be applied in prediction of violent behavior in the population of patients suffering from PTSD with reliability and validity comparable with the results of previous studies where this instrument was administered to other populations of psychiatric patients.

  13. Absence of back disorders in adults and work-related predictive factors in a 5-year perspective.

    Science.gov (United States)

    Reigo, T; Tropp, H; Timpka, T

    2001-06-01

    Factors important for avoiding back disorders in different age-groups have seldom been compared and studied over time. We therefore set out to study age-related differences in socio-economic and work-related factors associated with the absence of back disorders in a 5-year comparative cohort study using a mailed questionnaire. Two subgroups (aged 25-34 and 54-59 years) derived from a representative sample of the Swedish population were followed at baseline, 1 year and 5 years. Questions were asked about the duration of back pain episodes, relapses, work changes and work satisfaction. A work adaptability, partnership, growth, affection, resolve (APGAR) score was included in the final questionnaire. Multivariate logistic regression was used to identify factors predicting the absence of back disorders. Absence of physically heavy work predicted an absence of back disorders [odds ratio (OR), 2.86; 95% confidence interval (CI), 1.3-6.3] in the older group. In the younger age-group, the absence of stressful work predicted absence of back disorders (OR, 2.0; 95% CI, 1.1-3.6). Thirty-seven per cent of the younger age-group and 43% of the older age-group did not experience any back pain episodes during the study period. The exploratory work APGAR scores indicated that back disorders were only associated with lower work satisfaction in the older group. The analyses point out the importance of avoiding perceived psychological stress in the young and avoiding perceived physically heavy work in the older age-group for avoiding back disorders. The results suggest a need for different programmes at workplaces to avoid back disorders depending on the age of the employees concerned.

  14. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  15. Approximate techniques for predicting size effects on cleavage fracture toughness (Jc)

    International Nuclear Information System (INIS)

    Kirk, M.T.; Dodds, R.H. Jr.

    1993-07-01

    This investigation examines the ability of an elastic T-stress analysis coupled with modified boundary layer (MBL) solution to predict stresses ahead of a crack tip in a variety of planar geometries. The approximate stresses are used as input to estimate the effective driving force for cleavage fracture (J 0 ) using the micromechanically based approach introduced by Dodds and Anderson. Finite element analyses for a wide variety of planar cracked geometries are conducted which have elastic biaxiality parameters (β) ranging from -0.99 (very low constraint) to +2.96 (very high constraint). The magnitude and sign of β indicate the rate at which crack-tip constraint changes with increasing applied load. All results pertain to a moderately strain hardening material (strain hardening exponent (η) of 10). These analyses suggest that β is an effective indicator of both the accuracy of T-MBL estimates of J 0 and of applicability limits on evolving fracture analysis methodologies (i.e. T-MBL, J-Q, and J/J 0 ). Specifically, when 1β1>0.4 these analyses show that the T-MBL approximation of J 0 is accurate to within 20% of a detailed finite-element analysis. As ''structural type'' configurations, i.e. shallow cracks in tension, generally have 1β1>0.4, it appears that only an elastic analysis may be needed to determine reasonably accurate J 0 values for structural conditions

  16. Methodological issues in radiation dose-volume outcome analyses: Summary of a joint AAPM/NIH workshop

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Niemierko, Andrzej; Herbert, Donald; Yan, Di; Jackson, Andrew; Ten Haken, Randall K.; Langer, Mark; Sapareto, Steve

    2002-01-01

    This report represents a summary of presentations at a joint workshop of the National Institutes of Health and the American Association of Physicists in Medicine (AAPM). Current methodological issues in dose-volume modeling are addressed here from several different perspectives. Areas of emphasis include (a) basic modeling issues including the equivalent uniform dose framework and the bootstrap method, (b) issues in the valid use of statistics, including the need for meta-analysis, (c) issues in dealing with organ deformation and its effects on treatment response, (d) evidence for volume effects for rectal complications, (e) the use of volume effect data in liver and lung as a basis for dose escalation studies, and (f) implications of uncertainties in volume effect knowledge on optimized treatment planning. Taken together, these approaches to studying volume effects describe many implications for the development and use of this information in radiation oncology practice. Areas of significant interest for further research include the meta-analysis of clinical data; interinstitutional pooled data analyses of volume effects; analyses of the uncertainties in outcome prediction models, minimal parameter number outcome models for ranking treatment plans (e.g., equivalent uniform dose); incorporation of the effect of motion in the outcome prediction; dose-escalation/isorisk protocols based on outcome models; the use of functional imaging to study radio-response; and the need for further small animal tumor control probability/normal tissue complication probability studies

  17. Size matters. The width and location of a ureteral stone accurately predict the chance of spontaneous passage

    Energy Technology Data Exchange (ETDEWEB)

    Jendeberg, Johan; Geijer, Haakan; Alshamari, Muhammed; Liden, Mats [Oerebro University Hospital, Department of Radiology, Faculty of Medicine and Health, Oerebro (Sweden); Cierzniak, Bartosz [Oerebro University, Department of Surgery, Faculty of Medicine and Health, Oerebro (Sweden)

    2017-11-15

    To determine how to most accurately predict the chance of spontaneous passage of a ureteral stone using information in the diagnostic non-enhanced computed tomography (NECT) and to create predictive models with smaller stone size intervals than previously possible. Retrospectively 392 consecutive patients with ureteric stone on NECT were included. Three radiologists independently measured the stone size. Stone location, side, hydronephrosis, CRP, medical expulsion therapy (MET) and all follow-up radiology until stone expulsion or 26 weeks were recorded. Logistic regressions were performed with spontaneous stone passage in 4 weeks and 20 weeks as the dependent variable. The spontaneous passage rate in 20 weeks was 312 out of 392 stones, 98% in 0-2 mm, 98% in 3 mm, 81% in 4 mm, 65% in 5 mm, 33% in 6 mm and 9% in ≥6.5 mm wide stones. The stone size and location predicted spontaneous ureteric stone passage. The side and the grade of hydronephrosis only predicted stone passage in specific subgroups. Spontaneous passage of a ureteral stone can be predicted with high accuracy with the information available in the NECT. We present a prediction method based on stone size and location. (orig.)

  18. Numerical analyses of a water pool under loadings caused by a condensation induced water hammer

    Energy Technology Data Exchange (ETDEWEB)

    Timperi, A.; Paettikangas, T.; Calonius, K.; Tuunanen, J.; Poikolainen, J.; Saarenheimo, A. [VTT Industrial Systems (Finland)

    2004-03-01

    Three-dimensional simulations of a rapidly condensing steam bubble in a water pool have been performed by using the commercial computational fluid dynamics (CFD) code Star-CD. The condensing bubble was modelled by using a mass sink in a single-phase calculation. The pressure load on the wall of the pool was determined and transferred to the structural analyses code ABAQUS. The analyses were done for a test pool at Lappeenranta University of Technology. The structural integrity of the pool during steam experiments was investigated by assuming as a test load the rapid condensation of a steam bubble with a diameter of 20 cm. The mass sink for modelling the collapse of the bubble was deter-mined from the potential theory of incompressible fluid. The rapid condensation of the bubble within 25 ms initiated a strong condensation water hammer. The maximum amplitude of the pressure load on the pool wall was approximately 300 kPa. The loads caused by the high compression waves lasted only about 0.4 ms. The loadings caused by larger bubbles or more rapid collapse could not be calculated with the present method. (au)

  19. Prediction model to predict critical weight loss in patients with head and neck cancer during (chemo)radiotherapy.

    Science.gov (United States)

    Langius, Jacqueline A E; Twisk, Jos; Kampman, Martine; Doornaert, Patricia; Kramer, Mark H H; Weijs, Peter J M; Leemans, C René

    2016-01-01

    Patients with head and neck cancer (HNC) frequently encounter weight loss with multiple negative outcomes as a consequence. Adequate treatment is best achieved by early identification of patients at risk for critical weight loss. The objective of this study was to detect predictive factors for critical weight loss in patients with HNC receiving (chemo)radiotherapy ((C)RT). In this cohort study, 910 patients with HNC were included receiving RT (±surgery/concurrent chemotherapy) with curative intent. Body weight was measured at the start and end of (C)RT. Logistic regression and classification and regression tree (CART) analyses were used to analyse predictive factors for critical weight loss (defined as >5%) during (C)RT. Possible predictors included gender, age, WHO performance status, tumour location, TNM classification, treatment modality, RT technique (three-dimensional conformal RT (3D-RT) vs intensity-modulated RT (IMRT)), total dose on the primary tumour and RT on the elective or macroscopic lymph nodes. At the end of (C)RT, mean weight loss was 5.1±4.9%. Fifty percent of patients had critical weight loss during (C)RT. The main predictors for critical weight loss during (C)RT by both logistic and CART analyses were RT on the lymph nodes, higher RT dose on the primary tumour, receiving 3D-RT instead of IMRT, and younger age. Critical weight loss during (C)RT was prevalent in half of HNC patients. To predict critical weight loss, a practical prediction tree for adequate nutritional advice was developed, including the risk factors RT to the neck, higher RT dose, 3D-RT, and younger age. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. The development of a model to predict the effects of worker and task factors on foot placements in manual material handling tasks.

    Science.gov (United States)

    Wagner, David W; Reed, Matthew P; Chaffin, Don B

    2010-11-01

    Accurate prediction of foot placements in relation to hand locations during manual materials handling tasks is critical for prospective biomechanical analysis. To address this need, the effects of lifting task conditions and anthropometric variables on foot placements were studied in a laboratory experiment. In total, 20 men and women performed two-handed object transfers that required them to walk to a shelf, lift an object from the shelf at waist height and carry the object to a variety of locations. Five different changes in the direction of progression following the object pickup were used, ranging from 45° to 180° relative to the approach direction. Object weights of 1.0 kg, 4.5 kg, 13.6 kg were used. Whole-body motions were recorded using a 3-D optical retro-reflective marker-based camera system. A new parametric system for describing foot placements, the Quantitative Transition Classification System, was developed to facilitate the parameterisation of foot placement data. Foot placements chosen by the subjects during the transfer tasks appeared to facilitate a change in the whole-body direction of progression, in addition to aiding in performing the lift. Further analysis revealed that five different stepping behaviours accounted for 71% of the stepping patterns observed. More specifically, the most frequently observed behaviour revealed that the orientation of the lead foot during the actual lifting task was primarily affected by the amount of turn angle required after the lift (R(2) = 0.53). One surprising result was that the object mass (scaled by participant body mass) was not found to significantly affect any of the individual step placement parameters. Regression models were developed to predict the most prevalent step placements and are included in this paper to facilitate more accurate human motion simulations and ergonomics analyses of manual material lifting tasks. STATEMENT OF RELEVANCE: This study proposes a method for parameterising the steps

  1. A semi-supervised learning approach for RNA secondary structure prediction.

    Science.gov (United States)

    Yonemoto, Haruka; Asai, Kiyoshi; Hamada, Michiaki

    2015-08-01

    RNA secondary structure prediction is a key technology in RNA bioinformatics. Most algorithms for RNA secondary structure prediction use probabilistic models, in which the model parameters are trained with reliable RNA secondary structures. Because of the difficulty of determining RNA secondary structures by experimental procedures, such as NMR or X-ray crystal structural analyses, there are still many RNA sequences that could be useful for training whose secondary structures have not been experimentally determined. In this paper, we introduce a novel semi-supervised learning approach for training parameters in a probabilistic model of RNA secondary structures in which we employ not only RNA sequences with annotated secondary structures but also ones with unknown secondary structures. Our model is based on a hybrid of generative (stochastic context-free grammars) and discriminative models (conditional random fields) that has been successfully applied to natural language processing. Computational experiments indicate that the accuracy of secondary structure prediction is improved by incorporating RNA sequences with unknown secondary structures into training. To our knowledge, this is the first study of a semi-supervised learning approach for RNA secondary structure prediction. This technique will be useful when the number of reliable structures is limited. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. [Refractory CD20-positive peripheral T-cell lymphoma showing loss of CD20 expression after rituximab therapy and gain of CD20 expression after administration of vorinostat and gemcitabine].

    Science.gov (United States)

    Teshima, Kazuaki; Ohyagi, Hideaki; Kume, Masaaki; Takahashi, Satsuki; Saito, Masahiro; Takahashi, Naoto

    A 79-year-old male patient presented with systemic lymphadenopathy. A lymph node biopsy revealed effacement of the normal nodal architecture with diffuse proliferation of medium-sized atypical lymphoid cells. Southern blot analyses demonstrated rearrangement of the T-cell receptor gene but not the immunoglobulin heavy chain gene. He was diagnosed with CD20-positive peripheral T-cell lymphoma (PTCL), NOS. Although he achieved partial remission after six cycles of R-CHOP, he relapse occurred after 2 months. CD20-negative conversion was confirmed in the lymph node, which was positive for CCR4, and the skin at the time of relapse. The patient received the GDP regimen as salvage therapy with the addition of vorinostat for skin involvement; however, he failed to respond, and the disease systemically progressed. Furthermore, he also exhibited progression in the skin after stopping vorinostat due to hematologic toxicity. A lymph node biopsy at progression revealed CD20 re-expression by immunohistochemistry. At progression, the patient received mogamulizumab but failed to respond, and he died owing to disease progression 8 months after relapse. In this case, we demonstrated CD20-negative conversion following rituximab and CD20-positive reversion after using vorinostat and gemcitabine.

  3. Prediction of arterial oxygen partial pressure after changes in FIO₂: validation and clinical application of a novel formula.

    Science.gov (United States)

    Al-Otaibi, H M; Hardman, J G

    2011-11-01

    Existing methods allow prediction of Pa(O₂) during adjustment of Fi(O₂). However, these are cumbersome and lack sufficient accuracy for use in the clinical setting. The present studies aim to extend the validity of a novel formula designed to predict Pa(O₂) during adjustment of Fi(O₂) and to compare it with the current methods. Sixty-seven new data sets were collected from 46 randomly selected, mechanically ventilated patients. Each data set consisted of two subsets (before and 20 min after Fi(O₂) adjustment) and contained ventilator settings, pH, and arterial blood gas values. We compared the accuracy of Pa(O₂) prediction using a new formula (which utilizes only the pre-adjustment Pa(O₂) and pre- and post-adjustment Fi(O₂) with prediction using assumptions of constant Pa(O₂)/Fi(O₂) or constant Pa(O₂)/Pa(O₂). Subsequently, 20 clinicians predicted Pa(O₂) using the new formula and using Nunn's isoshunt diagram. The accuracy of the clinician's predictions was examined. The 95% limits of agreement (LA(95%)) between predicted and measured Pa(O₂) in the patient group were: new formula 0.11 (2.0) kPa, Pa(O₂)/Fi(O₂) -1.9 (4.4) kPa, and Pa(O₂)/Pa(O₂) -1.0 (3.6) kPa. The LA(95%) of clinicians' predictions of Pa(O₂) were 0.56 (3.6) kPa (new formula) and -2.7 (6.4) kPa (isoshunt diagram). The new formula's prediction of changes in Pa(O₂) is acceptably accurate and reliable and better than any other existing method. Its use by clinicians appears to improve accuracy over the most popular existing method. The simplicity of the new method may allow its regular use in the critical care setting.

  4. Cost-of-illness studies and cost-effectiveness analyses in anxiety disorders: a systematic review.

    Science.gov (United States)

    Konnopka, Alexander; Leichsenring, Falk; Leibing, Eric; König, Hans-Helmut

    2009-04-01

    To review cost-of-illness studies (COI) and cost-effectiveness analyses (CEA) conducted for anxiety disorders. Based on a database search in Pubmed, PsychINFO and NHS EED, studies were classified according to various criteria. Cost data were inflated and converted to 2005 US-$ purchasing power parities (PPP). We finally identified 20 COI and 11 CEA of which most concentrated on panic disorder (PD) and generalized anxiety disorder (GAD). Differing inclusion of cost categories limited comparability of COI. PD and GAD tended to show higher direct costs per case, but lower direct cost per inhabitant than social and specific phobias. Different measures of effectiveness severely limited comparability of CEA. Overall CEA analysed 26 therapeutic or interventional strategies mostly compared to standard treatment, 8 of them resulting in lower better effectiveness and costs than the comparator. Anxiety disorders cause considerable costs. More research on phobias, more standardised inclusion of cost categories in COI and a wider use of comparable effectiveness measures (like QALYs) in CEA is needed.

  5. Ultimate compression after impact load prediction in graphite/epoxy coupons using neural network and multivariate statistical analyses

    Science.gov (United States)

    Gregoire, Alexandre David

    2011-07-01

    The goal of this research was to accurately predict the ultimate compressive load of impact damaged graphite/epoxy coupons using a Kohonen self-organizing map (SOM) neural network and multivariate statistical regression analysis (MSRA). An optimized use of these data treatment tools allowed the generation of a simple, physically understandable equation that predicts the ultimate failure load of an impacted damaged coupon based uniquely on the acoustic emissions it emits at low proof loads. Acoustic emission (AE) data were collected using two 150 kHz resonant transducers which detected and recorded the AE activity given off during compression to failure of thirty-four impacted 24-ply bidirectional woven cloth laminate graphite/epoxy coupons. The AE quantification parameters duration, energy and amplitude for each AE hit were input to the Kohonen self-organizing map (SOM) neural network to accurately classify the material failure mechanisms present in the low proof load data. The number of failure mechanisms from the first 30% of the loading for twenty-four coupons were used to generate a linear prediction equation which yielded a worst case ultimate load prediction error of 16.17%, just outside of the +/-15% B-basis allowables, which was the goal for this research. Particular emphasis was placed upon the noise removal process which was largely responsible for the accuracy of the results.

  6. Self-Rated Activity Levels and Longevity: Evidence from a 20 Year Longitudinal Study

    Science.gov (United States)

    Mullee, Mark A.; Coleman, Peter G.; Briggs, Roger S. J.; Stevenson, James E.; Turnbull, Joanne C.

    2008-01-01

    The study reports on factors predicting the longevity of 328 people over the age of 65 drawn from an English city and followed over 20 years. Both the reported activities score and the individual's comparative evaluation of their own level of activity independently reduced the risk of death, even when health and cognitive status were taken into…

  7. Kernel-based whole-genome prediction of complex traits: a review.

    Science.gov (United States)

    Morota, Gota; Gianola, Daniel

    2014-01-01

    Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways), thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics.

  8. Kernel-based whole-genome prediction of complex traits: a review

    Directory of Open Access Journals (Sweden)

    Gota eMorota

    2014-10-01

    Full Text Available Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways, thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics.

  9. Integration of meanline and one-dimensional methods for prediction of pulsating performance of a turbocharger turbine

    International Nuclear Information System (INIS)

    Chiong, M.S.; Rajoo, S.; Romagnoli, A.; Costall, A.W.; Martinez-Botas, R.F.

    2014-01-01

    Highlights: • Unsteady turbine performance prediction by integrating the 1-D and meanline models. • The optimum discretization length/diameter ratio is identified. • No improvement is gained by increasing the number of rotor entries. • The predicted instantaneous mass flow and output power are analysed in detail. - Abstract: Stringent emission regulations are driving engine manufacturers to increase investment into enabling technologies to achieve better specific fuel consumption, thermal efficiency and most importantly carbon reduction. Engine downsizing is seen as a key enabler to successfully achieve all of these requirements. Boosting through turbocharging is widely regarded as one of the most promising technologies for engine downsizing. However, the wide range of engine speeds and loads requires enhanced quality of engine-turbocharger matching, compared to the conventional approach which considers only the full load condition. Thus, development of computational models capable of predicting the unsteady behaviour of a turbocharger turbine is crucial to the overall matching process. A purely one-dimensional (1D) turbine model is capable of good unsteady swallowing capacity predictions, however it has not been fully exploited to predict instantaneous turbine power. On the contrary, meanline models (zero-dimensional) are regarded as a good tool to determine turbine efficiency in steady state but they do not include any information about the pressure wave action occurring within the turbine. This paper explores an alternative methodology to predict instantaneous turbine power and swallowing capacity by integrating one-dimensional and meanline models. A single entry mixed-flow turbine is modelled using a 1D gas dynamic code to solve the unsteady flow state in the volute, consequently used as the input for a meanline model to evaluate the instantaneous turbine power. The key in the effectiveness of this methodology relies on the synchronisation of the flow

  10. [Evaluation of thermal comfort in a student population: predictive value of an integrated index (Fanger's predicted mean value].

    Science.gov (United States)

    Catenacci, G; Terzi, R; Marcaletti, G; Tringali, S

    1989-01-01

    Practical applications and predictive values of a thermal comfort index (Fanger's PRV) were verified on a sample school population (1236 subjects) by studying the relationships between thermal sensations (subjective analysis), determined by means of an individual questionnaire, and the values of thermal comfort index (objective analysis) obtained by calculating the PMV index individually in the subjects under study. In homogeneous conditions of metabolic expenditure rate and thermal impedence from clothing, significant differences were found between the two kinds of analyses. At 22 degrees C mean radiant and operative temperature, the PMV values averaged 0 and the percentage of subjects who experienced thermal comfort did not exceed 60%. The high level of subjects who were dissatisfied with their environmental thermal conditions confirms the doubts regarding the use of the PMV index as a predictive indicator of thermal comfort, especially considering that the negative answers were not homogeneous nor attributable to the small thermal fluctuations (less than 0.5 degree C) measured in the classrooms.

  11. Web 2.0 as a megatrend in eGovernment: An empirical analysis of its preconditions and outcomes

    OpenAIRE

    Plomp, M.G.A.; Te Velde, R.A.

    2011-01-01

    In recent years, Web 2.0 and social media were such an enormous hype, that today the popularity of these terms seems to be declining already. However, scientists and practitioners in (e)Goverment are still bothered by important questions, like: what is Web 2.0 and should we do something with it? In this paper, we give an extensive description of what Web 2.0 entails and to what degree it is really something ‘new’. Consequently, we analyse the critical preconditions and main outcomes of workin...

  12. A Particle Swarm Optimization-Based Approach with Local Search for Predicting Protein Folding.

    Science.gov (United States)

    Yang, Cheng-Hong; Lin, Yu-Shiun; Chuang, Li-Yeh; Chang, Hsueh-Wei

    2017-10-01

    The hydrophobic-polar (HP) model is commonly used for predicting protein folding structures and hydrophobic interactions. This study developed a particle swarm optimization (PSO)-based algorithm combined with local search algorithms; specifically, the high exploration PSO (HEPSO) algorithm (which can execute global search processes) was combined with three local search algorithms (hill-climbing algorithm, greedy algorithm, and Tabu table), yielding the proposed HE-L-PSO algorithm. By using 20 known protein structures, we evaluated the performance of the HE-L-PSO algorithm in predicting protein folding in the HP model. The proposed HE-L-PSO algorithm exhibited favorable performance in predicting both short and long amino acid sequences with high reproducibility and stability, compared with seven reported algorithms. The HE-L-PSO algorithm yielded optimal solutions for all predicted protein folding structures. All HE-L-PSO-predicted protein folding structures possessed a hydrophobic core that is similar to normal protein folding.

  13. [A prediction model for internet game addiction in adolescents: using a decision tree analysis].

    Science.gov (United States)

    Kim, Ki Sook; Kim, Kyung Hee

    2010-06-01

    This study was designed to build a theoretical frame to provide practical help to prevent and manage adolescent internet game addiction by developing a prediction model through a comprehensive analysis of related factors. The participants were 1,318 students studying in elementary, middle, and high schools in Seoul and Gyeonggi Province, Korea. Collected data were analyzed using the SPSS program. Decision Tree Analysis using the Clementine program was applied to build an optimum and significant prediction model to predict internet game addiction related to various factors, especially parent related factors. From the data analyses, the prediction model for factors related to internet game addiction presented with 5 pathways. Causative factors included gender, type of school, siblings, economic status, religion, time spent alone, gaming place, payment to Internet café, frequency, duration, parent's ability to use internet, occupation (mother), trust (father), expectations regarding adolescent's study (mother), supervising (both parents), rearing attitude (both parents). The results suggest preventive and managerial nursing programs for specific groups by path. Use of this predictive model can expand the role of school nurses, not only in counseling addicted adolescents but also, in developing and carrying out programs with parents and approaching adolescents individually through databases and computer programming.

  14. Do Work Characteristics Predict Health Deterioration Among Employees with Chronic Diseases?

    Science.gov (United States)

    de Wind, Astrid; Boot, Cécile R L; Sewdas, Ranu; Scharn, Micky; van den Heuvel, Swenne G; van der Beek, Allard J

    2017-06-29

    Purpose In our ageing workforce, the increasing numbers of employees with chronic diseases are encouraged to prolong their working lives. It is important to prevent health deterioration in this vulnerable group. This study aims to investigate whether work characteristics predict health deterioration over a 3-year period among employees with (1) chronic diseases, and, more specifically, (2) musculoskeletal and psychological disorders. Methods The study population consisted of 5600 employees aged 45-64 years with a chronic disease, who participated in the Dutch Study on Transitions in Employment, Ability and Motivation (STREAM). Information on work characteristics was derived from the baseline questionnaire. Health deterioration was defined as a decrease in general health (SF-12) between baseline and follow-up (1-3 years). Crude and adjusted logistic regression analyses were performed to investigate prediction of health deterioration by work characteristics. Subgroup analyses were performed for employees with musculoskeletal and psychological disorders. Results At follow-up, 19.2% of the employees reported health deterioration (N = 1075). Higher social support of colleagues or supervisor predicted health deterioration in the crude analyses in the total group, and the groups with either musculoskeletal or psychological disorders (ORs 1.11-1.42). This effect was not found anymore in the adjusted analyses. The other work characteristics did not predict health deterioration in any group. Conclusions This study did not support our hypothesis that work characteristics predict health deterioration among employees with chronic diseases. As our study population succeeded continuing employment to 45 years and beyond, it was probably a relatively healthy selection of employees.

  15. Prediction of Groundwater Arsenic Contamination using Geographic Information System and Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Md. Moqbul Hossain

    2013-01-01

    Full Text Available Ground water arsenic contamination is a well known health and environmental problem in Bangladesh. Sources of this heavy metal are known to be geogenic, however, the processes of its release into groundwater are poorly understood phenomena. In quest of mitigation of the problem it is necessary to predict probable contamination before it causes any damage to human health. Hence our research has been carried out to find the factor relations of arsenic contamination and develop an arsenic contamination prediction model. Researchers have generally agreed that the elevated concentration of arsenic is affected by several factors such as soil reaction (pH, organic matter content, geology, iron content, etc. However, the variability of concentration within short lateral and vertical intervals, and the inter-relationships of variables among themselves, make the statistical analyses highly non-linear and difficult to converge with a meaningful relationship. Artificial Neural Networks (ANN comes in handy for such a black box type problem. This research uses Back propagation Neural Networks (BPNN to train and validate the data derived from Geographic Information System (GIS spatial distribution grids. The neural network architecture with (6-20-1 pattern was able to predict the arsenic concentration with reasonable accuracy.

  16. Assessment of protein disorder region predictions in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan; Kryshtafovych, Andriy; Moult, John; Tramontano, Anna; Fidelis, Krzysztof

    2013-01-01

    The article presents the assessment of disorder region predictions submitted to CASP10. The evaluation is based on the three measures tested in previous CASPs: (i) balanced accuracy, (ii) the Matthews correlation coefficient for the binary predictions, and (iii) the area under the curve in the receiver operating characteristic (ROC) analysis of predictions using probability annotation. We also performed new analyses such as comparison of the submitted predictions with those obtained with a Naïve disorder prediction method and with predictions from the disorder prediction databases D2P2 and MobiDB. On average, the methods participating in CASP10 demonstrated slightly better performance than those in CASP9.

  17. Assessment of protein disorder region predictions in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan

    2013-11-22

    The article presents the assessment of disorder region predictions submitted to CASP10. The evaluation is based on the three measures tested in previous CASPs: (i) balanced accuracy, (ii) the Matthews correlation coefficient for the binary predictions, and (iii) the area under the curve in the receiver operating characteristic (ROC) analysis of predictions using probability annotation. We also performed new analyses such as comparison of the submitted predictions with those obtained with a Naïve disorder prediction method and with predictions from the disorder prediction databases D2P2 and MobiDB. On average, the methods participating in CASP10 demonstrated slightly better performance than those in CASP9.

  18. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    International Nuclear Information System (INIS)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E.; Tills, J.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions

  19. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E. [Sandia National Labs., Albuquerque, NM (United States); Tills, J. [J. Tills and Associates, Inc., Sandia Park, NM (United States)

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.

  20. Optimization of C20 isomers structure

    International Nuclear Information System (INIS)

    Ndjaka, J.M.B.; Charlier, J.C.

    2001-07-01

    We have performed geometry optimization of various possible planar and three-dimensional C 20 geometries. The planar structures considered include a linear chain, a monoclinic ring, and a bicyclic bow tie; while the three-dimensional geometric; consisted of a bowl or corranulene structure and a fullerene cage. In agreement with Wang et al MP2's calculations, our results predict the corranulene bowl to be the lowest energy structure. From the ground state geometry to the highest energy, considered C 20 structures, listed in increasing energy, are bowl, cage, bow tie, ring and chain. For the ring and bow tie isomers, the shape of the optimized structure deviates from that of the initial configuration; while the shape of the optimised bowl, cage and chain remain unchanged. (author)

  1. Analyses of non-fatal accidents in an opencast mine by logistic regression model - a case study.

    Science.gov (United States)

    Onder, Seyhan; Mutlu, Mert

    2017-09-01

    Accidents cause major damage for both workers and enterprises in the mining industry. To reduce the number of occupational accidents, these incidents should be properly registered and carefully analysed. This study efficiently examines the Aegean Lignite Enterprise (ELI) of Turkish Coal Enterprises (TKI) in Soma between 2006 and 2011, and opencast coal mine occupational accident records were used for statistical analyses. A total of 231 occupational accidents were analysed for this study. The accident records were categorized into seven groups: area, reason, occupation, part of body, age, shift hour and lost days. The SPSS package program was used in this study for logistic regression analyses, which predicted the probability of accidents resulting in greater or less than 3 lost workdays for non-fatal injuries. Social facilities-area of surface installations, workshops and opencast mining areas are the areas with the highest probability for accidents with greater than 3 lost workdays for non-fatal injuries, while the reasons with the highest probability for these types of accidents are transporting and manual handling. Additionally, the model was tested for such reported accidents that occurred in 2012 for the ELI in Soma and estimated the probability of exposure to accidents with lost workdays correctly by 70%.

  2. VOC composition of current motor vehicle fuels and vapors, and collinearity analyses for receptor modeling.

    Science.gov (United States)

    Chin, Jo-Yu; Batterman, Stuart A

    2012-03-01

    The formulation of motor vehicle fuels can alter the magnitude and composition of evaporative and exhaust emissions occurring throughout the fuel cycle. Information regarding the volatile organic compound (VOC) composition of motor fuels other than gasoline is scarce, especially for bioethanol and biodiesel blends. This study examines the liquid and vapor (headspace) composition of four contemporary and commercially available fuels: gasoline (gasoline), ultra-low sulfur diesel (ULSD), and B20 (20% soy-biodiesel and 80% ULSD). The composition of gasoline and E85 in both neat fuel and headspace vapor was dominated by aromatics and n-heptane. Despite its low gasoline content, E85 vapor contained higher concentrations of several VOCs than those in gasoline vapor, likely due to adjustments in its formulation. Temperature changes produced greater changes in the partial pressures of 17 VOCs in E85 than in gasoline, and large shifts in the VOC composition. B20 and ULSD were dominated by C(9) to C(16)n-alkanes and low levels of the aromatics, and the two fuels had similar headspace vapor composition and concentrations. While the headspace composition predicted using vapor-liquid equilibrium theory was closely correlated to measurements, E85 vapor concentrations were underpredicted. Based on variance decomposition analyses, gasoline and diesel fuels and their vapors VOC were distinct, but B20 and ULSD fuels and vapors were highly collinear. These results can be used to estimate fuel related emissions and exposures, particularly in receptor models that apportion emission sources, and the collinearity analysis suggests that gasoline- and diesel-related emissions can be distinguished. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Statistical and extra-statistical considerations in differential item functioning analyses

    Directory of Open Access Journals (Sweden)

    G. K. Huysamen

    2004-10-01

    Full Text Available This article briefly describes the main procedures for performing differential item functioning (DIF analyses and points out some of the statistical and extra-statistical implications of these methods. Research findings on the sources of DIF, including those associated with translated tests, are reviewed. As DIF analyses are oblivious of correlations between a test and relevant criteria, the elimination of differentially functioning items does not necessarily improve predictive validity or reduce any predictive bias. The implications of the results of past DIF research for test development in the multilingual and multi-cultural South African society are considered. Opsomming Hierdie artikel beskryf kortliks die hoofprosedures vir die ontleding van differensiële itemfunksionering (DIF en verwys na sommige van die statistiese en buite-statistiese implikasies van hierdie metodes. ’n Oorsig word verskaf van navorsingsbevindings oor die bronne van DIF, insluitend dié by vertaalde toetse. Omdat DIF-ontledings nie die korrelasies tussen ’n toets en relevante kriteria in ag neem nie, sal die verwydering van differensieel-funksionerende items nie noodwendig voorspellingsgeldigheid verbeter of voorspellingsydigheid verminder nie. Die implikasies van vorige DIF-navorsingsbevindings vir toetsontwikkeling in die veeltalige en multikulturele Suid-Afrikaanse gemeenskap word oorweeg.

  4. CADDIS Volume 4. Data Analysis: Advanced Analyses - Controlling for Natural Variability: SSD Plot Diagrams

    Science.gov (United States)

    Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.

  5. Barium atoms and N20 molecular agregates reaction

    International Nuclear Information System (INIS)

    Visticot, J.P.; Mestdagh, J.M.; Alcaraz, C.; Cuvellier, J.; Berlande, J.

    1988-06-01

    The collisions between a barium atom and N20 molecular agregates are studied, for a better understanding of the solvation part in a chemical reaction. The experiments are carried out in a crossed molecular beam device. The light coming from the collision zone is scattered, and analysed by means of a photon detector. A time-of-flight technique is applied in the investigation of the beam's polymer concentration. The results show a nearly negligible chemiluminescent effect in the reaction between barium and N20 polymer. A solvated BaO formation mechanism is proposed to justify the experimental results [fr

  6. Asian Summer Monsoon Rainfall associated with ENSO and its Predictability

    Science.gov (United States)

    Shin, C. S.; Huang, B.; Zhu, J.; Marx, L.; Kinter, J. L.; Shukla, J.

    2015-12-01

    The leading modes of the Asian summer monsoon (ASM) rainfall variability and their seasonal predictability are investigated using the CFSv2 hindcasts initialized from multiple ocean analyses over the period of 1979-2008 and observation-based analyses. It is shown that the two leading empirical orthogonal function (EOF) modes of the observed ASM rainfall anomalies, which together account for about 34% of total variance, largely correspond to the ASM responses to the ENSO influences during the summers of the developing and decaying years of a Pacific anomalous event, respectively. These two ASM modes are then designated as the contemporary and delayed ENSO responses, respectively. It is demonstrated that the CFSv2 is capable of predicting these two dominant ASM modes up to the lead of 5 months. More importantly, the predictability of the ASM rainfall are much higher with respect to the delayed ENSO mode than the contemporary one, with the predicted principal component time series of the former maintaining high correlation skill and small ensemble spread with all lead months whereas the latter shows significant degradation in both measures with lead-time. A composite analysis for the ASM rainfall anomalies of all warm ENSO events in this period substantiates the finding that the ASM is more predictable following an ENSO event. The enhanced predictability mainly comes from the evolution of the warm SST anomalies over the Indian Ocean in the spring of the ENSO maturing phases and the persistence of the anomalous high sea surface pressure over the western Pacific in the subsequent summer, which the hindcasts are able to capture reasonably well. The results also show that the ensemble initialization with multiple ocean analyses improves the CFSv2's prediction skill of both ENSO and ASM rainfall. In fact, the skills of the ensemble mean hindcasts initialized from the four different ocean analyses are always equivalent to the best ones initialized from any individual ocean

  7. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Part 1 and 2

    International Nuclear Information System (INIS)

    Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate

  8. Secondary recrystallisation in 20 w/o Cr-25 w/o Ni-Nb stabilised stainless steel

    International Nuclear Information System (INIS)

    Healey, T.; Brown, A.F.; Speight, M.V.

    1976-11-01

    The fuel cladding material for the Commercial Advanced Gas Reactor is a fine grain 20 w/o Cr-25 w/o Ni niobium stabilised stainless steel. The grain structure stability of this alloy has been investigated as a function of carbon content over the temperature range 930 - 990 0 C. It is demonstrated that the primary grain structure is susceptible to abnormal growth due to secondary recrystallisation of the initial fine grain structure after a composition and temperature dependent incubation period. The magnitude of the incubation period is analysed on the basis that secondary recrystallisation commences when randomly dispersed niobium carbide particles have coarsened to a critical size. The validity of the analysis is tested by comparing the predictions with experimental observation. The model is subsequently used to evaluate the incubation period for conditions of temperature, composition and microstructure which differ from those defined in the experimental studies. (author)

  9. Predicting Pilot Retention

    Science.gov (United States)

    2012-06-15

    forever… Gig ‘Em! Dale W. Stanley III vii Table of Contents Page Acknowledgments...over the last 20 years. Airbus predicted that these trends would continue as emerging economies , especially in Asia, were creating a fast growing...US economy , pay differential and hiring by the major airlines contributed most to the decision to separate from the Air Force (Fullerton, 2003: 354

  10. Prediction and repeatability of milk coagulation properties and curd-firming modeling parameters of ovine milk using Fourier-transform infrared spectroscopy and Bayesian models.

    Science.gov (United States)

    Ferragina, A; Cipolat-Gotet, C; Cecchinato, A; Pazzola, M; Dettori, M L; Vacca, G M; Bittante, G

    2017-05-01

    The aim of this study was to apply Bayesian models to the Fourier-transform infrared spectroscopy spectra of individual sheep milk samples to derive calibration equations to predict traditional and modeled milk coagulation properties (MCP), and to assess the repeatability of MCP measures and their predictions. Data consisted of 1,002 individual milk samples collected from Sarda ewes reared in 22 farms in the region of Sardinia (Italy) for which MCP and modeled curd-firming parameters were available. Two milk samples were taken from 87 ewes and analyzed with the aim of estimating repeatability, whereas a single sample was taken from the other 915 ewes. Therefore, a total of 1,089 analyses were performed. For each sample, 2 spectra in the infrared region 5,011 to 925 cm -1 were available and averaged before data analysis. BayesB models were used to calibrate equations for each of the traits. Prediction accuracy was estimated for each trait and model using 20 replicates of a training-testing validation procedure. The repeatability of MCP measures and their predictions were also compared. The correlations between measured and predicted traits, in the external validation, were always higher than 0.5 (0.88 for rennet coagulation time). We confirmed that the most important element for finding the prediction accuracy is the repeatability of the gold standard analyses used for building calibration equations. Repeatability measures of the predicted traits were generally high (≥95%), even for those traits with moderate analytical repeatability. Our results show that Bayesian models applied to Fourier-transform infrared spectra are powerful tools for cheap and rapid prediction of important traits in ovine milk and, compared with other methods, could help in the interpretation of results. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  11. Complete genome of a European hepatitis C virus subtype 1g isolate: phylogenetic and genetic analyses

    Directory of Open Access Journals (Sweden)

    Bargalló Ana

    2008-06-01

    Full Text Available Abstract Background Hepatitis C virus isolates have been classified into six main genotypes and a variable number of subtypes within each genotype, mainly based on phylogenetic analysis. Analyses of the genetic relationship among genotypes and subtypes are more reliable when complete genome sequences (or at least the full coding region are used; however, so far 31 of 80 confirmed or proposed subtypes have at least one complete genome available. Of these, 20 correspond to confirmed subtypes of epidemic interest. Results We present and analyse the first complete genome sequence of a HCV subtype 1g isolate. Phylogenetic and genetic distance analyses reveal that HCV-1g is the most divergent subtype among the HCV-1 confirmed subtypes. Potential genomic recombination events between genotypes or subtype 1 genomes were ruled out. We demonstrate phylogenetic congruence of previously deposited partial sequences of HCV-1g with respect to our sequence. Conclusion In light of this, we propose changing the current status of its subtype-specific designation from provisional to confirmed.

  12. Prediction of resource volumes at untested locations using simple local prediction models

    Science.gov (United States)

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2006-01-01

    This paper shows how local spatial nonparametric prediction models can be applied to estimate volumes of recoverable gas resources at individual undrilled sites, at multiple sites on a regional scale, and to compute confidence bounds for regional volumes based on the distribution of those estimates. An approach that combines cross-validation, the jackknife, and bootstrap procedures is used to accomplish this task. Simulation experiments show that cross-validation can be applied beneficially to select an appropriate prediction model. The cross-validation procedure worked well for a wide range of different states of nature and levels of information. Jackknife procedures are used to compute individual prediction estimation errors at undrilled locations. The jackknife replicates also are used with a bootstrap resampling procedure to compute confidence bounds for the total volume. The method was applied to data (partitioned into a training set and target set) from the Devonian Antrim Shale continuous-type gas play in the Michigan Basin in Otsego County, Michigan. The analysis showed that the model estimate of total recoverable volumes at prediction sites is within 4 percent of the total observed volume. The model predictions also provide frequency distributions of the cell volumes at the production unit scale. Such distributions are the basis for subsequent economic analyses. ?? Springer Science+Business Media, LLC 2007.

  13. Prediction and analysis of near-road concentrations using a reduced-form emission/dispersion model

    Directory of Open Access Journals (Sweden)

    Kononowech Robert

    2010-06-01

    Full Text Available Abstract Background Near-road exposures of traffic-related air pollutants have been receiving increased attention due to evidence linking emissions from high-traffic roadways to adverse health outcomes. To date, most epidemiological and risk analyses have utilized simple but crude exposure indicators, most typically proximity measures, such as the distance between freeways and residences, to represent air quality impacts from traffic. This paper derives and analyzes a simplified microscale simulation model designed to predict short- (hourly to long-term (annual average pollutant concentrations near roads. Sensitivity analyses and case studies are used to highlight issues in predicting near-road exposures. Methods Process-based simulation models using a computationally efficient reduced-form response surface structure and a minimum number of inputs integrate the major determinants of air pollution exposures: traffic volume and vehicle emissions, meteorology, and receptor location. We identify the most influential variables and then derive a set of multiplicative submodels that match predictions from "parent" models MOBILE6.2 and CALINE4. The assembled model is applied to two case studies in the Detroit, Michigan area. The first predicts carbon monoxide (CO concentrations at a monitoring site near a freeway. The second predicts CO and PM2.5 concentrations in a dense receptor grid over a 1 km2 area around the intersection of two major roads. We analyze the spatial and temporal patterns of pollutant concentration predictions. Results Predicted CO concentrations showed reasonable agreement with annual average and 24-hour measurements, e.g., 59% of the 24-hr predictions were within a factor of two of observations in the warmer months when CO emissions are more consistent. The highest concentrations of both CO and PM2.5 were predicted to occur near intersections and downwind of major roads during periods of unfavorable meteorology (e.g., low wind

  14. Computational prediction of protein-protein interactions in Leishmania predicted proteomes.

    Directory of Open Access Journals (Sweden)

    Antonio M Rezende

    Full Text Available The Trypanosomatids parasites Leishmania braziliensis, Leishmania major and Leishmania infantum are important human pathogens. Despite of years of study and genome availability, effective vaccine has not been developed yet, and the chemotherapy is highly toxic. Therefore, it is clear just interdisciplinary integrated studies will have success in trying to search new targets for developing of vaccines and drugs. An essential part of this rationale is related to protein-protein interaction network (PPI study which can provide a better understanding of complex protein interactions in biological system. Thus, we modeled PPIs for Trypanosomatids through computational methods using sequence comparison against public database of protein or domain interaction for interaction prediction (Interolog Mapping and developed a dedicated combined system score to address the predictions robustness. The confidence evaluation of network prediction approach was addressed using gold standard positive and negative datasets and the AUC value obtained was 0.94. As result, 39,420, 43,531 and 45,235 interactions were predicted for L. braziliensis, L. major and L. infantum respectively. For each predicted network the top 20 proteins were ranked by MCC topological index. In addition, information related with immunological potential, degree of protein sequence conservation among orthologs and degree of identity compared to proteins of potential parasite hosts was integrated. This information integration provides a better understanding and usefulness of the predicted networks that can be valuable to select new potential biological targets for drug and vaccine development. Network modularity which is a key when one is interested in destabilizing the PPIs for drug or vaccine purposes along with multiple alignments of the predicted PPIs were performed revealing patterns associated with protein turnover. In addition, around 50% of hypothetical protein present in the networks

  15. ATOP - The Advanced Taiwan Ocean Prediction System Based on the mpiPOM. Part 1: Model Descriptions, Analyses and Results

    Directory of Open Access Journals (Sweden)

    Leo Oey

    2013-01-01

    Full Text Available A data-assimilated Taiwan Ocean Prediction (ATOP system is being developed at the National Central University, Taiwan. The model simulates sea-surface height, three-dimensional currents, temperature and salinity and turbulent mixing. The model has options for tracer and particle-tracking algorithms, as well as for wave-induced Stokes drift and wave-enhanced mixing and bottom drag. Two different forecast domains have been tested: a large-grid domain that encompasses the entire North Pacific Ocean at 0.1° × 0.1° horizontal resolution and 41 vertical sigma levels, and a smaller western North Pacific domain which at present also has the same horizontal resolution. In both domains, 25-year spin-up runs from 1988 - 2011 were first conducted, forced by six-hourly Cross-Calibrated Multi-Platform (CCMP and NCEP reanalysis Global Forecast System (GSF winds. The results are then used as initial conditions to conduct ocean analyses from January 2012 through February 2012, when updated hindcasts and real-time forecasts begin using the GFS winds. This paper describes the ATOP system and compares the forecast results against satellite altimetry data for assessing model skills. The model results are also shown to compare well with observations of (i the Kuroshio intrusion in the northern South China Sea, and (ii subtropical counter current. Review and comparison with other models in the literature of ¡§(i¡¨ are also given.

  16. A genomic biomarker signature can predict skin sensitizers using a cell-based in vitro alternative to animal tests

    Directory of Open Access Journals (Sweden)

    Albrekt Ann-Sofie

    2011-08-01

    Full Text Available Abstract Background Allergic contact dermatitis is an inflammatory skin disease that affects a significant proportion of the population. This disease is caused by an adverse immune response towards chemical haptens, and leads to a substantial economic burden for society. Current test of sensitizing chemicals rely on animal experimentation. New legislations on the registration and use of chemicals within pharmaceutical and cosmetic industries have stimulated significant research efforts to develop alternative, human cell-based assays for the prediction of sensitization. The aim is to replace animal experiments with in vitro tests displaying a higher predictive power. Results We have developed a novel cell-based assay for the prediction of sensitizing chemicals. By analyzing the transcriptome of the human cell line MUTZ-3 after 24 h stimulation, using 20 different sensitizing chemicals, 20 non-sensitizing chemicals and vehicle controls, we have identified a biomarker signature of 200 genes with potent discriminatory ability. Using a Support Vector Machine for supervised classification, the prediction performance of the assay revealed an area under the ROC curve of 0.98. In addition, categorizing the chemicals according to the LLNA assay, this gene signature could also predict sensitizing potency. The identified markers are involved in biological pathways with immunological relevant functions, which can shed light on the process of human sensitization. Conclusions A gene signature predicting sensitization, using a human cell line in vitro, has been identified. This simple and robust cell-based assay has the potential to completely replace or drastically reduce the utilization of test systems based on experimental animals. Being based on human biology, the assay is proposed to be more accurate for predicting sensitization in humans, than the traditional animal-based tests.

  17. A genomic biomarker signature can predict skin sensitizers using a cell-based in vitro alternative to animal tests

    Science.gov (United States)

    2011-01-01

    Background Allergic contact dermatitis is an inflammatory skin disease that affects a significant proportion of the population. This disease is caused by an adverse immune response towards chemical haptens, and leads to a substantial economic burden for society. Current test of sensitizing chemicals rely on animal experimentation. New legislations on the registration and use of chemicals within pharmaceutical and cosmetic industries have stimulated significant research efforts to develop alternative, human cell-based assays for the prediction of sensitization. The aim is to replace animal experiments with in vitro tests displaying a higher predictive power. Results We have developed a novel cell-based assay for the prediction of sensitizing chemicals. By analyzing the transcriptome of the human cell line MUTZ-3 after 24 h stimulation, using 20 different sensitizing chemicals, 20 non-sensitizing chemicals and vehicle controls, we have identified a biomarker signature of 200 genes with potent discriminatory ability. Using a Support Vector Machine for supervised classification, the prediction performance of the assay revealed an area under the ROC curve of 0.98. In addition, categorizing the chemicals according to the LLNA assay, this gene signature could also predict sensitizing potency. The identified markers are involved in biological pathways with immunological relevant functions, which can shed light on the process of human sensitization. Conclusions A gene signature predicting sensitization, using a human cell line in vitro, has been identified. This simple and robust cell-based assay has the potential to completely replace or drastically reduce the utilization of test systems based on experimental animals. Being based on human biology, the assay is proposed to be more accurate for predicting sensitization in humans, than the traditional animal-based tests. PMID:21824406

  18. Basis of predictive mycology.

    Science.gov (United States)

    Dantigny, Philippe; Guilmart, Audrey; Bensoussan, Maurice

    2005-04-15

    For over 20 years, predictive microbiology focused on food-pathogenic bacteria. Few studies concerned modelling fungal development. On one hand, most of food mycologists are not familiar with modelling techniques; on the other hand, people involved in modelling are developing tools dedicated to bacteria. Therefore, there is a tendency to extend the use of models that were developed for bacteria to moulds. However, some mould specificities should be taken into account. The use of specific models for predicting germination and growth of fungi was advocated previously []. This paper provides a short review of fungal modelling studies.

  19. Prediction of future labour market outcome in a cohort of long-term sick-listed Danes

    DEFF Research Database (Denmark)

    Pedersen, Jacob; Gerds, Thomas Alexander; Bjørner, Jakob

    2014-01-01

    BACKGROUND: Targeted interventions for the long-term sick-listed may prevent permanent exclusion from the labour force. We aimed to develop a prediction method for identifying high risk groups for continued or recurrent long-term sickness absence, unemployment, or disability among persons on long...... data set, statistical prediction methods were built using logistic regression and a discrete event simulation approach for a one year prediction horizon. Personalized risk profiles were obtained for five outcomes: employment, unemployment, recurrent sickness absence, continuous long-term sickness...... of recession (2008-2010). The accuracy of the prediction models was assessed with analyses of Receiver Operating Characteristic (ROC) curves and the Brier score in an independent validation data set. RESULTS: In comparison with a null model which ignored the predictor variables, logistic regression achieved...

  20. Time-series prediction of shellfish farm closure: A comparison of alternatives

    Directory of Open Access Journals (Sweden)

    Ashfaqur Rahman

    2014-08-01

    Full Text Available Shellfish farms are closed for harvest when microbial pollutants are present. Such pollutants are typically present in rainfall runoff from various land uses in catchments. Experts currently use a number of observable parameters (river flow, rainfall, salinity as proxies to determine when to close farms. We have proposed using the short term historical rainfall data as a time-series prediction problem where we aim to predict the closure of shellfish farms based only on rainfall. Time-series event prediction consists of two steps: (i feature extraction, and (ii prediction. A number of data mining challenges exist for these scenarios: (i which feature extraction method best captures the rainfall pattern over successive days that leads to opening or closure of the farms?, (ii The farm closure events occur infrequently and this leads to a class imbalance problem; the question is what is the best way to deal with this problem? In this paper we have analysed and compared different combinations of balancing methods (under-sampling and over-sampling, feature extraction methods (cluster profile, curve fitting, Fourier Transform, Piecewise Aggregate Approximation, and Wavelet Transform and learning algorithms (neural network, support vector machine, k-nearest neighbour, decision tree, and Bayesian Network to predict closure events accurately considering the above data mining challenges. We have identified the best combination of techniques to accurately predict shellfish farm closure from rainfall, given the above data mining challenges.

  1. The prediction of late-onset preeclampsia: Results from a longitudinal proteomics study.

    Science.gov (United States)

    Erez, Offer; Romero, Roberto; Maymon, Eli; Chaemsaithong, Piya; Done, Bogdan; Pacora, Percy; Panaitescu, Bogdan; Chaiworapongsa, Tinnakorn; Hassan, Sonia S; Tarca, Adi L

    2017-01-01

    Late-onset preeclampsia is the most prevalent phenotype of this syndrome; nevertheless, only a few biomarkers for its early diagnosis have been reported. We sought to correct this deficiency using a high through-put proteomic platform. A case-control longitudinal study was conducted, including 90 patients with normal pregnancies and 76 patients with late-onset preeclampsia (diagnosed at ≥34 weeks of gestation). Maternal plasma samples were collected throughout gestation (normal pregnancy: 2-6 samples per patient, median of 2; late-onset preeclampsia: 2-6, median of 5). The abundance of 1,125 proteins was measured using an aptamers-based proteomics technique. Protein abundance in normal pregnancies was modeled using linear mixed-effects models to estimate mean abundance as a function of gestational age. Data was then expressed as multiples of-the-mean (MoM) values in normal pregnancies. Multi-marker prediction models were built using data from one of five gestational age intervals (8-16, 16.1-22, 22.1-28, 28.1-32, 32.1-36 weeks of gestation). The predictive performance of the best combination of proteins was compared to placental growth factor (PIGF) using bootstrap. 1) At 8-16 weeks of gestation, the best prediction model included only one protein, matrix metalloproteinase 7 (MMP-7), that had a sensitivity of 69% at a false positive rate (FPR) of 20% (AUC = 0.76); 2) at 16.1-22 weeks of gestation, MMP-7 was the single best predictor of late-onset preeclampsia with a sensitivity of 70% at a FPR of 20% (AUC = 0.82); 3) after 22 weeks of gestation, PlGF was the best predictor of late-onset preeclampsia, identifying 1/3 to 1/2 of the patients destined to develop this syndrome (FPR = 20%); 4) 36 proteins were associated with late-onset preeclampsia in at least one interval of gestation (after adjustment for covariates); 5) several biological processes, such as positive regulation of vascular endothelial growth factor receptor signaling pathway, were perturbed; and 6

  2. A Bayesian antedependence model for whole genome prediction.

    Science.gov (United States)

    Yang, Wenzhao; Tempelman, Robert J

    2012-04-01

    Hierarchical mixed effects models have been demonstrated to be powerful for predicting genomic merit of livestock and plants, on the basis of high-density single-nucleotide polymorphism (SNP) marker panels, and their use is being increasingly advocated for genomic predictions in human health. Two particularly popular approaches, labeled BayesA and BayesB, are based on specifying all SNP-associated effects to be independent of each other. BayesB extends BayesA by allowing a large proportion of SNP markers to be associated with null effects. We further extend these two models to specify SNP effects as being spatially correlated due to the chromosomally proximal effects of causal variants. These two models, that we respectively dub as ante-BayesA and ante-BayesB, are based on a first-order nonstationary antedependence specification between SNP effects. In a simulation study involving 20 replicate data sets, each analyzed at six different SNP marker densities with average LD levels ranging from r(2) = 0.15 to 0.31, the antedependence methods had significantly (P 0. 24) with differences exceeding 3%. A cross-validation study was also conducted on the heterogeneous stock mice data resource (http://mus.well.ox.ac.uk/mouse/HS/) using 6-week body weights as the phenotype. The antedependence methods increased cross-validation prediction accuracies by up to 3.6% compared to their classical counterparts (P benchmark data sets and demonstrated that the antedependence methods were more accurate than their classical counterparts for genomic predictions, even for individuals several generations beyond the training data.

  3. Predictive Factors of Clinical Response of Infliximab Therapy in Active Nonradiographic Axial Spondyloarthritis Patients

    Directory of Open Access Journals (Sweden)

    Zhiming Lin

    2015-01-01

    Full Text Available Objectives. To evaluate the efficiency and the predictive factors of clinical response of infliximab in active nonradiographic axial spondyloarthritis patients. Methods. Active nonradiographic patients fulfilling ESSG criteria for SpA but not fulfilling modified New York criteria were included. All patients received infliximab treatment for 24 weeks. The primary endpoint was ASAS20 response at weeks 12 and 24. The abilities of baseline parameters and response at week 2 to predict ASAS20 response at weeks 12 and 24 were assessed using ROC curve and logistic regression analysis, respectively. Results. Of 70 axial SpA patients included, the proportions of patients achieving an ASAS20 response at weeks 2, 6, 12, and 24 were 85.7%, 88.6%, 87.1%, and 84.3%, respectively. Baseline MRI sacroiliitis score (AUC = 0.791; P=0.005, CRP (AUC = 0.75; P=0.017, and ASDAS (AUC = 0.778, P=0.007 significantly predicted ASAS20 response at week 12. However, only ASDAS (AUC = 0.696, P=0.040 significantly predicted ASAS20 response at week 24. Achievement of ASAS20 response after the first infliximab infusion was a significant predictor of subsequent ASAS20 response at weeks 12 and 24 (wald χ2=6.87, P=0.009, and wald χ2=5.171, P=0.023. Conclusions. Infliximab shows efficiency in active nonradiographic axial spondyloarthritis patients. ASDAS score and first-dose response could help predicting clinical efficacy of infliximab therapy in these patients.

  4. Spatial Models for Prediction and Early Warning of Aedes aegypti Proliferation from Data on Climate Change and Variability in Cuba.

    Science.gov (United States)

    Ortiz, Paulo L; Rivero, Alina; Linares, Yzenia; Pérez, Alina; Vázquez, Juan R

    2015-04-01

    Climate variability, the primary expression of climate change, is one of the most important environmental problems affecting human health, particularly vector-borne diseases. Despite research efforts worldwide, there are few studies addressing the use of information on climate variability for prevention and early warning of vector-borne infectious diseases. Show the utility of climate information for vector surveillance by developing spatial models using an entomological indicator and information on predicted climate variability in Cuba to provide early warning of danger of increased risk of dengue transmission. An ecological study was carried out using retrospective and prospective analyses of time series combined with spatial statistics. Several entomological and climatic indicators were considered using complex Bultó indices -1 and -2. Moran's I spatial autocorrelation coefficient specified for a matrix of neighbors with a radius of 20 km, was used to identify the spatial structure. Spatial structure simulation was based on simultaneous autoregressive and conditional autoregressive models; agreement between predicted and observed values for number of Aedes aegypti foci was determined by the concordance index Di and skill factor Bi. Spatial and temporal distributions of populations of Aedes aegypti were obtained. Models for describing, simulating and predicting spatial patterns of Aedes aegypti populations associated with climate variability patterns were put forward. The ranges of climate variability affecting Aedes aegypti populations were identified. Forecast maps were generated for the municipal level. Using the Bultó indices of climate variability, it is possible to construct spatial models for predicting increased Aedes aegypti populations in Cuba. At 20 x 20 km resolution, the models are able to provide warning of potential changes in vector populations in rainy and dry seasons and by month, thus demonstrating the usefulness of climate information for

  5. External validation and clinical utility of a prediction model for 6-month mortality in patients undergoing hemodialysis for end-stage kidney disease.

    Science.gov (United States)

    Forzley, Brian; Er, Lee; Chiu, Helen Hl; Djurdjev, Ognjenka; Martinusen, Dan; Carson, Rachel C; Hargrove, Gaylene; Levin, Adeera; Karim, Mohamud

    2018-02-01

    End-stage kidney disease is associated with poor prognosis. Health care professionals must be prepared to address end-of-life issues and identify those at high risk for dying. A 6-month mortality prediction model for patients on dialysis derived in the United States is used but has not been externally validated. We aimed to assess the external validity and clinical utility in an independent cohort in Canada. We examined the performance of the published 6-month mortality prediction model, using discrimination, calibration, and decision curve analyses. Data were derived from a cohort of 374 prevalent dialysis patients in two regions of British Columbia, Canada, which included serum albumin, age, peripheral vascular disease, dementia, and answers to the "the surprise question" ("Would I be surprised if this patient died within the next year?"). The observed mortality in the validation cohort was 11.5% at 6 months. The prediction model had reasonable discrimination (c-stat = 0.70) but poor calibration (calibration-in-the-large = -0.53 (95% confidence interval: -0.88, -0.18); calibration slope = 0.57 (95% confidence interval: 0.31, 0.83)) in our data. Decision curve analysis showed the model only has added value in guiding clinical decision in a small range of threshold probabilities: 8%-20%. Despite reasonable discrimination, the prediction model has poor calibration in this external study cohort; thus, it may have limited clinical utility in settings outside of where it was derived. Decision curve analysis clarifies limitations in clinical utility not apparent by receiver operating characteristic curve analysis. This study highlights the importance of external validation of prediction models prior to routine use in clinical practice.

  6. New target prediction and visualization tools incorporating open source molecular fingerprints for TB Mobile 2.0.

    Science.gov (United States)

    Clark, Alex M; Sarker, Malabika; Ekins, Sean

    2014-01-01

    We recently developed a freely available mobile app (TB Mobile) for both iOS and Android platforms that displays Mycobacterium tuberculosis (Mtb) active molecule structures and their targets with links to associated data. The app was developed to make target information available to as large an audience as possible. We now report a major update of the iOS version of the app. This includes enhancements that use an implementation of ECFP_6 fingerprints that we have made open source. Using these fingerprints, the user can propose compounds with possible anti-TB activity, and view the compounds within a cluster landscape. Proposed compounds can also be compared to existing target data, using a näive Bayesian scoring system to rank probable targets. We have curated an additional 60 new compounds and their targets for Mtb and added these to the original set of 745 compounds. We have also curated 20 further compounds (many without targets in TB Mobile) to evaluate this version of the app with 805 compounds and associated targets. TB Mobile can now manage a small collection of compounds that can be imported from external sources, or exported by various means such as email or app-to-app inter-process communication. This means that TB Mobile can be used as a node within a growing ecosystem of mobile apps for cheminformatics. It can also cluster compounds and use internal algorithms to help identify potential targets based on molecular similarity. TB Mobile represents a valuable dataset, data-visualization aid and target prediction tool.

  7. Blood glucose, lactate, pyruvate, glycerol, 3-hydroxybutyrate and acetoacetate measurements in man using a centrifugal analyser with a fluorimetric attachment.

    Science.gov (United States)

    Harrison, J; Hodson, A W; Skillen, A W; Stappenbeck, R; Agius, L; Alberti, K G

    1988-03-01

    Methods are described for the analysis of glucose, lactate, pyruvate, alanine, glycerol, 3-hydroxybutyrate and acetoacetate in perchloric acid extracts of human blood, using the Cobas Bio centrifugal analyser fitted with a fluorimetric attachment. Intra-assay and inter-assay coefficients of variation ranged from 1.9 to 7.9% and from 1.0 to 7.2% respectively. Correlation coefficients ranged from 0.96 to 0.99 against established continuous-flow and manual spectrophotometric methods. All seven metabolites can be measured using a single perchloric acid extract of 20 microliter of blood. The versatility of the assays is such that as little as 100 pmol pyruvate, 3-hydroxybutyrate or as much as 15 nmol glucose can be measured in the same 20 microliter extract.

  8. Analysis of MRI by fractals for prediction of sensory attributes: A case study in loin

    DEFF Research Database (Denmark)

    Caballero, Daniel; Antequera, Teresa; Caro, Andrés

    2018-01-01

    This study investigates the use of fractal algorithms to analyse MRI of meat products, specifically loin, in order to determine sensory parameters of loin. For that, the capability of different fractal algorithms was evaluated (Classical Fractal Algorithm, CFA; Fractal Texture Algorithm, FTA...... was analysed. Results on this study firstly demonstrate the capability of fractal algorithms to analyse MRI from meat product. Different combinations of the analysed techniques can be applied for predicting most sensory attributes of loins adequately (R > 0.5). However, the combination of SE, OPFTA and MLR...... offered the most appropriate results. Thus, it could be proposed as an alternative to the traditional food technology methods....

  9. Measurement of the analysing power of elastic proton-proton scattering at 582 MeV

    International Nuclear Information System (INIS)

    Berdoz, A.; Favier, B.; Foroughi, F.; Weddigen, C.

    1984-01-01

    The authors have measured the analysing power of elastic proton-proton scattering at 582 MeV for 14 angles from 20 to 80 0 CM. The angular range was limited to >20 0 by the energy loss of the recoil protons. The experiment was performed at the PM1 beam line at SIN. A beam intensity of about 10 8 particles s -1 was used. (Auth.)

  10. Predicting Effects of Water Regime Changes on Waterbirds: Insights from Staging Swans.

    Science.gov (United States)

    Nolet, Bart A; Gyimesi, Abel; van Krimpen, Roderick R D; de Boer, Willem F; Stillman, Richard A

    2016-01-01

    Predicting the environmental impact of a proposed development is notoriously difficult, especially when future conditions fall outside the current range of conditions. Individual-based approaches have been developed and applied to predict the impact of environmental changes on wintering and staging coastal bird populations. How many birds make use of staging sites is mostly determined by food availability and accessibility, which in the case of many waterbirds in turn is affected by water level. Many water systems are regulated and water levels are maintained at target levels, set by management authorities. We used an individual-based modelling framework (MORPH) to analyse how different target water levels affect the number of migratory Bewick's swans Cygnus columbianus bewickii staging at a shallow freshwater lake (Lauwersmeer, the Netherlands) in autumn. As an emerging property of the model, we found strong non-linear responses of swan usage to changes in water level, with a sudden drop in peak numbers as well as bird-days with a 0.20 m rise above the current target water level. Such strong non-linear responses are probably common and should be taken into account in environmental impact assessments.

  11. Predicting Effects of Water Regime Changes on Waterbirds: Insights from Staging Swans.

    Directory of Open Access Journals (Sweden)

    Bart A Nolet

    Full Text Available Predicting the environmental impact of a proposed development is notoriously difficult, especially when future conditions fall outside the current range of conditions. Individual-based approaches have been developed and applied to predict the impact of environmental changes on wintering and staging coastal bird populations. How many birds make use of staging sites is mostly determined by food availability and accessibility, which in the case of many waterbirds in turn is affected by water level. Many water systems are regulated and water levels are maintained at target levels, set by management authorities. We used an individual-based modelling framework (MORPH to analyse how different target water levels affect the number of migratory Bewick's swans Cygnus columbianus bewickii staging at a shallow freshwater lake (Lauwersmeer, the Netherlands in autumn. As an emerging property of the model, we found strong non-linear responses of swan usage to changes in water level, with a sudden drop in peak numbers as well as bird-days with a 0.20 m rise above the current target water level. Such strong non-linear responses are probably common and should be taken into account in environmental impact assessments.

  12. Tributyltin synergizes with 20-hydroxyecdysone to produce endocrine toxicity.

    Science.gov (United States)

    Wang, Ying H; Kwon, Gwijun; Li, Hong; Leblanc, Gerald A

    2011-09-01

    One of the great challenges facing modern toxicology is in predicting the hazard associated with chemical mixtures. The development of effective means of predicting the toxicity of chemical mixtures requires an understanding of how chemicals interact to produce nonadditive outcomes (e.g., synergy). We hypothesized that tributyltin would elicit toxicity in daphnids (Daphnia magna) by exaggerating physiological responses to 20-hydroxyecdysone signaling via synergistic activation of the retinoid X receptor (RXR):ecdysteroid receptor (EcR) complex. Using reporter gene assays, we demonstrated that RXR, alone, is activated by a variety of ligands including tributyltin, whereas RXR:EcR heterodimers were not activated by tributyltin. However, tributyltin, in combination with the daphnid EcR ligand 20-hydroxyecdysone, caused concentration-dependent, synergistic activation of the RXR:EcR reporter. Electrophoretic mobility shift assays revealed that tributyltin did not enhance the activity of 20-hydroxyecdysone by increasing binding of the receptor complex to a DR-4 DNA-binding site. Exposure of daphnids to elevated concentrations of 20-hydroxyecdysone caused premature and incomplete ecdysis resulting in death. Tributyltin exaggerated this effect of exogenous 20-hydroxyecdysone. Further, exposure of daphnids to tributyltin enhanced the inductive effects of 20-hydroxyecdysone on expression of the 20-hydroxyecdysone-inducible gene HR3. Continuous, prolonged exposure of maternal daphnids to concentrations of tributyltin resulted in mortality concurrent with molting. Taken together, these results demonstrate that xenobiotics, such as tributyltin, can interact with RXR to influence gene expression regulated by the heterodimeric partner to RXR. The result of such interactions can be toxicity due to inappropriate or exaggerated hormonal signaling. The application of the in vitro/in vivo approach used in this study is discussed in relation to modeling of nonadditive interactions

  13. Prediction of size-fractionated airborne particle-bound metals using MLR, BP-ANN and SVM analyses.

    Science.gov (United States)

    Leng, Xiang'zi; Wang, Jinhua; Ji, Haibo; Wang, Qin'geng; Li, Huiming; Qian, Xin; Li, Fengying; Yang, Meng

    2017-08-01

    Size-fractionated heavy metal concentrations were observed in airborne particulate matter (PM) samples collected from 2014 to 2015 (spanning all four seasons) from suburban (Xianlin) and industrial (Pukou) areas in Nanjing, a megacity of southeast China. Rapid prediction models of size-fractionated metals were established based on multiple linear regression (MLR), back propagation artificial neural network (BP-ANN) and support vector machine (SVM) by using meteorological factors and PM concentrations as input parameters. About 38% and 77% of PM 2.5 concentrations in Xianlin and Pukou, respectively, were beyond the Chinese National Ambient Air Quality Standard limit of 75 μg/m 3 . Nearly all elements had higher concentrations in industrial areas, and in winter among the four seasons. Anthropogenic elements such as Pb, Zn, Cd and Cu showed larger percentages in the fine fraction (ø≤2.5 μm), whereas the crustal elements including Al, Ba, Fe, Ni, Sr and Ti showed larger percentages in the coarse fraction (ø > 2.5 μm). SVM showed a higher training correlation coefficient (R), and lower mean absolute error (MAE) as well as lower root mean square error (RMSE), than MLR and BP-ANN for most metals. All the three methods showed better prediction results for Ni, Al, V, Cd and As, whereas relatively poor for Cr and Fe. The daily airborne metal concentrations in 2015 were then predicted by the fully trained SVM models and the results showed the heaviest pollution of airborne heavy metals occurred in December and January, whereas the lightest pollution occurred in June and July. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Hidden Markov Model for quantitative prediction of snowfall

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  15. Association of Climatic Variability, Vector Population and Malarial Disease in District of Visakhapatnam, India: A Modeling and Prediction Analysis.

    Science.gov (United States)

    Srimath-Tirumula-Peddinti, Ravi Chandra Pavan Kumar; Neelapu, Nageswara Rao Reddy; Sidagam, Naresh

    2015-01-01

    Malarial incidence, severity, dynamics and distribution of malaria are strongly determined by climatic factors, i.e., temperature, precipitation, and relative humidity. The objectives of the current study were to analyse and model the relationships among climate, vector and malaria disease in district of Visakhapatnam, India to understand malaria transmission mechanism (MTM). Epidemiological, vector and climate data were analysed for the years 2005 to 2011 in Visakhapatnam to understand the magnitude, trends and seasonal patterns of the malarial disease. Statistical software MINITAB ver. 14 was used for performing correlation, linear and multiple regression analysis. Perennial malaria disease incidence and mosquito population was observed in the district of Visakhapatnam with peaks in seasons. All the climatic variables have a significant influence on disease incidence as well as on mosquito populations. Correlation coefficient analysis, seasonal index and seasonal analysis demonstrated significant relationships among climatic factors, mosquito population and malaria disease incidence in the district of Visakhapatnam, India. Multiple regression and ARIMA (I) models are best suited models for modeling and prediction of disease incidences and mosquito population. Predicted values of average temperature, mosquito population and malarial cases increased along with the year. Developed MTM algorithm observed a major MTM cycle following the June to August rains and occurring between June to September and minor MTM cycles following March to April rains and occurring between March to April in the district of Visakhapatnam. Fluctuations in climatic factors favored an increase in mosquito populations and thereby increasing the number of malarial cases. Rainfall, temperatures (20°C to 33°C) and humidity (66% to 81%) maintained a warmer, wetter climate for mosquito growth, parasite development and malaria transmission. Changes in climatic factors influence malaria directly by

  16. A comparison of cephalometric analyses for assessing sagittal jaw relationship

    International Nuclear Information System (INIS)

    Erum, G.; Fida, M.

    2008-01-01

    To compare the seven methods of cephalometric analysis for assessing sagittal jaw relationship and to determine the level of agreement between them. Seven methods, describing anteroposterior jaw relationships (A-B plane, ANB, Wits, AXB, AF-BF, FABA and Beta angle) were measured on the lateral cephalographs of 85 patients. Correlation analysis, using Cramer's V-test, was performed to determine the possible agreement between the pair of analyses. The mean age of the sample, comprising 35 males and 50 females was 15 years and 3 months. Statistically significant relationships were found among seven sagittal parameters with p-value <0.001. Very strong correlation was found between AXB and AF-BF distance (r=0.924); and weak correlation between ANB and Beta angle (r=0.377). Wits appraisal showed the greatest coefficient of variability. Despite varying strengths of association, statistically significant correlations were found among seven methods for assessing sagittal jaw relationship. FABA and A-B plane may be used to predict the skeletal class in addition to the established ANB angle. (author)

  17. Thermal and hydraulic analyses of the System 81 cold traps

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K.

    1977-06-15

    Thermal and hydraulic analyses of the System 81 Type I and II cold traps were completed except for thermal transients analysis. Results are evaluated, discussed, and reported. Analytical models were developed to determine the physical dimensions of the cold traps and to predict the performance. The FFTF cold trap crystallizer performances were simulated using the thermal model. This simulation shows that the analytical model developed predicts reasonably conservative temperatures. Pressure drop and sodium residence time calculations indicate that the present design will meet the requirements specified in the E-Specification. Steady state temperature data for the critical regions were generated to assess the magnitude of the thermal stress.

  18. Physiological and enzymatic analyses of pineapple subjected to ionizing radiation; Analises fisiologicas e enzimaticas em abacaxi submetido a tecnologia de radiacao ionizante

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Josenilda Maria da [Centro de Energia Nuclear na Agricultura (CENA), Piracicaba, SP (Brazil)]. E-mail: jmnilda@yahoo.com.br; Silva, Juliana Pizarro; Spoto, Marta Helena Fillet [Escola Superior de Agricultura Luiz de Queiroz (ESALQ/USP), Piracicaba, SP (Brazil)

    2007-07-15

    The physiological and enzymatic post-harvest characteristics of the pineapple cultivar Smooth Cayenne were evaluated after the fruits were gamma-irradiated with doses of 100 and 150 Gy and the fruits were stored for 10, 20 and 30 days at 12 deg C ({+-}1) and relative humidity of 85% ({+-}5). Physiological and enzymatic analyses were made for each storage period to evaluate the alterations resulting from the application of ionizing radiation. Control specimens showed higher values of soluble pectins, total pectins, reducing sugars, sucrose and total sugars and lower values of polyphenyloxidase and polygalacturonase enzyme activities. All the analyses indicated that storage time is a significantly influencing factor. The 100 Gy dosage and 20-day storage period presented the best results from the standpoint of maturation and conservation of the fruits quality. (author)

  19. A Prediction Study of Path Loss Models from 2-73.5 GHz in an Urban-Macro Environment

    DEFF Research Database (Denmark)

    Thomas, Timothy; Rybakowski, Marcin; Sun, Shu

    2016-01-01

    can roughly be broken into two categories, ones that have some anchor in physics, and ones that curve-match only over the data set without any physical anchor. In this paper we use both real-world measurements from 2.0 to 28 GHz and ray-tracing studies from 2.0 to 73.5 GHz, both in an urban-macro...... environment, to assess the prediction performance of the two PL modeling techniques. In other words we look at how the two different PL modeling techniques perform when the PL model is applied to a prediction set which is different in distance, frequency, or environment from a measurement set where...

  20. Static and fatigue experimental tests on a full scale fuselage panel and FEM analyses

    Directory of Open Access Journals (Sweden)

    Raffaele Sepe

    2016-02-01

    Full Text Available A fatigue test on a full scale panel with complex loading condition and geometry configuration has been carried out using a triaxial test machine. The demonstrator is made up of two skins which are linked by a transversal butt-joint, parallel to the stringer direction. A fatigue load was applied in the direction normal to the longitudinal joint, while a constant load was applied in the longitudinal joint direction. The test panel was instrumented with strain gages and previously quasi-static tests were conducted to ensure a proper load transferring to the panel. In order to support the tests, geometric nonlinear shell finite element analyses were conducted to predict strain and stress distributions. The demonstrator broke up after about 177000 cycles. Subsequently, a finite element analysis (FEA was carried out in order to correlate failure events; due to the biaxial nature of the fatigue loads, Sines criterion was used. The analysis was performed taking into account the different materials by which the panel is composed. The numerical results show a good correlation with experimental data, successfully predicting failure locations on the panel.

  1. Analysis of the uranium price predicted to 24 months, implementing neural networks and the Monte Carlo method like predictive tools

    International Nuclear Information System (INIS)

    Esquivel E, J.; Ramirez S, J. R.; Palacios H, J. C.

    2011-11-01

    The present work shows predicted prices of the uranium, using a neural network. The importance of predicting financial indexes of an energy resource, in this case, allows establishing budgetary measures, as well as the costs of the resource to medium period. The uranium is part of the main energy generating fuels and as such, its price rebounds in the financial analyses, due to this is appealed to predictive methods to obtain an outline referent to the financial behaviour that will have in a certain time. In this study, two methodologies are used for the prediction of the uranium price: the Monte Carlo method and the neural networks. These methods allow predicting the indexes of monthly costs, for a two years period, starting from the second bimonthly of 2011. For the prediction the uranium costs are used, registered from the year 2005. (Author)

  2. Nurses' intention to leave: critically analyse the theory of reasoned action and organizational commitment model.

    Science.gov (United States)

    Liou, Shwu-Ru

    2009-01-01

    To systematically analyse the Organizational Commitment model and Theory of Reasoned Action and determine concepts that can better explain nurses' intention to leave their job. The Organizational Commitment model and Theory of Reasoned Action have been proposed and applied to understand intention to leave and turnover behaviour, which are major contributors to nursing shortage. However, the appropriateness of applying these two models in nursing was not analysed. Three main criteria of a useful model were used for the analysis: consistency in the use of concepts, testability and predictability. Both theories use concepts consistently. Concepts in the Theory of Reasoned Action are defined broadly whereas they are operationally defined in the Organizational Commitment model. Predictability of the Theory of Reasoned Action is questionable whereas the Organizational Commitment model can be applied to predict intention to leave. A model was proposed based on this analysis. Organizational commitment, intention to leave, work experiences, job characteristics and personal characteristics can be concepts for predicting nurses' intention to leave. Nursing managers may consider nurses' personal characteristics and experiences to increase their organizational commitment and enhance their intention to stay. Empirical studies are needed to test and cross-validate the re-synthesized model for nurses' intention to leave their job.

  3. The 20-20-20 Airships NASA Centennial Challenge

    Science.gov (United States)

    Kiessling, Alina; Diaz, Ernesto; Rhodes, Jason; Ortega, Sam; Eberly, Eric

    2015-08-01

    A 2013 Keck Institute for Space Studies (KISS) study examined airships as a possible platform for Earth and space science. Airships, lighter than air, powered, maneuverable vehicles, could offer significant gains in observing time, sky and ground coverage, data downlink capability, and continuity of observations over existing suborbital options at competitive prices. The KISS study recommended three courses of action to spur the development and use of airships as a science platform. One of those recommendations was that a prize competition be developed to demonstrate a stratospheric airship. Consequently, we have been developing a NASA Centennial Challenge; (www.nasa.gov/challenges) to spur innovation in stratospheric airships as a science platform. We anticipate a multi-million dollar class prize for the first organization to fly a powered airship that remains stationary at 20km (65,000 ft) altitude for over 20 hours with a 20kg payload. The design must be scalable to longer flights with more massive payloads. A second prize tier, for a 20km flight lasting 200 hours with a 200kg payload would incentivize a further step toward a scientifically compelling and viable new platform. This technology would also have broad commercial applications including communications, asset tracking, and surveillance. Via the 20-20-20 Centennial Challenge, we are seeking to spur private industry (or non-profit institutions, including Universities) to demonstrate the capability for sustained airship flights as astronomy and Earth science platforms.

  4. Quality of meta-analyses in major leading gastroenterology and hepatology journals: A systematic review.

    Science.gov (United States)

    Liu, Pengfei; Qiu, Yuanyu; Qian, Yuting; Chen, Xiao; Wang, Yiran; Cui, Jin; Zhai, Xiao

    2017-01-01

    To appraise the current reporting methodological quality of meta-analyses in five leading gastroenterology and hepatology journals, and to identify the variables associated with the reporting quality. We systematically searched the literature of meta-analyses in Gastroenterology, Gut, Hepatology, Journal of Hepatology (J HEPATOL) and American Journal of Gastroenterology (AM J GASTROENTEROL) from 2006 to 2008 and from 2012 to 2014. Characteristics were extracted based on the PRISMA statement and the AMSTAR tool. Country, number of patients, funding source were also revealed and descriptively reported. A total of 127 meta-analyses were enrolled in this study and were compared among journals, study years, and other characters. Compliances with the PRISMA statement and the AMSTAR checklist were 20.8 ± 4.2 out of a maximum of 27 and 7.6 ± 2.4 out of a maximum of 11, respectively. Some domains were poorly reported including describing a protocol and/or registration (item 5, 0.0%), describing methods, and giving results of additional analyses (item 16, 45.7% and item 23, 48.0%) for PRISMA and duplicating study selection and data extraction (item 2, 53.5%), and providing a list of included and excluded studies (item 5, 14.2%) for AMSTAR. Publication in recent years showed a significantly better methodological quality than those published in previous years. This study shows that methodological reporting quality of MAs in the major gastroenterology and hepatology journals has improved in recent years after the publication of the developed PRISMA statement, and it can be further improved. © 2016 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.

  5. A statistical model to predict total column ozone in Peninsular Malaysia

    Science.gov (United States)

    Tan, K. C.; Lim, H. S.; Mat Jafri, M. Z.

    2016-03-01

    This study aims to predict monthly columnar ozone in Peninsular Malaysia based on concentrations of several atmospheric gases. Data pertaining to five atmospheric gases (CO2, O3, CH4, NO2, and H2O vapor) were retrieved by satellite scanning imaging absorption spectrometry for atmospheric chartography from 2003 to 2008 and used to develop a model to predict columnar ozone in Peninsular Malaysia. Analyses of the northeast monsoon (NEM) and the southwest monsoon (SWM) seasons were conducted separately. Based on the Pearson correlation matrices, columnar ozone was negatively correlated with H2O vapor but positively correlated with CO2 and NO2 during both the NEM and SWM seasons from 2003 to 2008. This result was expected because NO2 is a precursor of ozone. Therefore, an increase in columnar ozone concentration is associated with an increase in NO2 but a decrease in H2O vapor. In the NEM season, columnar ozone was negatively correlated with H2O (-0.847), NO2 (0.754), and CO2 (0.477); columnar ozone was also negatively but weakly correlated with CH4 (-0.035). In the SWM season, columnar ozone was highly positively correlated with NO2 (0.855), CO2 (0.572), and CH4 (0.321) and also highly negatively correlated with H2O (-0.832). Both multiple regression and principal component analyses were used to predict the columnar ozone value in Peninsular Malaysia. We obtained the best-fitting regression equations for the columnar ozone data using four independent variables. Our results show approximately the same R value (≈ 0.83) for both the NEM and SWM seasons.

  6. Ecohydrological drought monitoring and prediction using a land data assimilation system

    Science.gov (United States)

    Sawada, Y.; Koike, T.

    2017-12-01

    Despite the importance of the ecological and agricultural aspects of severe droughts, few drought monitor and prediction systems can forecast the deficit of vegetation growth. To address this issue, we have developed a land data assimilation system (LDAS) which can simultaneously simulate soil moisture and vegetation dynamics. By assimilating satellite-observed passive microwave brightness temperature, which is sensitive to both surface soil moisture and vegetation water content, we can significantly improve the skill of a land surface model to simulate surface soil moisture, root zone soil moisture, and leaf area index (LAI). We run this LDAS to generate a global ecohydrological land surface reanalysis product. In this presentation, we will demonstrate how useful this new reanalysis product is to monitor and analyze the historical mega-droughts. In addition, using the analyses of soil moistures and LAI as initial conditions, we can forecast the ecological and hydrological conditions in the middle of droughts. We will present our recent effort to develop a near real time ecohydrological drought monitoring and prediction system in Africa by combining the LDAS and the atmospheric seasonal prediction.

  7. Genomic prediction of complex human traits: relatedness, trait architecture and predictive meta-models

    Science.gov (United States)

    Spiliopoulou, Athina; Nagy, Reka; Bermingham, Mairead L.; Huffman, Jennifer E.; Hayward, Caroline; Vitart, Veronique; Rudan, Igor; Campbell, Harry; Wright, Alan F.; Wilson, James F.; Pong-Wong, Ricardo; Agakov, Felix; Navarro, Pau; Haley, Chris S.

    2015-01-01

    We explore the prediction of individuals' phenotypes for complex traits using genomic data. We compare several widely used prediction models, including Ridge Regression, LASSO and Elastic Nets estimated from cohort data, and polygenic risk scores constructed using published summary statistics from genome-wide association meta-analyses (GWAMA). We evaluate the interplay between relatedness, trait architecture and optimal marker density, by predicting height, body mass index (BMI) and high-density lipoprotein level (HDL) in two data cohorts, originating from Croatia and Scotland. We empirically demonstrate that dense models are better when all genetic effects are small (height and BMI) and target individuals are related to the training samples, while sparse models predict better in unrelated individuals and when some effects have moderate size (HDL). For HDL sparse models achieved good across-cohort prediction, performing similarly to the GWAMA risk score and to models trained within the same cohort, which indicates that, for predicting traits with moderately sized effects, large sample sizes and familial structure become less important, though still potentially useful. Finally, we propose a novel ensemble of whole-genome predictors with GWAMA risk scores and demonstrate that the resulting meta-model achieves higher prediction accuracy than either model on its own. We conclude that although current genomic predictors are not accurate enough for diagnostic purposes, performance can be improved without requiring access to large-scale individual-level data. Our methodologically simple meta-model is a means of performing predictive meta-analysis for optimizing genomic predictions and can be easily extended to incorporate multiple population-level summary statistics or other domain knowledge. PMID:25918167

  8. Muscle Strength Is a Poor Screening Test for Predicting Lower Extremity Injuries in Professional Male Soccer Players: A 2-Year Prospective Cohort Study.

    Science.gov (United States)

    Bakken, Arnhild; Targett, Stephen; Bere, Tone; Eirale, Cristiano; Farooq, Abdulaziz; Mosler, Andrea B; Tol, Johannes L; Whiteley, Rod; Khan, Karim M; Bahr, Roald

    2018-03-01

    Lower extremity muscle strength tests are commonly used to screen for injury risk in professional soccer. However, there is limited evidence on the ability of such tests in predicting future injuries. To examine the association between hip and thigh muscle strength and the risk of lower extremity injuries in professional male soccer players. Case-control study; Level of evidence, 3. Professional male soccer players from 14 teams in Qatar underwent a comprehensive strength assessment at the beginning of the 2013/2014 and 2014/2015 seasons. Testing consisted of concentric and eccentric quadriceps and hamstring isokinetic peak torques, eccentric hip adduction and abduction forces, and bilateral isometric adductor force (squeeze test at 45°). Time-loss injuries and exposure in training and matches were registered prospectively by club medical staff throughout each season. Univariate and multivariate Cox regression analyses were used to calculate hazard ratios (HRs) with 95% CIs. In total, 369 players completed all strength tests and had registered injury and exposure data. Of these, 206 players (55.8%) suffered 538 lower extremity injuries during the 2 seasons; acute muscle injuries were the most frequent. Of the 20 strength measures examined, greater quadriceps concentric peak torque at 300 deg/s (HR, 1.005 [95% CI, 1.00-1.01]; P = .037) was the only strength measure identified as significantly associated with a risk of lower extremity injuries in multivariate analysis. Greater quadriceps concentric peak torque at 60 deg/s (HR, 1.004 [95% CI, 1.00-1.01]; P = .026) was associated with the risk of overuse injuries, and greater bilateral adductor strength adjusted for body weight (HR, 0.75 [95% CI, 0.57-0.97; P = .032) was associated with a lower risk for any knee injury. Receiver operating characteristic curve analyses indicated poor predictive ability of the significant strength variables (area under the curve, 0.45-0.56). There was a weak association with the risk of

  9. How often do sensitivity analyses for economic parameters change cost-utility analysis conclusions?

    Science.gov (United States)

    Schackman, Bruce R; Gold, Heather Taffet; Stone, Patricia W; Neumann, Peter J

    2004-01-01

    There is limited evidence about the extent to which sensitivity analysis has been used in the cost-effectiveness literature. Sensitivity analyses for health-related QOL (HR-QOL), cost and discount rate economic parameters are of particular interest because they measure the effects of methodological and estimation uncertainties. To investigate the use of sensitivity analyses in the pharmaceutical cost-utility literature in order to test whether a change in economic parameters could result in a different conclusion regarding the cost effectiveness of the intervention analysed. Cost-utility analyses of pharmaceuticals identified in a prior comprehensive audit (70 articles) were reviewed and further audited. For each base case for which sensitivity analyses were reported (n = 122), up to two sensitivity analyses for HR-QOL (n = 133), cost (n = 99), and discount rate (n = 128) were examined. Article mentions of thresholds for acceptable cost-utility ratios were recorded (total 36). Cost-utility ratios were denominated in US dollars for the year reported in each of the original articles in order to determine whether a different conclusion would have been indicated at the time the article was published. Quality ratings from the original audit for articles where sensitivity analysis results crossed the cost-utility ratio threshold above the base-case result were compared with those that did not. The most frequently mentioned cost-utility thresholds were $US20,000/QALY, $US50,000/QALY, and $US100,000/QALY. The proportions of sensitivity analyses reporting quantitative results that crossed the threshold above the base-case results (or where the sensitivity analysis result was dominated) were 31% for HR-QOL sensitivity analyses, 20% for cost-sensitivity analyses, and 15% for discount-rate sensitivity analyses. Almost half of the discount-rate sensitivity analyses did not report quantitative results. Articles that reported sensitivity analyses where results crossed the cost

  10. [Construction of a diagnostic prediction model of severe bacterial infection in febrile infants under 3 months old].

    Science.gov (United States)

    Villalobos Pinto, Enrique; Sánchez-Bayle, Marciano

    2017-12-01

    Fever is a common cause of paediatric admissions in emergency departments. An aetiological diagnosis is difficult to obtain in those less than 3 months of age, as they tend to have a higher rate of serious bacterial infection (SBI). The aim of this study is to find a predictor index of SBI in children under 3 months old with fever of unknown origin. A study was conducted on all children under 3 months of age with fever admitted to hospital, with additional tests being performed according to the clinical protocol. Rochester criteria for identifying febrile infants at low risk for SBI were also analysed. A predictive model for SBI and positive cultures was designed, including the following variables in the maximum model: C-reactive protein (CRP), procalcitonin (PCT), and meeting not less than four of the Rochester criteria. A total of 702 subjects were included, of which 22.64% had an SBI and 20.65% had positive cultures. Children who had SBI and a positive culture showed higher values of white cells, total neutrophils, CRP and PCT. A statistical significance was observed with less than 4 Rochester criteria, CRP and PCT levels, an SBI (area under the curve [AUC] 0.877), or for positive cultures (AUC 0.888). Using regression analysis a predictive index was calculated for SBI or a positive culture, with a sensitivity of 87.7 and 91%, a specificity of 70.1 and 87.7%, an LR+ of 2.93 and 3.62, and a LR- of 0.17 and 0.10, respectively. The predictive models are valid and slightly improve the validity of the Rochester criteria for positive culture in children less than 3 months admitted with fever. Copyright © 2016 Asociación Española de Pediatría. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. CABS-flex 2.0: a web server for fast simulations of flexibility of protein structures.

    Science.gov (United States)

    Kuriata, Aleksander; Gierut, Aleksandra Maria; Oleniecki, Tymoteusz; Ciemny, Maciej Pawel; Kolinski, Andrzej; Kurcinski, Mateusz; Kmiecik, Sebastian

    2018-05-14

    Classical simulations of protein flexibility remain computationally expensive, especially for large proteins. A few years ago, we developed a fast method for predicting protein structure fluctuations that uses a single protein model as the input. The method has been made available as the CABS-flex web server and applied in numerous studies of protein structure-function relationships. Here, we present a major update of the CABS-flex web server to version 2.0. The new features include: extension of the method to significantly larger and multimeric proteins, customizable distance restraints and simulation parameters, contact maps and a new, enhanced web server interface. CABS-flex 2.0 is freely available at http://biocomp.chem.uw.edu.pl/CABSflex2.

  12. Density functional calculations on the geometric structure and properties of the 3d transition metal atom doped endohedral fullerene M@C20F20 (M = Sc–Ni)

    International Nuclear Information System (INIS)

    Chun-Mei, Tang; Wei-Hua, Zhu; Kai-Ming, Deng

    2010-01-01

    This paper uses the generalised gradient approximation based on density functional theory to analyse the geometric structure and properties of the 3d transition metal atom doped endohedral fullerene M@C 20 F 20 (M = Sc–Ni). The geometric optimization shows that the cage centre is the most stable position for M, forming the structure named as M@C 20 F 20 -4. The inclusion energy, zero-point energy, and energy gap calculations tell us that N@C 20 F 20 -4 should be thermodynamically and kinetically stablest. M@C 20 F 20 -4 (M = Sc–Co) possesses high magnetic moments varied from 1 to 6 μ B , while Ni@C 20 F 20 -4 is nonmagnetic. The Ni–C bond in Ni@C 20 F 20 -4 contains both the covalent and ionic characters

  13. Factors predicting radiation pneumonitis in lung cancer patients: a retrospective study

    International Nuclear Information System (INIS)

    Rancati, T.; Ceresoli, G.L.; Gagliardi, G.; Schipani, S.; Cattaneo, G.M.

    2003-01-01

    Purpose: To evaluate clinical and lung dose-volume histogram based factors as predictors of radiation pneumonitis (RP) in lung cancer patients (PTs) treated with thoracic irradiation. Methods and materials: Records of all lung cancer PTs irradiated at our Institution between 1994 and 2000 were retrospectively reviewed. Eighty-four PTs with small or non-small-cell lung cancer, irradiated at >40 Gy, with full 3D dosimetry data and a follow-up time of >6 months from start of treatment, were analysed for RP. Pneumonitis was scored on the basis of SWOG toxicity criteria and was considered a complication when grade≥II. The following clinical parameters were considered: gender, age, surgery, chemotherapy agents, presence of chronic obstructive pulmonary disease (COPD), performance status. Dosimetric factors including prescribed dose (D iso ), presence of final conformal boost, mean lung dose (D mean ), % of lung receiving ≥20, 25, 30, 35, 40, and 45 Gy (respectively V 20 →V 45 ), and normal tissue complication probability (NTCP) values were analysed. DVHs data and NTCP values were collected for both lungs considered as a paired organ. Median and quartile values were taken as cut-off for statistical analysis. Factors that influenced RP were assessed by univariate (log-rank) and multivariate analyses (Cox hazard model). Results: There were 14 PTs (16.6%) who had ≥grade II pulmonary toxicity. In the entire population, the univariate analysis revealed that many dosimetric parameters (D iso , V 20 , V 30 , V 40 , V 45 ) were significantly associated with RP. No significant correlation was found between the incidence of RP and D mean or NTCP values. Multivariate analysis revealed that the use of mitomycin (MMC) (P=0.005) and the presence of COPD (P=0.026) were the most important risk factor for RP. In the group without COPD (55 PTs, seven RP) a few dosimetric factors (D mean , V 20 , V 45 ) and NTCP values (all models) were associated with RP in the univariate analysis

  14. Link Prediction Methods and Their Accuracy for Different Social Networks and Network Metrics

    Directory of Open Access Journals (Sweden)

    Fei Gao

    2015-01-01

    Full Text Available Currently, we are experiencing a rapid growth of the number of social-based online systems. The availability of the vast amounts of data gathered in those systems brings new challenges that we face when trying to analyse it. One of the intensively researched topics is the prediction of social connections between users. Although a lot of effort has been made to develop new prediction approaches, the existing methods are not comprehensively analysed. In this paper we investigate the correlation between network metrics and accuracy of different prediction methods. We selected six time-stamped real-world social networks and ten most widely used link prediction methods. The results of the experiments show that the performance of some methods has a strong correlation with certain network metrics. We managed to distinguish “prediction friendly” networks, for which most of the prediction methods give good performance, as well as “prediction unfriendly” networks, for which most of the methods result in high prediction error. Correlation analysis between network metrics and prediction accuracy of prediction methods may form the basis of a metalearning system where based on network characteristics it will be able to recommend the right prediction method for a given network.

  15. Gastro-oesophageal reflux disease in 20 dogs (2012 to 2014).

    Science.gov (United States)

    Muenster, M; Hoerauf, A; Vieth, M

    2017-05-01

    To describe the clinical features of canine gastro-oesophageal reflux disease. A search of our medical records produced 20 dogs with clinical signs attributable to oesophageal disease, hyper-regeneratory oesophagopathy and no other oesophageal disorders. The clinical, endoscopic and histological findings of the dogs were analysed. The 3-year incidence of gastro-oesophageal reflux disease was 0·9% of our referral dog population. Main clinical signs were regurgitation, discomfort or pain (each, 20/20 dogs) and ptyalism (18/20 dogs). Oesophagoscopy showed no (5/20 dogs) or minimal (13/20 dogs) mucosal lesions. In oesophageal mucosal biopsy specimens, there were hyperplastic changes of the basal cell layer (13/20 dogs), stromal papillae (14/20 dogs) and entire epithelium (9/20 dogs). Eleven dogs received omeprazole or pantoprazole and regurgitation and ptyalism improved in eight and pain diminished in six of these dogs within three to six weeks. Our findings suggest that canine gastro-oesophageal reflux disease is a more common clinical problem than hitherto suspected. © 2017 British Small Animal Veterinary Association.

  16. Experimental benchmark for piping system dynamic-response analyses

    International Nuclear Information System (INIS)

    1981-01-01

    This paper describes the scope and status of a piping system dynamics test program. A 0.20 m(8 in.) nominal diameter test piping specimen is designed to be representative of main heat transport system piping of LMFBR plants. Particular attention is given to representing piping restraints. Applied loadings consider component-induced vibration as well as seismic excitation. The principal objective of the program is to provide a benchmark for verification of piping design methods by correlation of predicted and measured responses. Pre-test analysis results and correlation methods are discussed

  17. Experimental benchmark for piping system dynamic response analyses

    International Nuclear Information System (INIS)

    Schott, G.A.; Mallett, R.H.

    1981-01-01

    The scope and status of a piping system dynamics test program are described. A 0.20-m nominal diameter test piping specimen is designed to be representative of main heat transport system piping of LMFBR plants. Attention is given to representing piping restraints. Applied loadings consider component-induced vibration as well as seismic excitation. The principal objective of the program is to provide a benchmark for verification of piping design methods by correlation of predicted and measured responses. Pre-test analysis results and correlation methods are discussed. 3 refs

  18. BepiPred-2.0: improving sequence-based B-cell epitope prediction using conformational epitopes

    DEFF Research Database (Denmark)

    Jespersen, Martin Closter; Peters, Bjoern; Nielsen, Morten

    2017-01-01

    Antibodies have become an indispensable tool for many biotechnological and clinical applications. They bind their molecular target (antigen) by recognizing a portion of its structure (epitope) in a highly specific manner. The ability to predict epitopes from antigen sequences alone is a complex t...

  19. The predictive value of bronchial histamine challenge in the diagnosis of bronchial asthma

    DEFF Research Database (Denmark)

    Madsen, F; Holstein-Rathlou, N H; Mosbech, H

    1985-01-01

    as asthmatics (n = 97) or non-asthmatics (n = 54). The diagnostic properties of the challenge were calculated using the statement of Baye. Considering PC20 values below 4.00 mg/ml as positive, the predictive value of a positive test was about 0.80 and the predictive value of a negative about 0.76. When PC20...

  20. An Extended Assessment of Fluid Flow Models for the Prediction of Two-Dimensional Steady-State Airfoil Aerodynamics

    Directory of Open Access Journals (Sweden)

    José F. Herbert-Acero

    2015-01-01

    Full Text Available This work presents the analysis, application, and comparison of thirteen fluid flow models in the prediction of two-dimensional airfoil aerodynamics, considering laminar and turbulent subsonic inflow conditions. Diverse sensitivity analyses of different free parameters (e.g., the domain topology and its discretization, the flow model, and the solution method together with its convergence mechanisms revealed important effects on the simulations’ outcomes. The NACA 4412 airfoil was considered throughout the work and the computational predictions were compared with experiments conducted under a wide range of Reynolds numbers (7e5≤Re≤9e6 and angles-of-attack (-10°≤α≤20°. Improvements both in modeling accuracy and processing time were achieved by considering the RS LP-S and the Transition SST turbulence models, and by considering finite volume-based solution methods with preconditioned systems, respectively. The RS LP-S model provided the best lift force predictions due to the adequate modeling of the micro and macro anisotropic turbulence at the airfoil’s surface and at the nearby flow field, which in turn allowed the adequate prediction of stall conditions. The Transition-SST model provided the best drag force predictions due to adequate modeling of the laminar-to-turbulent flow transition and the surface shear stresses. Conclusions, recommendations, and a comprehensive research agenda are presented based on validated computational results.

  1. Predictive validity of the SVR-20 and Static-99 in a Dutch sample of treated sex offenders

    NARCIS (Netherlands)

    de Vogel, V.; de Ruiter, C.; van Beek, D.; Mead, G.

    2004-01-01

    In this retrospective study, the interrater reliability and redictive validity of 2 risk assessment instruments for sexual violence are presented. The SVR-20, an instrument for structured professional judgment, and the Static-99, an actuarial risk assessment instrument, were coded from file

  2. Prediction of pork quality parameters by applying fractals and data mining on MRI

    DEFF Research Database (Denmark)

    Caballero, Daniel; Pérez-Palacios, Trinidad; Caro, Andrés

    2017-01-01

    This work firstly investigates the use of MRI, fractal algorithms and data mining techniques to determine pork quality parameters non-destructively. The main objective was to evaluate the capability of fractal algorithms (Classical Fractal algorithm, CFA; Fractal Texture Algorithm, FTA and One...... Point Fractal Texture Algorithm, OPFTA) to analyse MRI in order to predict quality parameters of loin. In addition, the effect of the sequence acquisition of MRI (Gradient echo, GE; Spin echo, SE and Turbo 3D, T3D) and the predictive technique of data mining (Isotonic regression, IR and Multiple linear...... regression, MLR) were analysed. Both fractal algorithm, FTA and OPFTA are appropriate to analyse MRI of loins. The sequence acquisition, the fractal algorithm and the data mining technique seems to influence on the prediction results. For most physico-chemical parameters, prediction equations with moderate...

  3. Bayesian Calibration, Validation and Uncertainty Quantification for Predictive Modelling of Tumour Growth: A Tutorial.

    Science.gov (United States)

    Collis, Joe; Connor, Anthony J; Paczkowski, Marcin; Kannan, Pavitra; Pitt-Francis, Joe; Byrne, Helen M; Hubbard, Matthew E

    2017-04-01

    In this work, we present a pedagogical tumour growth example, in which we apply calibration and validation techniques to an uncertain, Gompertzian model of tumour spheroid growth. The key contribution of this article is the discussion and application of these methods (that are not commonly employed in the field of cancer modelling) in the context of a simple model, whose deterministic analogue is widely known within the community. In the course of the example, we calibrate the model against experimental data that are subject to measurement errors, and then validate the resulting uncertain model predictions. We then analyse the sensitivity of the model predictions to the underlying measurement model. Finally, we propose an elementary learning approach for tuning a threshold parameter in the validation procedure in order to maximize predictive accuracy of our validated model.

  4. Indian Point 2 steam generator tube rupture analyses

    International Nuclear Information System (INIS)

    Dayan, A.

    1985-01-01

    Analyses were conducted with RETRAN-02 to study consequences of steam generator tube rupture (SGTR) events. The Indian Point, Unit 2, power plant (IP2, PWR) was modeled as a two asymmetric loops, consisting of 27 volumes and 37 junctions. The break section was modeled once, conservatively, as a 150% flow area opening at the wall of the steam generator cold leg plenum, and once as a 200% double-ended tube break. Results revealed 60% overprediction of breakflow rates by the traditional conservative model. Two SGTR transients were studied, one with low-pressure reactor trip and one with an earlier reactor trip via over temperature ΔT. The former is more typical to a plant with low reactor average temperature such as IP2. Transient analyses for a single tube break event over 500 seconds indicated continued primary subcooling and no need for steam line pressure relief. In addition, SGTR transients with reactor trip while the pressurizer still contains water were found to favorably reduce depressurization rates. Comparison of the conservative results with independent LOFTRAN predictions showed good agreement

  5. Proposed Testing to Assess the Accuracy of Glass-To-Metal Seal Stress Analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, Robert S.; Emery, John M; Tandon, Rajan; Antoun, Bonnie R.; Stavig, Mark E.; Newton, Clay S.; Gibson, Cory S; Bencoe, Denise N.

    2014-09-01

    The material characterization tests conducted on 304L VAR stainless steel and Schott 8061 glass have provided higher fidelity data for calibration of material models used in Glass - T o - Metal (GTM) seal analyses. Specifically, a Thermo - Multi - Linear Elastic Plastic ( thermo - MLEP) material model has be en defined for S S304L and the Simplified Potential Energy Clock nonlinear visc oelastic model has been calibrated for the S8061 glass. To assess the accuracy of finite element stress analyses of GTM seals, a suite of tests are proposed to provide data for comparison to mo del predictions.

  6. Web 2.0 Technologies and Applications in the Best Practice Networks and Communities

    Science.gov (United States)

    Dagiene, Valentina; Kurilovas, Eugenijus

    2010-01-01

    The paper is aimed to analyse the external expert evaluation results of eContent"plus" programme's iCOPER (Interoperable Content for Performance in a Competency-driven Society) project's deliverables, especially quality control and Web 2.0 technologies report. It is a suitability report for better practice concerning the use of Web 2.0

  7. Application of neural networks and its prospect. 4. Prediction of major disruptions in tokamak plasmas, analyses of time series data

    International Nuclear Information System (INIS)

    Yoshino, Ryuji

    2006-01-01

    Disruption prediction of tokamak plasma has been studied by neural network. The disruption prediction performances by neural network are estimated by the prediction success rate, false alarm rate, and time prior to disruption. The current driving type disruption is predicted by time series data, and plasma lifetime, risk of disruption and plasma stability. Some disruptions generated by density limit, impurity mixture, error magnetic field can be predicted 100 % of prediction success rate by the premonitory symptoms. The pressure driving type disruption phenomena generate some hundred micro seconds before, so that the operation limits such as β N limit of DIII-D and density limit of ADITYA were investigated. The false alarm rate was decreased by β N limit training under stable discharge. The pressure driving disruption generated with increasing plasma pressure can be predicted about 90 % by evaluating plasma stability. (S.Y.)

  8. Can adverse maternal and perinatal outcomes be predicted when blood pressure becomes elevated? Secondary analyses from the CHIPS (Control of Hypertension In Pregnancy Study) randomized controlled trial.

    Science.gov (United States)

    Magee, Laura A; von Dadelszen, Peter; Singer, Joel; Lee, Terry; Rey, Evelyne; Ross, Susan; Asztalos, Elizabeth; Murphy, Kellie E; Menzies, Jennifer; Sanchez, Johanna; Gafni, Amiram; Gruslin, Andrée; Helewa, Michael; Hutton, Eileen; Lee, Shoo K; Logan, Alexander G; Ganzevoort, Wessel; Welch, Ross; Thornton, Jim G; Moutquin, Jean Marie

    2016-07-01

    For women with chronic or gestational hypertension in CHIPS (Control of Hypertension In Pregnancy Study, NCT01192412), we aimed to examine whether clinical predictors collected at randomization could predict adverse outcomes. This was a planned, secondary analysis of data from the 987 women in the CHIPS Trial. Logistic regression was used to examine the impact of 19 candidate predictors on the probability of adverse perinatal (pregnancy loss or high level neonatal care for >48 h, or birthweight hypertension, preeclampsia, or delivery at blood pressure within 1 week before randomization. Continuous variables were represented continuously or dichotomized based on the smaller p-value in univariate analyses. An area-under-the-receiver-operating-curve (AUC ROC) of ≥0.70 was taken to reflect a potentially useful model. Point estimates for AUC ROC were hypertension (0.70, 95% CI 0.67-0.74) and delivery at hypertension develop an elevated blood pressure in pregnancy, or formerly normotensive women develop new gestational hypertension, maternal and current pregnancy clinical characteristics cannot predict adverse outcomes in the index pregnancy. © 2016 The Authors. Acta Obstetricia et Gynecologica Scandinavica published by John Wiley & Sons Ltd on behalf of Nordic Federation of Societies of Obstetrics and Gynecology (NFOG).

  9. Contribution of thermo-fluid analyses to the LHC experiments

    CERN Document Server

    Gasser, G

    2003-01-01

    The big amount of electrical and electronic equipment that will be installed in the four LHC experiments will cause important heat dissipation into the detectors’ volumes. This is a major issue for the experimental groups, as temperature stability is often a fundamental requirement for the different sub-detectors to be able to provide a good measurement quality. The thermofluid analyses that are carried out in the ST/CV group are a very efficient tool to understand and predict the thermal behaviour of the detectors. These studies are undertaken according to the needs of the experimental groups; they aim at evaluate the thermal stability for a proposed design, or to compare different technical solutions in order to choose the best one for the final design. The usual approach to carry out these studies is first presented and then, some practical examples of thermo-fluid analyses are presented focusing on the main results in order to illustrate their contribution.

  10. Development of temporal modelling for forecasting and prediction of malaria infections using time-series and ARIMAX analyses: a case study in endemic districts of Bhutan.

    Science.gov (United States)

    Wangdi, Kinley; Singhasivanon, Pratap; Silawan, Tassanee; Lawpoolsri, Saranath; White, Nicholas J; Kaewkungwal, Jaranit

    2010-09-03

    Malaria still remains a public health problem in some districts of Bhutan despite marked reduction of cases in last few years. To strengthen the country's prevention and control measures, this study was carried out to develop forecasting and prediction models of malaria incidence in the endemic districts of Bhutan using time series and ARIMAX. This study was carried out retrospectively using the monthly reported malaria cases from the health centres to Vector-borne Disease Control Programme (VDCP) and the meteorological data from Meteorological Unit, Department of Energy, Ministry of Economic Affairs. Time series analysis was performed on monthly malaria cases, from 1994 to 2008, in seven malaria endemic districts. The time series models derived from a multiplicative seasonal autoregressive integrated moving average (ARIMA) was deployed to identify the best model using data from 1994 to 2006. The best-fit model was selected for each individual district and for the overall endemic area was developed and the monthly cases from January to December 2009 and 2010 were forecasted. In developing the prediction model, the monthly reported malaria cases and the meteorological factors from 1996 to 2008 of the seven districts were analysed. The method of ARIMAX modelling was employed to determine predictors of malaria of the subsequent month. It was found that the ARIMA (p, d, q) (P, D, Q)s model (p and P representing the auto regressive and seasonal autoregressive; d and D representing the non-seasonal differences and seasonal differencing; and q and Q the moving average parameters and seasonal moving average parameters, respectively and s representing the length of the seasonal period) for the overall endemic districts was (2,1,1)(0,1,1)12; the modelling data from each district revealed two most common ARIMA models including (2,1,1)(0,1,1)12 and (1,1,1)(0,1,1)12. The forecasted monthly malaria cases from January to December 2009 and 2010 varied from 15 to 82 cases in 2009

  11. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1992-01-01

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  12. Development of a viscoplastic dynamic fracture mechanics treatment for crack arrest predictions in a PTS event

    International Nuclear Information System (INIS)

    Kanninen, M.F.; Hudak, S.J. Jr.; Reed, K.W.; Dexter, R.J.; Polch, E.Z.; Cardinal, J.W.; Achenbach, J.D.; Popelar, C.H.

    1986-01-01

    The objective of this research is to develop a fundamentally correct methodology for the prediction of crack arrest at the high upper shelf conditions occurring in a postulated pressurized thermal shock (PTS) event. The effort is aimed at the development of a versatile finite-element method for the solution of time-dependent boundary value problems that admit inertia effects, a prescribed spatial temperature distribution, and viscoplastic constitutive and fracture behavior. Supporting this development are (1) material characterization and fracture experimentation, (2) detailed mathematical analyses of the near-tip region, (3) elastodynamic fracture analysis, and (4) elastic-plastic tearing instability analyses. As a first step, dynamic-viscoplastic analyses are currently being made of the wide plate tests being performed by the National Bureau of Standards in a companion HSST program. Some preliminary conclusions drawn from this work and from the supporting research activities are offered in this paper. The outstanding critical issues that subsequent research must focus on are also described

  13. Development of a prognostic model for predicting spontaneous singleton preterm birth.

    Science.gov (United States)

    Schaaf, Jelle M; Ravelli, Anita C J; Mol, Ben Willem J; Abu-Hanna, Ameen

    2012-10-01

    To develop and validate a prognostic model for prediction of spontaneous preterm birth. Prospective cohort study using data of the nationwide perinatal registry in The Netherlands. We studied 1,524,058 singleton pregnancies between 1999 and 2007. We developed a multiple logistic regression model to estimate the risk of spontaneous preterm birth based on maternal and pregnancy characteristics. We used bootstrapping techniques to internally validate our model. Discrimination (AUC), accuracy (Brier score) and calibration (calibration graphs and Hosmer-Lemeshow C-statistic) were used to assess the model's predictive performance. Our primary outcome measure was spontaneous preterm birth at model included 13 variables for predicting preterm birth. The predicted probabilities ranged from 0.01 to 0.71 (IQR 0.02-0.04). The model had an area under the receiver operator characteristic curve (AUC) of 0.63 (95% CI 0.63-0.63), the Brier score was 0.04 (95% CI 0.04-0.04) and the Hosmer Lemeshow C-statistic was significant (pvalues of predicted probability. The positive predictive value was 26% (95% CI 20-33%) for the 0.4 probability cut-off point. The model's discrimination was fair and it had modest calibration. Previous preterm birth, drug abuse and vaginal bleeding in the first half of pregnancy were the most important predictors for spontaneous preterm birth. Although not applicable in clinical practice yet, this model is a next step towards early prediction of spontaneous preterm birth that enables caregivers to start preventive therapy in women at higher risk. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. A Comprehensive Quantitative Evaluation of New Sustainable Urbanization Level in 20 Chinese Urban Agglomerations

    Directory of Open Access Journals (Sweden)

    Cong Xu

    2016-01-01

    Full Text Available On 16 March 2014, the State Council of China launched its first urbanization planning initiative dubbed “National New Urbanization Planning (2014–2020” (NNUP. NNUP put forward 20 urban agglomerations and a sustainable development approach aiming to transform traditional Chinese urbanization to sustainable new urbanization. This study quantitatively evaluates the level of sustainability of the present new urbanization process in 20 Chinese urban agglomerations and provides some positive suggestions for the achievement of sustainable new urbanization. A three-level index system which is based on six fundamental elements in a city and a Full Permutation Polygon Synthetic Indicator evaluation method are adopted. The results show that China is undergoing a new urbanization process with a low level of sustainability and there are many problems remaining from traditional urbanization processes. There exists a polarized phenomenon in the urbanization of 20 urban agglomerations. Based on their own development patterns, the 20 urban agglomerations can be divided into seven categories. Every category has its own development characteristics. The analyses also show that waste of water resources, abuse of land resources, and air pollution are three big problems that are closely linked to traditional Chinese urbanization processes. To achieve sustainable new urbanization in China, four relevant suggestions and comments have been provided.

  15. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  16. Drug-target interaction prediction: A Bayesian ranking approach.

    Science.gov (United States)

    Peska, Ladislav; Buza, Krisztian; Koller, Júlia

    2017-12-01

    In silico prediction of drug-target interactions (DTI) could provide valuable information and speed-up the process of drug repositioning - finding novel usage for existing drugs. In our work, we focus on machine learning algorithms supporting drug-centric repositioning approach, which aims to find novel usage for existing or abandoned drugs. We aim at proposing a per-drug ranking-based method, which reflects the needs of drug-centric repositioning research better than conventional drug-target prediction approaches. We propose Bayesian Ranking Prediction of Drug-Target Interactions (BRDTI). The method is based on Bayesian Personalized Ranking matrix factorization (BPR) which has been shown to be an excellent approach for various preference learning tasks, however, it has not been used for DTI prediction previously. In order to successfully deal with DTI challenges, we extended BPR by proposing: (i) the incorporation of target bias, (ii) a technique to handle new drugs and (iii) content alignment to take structural similarities of drugs and targets into account. Evaluation on five benchmark datasets shows that BRDTI outperforms several state-of-the-art approaches in terms of per-drug nDCG and AUC. BRDTI results w.r.t. nDCG are 0.929, 0.953, 0.948, 0.897 and 0.690 for G-Protein Coupled Receptors (GPCR), Ion Channels (IC), Nuclear Receptors (NR), Enzymes (E) and Kinase (K) datasets respectively. Additionally, BRDTI significantly outperformed other methods (BLM-NII, WNN-GIP, NetLapRLS and CMF) w.r.t. nDCG in 17 out of 20 cases. Furthermore, BRDTI was also shown to be able to predict novel drug-target interactions not contained in the original datasets. The average recall at top-10 predicted targets for each drug was 0.762, 0.560, 1.000 and 0.404 for GPCR, IC, NR, and E datasets respectively. Based on the evaluation, we can conclude that BRDTI is an appropriate choice for researchers looking for an in silico DTI prediction technique to be used in drug

  17. Secondary structural analyses of ITS1 in Paramecium.

    Science.gov (United States)

    Hoshina, Ryo

    2010-01-01

    The nuclear ribosomal RNA gene operon is interrupted by internal transcribed spacer (ITS) 1 and ITS2. Although the secondary structure of ITS2 has been widely investigated, less is known about ITS1 and its structure. In this study, the secondary structure of ITS1 sequences for Paramecium and other ciliates was predicted. Each Paramecium ITS1 forms an open loop with three helices, A through C. Helix B was highly conserved among Paramecium, and similar helices were found in other ciliates. A phylogenetic analysis using the ITS1 sequences showed high-resolution, implying that ITS1 is a good tool for species-level analyses.

  18. The prediction of late-onset preeclampsia: Results from a longitudinal proteomics study

    Science.gov (United States)

    Erez, Offer; Romero, Roberto; Maymon, Eli; Chaemsaithong, Piya; Done, Bogdan; Pacora, Percy; Panaitescu, Bogdan; Chaiworapongsa, Tinnakorn; Hassan, Sonia S.

    2017-01-01

    Background Late-onset preeclampsia is the most prevalent phenotype of this syndrome; nevertheless, only a few biomarkers for its early diagnosis have been reported. We sought to correct this deficiency using a high through-put proteomic platform. Methods A case-control longitudinal study was conducted, including 90 patients with normal pregnancies and 76 patients with late-onset preeclampsia (diagnosed at ≥34 weeks of gestation). Maternal plasma samples were collected throughout gestation (normal pregnancy: 2–6 samples per patient, median of 2; late-onset preeclampsia: 2–6, median of 5). The abundance of 1,125 proteins was measured using an aptamers-based proteomics technique. Protein abundance in normal pregnancies was modeled using linear mixed-effects models to estimate mean abundance as a function of gestational age. Data was then expressed as multiples of-the-mean (MoM) values in normal pregnancies. Multi-marker prediction models were built using data from one of five gestational age intervals (8–16, 16.1–22, 22.1–28, 28.1–32, 32.1–36 weeks of gestation). The predictive performance of the best combination of proteins was compared to placental growth factor (PIGF) using bootstrap. Results 1) At 8–16 weeks of gestation, the best prediction model included only one protein, matrix metalloproteinase 7 (MMP-7), that had a sensitivity of 69% at a false positive rate (FPR) of 20% (AUC = 0.76); 2) at 16.1–22 weeks of gestation, MMP-7 was the single best predictor of late-onset preeclampsia with a sensitivity of 70% at a FPR of 20% (AUC = 0.82); 3) after 22 weeks of gestation, PlGF was the best predictor of late-onset preeclampsia, identifying 1/3 to 1/2 of the patients destined to develop this syndrome (FPR = 20%); 4) 36 proteins were associated with late-onset preeclampsia in at least one interval of gestation (after adjustment for covariates); 5) several biological processes, such as positive regulation of vascular endothelial growth factor

  19. Predicting invasive fungal pathogens using invasive pest assemblages: testing model predictions in a virtual world.

    Directory of Open Access Journals (Sweden)

    Dean R Paini

    Full Text Available Predicting future species invasions presents significant challenges to researchers and government agencies. Simply considering the vast number of potential species that could invade an area can be insurmountable. One method, recently suggested, which can analyse large datasets of invasive species simultaneously is that of a self organising map (SOM, a form of artificial neural network which can rank species by establishment likelihood. We used this method to analyse the worldwide distribution of 486 fungal pathogens and then validated the method by creating a virtual world of invasive species in which to test the SOM. This novel validation method allowed us to test SOM's ability to rank those species that can establish above those that can't. Overall, we found the SOM highly effective, having on average, a 96-98% success rate (depending on the virtual world parameters. We also found that regions with fewer species present (i.e. 1-10 species were more difficult for the SOM to generate an accurately ranked list, with success rates varying from 100% correct down to 0% correct. However, we were able to combine the numbers of species present in a region with clustering patterns in the SOM, to further refine confidence in lists generated from these sparsely populated regions. We then used the results from the virtual world to determine confidences for lists generated from the fungal pathogen dataset. Specifically, for lists generated for Australia and its states and territories, the reliability scores were between 84-98%. We conclude that a SOM analysis is a reliable method for analysing a large dataset of potential invasive species and could be used by biosecurity agencies around the world resulting in a better overall assessment of invasion risk.

  20. Predicting invasive fungal pathogens using invasive pest assemblages: testing model predictions in a virtual world.

    Science.gov (United States)

    Paini, Dean R; Bianchi, Felix J J A; Northfield, Tobin D; De Barro, Paul J

    2011-01-01

    Predicting future species invasions presents significant challenges to researchers and government agencies. Simply considering the vast number of potential species that could invade an area can be insurmountable. One method, recently suggested, which can analyse large datasets of invasive species simultaneously is that of a self organising map (SOM), a form of artificial neural network which can rank species by establishment likelihood. We used this method to analyse the worldwide distribution of 486 fungal pathogens and then validated the method by creating a virtual world of invasive species in which to test the SOM. This novel validation method allowed us to test SOM's ability to rank those species that can establish above those that can't. Overall, we found the SOM highly effective, having on average, a 96-98% success rate (depending on the virtual world parameters). We also found that regions with fewer species present (i.e. 1-10 species) were more difficult for the SOM to generate an accurately ranked list, with success rates varying from 100% correct down to 0% correct. However, we were able to combine the numbers of species present in a region with clustering patterns in the SOM, to further refine confidence in lists generated from these sparsely populated regions. We then used the results from the virtual world to determine confidences for lists generated from the fungal pathogen dataset. Specifically, for lists generated for Australia and its states and territories, the reliability scores were between 84-98%. We conclude that a SOM analysis is a reliable method for analysing a large dataset of potential invasive species and could be used by biosecurity agencies around the world resulting in a better overall assessment of invasion risk.

  1. Fatigue-creep life prediction for a notched specimen of 2[1]/[4]Cr-1Mo steel at 600 C

    International Nuclear Information System (INIS)

    Inoue, Tatsuo; Sakane, Masao; Fukuda, Yoshio; Igari, Toshihide; Miyahara, Mitsuo; Okazaki, Masakazu

    1994-01-01

    This paper presents the life prediction of 2[1]/[4]Cr-1Mo notched specimens subjected to fast-fast, slow-slow and hold-time loadings at 600 C. The crack initiation lives of notched specimens were estimated based on the local stress-strain calculated by inelastic finite element analyses. For the life prediction, combinations of seven different constitutive models and five fatigue-creep damage laws were used. The applicability of the constitutive model and damage law is discussed. The constitutive models predict similar stress-strain relations at the notch root, leading to similar predicted lives. The damage model, however, has a much larger influence on the life prediction. ((orig.))

  2. Pre-delivery fibrinogen predicts adverse maternal or neonatal outcomes in patients with placental abruption.

    Science.gov (United States)

    Wang, Liangcheng; Matsunaga, Shigetaka; Mikami, Yukiko; Takai, Yasushi; Terui, Katsuo; Seki, Hiroyuki

    2016-07-01

    Placental abruption is a severe obstetric complication of pregnancy that can cause disseminated intravascular coagulation and progress to massive post-partum hemorrhage. Coagulation disorder due to extreme consumption of fibrinogen is considered the main pathogenesis of disseminated intravascular coagulation in patients with placental abruption. The present study sought to determine if the pre-delivery fibrinogen level could predict adverse maternal or neonatal outcomes in patients with placental abruption. This retrospective medical chart review was conducted in a center for maternal, fetal, and neonatal medicine in Japan with 61 patients with placental abruption. Fibrinogen levels prior to delivery were collected and evaluated for the prediction of maternal and neonatal outcomes. The main outcome measures for maternal outcomes were disseminated intravascular coagulation and hemorrhage, and the main outcome measures for neonatal outcomes were Apgar score at 5 min, umbilical artery pH, and stillbirth. The receiver-operator curve and multivariate logistic regression analyses indicated that fibrinogen significantly predicted overt disseminated intravascular coagulation and the requirement of ≥6 red blood cell units, ≥10 fresh frozen plasma units, and ≥20 fresh frozen plasma units for transfusion. Moderate hemorrhage occurred in 71.5% of patients with a decrease in fibrinogen levels to 155 mg/dL. Fibrinogen could also predict neonatal outcomes. Umbilical artery pH neonatal outcomes with placental abruption. © 2016 Japan Society of Obstetrics and Gynecology. © 2016 Japan Society of Obstetrics and Gynecology.

  3. A Java Bytecode Metamodel for Composable Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Rensink, Arend; Aksit, Mehmet; Seidl, Martina; Zschaler, Steffen

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java; and access the program to be analyzed through libraries for manipulating intermediate code, such

  4. Perceived competence and enjoyment in predicting students' physical activity and cardiorespiratory fitness.

    Science.gov (United States)

    Gao, Zan

    2008-10-01

    This study investigated the predictive strength of perceived competence and enjoyment on students' physical activity and cardiorespiratory fitness in physical education classes. Participants (N = 307; 101 in Grade 6, 96 in Grade 7, 110 in Grade 8; 149 boys, 158 girls) responded to questionnaires assessing perceived competence and enjoyment of physical education, then their cardiorespiratory fitness was assessed on the Progressive Aerobic Cardiovascular Endurance Run (PACER) test. Physical activity in one class was estimated via pedometers. Regression analyses showed enjoyment (R2 = 16.5) and perceived competence (R2 = 4.2) accounted for significant variance of only 20.7% of physical activity and, perceived competence was the only significant contributor to cardiorespiratory fitness performance (R2 = 19.3%). Only a small amount of variance here leaves 80% unaccounted for. Some educational implications and areas for research are mentioned.

  5. LiDAR-based Prediction of Arthropod Abundance at the Southern Slopes of Mt. Kilimanjaro

    Science.gov (United States)

    Ziegler, Alice

    2017-04-01

    LiDAR (Light Detection And Ranging) is a remote sensing technology that offers high-resolution three-dimensional information about the covered area. These three-dimensional datasets were used in this work to derive structural parameters of the vegetation to predict the abundance of eight different arthropod assemblages with several models. For the model training of each arthropod assemblage, different versions (extent, filters) of the LiDAR datasets were provided and evaluated. Furthermore the importance of each of the LiDAR-derived structural parameters for each model was calculated. The best input dataset and structural parameters were used for the prediction of the abundance of arthropod assemblages. The analyses of the prediction results across seven different landuse types and the eight arthropod assemblages exposed, that for the arthropod assemblages, LiDAR-based predictions were in general best feasible for "Orthoptera" (average R2 (coefficient of determination) over all landuses: 0.14), even though the predictions for the other arthropod assemblages reached values of the same magnitude. It was also found that the landuse type "disturbed forest" showed the best results (average R2 over all assemblages: 0.20), whereas "home garden" was the least predictable (average R2 over all assemblages: 0.04). Differenciated by arthropod-landuse pairs, the results showed distinct differences and the R2 values diverged clearly. It was shown, that when model settings were optimized for only one arthropod taxa, values for R2 could reach values up to 0.55 ("Orthoptera" in "disturbed forest"). The analysis of the importance of each structural parameter for the prediction revealed that about one third of the 18 used parameters were always among the most important ones for the prediction of all assemblages. This strong ranking of parameters implied that focus for further research needs to be put on the selection of predictor variables.

  6. Web 2.0, Library 2.0, and Librarian 2.0:Preparing for the 2.0 World

    Science.gov (United States)

    Abram, S.

    2007-10-01

    There is a global conversation going on right now about the next generation of the web. It's happening under the name of Web 2.0. It's the McLuhanesque hot web where true human interaction takes precedence over merely `cool' information delivery and e-mail. It's about putting information into the real context of our users' lives, research, work and play. Concurrently, a group of information professionals are having a conversation about the vision for what Library 2.0 will look like in this Web 2.0 ecosystem. Some are even going so far as to talk about Web 3.0! Web 2.0 is coming fast and it's BIG! What are the skills and competencies that Librarian 2.0 will need? Come and hear an overview of Web 2.0 and a draft vision for Library 2.0 and an opinion about what adaptations we'll need to make to thrive in this future scenario. Let's talk about the Librarian 2.0 in our users' future!

  7. University Presentation to Potential Students Using Web 2.0 Environments

    Directory of Open Access Journals (Sweden)

    Andrius Eidimtas

    2013-02-01

    Full Text Available Choosing what to study for school graduates is a compound and multi-stage process (Chapman, 1981; Hossler et al., 1999; Brennan, 2001; Shankle, 2009. In the information retrieval stage, future students have to gather and assimilate actual information, form a list of possible higher education institutions. Nowadays modern internet technologies enable universities to create conditions for attractive and interactive information retrieval. Userfriendliness and accessibility of Web 2.0-based environments attract more young people to search for information in the web. Western universities have noticed a great potential of Web 2.0 in information dissemination back in 2007. Meanwhile, Lithuanian universities began using Web 2.0 to assemble virtual communities only in 2010 (Valinevičienė, 2010. Purpose—to disclose possibilities to present universities to school graduates in Web 2.0 environments. Design/methodology/approach—strategies of a case study by using methods of scientific literature analysis, observation and quantitative content analysis. Findings—referring to the information retrieval types and particularity of information retrieval by school graduates disclosed in the analysis of scientific literature, it has been identified that 76 per cent of Lithuanian universities apply at least one website created on the basis of Web 2.0 technology for their official presentation. The variety of Web 2.0 being used distributes only from 1 to 6 different tools, while in scientific literature more possibilities to apply Web 2.0 environments can be found. Research limitations/implications—the empiric part of the case study has been contextualized for Lithuania; however, the theoretic construct of possibilities to present universities in Web 2.0 environments can be used for the analysis presentation of foreign universities in Web 2.0 environments. Practical implications—the work can become the recommendation to develop possibilities for Lithuanian

  8. University Presentation to Potential Students Using Web 2.0 Environments

    Directory of Open Access Journals (Sweden)

    Andrius Eidimtas

    2012-12-01

    Full Text Available Choosing what to study for school graduates is a compound and multi-stage process (Chapman, 1981; Hossler et al., 1999; Brennan, 2001; Shankle, 2009. In the information retrieval stage, future students have to gather and assimilate actual information, form a list of possible higher education institutions. Nowadays modern internet technologies enable universities to create conditions for attractive and interactive information retrieval. Userfriendliness and accessibility of Web 2.0-based environments attract more young people to search for information in the web. Western universities have noticed a great potential of Web 2.0 in information dissemination back in 2007. Meanwhile, Lithuanian universities began using Web 2.0 to assemble virtual communities only in 2010 (Valinevičienė, 2010.Purpose—to disclose possibilities to present universities to school graduates in Web 2.0 environments.Design/methodology/approach—strategies of a case study by using methods of scientific literature analysis, observation and quantitative content analysis.Findings—referring to the information retrieval types and particularity of information retrieval by school graduates disclosed in the analysis of scientific literature, it has been identified that 76 per cent of Lithuanian universities apply at least one website created on the basis of Web 2.0 technology for their official presentation. The variety of Web 2.0 being used distributes only from 1 to 6 different tools, while in scientific literature more possibilities to apply Web 2.0 environments can be found.Research limitations/implications—the empiric part of the case study has been contextualized for Lithuania; however, the theoretic construct of possibilities to present universities in Web 2.0 environments can be used for the analysis presentation of foreign universities in Web 2.0 environments.Practical implications—the work can become the recommendation to develop possibilities for Lithuanian

  9. Prediction of lung density changes after radiotherapy by cone beam computed tomography response markers and pre-treatment factors for non-small cell lung cancer patients.

    Science.gov (United States)

    Bernchou, Uffe; Hansen, Olfred; Schytte, Tine; Bertelsen, Anders; Hope, Andrew; Moseley, Douglas; Brink, Carsten

    2015-10-01

    This study investigates the ability of pre-treatment factors and response markers extracted from standard cone-beam computed tomography (CBCT) images to predict the lung density changes induced by radiotherapy for non-small cell lung cancer (NSCLC) patients. Density changes in follow-up computed tomography scans were evaluated for 135 NSCLC patients treated with radiotherapy. Early response markers were obtained by analysing changes in lung density in CBCT images acquired during the treatment course. The ability of pre-treatment factors and CBCT markers to predict lung density changes induced by radiotherapy was investigated. Age and CBCT markers extracted at 10th, 20th, and 30th treatment fraction significantly predicted lung density changes in a multivariable analysis, and a set of response models based on these parameters were established. The correlation coefficient for the models was 0.35, 0.35, and 0.39, when based on the markers obtained at the 10th, 20th, and 30th fraction, respectively. The study indicates that younger patients without lung tissue reactions early into their treatment course may have minimal radiation induced lung density increase at follow-up. Further investigations are needed to examine the ability of the models to identify patients with low risk of symptomatic toxicity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Simultaneous discovery, estimation and prediction analysis of complex traits using a bayesian mixture model.

    Directory of Open Access Journals (Sweden)

    Gerhard Moser

    2015-04-01

    Full Text Available Gene discovery, estimation of heritability captured by SNP arrays, inference on genetic architecture and prediction analyses of complex traits are usually performed using different statistical models and methods, leading to inefficiency and loss of power. Here we use a Bayesian mixture model that simultaneously allows variant discovery, estimation of genetic variance explained by all variants and prediction of unobserved phenotypes in new samples. We apply the method to simulated data of quantitative traits and Welcome Trust Case Control Consortium (WTCCC data on disease and show that it provides accurate estimates of SNP-based heritability, produces unbiased estimators of risk in new samples, and that it can estimate genetic architecture by partitioning variation across hundreds to thousands of SNPs. We estimated that, depending on the trait, 2,633 to 9,411 SNPs explain all of the SNP-based heritability in the WTCCC diseases. The majority of those SNPs (>96% had small effects, confirming a substantial polygenic component to common diseases. The proportion of the SNP-based variance explained by large effects (each SNP explaining 1% of the variance varied markedly between diseases, ranging from almost zero for bipolar disorder to 72% for type 1 diabetes. Prediction analyses demonstrate that for diseases with major loci, such as type 1 diabetes and rheumatoid arthritis, Bayesian methods outperform profile scoring or mixed model approaches.

  11. A Scintillation Camera and 1600-Channel Analyser for the Diagnosis of Metastatic Cancer of the Liver

    Energy Technology Data Exchange (ETDEWEB)

    Jasinski, W. K.; Malinowska, J.; Mackiewicz, H.; Siwicki, H.; Tolwinski, J. [Institute of Oncology, Warsaw (Poland)

    1969-05-15

    A series of 59 consecutive cases of histologically proven breast cancer were admitted.for radiotherapy. The group was mostly composed of stage III/TNM. The patients were studied with a scintillation camera and a 1600-channel analyser after i. v. injection of 2mCi of colloidal {sup 99m}Tc. Up to 104 counts were accumulated in one of the elements of the analyser matrix and were then transferred to computer-compatible punched tape. The data could be presented immediately in a number of different ways: (a) as conventional scintigraphic pictures on the camera's oscilloscope; (b) as nine pictures from the analyser's oscilloscope with 10, 20, 30... 90% of the counts erased; and (c) as one picture with those four selected levels of erase which were considered best, for a given case. In addition, progressive increase of exposure and defocusing of each level was possible. Examination with colloidal {sup 99m}Tc was found to be a safe method for the detection of malignant secondaries in the liver because of the low irradiation dose to the patient. The high radioactivity accumulated in the liver means a short examination time and the elimination of statistical fluctuations. Furthermore, the stored information enables the time-consuming work to be done in the patient's absence. (author)

  12. Multivariate differential analyses of adolescents' experiences of aggression in families

    Directory of Open Access Journals (Sweden)

    Chris Myburgh

    2011-01-01

    Full Text Available Aggression is part of South African society and has implications for the mental health of persons living in South Africa. If parents are aggressive adolescents are also likely to be aggressive and that will impact negatively on their mental health. In this article the nature and extent of adolescents' experiences of aggression and aggressive behaviour in the family are investigated. A deductive explorative quantitative approach was followed. Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers Cronbach Alpha, various consecutive first and second order factor analyses, correlations, multiple regression, MANOVA, ANOVA and Scheffè/ Dunnett tests were used. It was found that aggression correlated negatively with the independent variables; and the correlations between adolescents and their parents were significant. Regression analyses indicated that different predictors predicted aggression. Furthermore, differences between adolescents and their parents indicated that the experienced levels of aggression between adolescents and their parents were small. Implications for education are given.

  13. On the applicability of probabilistic analyses to assess the structural reliability of materials and components for solid-oxide fuel cells

    Energy Technology Data Exchange (ETDEWEB)

    Lara-Curzio, Edgar [ORNL; Radovic, Miladin [Texas A& M University; Luttrell, Claire R [ORNL

    2016-01-01

    The applicability of probabilistic analyses to assess the structural reliability of materials and components for solid-oxide fuel cells (SOFC) is investigated by measuring the failure rate of Ni-YSZ when subjected to a temperature gradient and comparing it with that predicted using the Ceramics Analysis and Reliability Evaluation of Structures (CARES) code. The use of a temperature gradient to induce stresses was chosen because temperature gradients resulting from gas flow patterns generate stresses during SOFC operation that are the likely to control the structural reliability of cell components The magnitude of the predicted failure rate was found to be comparable to that determined experimentally, which suggests that such probabilistic analyses are appropriate for predicting the structural reliability of materials and components for SOFCs. Considerations for performing more comprehensive studies are discussed.

  14. Short communication: Prediction of retention pay-off using a machine learning algorithm.

    Science.gov (United States)

    Shahinfar, Saleh; Kalantari, Afshin S; Cabrera, Victor; Weigel, Kent

    2014-05-01

    Replacement decisions have a major effect on dairy farm profitability. Dynamic programming (DP) has been widely studied to find the optimal replacement policies in dairy cattle. However, DP models are computationally intensive and might not be practical for daily decision making. Hence, the ability of applying machine learning on a prerun DP model to provide fast and accurate predictions of nonlinear and intercorrelated variables makes it an ideal methodology. Milk class (1 to 5), lactation number (1 to 9), month in milk (1 to 20), and month of pregnancy (0 to 9) were used to describe all cows in a herd in a DP model. Twenty-seven scenarios based on all combinations of 3 levels (base, 20% above, and 20% below) of milk production, milk price, and replacement cost were solved with the DP model, resulting in a data set of 122,716 records, each with a calculated retention pay-off (RPO). Then, a machine learning model tree algorithm was used to mimic the evaluated RPO with DP. The correlation coefficient factor was used to observe the concordance of RPO evaluated by DP and RPO predicted by the model tree. The obtained correlation coefficient was 0.991, with a corresponding value of 0.11 for relative absolute error. At least 100 instances were required per model constraint, resulting in 204 total equations (models). When these models were used for binary classification of positive and negative RPO, error rates were 1% false negatives and 9% false positives. Applying this trained model from simulated data for prediction of RPO for 102 actual replacement records from the University of Wisconsin-Madison dairy herd resulted in a 0.994 correlation with 0.10 relative absolute error rate. Overall results showed that model tree has a potential to be used in conjunction with DP to assist farmers in their replacement decisions. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. Development and internal validation of a side-specific, multiparametric magnetic resonance imaging-based nomogram for the prediction of extracapsular extension of prostate cancer.

    Science.gov (United States)

    Martini, Alberto; Gupta, Akriti; Lewis, Sara C; Cumarasamy, Shivaram; Haines, Kenneth G; Briganti, Alberto; Montorsi, Francesco; Tewari, Ashutosh K

    2018-04-19

    To develop a nomogram for predicting side-specific extracapsular extension (ECE) for planning nerve-sparing radical prostatectomy. We retrospectively analysed data from 561 patients who underwent robot-assisted radical prostatectomy between February 2014 and October 2015. To develop a side-specific predictive model, we considered the prostatic lobes separately. Four variables were included: prostate-specific antigen; highest ipsilateral biopsy Gleason grade; highest ipsilateral percentage core involvement; and ECE on multiparametric magnetic resonance imaging (mpMRI). A multivariable logistic regression analysis was fitted to predict side-specific ECE. A nomogram was built based on the coefficients of the logit function. Internal validation was performed using 'leave-one-out' cross-validation. Calibration was graphically investigated. The decision curve analysis was used to evaluate the net clinical benefit. The study population consisted of 829 side-specific cases, after excluding negative biopsy observations (n = 293). ECE was reported on mpMRI and final pathology in 115 (14%) and 142 (17.1%) cases, respectively. Among these, mpMRI was able to predict ECE correctly in 57 (40.1%) cases. All variables in the model except highest percentage core involvement were predictors of ECE (all P ≤ 0.006). All variables were considered for inclusion in the nomogram. After internal validation, the area under the curve was 82.11%. The model demonstrated excellent calibration and improved clinical risk prediction, especially when compared with relying on mpMRI prediction of ECE alone. When retrospectively applying the nomogram-derived probability, using a 20% threshold for performing nerve-sparing, nine out of 14 positive surgical margins (PSMs) at the site of ECE resulted above the threshold. We developed an easy-to-use model for the prediction of side-specific ECE, and hope it serves as a tool for planning nerve-sparing radical prostatectomy and in the reduction of PSM in

  16. A simple lead dust fall method predicts children's blood lead level: New evidence from Australia.

    Science.gov (United States)

    Gulson, Brian; Taylor, Alan

    2017-11-01

    We have measured dust fall accumulation in petri dishes (PDD) collected 6 monthly from inside residences in Sydney urban area, New South Wales, Australia as part of a 5-year longitudinal study to determine environmental associations, including soil. with blood lead (PbB) levels. The Pb loading in the dishes (n = 706) had geometric means (GM) of 24µg/m 2 /30d, a median value of 22µg/m 2 /30d with a range from 0.2 to 11,390µg/m 2 /30d. Observed geometric mean PbB was 2.4µg/dL at ages 2-3 years. Regression analyses showed a statistically significant relationship between predicted PbB and PDD. The predicted PbB values from dust in our study are consistent with similar analyses from the US in which floor dust was collected by wipes. Predicted PbB values from PDD indicate that an increase in PDD of about 100µg/m 2 /30d would increase PbB by about 1.5µg/dL or a doubling PbB at the low levels currently observed in many countries. Predicted PbB values from soil indicate that a change from 0 to 1000mg Pb/kg results in an increase of 1.7µg/dL in PbB, consistent with earlier investigations. Blood Pb levels can be predicted from dust fall accumulation (and soil) in cases where blood sampling is not always possible, especially in young children. Petri dish loading data could provide an alternative or complementary "action level" at about 100µg Pb/m 2 /30 days, similar to the suggested level of about 110µg Pb/m 2 for surface wipes, for use in monitoring activities such as housing rehabilitation, demolition or soil resuspension. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. PBFA Z: A 20-MA z-pinch driver for plasma radiation sources

    International Nuclear Information System (INIS)

    Spielman, R.B.; Breeze, S.F.; Deeney, C.

    1996-01-01

    Sandia National Laboratories is completing a major modification to the PBFA-II facility. PBFA Z will be a z-pinch driver capable of delivering up to 20 MA to a z-pinch load. It optimizes the electrical coupling to the implosion energy of z pinches at implosion velocities of ∼ 40 cm/μs. Design constraints resulted in an accelerator with a 0.12-Ω impedance, a 10.25-nH inductance, and a 120-ns pulse width. The design required new water transmission lines, insulator stack, and vacuum power feeds. Current is delivered to the z-pinch load through four, self-magnetically-insulated vacuum transmission lines and a double post-hole convolute. A variety of design codes are used to model the power flow. These predict a peak current of 20 MA to a z-pinch load having a 2-cm length, a 2-cm radius, and a 15--mg mass, coupling 1.5 MJ into kinetic energy. We present 2-D Rad-Hydro calculations showing MJ x-ray outputs from tungsten wire-array z pinches

  18. Assessment of the influence of different sample processing and cold storage duration on plant free proline content analyses.

    Science.gov (United States)

    Teklić, Tihana; Spoljarević, Marija; Stanisavljević, Aleksandar; Lisjak, Miroslav; Vinković, Tomislav; Parađiković, Nada; Andrić, Luka; Hancock, John T

    2010-01-01

    A method which is widely accepted for the analysis of free proline content in plant tissues is based on the use of 3% sulfosalicylic acid as an extractant, followed by spectrophotometric quantification of a proline-ninhydrin complex in toluene. However, sample preparation and storage may influence the proline actually measured. This may give misleading or difficult to compare data. To evaluate free proline levels fresh and frozen strawberry (Fragaria × ananassa Duch.) leaves and soybean [Glycine max (L.) Merr.] hypocotyl tissues were used. These were ground with or without liquid nitrogen and proline extracted with sulfosalicylic acid. A particular focus was the influence of plant sample cold storage duration (1, 4 and 12 weeks at -20°C) on tissue proline levels measured. The free proline content analyses, carried out in leaves of Fragaria × ananassa Duch. as well as in hypocotyls of Glycine max (L.) Merr., showed a significant influence of the sample preparation method and cold storage period. Long-term storage of up to 12 weeks at -20°C led to a significant increase in the measured proline in all samples analysed. The observed changes in proline content in plant tissue samples stored at -20°C indicate the likelihood of the over-estimation of the proline content if the proline analyses are delayed. Plant sample processing and cold storage duration seem to have an important influence on results of proline analyses. Therefore it is recommended that samples should be ground fresh and analysed immediately. Copyright © 2010 John Wiley & Sons, Ltd.

  19. Predictable return distributions

    DEFF Research Database (Denmark)

    Pedersen, Thomas Quistgaard

    trace out the entire distribution. A univariate quantile regression model is used to examine stock and bond return distributions individually, while a multivariate model is used to capture their joint distribution. An empirical analysis on US data shows that certain parts of the return distributions......-of-sample analyses show that the relative accuracy of the state variables in predicting future returns varies across the distribution. A portfolio study shows that an investor with power utility can obtain economic gains by applying the empirical return distribution in portfolio decisions instead of imposing...

  20. A Bayesian network model for predicting type 2 diabetes risk based on electronic health records

    Science.gov (United States)

    Xie, Jiang; Liu, Yan; Zeng, Xu; Zhang, Wu; Mei, Zhen

    2017-07-01

    An extensive, in-depth study of diabetes risk factors (DBRF) is of crucial importance to prevent (or reduce) the chance of suffering from type 2 diabetes (T2D). Accumulation of electronic health records (EHRs) makes it possible to build nonlinear relationships between risk factors and diabetes. However, the current DBRF researches mainly focus on qualitative analyses, and the inconformity of physical examination items makes the risk factors likely to be lost, which drives us to study the novel machine learning approach for risk model development. In this paper, we use Bayesian networks (BNs) to analyze the relationship between physical examination information and T2D, and to quantify the link between risk factors and T2D. Furthermore, with the quantitative analyses of DBRF, we adopt EHR and propose a machine learning approach based on BNs to predict the risk of T2D. The experiments demonstrate that our approach can lead to better predictive performance than the classical risk model.

  1. PBFA Z: A 20-MA Z-pinch driver for plasma radiation sources

    International Nuclear Information System (INIS)

    Spielman, R.B.; Breeze, S.F.; Deeney, C.

    1996-01-01

    Sandia National Laboratories is completing a major modification to the PBFA-II facility. PBFA Z will be capable of delivering up to 20 MA to a z-pinch load. It optimizes the electrical coupling to the implosion energy of z pinches at implosion velocities of ∼ 40 cm/μs. Design constraints resulted in an accelerator with a 0.12-Ω impedance, a 10.25-nH inductance, and a 120-ns pulse width. The design required new water transmission lines, insulator stack, and vacuum power feeds. Current is delivered to the z-pinch load through four self-magnetically-insulated vacuum transmission lines and a double post-hole convolute. A variety of design codes are used to model the power flow. These predict a peak current of 20 MA to a z-pinch load having a 2-cm length, a 2-cm radius, and a 15-mg mass, coupling 1.5 MJ into kinetic energy. Calculations are presented showing MJ x-ray outputs from tungsten wire-array z pinches. (author). 4 figs., 14 refs

  2. PBFA Z: A 20-MA Z-pinch driver for plasma radiation sources

    Energy Technology Data Exchange (ETDEWEB)

    Spielman, R B; Breeze, S F; Deeney, C [Sandia Labs., Albuquerque, NM (United States); and others

    1997-12-31

    Sandia National Laboratories is completing a major modification to the PBFA-II facility. PBFA Z will be capable of delivering up to 20 MA to a z-pinch load. It optimizes the electrical coupling to the implosion energy of z pinches at implosion velocities of {approx} 40 cm/{mu}s. Design constraints resulted in an accelerator with a 0.12-{Omega} impedance, a 10.25-nH inductance, and a 120-ns pulse width. The design required new water transmission lines, insulator stack, and vacuum power feeds. Current is delivered to the z-pinch load through four self-magnetically-insulated vacuum transmission lines and a double post-hole convolute. A variety of design codes are used to model the power flow. These predict a peak current of 20 MA to a z-pinch load having a 2-cm length, a 2-cm radius, and a 15-mg mass, coupling 1.5 MJ into kinetic energy. Calculations are presented showing MJ x-ray outputs from tungsten wire-array z pinches. (author). 4 figs., 14 refs.

  3. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...

  4. Datamonkey 2.0: a modern web application for characterizing selective and other evolutionary processes.

    Science.gov (United States)

    Weaver, Steven; Shank, Stephen D; Spielman, Stephanie J; Li, Michael; Muse, Spencer V; Kosakovsky Pond, Sergei L

    2018-01-02

    Inference of how evolutionary forces have shaped extant genetic diversity is a cornerstone of modern comparative sequence analysis. Advances in sequence generation and increased statistical sophistication of relevant methods now allow researchers to extract ever more evolutionary signal from the data, albeit at an increased computational cost. Here, we announce the release of Datamonkey 2.0, a completely re-engineered version of the Datamonkey web-server for analyzing evolutionary signatures in sequence data. For this endeavor, we leveraged recent developments in open-source libraries that facilitate interactive, robust, and scalable web application development. Datamonkey 2.0 provides a carefully curated collection of methods for interrogating coding-sequence alignments for imprints of natural selection, packaged as a responsive (i.e. can be viewed on tablet and mobile devices), fully interactive, and API-enabled web application. To complement Datamonkey 2.0, we additionally release HyPhy Vision, an accompanying JavaScript application for visualizing analysis results. HyPhy Vision can also be used separately from Datamonkey 2.0 to visualize locally-executed HyPhy analyses. Together, Datamonkey 2.0 and HyPhy Vision showcase how scientific software development can benefit from general-purpose open-source frameworks. Datamonkey 2.0 is freely and publicly available at http://www.datamonkey. org, and the underlying codebase is available from https://github.com/veg/datamonkey-js. © The Author 2018. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. A Quantum Annealing Computer Team Addresses Climate Change Predictability

    Science.gov (United States)

    Halem, M. (Principal Investigator); LeMoigne, J.; Dorband, J.; Lomonaco, S.; Yesha, Ya.; Simpson, D.; Clune, T.; Pelissier, C.; Nearing, G.; Gentine, P.; hide

    2016-01-01

    The near confluence of the successful launch of the Orbiting Carbon Observatory2 on July 2, 2014 and the acceptance on August 20, 2015 by Google, NASA Ames Research Center and USRA of a 1152 qubit D-Wave 2X Quantum Annealing Computer (QAC), offered an exceptional opportunity to explore the potential of this technology to address the scientific prediction of global annual carbon uptake by land surface processes. At UMBC,we have collected and processed 20 months of global Level 2 light CO2 data as well as fluorescence data. In addition we have collected ARM data at 2sites in the US and Ameriflux data at more than 20 stations. J. Dorband has developed and implemented a multi-hidden layer Boltzmann Machine (BM) algorithm on the QAC. Employing the BM, we are calculating CO2 fluxes by training collocated OCO-2 level 2 CO2 data with ARM ground station tower data to infer to infer measured CO2 flux data. We generate CO2 fluxes with a regression analysis using these BM derived weights on the level 2 CO2 data for three Ameriflux sites distinct from the ARM stations. P. Gentine has negotiated for the access of K34 Ameriflux data in the Amazon and is applying a neural net to infer the CO2 fluxes. N. Talik validated the accuracy of the BM performance on the QAC against a restricted BM implementation on the IBM Softlayer Cloud with the Nvidia co-processors utilizing the same data sets. G. Nearing and K. Harrison have extended the GSFC LIS model with the NCAR Noah photosynthetic parameterization and have run a 10 year global prediction of the net ecosystem exchange. C. Pellisier is preparing a BM implementation of the Kalman filter data assimilation of CO2 fluxes. At UMBC, R. Prouty is conducting OSSE experiments with the LISNoah model on the IBM iDataPlex to simulate the impact of CO2 fluxes to improve the prediction of global annual carbon uptake. J. LeMoigne and D. Simpson have developed a neural net image registration system that will be used for MODIS ENVI and will be

  6. Prediction of Mortality after Emergent Transjugular Intrahepatic Portosystemic Shunt Placement: Use of APACHE II, Child-Pugh and MELD Scores in Asian Patients with Refractory Variceal Hemorrhage

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Wen Sheng; Wu, Reng Hong; Lin, Ching Yih; Chen, Jyh Jou; Sheu, Ming Juen; Koay, Lok Beng; Lee, Chuan [Chi-Mei Foundation Medical Center, Tainan (China)

    2009-10-15

    This study was designed to determine if existing methods of grading liver function that have been developed in non-Asian patients with cirrhosis can be used to predict mortality in Asian patients treated for refractory variceal hemorrhage by the use of the transjugular intrahepatic portosystemic shunt (TIPS) procedure. Data for 107 consecutive patients who underwent an emergency TIPS procedure were retrospectively analyzed. Acute physiology and chronic health evaluation (APACHE II), Child-Pugh and model for end-stage liver disease (MELD) scores were calculated. Survival analyses were performed to evaluate the ability of the various models to predict 30-day, 60-day and 360-day mortality. The ability of stratified APACHE II, Child-Pugh, and MELD scores to predict survival was assessed by the use of Kaplan-Meier analysis with the log-rank test. No patient died during the TIPS procedure, but 82 patients died during the follow-up period. Thirty patients died within 30 days after the TIPS procedure; 37 patients died within 60 days and 53 patients died within 360 days. Univariate analysis indicated that hepatorenal syndrome, use of inotropic agents and mechanical ventilation were associated with elevated 30-day mortality (p < 0.05). Multivariate analysis showed that a Child-Pugh score > 11 or an MELD score > 20 predicted increased risk of death at 30, 60 and 360 days (p < 0.05). APACHE II scores could only predict mortality at 360 days (p < 0.05). A Child-Pugh score > 11 or an MELD score > 20 are predictive of mortality in Asian patients with refractory variceal hemorrhage treated with the TIPS procedure. An APACHE II score is not predictive of early mortality in this patient population.

  7. Web 2.0 in healthcare: state-of-the-art in the German health insurance landscape.

    Science.gov (United States)

    Kuehne, Mirko; Blinn, Nadine; Rosenkranz, Christoph; Nuettgens, Markus

    2011-01-01

    The Internet is increasingly used as a source for information and knowledge. Even in the field of healthcare, information is widely available. Patients and their relatives increasingly use the Internet in order to search for healthcare information and applications. "Health 2.0" - the increasing use of Web 2.0 technologies and tools in Electronic Healthcare - promises new ways of interaction, communication, and participation for healthcare. In order to explore how Web 2.0 applications are in general adopted and implemented by health information providers, we analysed the websites of all German health insurances companies regarding their provision of Web 2.0 applications. As health insurances play a highly relevant role in the German healthcare system, we conduct an exploratory survey in order to provide answers about the adoption and implementation of Web 2.0 technologies. Hence, all 198 private and public health insurances were analysed according to their websites. The results show a wide spread diffusion of Web 2.0 applications but also huge differences between the implementation by the respective insurances. Therefore, our findings provide a foundation for further research on aspects that drive the adoption.

  8. Competition and stability analyses among emissions, energy, and economy: Application for Mexico

    International Nuclear Information System (INIS)

    Pao, Hsiao-Tien; Fu, Hsin-Chia

    2015-01-01

    In view of limited natural resources on Earth, linkage among environment, energy, and economy (3Es) becomes important perspectives for sustainable development. This paper proposes to use Lotka–Volterra model for SUstainable Development (LV-SUD) to analyse the interspecific interactions, equilibria and their stabilities among emissions, different types of energy consumption (renewable, nuclear, and fossil fuel), and real GDP, the main factors of 3Es issues. Modelling these interactions provides a useful multivariate framework for prediction outcomes. Interaction between 3Es, namely competition, symbiosis, or predation, plays an important role in policy development to achieve a balanced use of energy resources and to strengthen the green economy. Applying LV-SUD in Mexico, an emerging markets country, analysing results show that there is a mutualism between fossil fuel consumption and GDP; prey-predator relationships that fossil fuel and GDP enhance the growth of emissions, but emissions inhibit the growth of the others; and commensalisms that GDP benefits from nuclear power, and renewable power benefits from fossil fuel. It is suggested that national energy policies should remain committed to decoupling the relevance between non-clean energy and GDP, to actively developing clean energy and thereby to properly reducing fossil fuel consumption and emissions without harming economic growth. - Highlights: • LV-SUD is used to analyse the competition between environment-energy-economy (3Es). • The competitions between renewable, nuclear, and fossil energy are analysed. • Competition between 3Es plays an important role in policy development. • LV-SUD provides a useful multivariate framework for prediction outcomes. • An application for emerging markets countries such as Mexico is presented

  9. Prediction of Mortality after Emergent Transjugular Intrahepatic Portosystemic Shunt Placement: Use of APACHE II, Child-Pugh and MELD Scores in Asian Patients with Refractory Variceal Hemorrhage

    International Nuclear Information System (INIS)

    Tzeng, Wen Sheng; Wu, Reng Hong; Lin, Ching Yih; Chen, Jyh Jou; Sheu, Ming Juen; Koay, Lok Beng; Lee, Chuan

    2009-01-01

    This study was designed to determine if existing methods of grading liver function that have been developed in non-Asian patients with cirrhosis can be used to predict mortality in Asian patients treated for refractory variceal hemorrhage by the use of the transjugular intrahepatic portosystemic shunt (TIPS) procedure. Data for 107 consecutive patients who underwent an emergency TIPS procedure were retrospectively analyzed. Acute physiology and chronic health evaluation (APACHE II), Child-Pugh and model for end-stage liver disease (MELD) scores were calculated. Survival analyses were performed to evaluate the ability of the various models to predict 30-day, 60-day and 360-day mortality. The ability of stratified APACHE II, Child-Pugh, and MELD scores to predict survival was assessed by the use of Kaplan-Meier analysis with the log-rank test. No patient died during the TIPS procedure, but 82 patients died during the follow-up period. Thirty patients died within 30 days after the TIPS procedure; 37 patients died within 60 days and 53 patients died within 360 days. Univariate analysis indicated that hepatorenal syndrome, use of inotropic agents and mechanical ventilation were associated with elevated 30-day mortality (p 11 or an MELD score > 20 predicted increased risk of death at 30, 60 and 360 days (p 11 or an MELD score > 20 are predictive of mortality in Asian patients with refractory variceal hemorrhage treated with the TIPS procedure. An APACHE II score is not predictive of early mortality in this patient population

  10. NCBI nr-aa BLAST: CBRC-CFAM-20-0004 [SEVENS

    Lifescience Database Archive (English)

    Full Text Available CBRC-CFAM-20-0004 ref|NP_001008277.1| rhodopsin [Canis lupus familiaris] ref|XP_855...608.1| PREDICTED: rhodopsin [Canis familiaris] emb|CAA70209.1| unnamed protein product [Canis familiaris] NP_001008277.1 0.0 97% ...

  11. Plant corrosion: prediction of materials performance

    International Nuclear Information System (INIS)

    Strutt, J.E.; Nicholls, J.R.

    1987-01-01

    Seventeen papers have been compiled forming a book on computer-based approaches to corrosion prediction in a wide range of industrial sectors, including the chemical, petrochemical and power generation industries. Two papers have been selected and indexed separately. The first describes a system operating within BNFL's Reprocessing Division to predict materials performance in corrosive conditions to aid future plant design. The second describes the truncation of the distribution function of pit depths during high temperature oxidation of a 20Cr austenitic steel in the fuel cladding in AGR systems. (U.K.)

  12. Thermal analyses for a nuclear-waste repository in tuff using USW-G1 borehole data

    International Nuclear Information System (INIS)

    Johnson, R.L.

    1982-10-01

    Thermal calculations using properties of tuffs obtained from the USW-G1 borehole, located near the SW margin of the Nevada Test Site (NTS), have been completed for a nuclear waste repository sited in welded tuff below the water table. The analyses considered two wasteforms, high level waste and spent fuel, emplaced at two different, gross thermal loadings, 50 and 75 kW/Acre (20.24 and 30.36 kW/ha). Calculations were made assuming that no boiling of the groundwater occurs; i.e., that the hydrostatic head potential was reestablished soon after waste emplacement. 23 figures, 2 tables

  13. Predicting violence and recidivism in a large sample of males on probation or parole.

    Science.gov (United States)

    Prell, Lettie; Vitacco, Michael J; Zavodny, Denis

    This study evaluated the utility of items and scales from the Iowa Violence and Victimization Instrument in a sample of 1961 males from the state of Iowa who were on probation or released from prison to parole supervision. This is the first study to examine the potential of the Iowa Violence and Victimization Instrument to predict criminal offenses. The males were followed for 30months immediately following their admission to probation or parole. AUC analyses indicated fair to good predictive power for the Iowa Violence and Victimization Instrument for charges of violence and victimization, but chance predictive power for drug offenses. Notably, both scales of the instrument performed equally well at the 30-month follow-up. Items on the Iowa Violence and Victimization Instrument not only predicted violence, but are straightforward to score. Violence management strategies are discussed as they relate to the current findings, including the potential to expand the measure to other jurisdictions and populations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Independent Predictive Factors of Hospitalization in a North-West Burn Center of Iran; an Epidemiologic Study

    Directory of Open Access Journals (Sweden)

    Samad Shams Vahdati

    2015-01-01

    Full Text Available Introduction: A high grade burn is one of the most devastating injuries with several medical, social, economic, and psychological effects. These injuries are the most common cause of accidental deaths after traffic injuries in both the developed and developing countries. Therefore this research was aimed to determine demographic characteristics of patients with burn injury admitted to the emergency department and identify predictive factors of hospitalization. Methods: This is a cross sectional descriptive study, which is done in 20 March up to 20 September 2011 in emergency department of Sina Hospital, Tabriz, Iran. Patients’ information including demographic characteristic, cause of burn, place of accident, anatomical areas burned, grading and percent of burning and disposition were gathered and analyzed using SPSS version 18.0 statistical software. Stepwise multivariate regression analysis was used for recognition of independent predictive factors of hospitalization in burned patients. Results: One hundred and sixty patients were enrolled (54.4% female. The average age of those was 20.47±13.5 years. The prevalence of burn was significantly higher in ages under 20 years (p<0.001. Lower limb (37.5%, head and neck (21.25% and upper limb (17.5% were three frequent site of burn. The most common cause of burns was boiling water scalding (34.4%. Home related burn was significantly higher than other place (p<0.001. The most frequent percent of burn was <5% (46.25%. Finally 50 (31.25% cases hospitalized. Univariate analysis demonstrated that age under 20 years old (p=0.02 female gender (p=0.02, burning site (p=0.002, cause (p=0.005, place (p<0.001, grade (p<0.001, and percent (p<0.001 was related to disposition of patients. Stepwise multiple logistic regression showed female gender (OR=3.52; 95% CI: 1.57-7.88; p=0.002, work related burning (OR=1.78; 95% CI: 1.26-2.52; p=0.001, and burning over 5 percent (OR=2.15; 95% CI: 1.35-3.41; p=0.001 as

  15. Experimental study of residual stresses in laser clad AISI P20 tool steel on pre-hardened wrought P20 substrate

    International Nuclear Information System (INIS)

    Chen, J.-Y.; Conlon, K.; Xue, L.; Rogge, R.

    2010-01-01

    Research highlights: → Laser cladding of P20 tool steel. → Residual stress analysis of laser clad P20 tool steel. → Microstructure of laser clad P20 tool steel. → Tooling Repair using laser cladding. → Stress reliving treatment of laser clad P20 tool steel. - Abstract: Laser cladding is to deposit desired material onto the surface of a base material (or substrate) with a relatively low heat input to form a metallurgically sound and dense clad. This process has been successfully applied for repairing damaged high-value tooling to reduce their through-life cost. However, laser cladding, which needs to melt a small amount of a substrate along with cladding material, inevitably introduces residual stresses in both clad and substrate. The tensile residual stresses in the clad could adversely affect mechanical performance of the substrate being deposited. This paper presents an experimental study on process-induced residual stresses in laser clad AISI P20 tool steel onto pre-hardened wrought P20 base material and the correlation with microstructures using hole-drilling and neutron diffraction methods. Combined with X-ray diffraction and scanning electron microscopic analyses, the roles of solid-state phase transformations in the clad and heat-affected zone (HAZ) of the substrate during cladding and post-cladding heat treatments on the development and controllability of residual stresses in the P20 clad have been investigated, and the results could be beneficial to more effective repair of damaged plastic injection molds made by P20 tool steel.

  16. Stock price prediction using geometric Brownian motion

    Science.gov (United States)

    Farida Agustini, W.; Restu Affianti, Ika; Putri, Endah RM

    2018-03-01

    Geometric Brownian motion is a mathematical model for predicting the future price of stock. The phase that done before stock price prediction is determine stock expected price formulation and determine the confidence level of 95%. On stock price prediction using geometric Brownian Motion model, the algorithm starts from calculating the value of return, followed by estimating value of volatility and drift, obtain the stock price forecast, calculating the forecast MAPE, calculating the stock expected price and calculating the confidence level of 95%. Based on the research, the output analysis shows that geometric Brownian motion model is the prediction technique with high rate of accuracy. It is proven with forecast MAPE value ≤ 20%.

  17. Predicting the water-drop energy required to breakdown dry soil aggregates

    International Nuclear Information System (INIS)

    Mbagwu, J.S.C.; Bazzoffi, P.

    1995-04-01

    The raindrop energy required to breakdown dry soil aggregates is an index of structural stability which has been found very useful in modelling soil erosion process and in evaluating the suitability of tillage implements for different soils. The aim of this research was to develop and validate a model for predicting the specific water-drop energy required to breakdown aggregates (D) as influenced by soil properties. Air-dry aggregates (2-4 mm in diameter), collected from 15 surface (0-20 cm) soils in north central Italy were used for this study. The actual and natural log-transformed D values were regressed on the soil properties. Clay content, wilting point moisture content (WP) and percent water-stable aggregates (WSA) > 2.0 mm were good predictors of D. Empirical models developed from either clay content or WP predicted D in 70% of the test soils whereas the model developed from WSA > 2.0 mm predicted D in 90% of the test soils. The correlation coefficients (r) between measured and predicted D were 0.961, 0.963 and 0.997 respectively, for models developed from clay, WP and WSA > 2.0 mm. The validity of these models need to be tested on other soils with a wider variation in properties than those used to developed the models. (author). 42 refs, 5 tabs

  18. A comparison of the lactate Pro, Accusport, Analox GM7 and Kodak Ektachem lactate analysers in normal, hot and humid conditions.

    Science.gov (United States)

    Mc Naughton, L R; Thompson, D; Philips, G; Backx, K; Crickmore, L

    2002-02-01

    This study aimed to compare the performance of a new portable lactate analyser against other standard laboratory methods in three conditions, normal (20 +/- 1.3 degrees C; 40 +/- 5 % RH), hot (40 +/- 2.5 degrees C; 40 +/- 5 % RH), and humid (20 +/- 1.1 degrees C; 82 +/- 6 % RH) conditions. Seven healthy males, ([Mean +/- SE]: age, 26.3 +/- 1.3 yr; height, 177.7 +/- 1.6 cm; weight, 77.4 +/- 0.9 kg, .VO(2)max, 56.1 +/- 1.9 ml x kg x min(-1)) undertook a maximal cycle ergometry test to exhaustion in the three conditions. Blood was taken every 3 min at the end of each stage and was analysed using the Lactate Pro LT-1710, the Accusport, the Analox GM7 and the Kodak Ektachem systems. The MANOVA (Analyser Type x Condition x Workload) indicated no interaction effect (F(42,660), = 0.45, p > 0.99, Power = 0.53). The data across all workloads indicated that the machines measured significantly differently to each other (F(4,743) = 14.652, p < 0.0001, Power = 1.00). The data were moderately to highly correlated. We conclude that the Lactate Pro is a simple and effective measurement device for taking blood lactate in a field or laboratory setting. However, we would caution against using this machine to compare data from other machines.

  19. Nonlinear piping damping and response predictions

    International Nuclear Information System (INIS)

    Severud, L.K.; Weiner, E.O.; Lindquist, M.R.; Anderson, M.J.; Wagner, S.E.

    1986-10-01

    The high level dynamic testing of four prototypic piping systems, used to provide benchmarks for analytical prediction comparisons, is overviewed. The size of pipe tested ranged from one-inch to six-inches in diameter and consisted of carbon steel or stainless steel material. Failure of the tested systems included progressive gross deformation or some combination of ratchetting-fatigue. Pretest failure predictions and post test comparisons using simplified elastic and elasto-plastic methods are presented. Detailed non-linear inelastic analyses are also shown, along with a typical ratchet-fatigue failure calculation. A simplified method for calculating modal equivalent viscous damping for snubbers and plastic hinges is also described. Conclusions are made regarding the applicability of the various analytical failure predictive methods and recommendations are made for future analytic and test efforts

  20. Evaluation of the Thermochemical Code - CHEETAH 2.0 for Modelling Explosives Performance

    National Research Council Canada - National Science Library

    Lu, Jing

    2001-01-01

    The Lawrence Livermore National Laboratory CHEETAH 2.0 program has been used to analyse a number of conventional ideal explosive ingredients, ideal explosive compositions, non-ideal explosive compositions, and new and proposed explosives...

  1. Coupled 230Th/234U-ESR analyses for corals: A new method to assess sealevel change

    International Nuclear Information System (INIS)

    Blackwell, Bonnie A.B.; Teng, Steve J.T.; Lundberg, Joyce A.; Blickstein, Joel I.B.; Skinner, Anne R.

    2007-01-01

    Although coupled 230 Th/ 234 U-ESR analyses have become routine for dating teeth, they have never been used for corals. While the ESR age depends on, and requires assumptions about, the time-averaged cosmic dose rate, D-bar cos (t), 230 Th/ 234 U dates do not. Since D-bar cos (t) received by corals depends on the attenuation by any intervening material, D-bar cos (t) response reflects changing water depths and sediment cover. By coupling the two methods, one can determine the age and a unique D-bar cos,coupled (t) simultaneously. From a coral's water depth and sedimentary history as predicted by a given sealevel curve, one can predict D-bar cos,sealevel (t). If D-bar cos,coupled (t) agrees well with D-bar cos,sealevel (t), this provides independent validation for the curve used to build D-bar cos,sealevel (t). For six corals dated at 7-128 ka from Florida Platform reef crests, the sealevel curve by Waelbroeck et al. [2002. Sea-level and deep water temperature changes derived from benthonic foraminifera isotopic records. Quat. Sci. Rev. 21, 295-305] predicted their D-bar cos,coupled (t) values as well as, or better than, the SPECMAP sealevel curve. Where a whole reef can be sampled over a transect, a precise test for sealevel curves could be developed

  2. An index predictive of cognitive outcome in retired professional American Football players with a history of sports concussion.

    Science.gov (United States)

    Wright, Mathew J; Woo, Ellen; Birath, J Brandon; Siders, Craig A; Kelly, Daniel F; Wang, Christina; Swerdloff, Ronald; Romero, Elizabeth; Kernan, Claudia; Cantu, Robert C; Guskiewicz, Kevin

    2016-01-01

    Various concussion characteristics and personal factors are associated with cognitive recovery in athletes. We developed an index based on concussion frequency, severity, and timeframe, as well as cognitive reserve (CR), and we assessed its predictive power regarding cognitive ability in retired professional football players. Data from 40 retired professional American football players were used in the current study. On average, participants had been retired from football for 20 years. Current neuropsychological performances, indicators of CR, concussion history, and play data were used to create an index for predicting cognitive outcome. The sample displayed a range of concussions, concussion severities, seasons played, CR, and cognitive ability. Many of the participants demonstrated cognitive deficits. The index strongly predicted global cognitive ability (R(2) = .31). The index also predicted the number of areas of neuropsychological deficit, which varied as a function of the deficit classification system used (Heaton: R(2) = .15; Wechsler: R(2) = .28). The current study demonstrated that a unique combination of CR, sports concussion, and game-related data can predict cognitive outcomes in participants who had been retired from professional American football for an average of 20 years. Such indices may prove to be useful for clinical decision making and research.

  3. HLA region excluded by linkage analyses of early onset periodontitis

    Energy Technology Data Exchange (ETDEWEB)

    Sun, C.; Wang, S.; Lopez, N.

    1994-09-01

    Previous studies suggested that HLA genes may influence susceptibility to early-onset periodontitis (EOP). Segregation analyses indicate that EOP may be due to a single major gene. We conducted linkage analyses to assess possible HLA effects on EOP. Fifty families with two or more close relatives affected by EOP were ascertained in Virginia and Chile. A microsatellite polymorphism within the HLA region (at the tumor necrosis factor beta locus) was typed using PCR. Linkage analyses used a donimant model most strongly supported by previous studies. Assuming locus homogeneity, our results exclude a susceptibility gene within 10 cM on either side of our marker locus. This encompasses all of the HLA region. Analyses assuming alternative models gave qualitatively similar results. Allowing for locus heterogeneity, our data still provide no support for HLA-region involvement. However, our data do not statistically exclude (LOD <-2.0) hypotheses of disease-locus heterogeneity, including models where up to half of our families could contain an EOP disease gene located in the HLA region. This is due to the limited power of even our relatively large collection of families and the inherent difficulties of mapping genes for disorders that have complex and heterogeneous etiologies. Additional statistical analyses, recruitment of families, and typing of flanking DNA markers are planned to more conclusively address these issues with respect to the HLA region and other candidate locations in the human genome. Additional results for markers covering most of the human genome will also be presented.

  4. Monitoring and prediction of natural disasters

    International Nuclear Information System (INIS)

    Kondratyev, K. Ya; Krapivin, V. F.

    2004-01-01

    The problems of natural disaster predicting and accomplishing a synthesis of environmental monitoring systems to collect, store, and process relevant information for their solution are analysed. A three-level methodology is proposed for making decisions concerning the natural disaster dynamics. The methodology is based on the assessment of environmental indicators and the use of numerical models of the environment

  5. Thalamic functional connectivity predicts seizure laterality in individual TLE patients: Application of a biomarker development strategy

    Directory of Open Access Journals (Sweden)

    Daniel S. Barron

    2015-01-01

    No significant differences in functional connection strength in patient and control groups were observed with Mann-Whitney Tests (corrected for multiple comparisons. Notwithstanding the lack of group differences, individual patient difference scores (from control mean connection strength successfully predicted seizure onset zone as shown in ROC curves: discriminant analysis (two-dimensional predicted seizure onset zone with 85% sensitivity and 91% specificity; logistic regression (four-dimensional achieved 86% sensitivity and 100% specificity. The strongest markers in both analyses were left thalamo-hippocampal and right thalamo-entorhinal cortex functional connection strength. Thus, this study shows that thalamic functional connections are sensitive and specific markers of seizure onset laterality in individual temporal lobe epilepsy patients. This study also advances an overall strategy for the programmatic development of neuroimaging biomarkers in clinical and genetic populations: a disease model informed by coordinate-based meta-analysis was used to anatomically constrain individual patient analyses.

  6. Prediction of Summer Extreme Precipitation over the Middle and Lower Reaches of the Yangtze River Basin

    Science.gov (United States)

    Liu, L.; Ning, L.; Liu, J.; Yan, M.; Sun, W.

    2017-12-01

    Abstract: Summer extreme precipitation (SEP) often causes severe landslide, debris flow and floods over the middle and lower reaches of the Yangtze River Basin(MLYRB), so skillful prediction of the SEP is critical to the future climate adaptions and mitigations. In this work, the characteristic region over the MLYRB (27°N-32°N,108°1-118°E) is defined by the spatial mode of the rotated empirical orthogonal functions (REOF) of SEP over the China from 1961 to 2014. A physics-based empirical model (PEM) of SEP predictions is built with two preceding predictors with significant physical influences on the SEP over the MLYRB. The first predictor is the spring sea surface temperature (SST) over the Northern Indian Ocean (20°S-20°N,50°E-95°E), and the second predictor is the spring sea surface pressure (SLP) over the Aleutian Island (50°N-70°N,160°E-160°W). Analyses of physical mechanism show that when the spring SST over the Northern Indian Ocean is higher, the South Asian High (SAH) extends to the east and the western Pacific sub-tropical high (WPSH) extends to the west, therefore, the generated secondary circulation induces anomalous upward motions and more water vapor transportation to the MLYRB, resulting more SEP. Meanwhile, when the spring SLP over the Aleutian Island is lower, the also WPSH extends to the west, which leads to a negative omega anomaly centered the MLYRB and more water vapor transportation to the MLYRB, resulting in more SEP. The regression model is built using the data from a training period from 1961 to 1999 with correlation coefficient skill of 0.57 (p<0.01) for prediction of SEP in 1961-1999. The independent forecast of the PEM shows that it is skillful in SEP prediction with the correlation coefficient between observed SEP and model-simulated SEP over the validation period 2000-2014 is 0.51 (p<0.05). This finding shows that the preceding spring SST and SLP can provide useful information for prediction of SEP, and the methodology

  7. Predictive Manufacturing: A Classification Strategy to Predict Product Failures

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Kulahci, Murat

    2018-01-01

    manufacturing analytics model that employs a big data approach to predicting product failures; third, we illustrate the issue of high dimensionality, along with statistically redundant information; and, finally, our proposed method will be compared against the well-known classification methods (SVM, K......-nearest neighbor, artificial neural networks). The results from real data show that our predictive manufacturing analytics approach, using genetic algorithms and Voronoi tessellations, is capable of predicting product failure with reasonable accuracy. The potential application of this method contributes...... to accurately predicting product failures, which would enable manufacturers to reduce production costs without compromising product quality....

  8. Do needs for security and certainty predict cultural and economic conservatism? A cross-national analysis

    NARCIS (Netherlands)

    Malka, A.; Soto, C.J.; Inzlicht, M.; Lelkes, Y.

    2014-01-01

    We examine whether individual differences in needs for security and certainty predict conservative (vs. liberal) position on both cultural and economic political issues and whether these effects are conditional on nation-level characteristics and individual-level political engagement. Analyses with

  9. Flow-dependent empirical singular vector with an ensemble Kalman filter data assimilation for El Nino prediction

    Energy Technology Data Exchange (ETDEWEB)

    Ham, Yoo-Geun [NASA/GSFC Code 610.1, Global Modeling and Assimilation Office, Greenbelt, MD (United States); Universities Space Research Association, Goddard Earth Sciences Technology and Research Studies and Investigations, Baltimore, MD (United States); Rienecker, Michele M. [NASA/GSFC Code 610.1, Global Modeling and Assimilation Office, Greenbelt, MD (United States)

    2012-10-15

    In this study, a new approach for extracting flow-dependent empirical singular vectors (FESVs) for seasonal prediction using ensemble perturbations obtained from an ensemble Kalman filter (EnKF) assimilation is presented. Due to the short interval between analyses, EnKF perturbations primarily contain instabilities related to fast weather variability. To isolate slower, coupled instabilities that would be more suitable for seasonal prediction, an empirical linear operator for seasonal time-scales (i.e. several months) is formulated using a causality hypothesis; then, the most unstable mode from the linear operator is extracted for seasonal time-scales. It is shown that the flow-dependent operator represents nonlinear integration results better than a conventional empirical linear operator static in time. Through 20 years of retrospective seasonal predictions, it is shown that the skill of forecasting equatorial SST anomalies using the FESV is systematically improved over that using Conventional ESV (CESV). For example, the correlation skill of the NINO3 SST index using FESV is higher, by about 0.1, than that of CESV at 8-month leads. In addition, the forecast skill improvement is significant over the locations where the correlation skill of conventional methods is relatively low, indicating that the FESV is effective where the initial uncertainty is large. (orig.)

  10. Combining a weed traits database with a population dynamics model predicts shifts in weed communities.

    Science.gov (United States)

    Storkey, J; Holst, N; Bøjer, O Q; Bigongiali, F; Bocci, G; Colbach, N; Dorner, Z; Riemens, M M; Sartorato, I; Sønderskov, M; Verschwele, A

    2015-04-01

    A functional approach to predicting shifts in weed floras in response to management or environmental change requires the combination of data on weed traits with analytical frameworks that capture the filtering effect of selection pressures on traits. A weed traits database (WTDB) was designed, populated and analysed, initially using data for 19 common European weeds, to begin to consolidate trait data in a single repository. The initial choice of traits was driven by the requirements of empirical models of weed population dynamics to identify correlations between traits and model parameters. These relationships were used to build a generic model, operating at the level of functional traits, to simulate the impact of increasing herbicide and fertiliser use on virtual weeds along gradients of seed weight and maximum height. The model generated 'fitness contours' (defined as population growth rates) within this trait space in different scenarios, onto which two sets of weed species, defined as common or declining in the UK, were mapped. The effect of increasing inputs on the weed flora was successfully simulated; 77% of common species were predicted to have stable or increasing populations under high fertiliser and herbicide use, in contrast with only 29% of the species that have declined. Future development of the WTDB will aim to increase the number of species covered, incorporate a wider range of traits and analyse intraspecific variability under contrasting management and environments.

  11. On-line prediction of BWR transients in support of plant operation and safety analyses

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Lekach, S.V.; Mallen, A.N.

    1983-01-01

    A combination of advanced modeling techniques and modern, special-purpose peripheral minicomputer technology is presented which affords realistic predictions of plant transient and severe off-normal events in LWR power plants through on-line simulations at a speed ten times greater than actual process speeds. Results are shown for a BWR plant simulation. The mathematical models account for nonequilibrium, nonhomogeneous two-phase flow effects in the coolant, for acoustical effects in the steam line and for the dynamics of the recirculation loop and feed-water train. Point kinetics incorporate reactivity feedback for void fraction, for fuel temperature, and for coolant temperature. Control systems and trip logic are simulated for the nuclear steam supply system

  12. ATWS analyses for Krsko Full Scope Simulator verification

    Energy Technology Data Exchange (ETDEWEB)

    Cerne, G; Tiselj, I; Parzer, I [Reactor Engineering Div., Inst. Jozef Stefan, Ljubljana (Slovenia)

    2000-07-01

    The purpose of this analysis was to simulate Anticipated Transient without Scram transient for Krsko NPP. The results of these calculations were used for verification of reactor coolant system thermal-hydraulic response predicted by Krsko Full Scope Simulator. For the thermal-hydraulic analyses the RELAP5/MOD2 code and the input card deck for NPP Krsko was used. The analyses for ATWS were performed to assess the influence and benefit of ATWS Mitigation System Actuation Circuitry (AMSAC). In the presented paper the most severe ATWS scenarios have been analyzed, starting with the loss of Main Feedwater at both steam generators. Thus, gradual loss of secondary heat sink occurred. On top of that, control rods were not supposed to scram, leaving the chain reaction to be controlled only by inherent physical properties of the fuel and moderator and eventual actions of the BOP system. The primary system response has been studied regarding the AMSAC availability. (author)

  13. K-West and K-East basin thermal analyses for dry conditions

    International Nuclear Information System (INIS)

    Beaver, T.R.; Cramer, E.R.; Hinman, C.A.

    1994-01-01

    Detailed 3 dimensional thermal analyses of the 100K East and 100 K West basins were conducted to determine the peak fuel temperature for intact fuel in the event of a complete loss of water from the basins. Thermal models for the building, an array of fuel encapsulation canisters on the basin floor, and the fuel within a single canister are described along with conservative predictions for the maximum expected temperatures for the loss of water event

  14. SN transport analyses of critical reactor experiments for the SNTP program

    International Nuclear Information System (INIS)

    Mays, C.W.

    1993-01-01

    The capability of S N methodology to accurately predict the neutronics behavior of a compact, light water-moderated reactor is examined. This includes examining the effects of cross-section modeling and the choice of spatial and angular representation. The isothermal temperature coefficient in the range of 293 K to 355 K is analyzed, as well as the radial fission density profile across the central fuel element. Measured data from a series of critical experiments are used for these analyses

  15. Understanding predictability and exploration in human mobility

    DEFF Research Database (Denmark)

    Cuttone, Andrea; Jørgensen, Sune Lehmann; González, Marta C.

    2018-01-01

    Predictive models for human mobility have important applications in many fields including traffic control, ubiquitous computing, and contextual advertisement. The predictive performance of models in literature varies quite broadly, from over 90% to under 40%. In this work we study which underlying...... strong influence on the accuracy of prediction. Finally we reveal that the exploration of new locations is an important factor in human mobility, and we measure that on average 20-25% of transitions are to new places, and approx. 70% of locations are visited only once. We discuss how these mechanisms...... are important factors limiting our ability to predict human mobility....

  16. Can the theory of planned behaviour predict the physical activity behaviour of individuals?

    Science.gov (United States)

    Hobbs, Nicola; Dixon, Diane; Johnston, Marie; Howie, Kate

    2013-01-01

    The theory of planned behaviour (TPB) can identify cognitions that predict differences in behaviour between individuals. However, it is not clear whether the TPB can predict the behaviour of an individual person. This study employs a series of n-of-1 studies and time series analyses to examine the ability of the TPB to predict physical activity (PA) behaviours of six individuals. Six n-of-1 studies were conducted, in which TPB cognitions and up to three PA behaviours (walking, gym workout and a personally defined PA) were measured twice daily for six weeks. Walking was measured by pedometer step count, gym attendance by self-report with objective validation of gym entry and the personally defined PA behaviour by self-report. Intra-individual variability in TPB cognitions and PA behaviour was observed in all participants. The TPB showed variable predictive utility within individuals and across behaviours. The TPB predicted at least one PA behaviour for five participants but had no predictive utility for one participant. Thus, n-of-1 designs and time series analyses can be used to test theory in an individual.

  17. Development of a regional ensemble prediction method for probabilistic weather prediction

    International Nuclear Information System (INIS)

    Nohara, Daisuke; Tamura, Hidetoshi; Hirakuchi, Hiromaru

    2015-01-01

    A regional ensemble prediction method has been developed to provide probabilistic weather prediction using a numerical weather prediction model. To obtain consistent perturbations with the synoptic weather pattern, both of initial and lateral boundary perturbations were given by differences between control and ensemble member of the Japan Meteorological Agency (JMA)'s operational one-week ensemble forecast. The method provides a multiple ensemble member with a horizontal resolution of 15 km for 48-hour based on a downscaling of the JMA's operational global forecast accompanied with the perturbations. The ensemble prediction was examined in the case of heavy snow fall event in Kanto area on January 14, 2013. The results showed that the predictions represent different features of high-resolution spatiotemporal distribution of precipitation affected by intensity and location of extra-tropical cyclone in each ensemble member. Although the ensemble prediction has model bias of mean values and variances in some variables such as wind speed and solar radiation, the ensemble prediction has a potential to append a probabilistic information to a deterministic prediction. (author)

  18. LoopIng: a template-based tool for predicting the structure of protein loops.

    KAUST Repository

    Messih, Mario Abdel

    2015-08-06

    Predicting the structure of protein loops is very challenging, mainly because they are not necessarily subject to strong evolutionary pressure. This implies that, unlike the rest of the protein, standard homology modeling techniques are not very effective in modeling their structure. However, loops are often involved in protein function, hence inferring their structure is important for predicting protein structure as well as function.We describe a method, LoopIng, based on the Random Forest automated learning technique, which, given a target loop, selects a structural template for it from a database of loop candidates. Compared to the most recently available methods, LoopIng is able to achieve similar accuracy for short loops (4-10 residues) and significant enhancements for long loops (11-20 residues). The quality of the predictions is robust to errors that unavoidably affect the stem regions when these are modeled. The method returns a confidence score for the predicted template loops and has the advantage of being very fast (on average: 1 min/loop).www.biocomputing.it/loopinganna.tramontano@uniroma1.itSupplementary data are available at Bioinformatics online.

  19. A quick reality check for microRNA target prediction.

    Science.gov (United States)

    Kast, Juergen

    2011-04-01

    The regulation of protein abundance by microRNA (miRNA)-mediated repression of mRNA translation is a rapidly growing area of interest in biochemical research. In animal cells, the miRNA seed sequence does not perfectly match that of the mRNA it targets, resulting in a large number of possible miRNA targets and varied extents of repression. Several software tools are available for the prediction of miRNA targets, yet the overlap between them is limited. Jovanovic et al. have developed and applied a targeted, quantitative approach to validate predicted miRNA target proteins. Using a proteome database, they have set up and tested selected reaction monitoring assays for approximately 20% of more than 800 predicted let-7 targets, as well as control genes in Caenorhabditis elegans. Their results demonstrate that such assays can be developed quickly and with relative ease, and applied in a high-throughput setup to verify known and identify novel miRNA targets. They also show, however, that the choice of the biological system and material has a noticeable influence on the frequency, extent and direction of the observed changes. Nonetheless, selected reaction monitoring assays, such as those developed by Jovanovic et al., represent an attractive new tool in the study of miRNA function at the organism level.

  20. A case against bio markers as they are currently used in radioecological risk analyses: a problem of linkage

    International Nuclear Information System (INIS)

    Hinton, T.G.; Brechignac, F.

    2005-01-01

    Bio-markers are successfully used in human risk analyses as early indicators of contaminant exposure and predictors of deleterious effects. This has boosted the search for bio-markers in determining ecological risks to non-human biota, and particularly for assessments related to radioactive contaminants. There are difficulties, however, that prevent an easy transfer of the bio-marker concept from humans to non-human biota, as there are significant differences in endpoints of concern, units of observation and dose response relationships between human and ecological risk analyses. The use of bio-markers in ecological risk analyses currently lacks a linkage between molecular-level effects and quantifiable impacts observed in individuals and populations. This is important because ecological risk analyses generally target the population level of biological organisation. We highlight various examples that demonstrate the difficulties of linking individual responses to population-level impacts, such as indirect effects and compensatory interactions. Eco-toxicologists cope with such difficulties through the use of uncertainty or extrapolation factors. Extrapolation factors (EF) typically range from 1 to 1000 when linking effects observed in individuals to those predicted to occur in populations. We question what magnitude of EF will be required when going from a molecular level effect, measured by a bio-marker, all the way up to the population level of biological organisation. Particularly, we stress that a successful application of bio-markers to radioecological risk assessment can only be achieved once the connection has been made between changes in individual resource allocation-based life histories and population dynamics. This clearly emphasises the need to quantify the propagation of molecular and cellular level effects to higher levels of biological organisation, especially in the long-term via several generations of exposure. Finally, we identify pertinent research