WorldWideScience

Sample records for analyses predict a20

  1. Analysing News for Stock Market Prediction

    Science.gov (United States)

    Ramalingam, V. V.; Pandian, A.; Dwivedi, shivam; Bhatt, Jigar P.

    2018-04-01

    Stock market means the aggregation of all sellers and buyers of stocks representing their ownership claims on the business. To be completely absolute about the investment on these stocks, proper knowledge about them as well as their pricing, for both present and future is very essential. Large amount of data is collected and parsed to obtain this essential information regarding the fluctuations in the stock market. This data can be any news or public opinions in general. Recently, many methods have been used, especially big unstructured data methods to predict the stock market values. We introduce another method of focusing on deriving the best statistical learning model for predicting the future values. The data set used is very large unstructured data collected from an online social platform, commonly known as Quindl. The data from this platform is then linked to a csv fie and cleaned to obtain the essential information for stock market prediction. The method consists of carrying out the NLP (Natural Language Processing) of the data and then making it easier for the system to understand, finds and identifies the correlation in between this data and the stock market fluctuations. The model is implemented using Python Programming Language throughout the entire project to obtain flexibility and convenience of the system.

  2. Neonatal Sleep-Wake Analyses Predict 18-month Neurodevelopmental Outcomes.

    Science.gov (United States)

    Shellhaas, Renée A; Burns, Joseph W; Hassan, Fauziya; Carlson, Martha D; Barks, John D E; Chervin, Ronald D

    2017-11-01

    The neurological examination of critically ill neonates is largely limited to reflexive behavior. The exam often ignores sleep-wake physiology that may reflect brain integrity and influence long-term outcomes. We assessed whether polysomnography and concurrent cerebral near-infrared spectroscopy (NIRS) might improve prediction of 18-month neurodevelopmental outcomes. Term newborns with suspected seizures underwent standardized neurologic examinations to generate Thompson scores and had 12-hour bedside polysomnography with concurrent cerebral NIRS. For each infant, the distribution of sleep-wake stages and electroencephalogram delta power were computed. NIRS-derived fractional tissue oxygen extraction (FTOE) was calculated across sleep-wake stages. At age 18-22 months, surviving participants were evaluated with Bayley Scales of Infant Development (Bayley-III), 3rd edition. Twenty-nine participants completed Bayley-III. Increased newborn time in quiet sleep predicted worse 18-month cognitive and motor scores (robust regression models, adjusted r2 = 0.22, p = .007, and 0.27, .004, respectively). Decreased 0.5-2 Hz electroencephalograph (EEG) power during quiet sleep predicted worse 18-month language and motor scores (adjusted r2 = 0.25, p = .0005, and 0.33, .001, respectively). Predictive values remained significant after adjustment for neonatal Thompson scores or exposure to phenobarbital. Similarly, an attenuated difference in FTOE, between neonatal wakefulness and quiet sleep, predicted worse 18-month cognitive, language, and motor scores in adjusted analyses (each p sleep-as quantified by increased time in quiet sleep, lower electroencephalogram delta power during that stage, and muted differences in FTOE between quiet sleep and wakefulness-may improve prediction of adverse long-term outcomes for newborns with neurological dysfunction. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved

  3. Climate Prediction Center (CPC) US daily temperature analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The U.S. daily temperature analyses are maps depicting various temperature quantities utilizing daily maximum and minimum temperature data across the US. Maps are...

  4. Analysing Twitter and web queries for flu trend prediction.

    Science.gov (United States)

    Santos, José Carlos; Matos, Sérgio

    2014-05-07

    Social media platforms encourage people to share diverse aspects of their daily life. Among these, shared health related information might be used to infer health status and incidence rates for specific conditions or symptoms. In this work, we present an infodemiology study that evaluates the use of Twitter messages and search engine query logs to estimate and predict the incidence rate of influenza like illness in Portugal. Based on a manually classified dataset of 2704 tweets from Portugal, we selected a set of 650 textual features to train a Naïve Bayes classifier to identify tweets mentioning flu or flu-like illness or symptoms. We obtained a precision of 0.78 and an F-measure of 0.83, based on cross validation over the complete annotated set. Furthermore, we trained a multiple linear regression model to estimate the health-monitoring data from the Influenzanet project, using as predictors the relative frequencies obtained from the tweet classification results and from query logs, and achieved a correlation ratio of 0.89 (puser-generated content have mostly focused on the english language. Our results further validate those studies and show that by changing the initial steps of data preprocessing and feature extraction and selection, the proposed approaches can be adapted to other languages. Additionally, we investigated whether the predictive model created can be applied to data from the subsequent flu season. In this case, although the prediction result was good, an initial phase to adapt the regression model could be necessary to achieve more robust results.

  5. Comparative and Predictive Multimedia Assessments Using Monte Carlo Uncertainty Analyses

    Science.gov (United States)

    Whelan, G.

    2002-05-01

    Multiple-pathway frameworks (sometimes referred to as multimedia models) provide a platform for combining medium-specific environmental models and databases, such that they can be utilized in a more holistic assessment of contaminant fate and transport in the environment. These frameworks provide a relatively seamless transfer of information from one model to the next and from databases to models. Within these frameworks, multiple models are linked, resulting in models that consume information from upstream models and produce information to be consumed by downstream models. The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) is an example, which allows users to link their models to other models and databases. FRAMES is an icon-driven, site-layout platform that is an open-architecture, object-oriented system that interacts with environmental databases; helps the user construct a Conceptual Site Model that is real-world based; allows the user to choose the most appropriate models to solve simulation requirements; solves the standard risk paradigm of release transport and fate; and exposure/risk assessments to people and ecology; and presents graphical packages for analyzing results. FRAMES is specifically designed allow users to link their own models into a system, which contains models developed by others. This paper will present the use of FRAMES to evaluate potential human health exposures using real site data and realistic assumptions from sources, through the vadose and saturated zones, to exposure and risk assessment at three real-world sites, using the Multimedia Environmental Pollutant Assessment System (MEPAS), which is a multimedia model contained within FRAMES. These real-world examples use predictive and comparative approaches coupled with a Monte Carlo analysis. A predictive analysis is where models are calibrated to monitored site data, prior to the assessment, and a comparative analysis is where models are not calibrated but

  6. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.; Mai, Paul Martin; Thingbaijam, Kiran Kumar; Razafindrakoto, H. N. T.; Genton, Marc G.

    2014-01-01

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  7. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.

    2014-11-10

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  8. A Wear Rule and Cutter Life Prediction Model of a 20-in. TBM Cutter for Granite: A Case Study of a Water Conveyance Tunnel in China

    Science.gov (United States)

    Liu, Quansheng; Liu, Jianping; Pan, Yucong; Zhang, Xiaoping; Peng, Xingxin; Gong, Qiuming; Du, Lijie

    2017-05-01

    Disc cutter wear is one of the comprehensive results of the rock-machine interaction in tunnel boring machine (TBM) tunneling. The replacement of the disc cutter is a time-consuming and costly activity that can significantly reduce the TBM utilization ( U) and advance rate (AR), and has a major effect on the total time and cost of TBM tunneling projects. Therefore, the importance of predicting the cutter life accurately can never be overemphasized. Most cutter wear prediction models are only suitable for 17-in. or smaller disc cutters. However, use of large-diameter disc cutters has been an irresistible trend for large-section hard rock TBMs. This study attempts to reveal the genuine wear rule of a 20-in. disc cutter and develop a new empirical model for predicting the cutter life in granite based on field data collected from a water conveyance tunnel constructed by the TBM tunneling method in China. The field data including the actual cutter wear and the geological parameters along the studied tunnel were compiled in a special database that was subjected to statistical analysis to reveal the genuine wear rule of a 20-in. disc cutter and develop the reasonable correlations between some common intact rock parameters and the disc cutter life. These equations were developed based on data from massive to very massive granite with a UCS range of 40-100 MPa, which can be applied for the assessment of the cutter life of a 20-in. disc cutter in similar hard rock projects with similar rock strengths and rock abrasivities.

  9. On the use of uncertainty analyses to test hypotheses regarding deterministic model predictions of environmental processes

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bittner, E.A.; Essington, E.H.

    1995-01-01

    This paper illustrates the use of Monte Carlo parameter uncertainty and sensitivity analyses to test hypotheses regarding predictions of deterministic models of environmental transport, dose, risk and other phenomena. The methodology is illustrated by testing whether 238 Pu is transferred more readily than 239+240 Pu from the gastrointestinal (GI) tract of cattle to their tissues (muscle, liver and blood). This illustration is based on a study wherein beef-cattle grazed for up to 1064 days on a fenced plutonium (Pu)-contaminated arid site in Area 13 near the Nevada Test Site in the United States. Periodically, cattle were sacrificed and their tissues analyzed for Pu and other radionuclides. Conditional sensitivity analyses of the model predictions were also conducted. These analyses indicated that Pu cattle tissue concentrations had the largest impact of any model parameter on the pdf of predicted Pu fractional transfers. Issues that arise in conducting uncertainty and sensitivity analyses of deterministic models are discussed. (author)

  10. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma.

    Science.gov (United States)

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-06-01

    The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC).Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement.The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P histogram analyses-in particular for 1th percentile for PVP images-held promise for prediction of MVI of HCC.

  11. Quantitative Prediction of Coalbed Gas Content Based on Seismic Multiple-Attribute Analyses

    Directory of Open Access Journals (Sweden)

    Renfang Pan

    2015-09-01

    Full Text Available Accurate prediction of gas planar distribution is crucial to selection and development of new CBM exploration areas. Based on seismic attributes, well logging and testing data we found that seismic absorption attenuation, after eliminating the effects of burial depth, shows an evident correlation with CBM gas content; (positive structure curvature has a negative correlation with gas content; and density has a negative correlation with gas content. It is feasible to use the hydrocarbon index (P*G and pseudo-Poisson ratio attributes for detection of gas enrichment zones. Based on seismic multiple-attribute analyses, a multiple linear regression equation was established between the seismic attributes and gas content at the drilling wells. Application of this equation to the seismic attributes at locations other than the drilling wells yielded a quantitative prediction of planar gas distribution. Prediction calculations were performed for two different models, one using pre-stack inversion and the other one disregarding pre-stack inversion. A comparison of the results indicates that both models predicted a similar trend for gas content distribution, except that the model using pre-stack inversion yielded a prediction result with considerably higher precision than the other model.

  12. Transport and stability analyses supporting disruption prediction in high beta KSTAR plasmas

    Science.gov (United States)

    Ahn, J.-H.; Sabbagh, S. A.; Park, Y. S.; Berkery, J. W.; Jiang, Y.; Riquezes, J.; Lee, H. H.; Terzolo, L.; Scott, S. D.; Wang, Z.; Glasser, A. H.

    2017-10-01

    KSTAR plasmas have reached high stability parameters in dedicated experiments, with normalized beta βN exceeding 4.3 at relatively low plasma internal inductance li (βN/li>6). Transport and stability analyses have begun on these plasmas to best understand a disruption-free path toward the design target of βN = 5 while aiming to maximize the non-inductive fraction of these plasmas. Initial analysis using the TRANSP code indicates that the non-inductive current fraction in these plasmas has exceeded 50 percent. The advent of KSTAR kinetic equilibrium reconstructions now allows more accurate computation of the MHD stability of these plasmas. Attention is placed on code validation of mode stability using the PEST-3 and resistive DCON codes. Initial evaluation of these analyses for disruption prediction is made using the disruption event characterization and forecasting (DECAF) code. The present global mode kinetic stability model in DECAF developed for low aspect ratio plasmas is evaluated to determine modifications required for successful disruption prediction of KSTAR plasmas. Work supported by U.S. DoE under contract DE-SC0016614.

  13. Functional enrichment analyses and construction of functional similarity networks with high confidence function prediction by PFP

    Directory of Open Access Journals (Sweden)

    Kihara Daisuke

    2010-05-01

    Full Text Available Abstract Background A new paradigm of biological investigation takes advantage of technologies that produce large high throughput datasets, including genome sequences, interactions of proteins, and gene expression. The ability of biologists to analyze and interpret such data relies on functional annotation of the included proteins, but even in highly characterized organisms many proteins can lack the functional evidence necessary to infer their biological relevance. Results Here we have applied high confidence function predictions from our automated prediction system, PFP, to three genome sequences, Escherichia coli, Saccharomyces cerevisiae, and Plasmodium falciparum (malaria. The number of annotated genes is increased by PFP to over 90% for all of the genomes. Using the large coverage of the function annotation, we introduced the functional similarity networks which represent the functional space of the proteomes. Four different functional similarity networks are constructed for each proteome, one each by considering similarity in a single Gene Ontology (GO category, i.e. Biological Process, Cellular Component, and Molecular Function, and another one by considering overall similarity with the funSim score. The functional similarity networks are shown to have higher modularity than the protein-protein interaction network. Moreover, the funSim score network is distinct from the single GO-score networks by showing a higher clustering degree exponent value and thus has a higher tendency to be hierarchical. In addition, examining function assignments to the protein-protein interaction network and local regions of genomes has identified numerous cases where subnetworks or local regions have functionally coherent proteins. These results will help interpreting interactions of proteins and gene orders in a genome. Several examples of both analyses are highlighted. Conclusion The analyses demonstrate that applying high confidence predictions from PFP

  14. Predictive analyses of flow-induced vibration and fretting wear in steam generator tubes

    International Nuclear Information System (INIS)

    Axisa, F.

    1989-01-01

    Maintaining the service life of PWR steam generators under highly reliable conditions requires a complex design to prevent various damaging processes, including those related to flow induced vibration. Predictive analyses have to rely on numerical tools to compute the vibratory response of multi-supported tubes in association with experimental data and semi-empirical relationships for quantifying flow-induced excitation mechanisms and tube damaging processes. In the presence of loose supports tube dynamics becomes highly nonlinear in nature. To deal with such problems CEA and FRAMATOME developed a computer program called GERBOISE. This paper provides a short description of an experimental program currently in progress at CEN Saclay to validate the numerical methods implemented in GERBOISE. According to the results obtained so far reasonable agreement is obtained between experiment and numerical simulation, especially as averaged quantities are concerned

  15. Interactions between risk factors in the prediction of onset of eating disorders: Exploratory hypothesis generating analyses.

    Science.gov (United States)

    Stice, Eric; Desjardins, Christopher D

    2018-06-01

    Because no study has tested for interactions between risk factors in the prediction of future onset of each eating disorder, this exploratory study addressed this lacuna to generate hypotheses to be tested in future confirmatory studies. Data from three prevention trials that targeted young women at high risk for eating disorders due to body dissatisfaction (N = 1271; M age 18.5, SD 4.2) and collected diagnostic interview data over 3-year follow-up were combined to permit sufficient power to predict onset of anorexia nervosa (AN), bulimia nervosa (BN), binge eating disorder (BED), and purging disorder (PD) using classification tree analyses, an analytic technique uniquely suited to detecting interactions. Low BMI was the most potent predictor of AN onset, and body dissatisfaction amplified this relation. Overeating was the most potent predictor of BN onset, and positive expectancies for thinness and body dissatisfaction amplified this relation. Body dissatisfaction was the most potent predictor of BED onset, and overeating, low dieting, and thin-ideal internalization amplified this relation. Dieting was the most potent predictor of PD onset, and negative affect and positive expectancies for thinness amplified this relation. Results provided evidence of amplifying interactions between risk factors suggestive of cumulative risk processes that were distinct for each disorder; future confirmatory studies should test the interactive hypotheses generated by these analyses. If hypotheses are confirmed, results may allow interventionists to target ultra high-risk subpopulations with more intensive prevention programs that are uniquely tailored for each eating disorder, potentially improving the yield of prevention efforts. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Predictive Engineering Tools for Injection-Molded Long-Carbon-Thermoplastic Composites: Weight and Cost Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fifield, Leonard S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gandhi, Umesh N. [Toyota Research Inst. North America, Ann Arbor, MI (United States); Mori, Steven [MAGNA Exteriors and Interiors Corporation, Aurora, ON (Canada); Wollan, Eric J. [PlastiComp, Inc., Winona, MN (United States)

    2016-08-01

    This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used as resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.

  17. Structural Dynamic Analyses And Test Predictions For Spacecraft Structures With Non-Linearities

    Science.gov (United States)

    Vergniaud, Jean-Baptiste; Soula, Laurent; Newerla, Alfred

    2012-07-01

    The overall objective of the mechanical development and verification process is to ensure that the spacecraft structure is able to sustain the mechanical environments encountered during launch. In general the spacecraft structures are a-priori assumed to behave linear, i.e. the responses to a static load or dynamic excitation, respectively, will increase or decrease proportionally to the amplitude of the load or excitation induced. However, past experiences have shown that various non-linearities might exist in spacecraft structures and the consequences of their dynamic effects can significantly affect the development and verification process. Current processes are mainly adapted to linear spacecraft structure behaviour. No clear rules exist for dealing with major structure non-linearities. They are handled outside the process by individual analysis and margin policy, and analyses after tests to justify the CLA coverage. Non-linearities can primarily affect the current spacecraft development and verification process on two aspects. Prediction of flights loads by launcher/satellite coupled loads analyses (CLA): only linear satellite models are delivered for performing CLA and no well-established rules exist how to properly linearize a model when non- linearities are present. The potential impact of the linearization on the results of the CLA has not yet been properly analyzed. There are thus difficulties to assess that CLA results will cover actual flight levels. Management of satellite verification tests: the CLA results generated with a linear satellite FEM are assumed flight representative. If the internal non- linearities are present in the tested satellite then there might be difficulties to determine which input level must be passed to cover satellite internal loads. The non-linear behaviour can also disturb the shaker control, putting the satellite at risk by potentially imposing too high levels. This paper presents the results of a test campaign performed in

  18. Monitoring and predicting crop growth and analysing agricultural ecosystems by remote sensing

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Akiyama

    1996-05-01

    Full Text Available LANDSAT/TM data, which are characterized by high spectral/spatial resolutions, are able to contribute to practical agricultural management. In the first part of the paper, the authors review some recent applications of satellite remote sensing in agriculture. Techniques for crop discrimination and mapping have made such rapid progress that we can classify crop types with more than 80% accuracy. The estimation of crop biomass using satellite data, including leaf area, dry and fresh weights, and the prediction of grain yield, has been attempted using various spectral vegetation indices. Plant stresses caused by nutrient deficiency and water deficit have also been analysed successfully. Such information may be useful for farm management. In the latter half of the paper, we introduce the Arctic Science Project, which was carried out under the Science and Technology Agency of Japan collaborating with Finnish scientists. In this project, monitoring of the boreal forest was carried out using LANDSAT data. Changes in the phenology of subarctic ground vegetation, based on spectral properties, were measured by a boom-mounted, four-band spectroradiometer. The turning point dates of the seasonal near-infrared (NIR and red (R reflectance factors might indicate the end of growth and the beginning of autumnal tints, respectively.

  19. Predicting behavior change from persuasive messages using neural representational similarity and social network analyses.

    Science.gov (United States)

    Pegors, Teresa K; Tompson, Steven; O'Donnell, Matthew Brook; Falk, Emily B

    2017-08-15

    Neural activity in medial prefrontal cortex (MPFC), identified as engaging in self-related processing, predicts later health behavior change. However, it is unknown to what extent individual differences in neural representation of content and lived experience influence this brain-behavior relationship. We examined whether the strength of content-specific representations during persuasive messaging relates to later behavior change, and whether these relationships change as a function of individuals' social network composition. In our study, smokers viewed anti-smoking messages while undergoing fMRI and we measured changes in their smoking behavior one month later. Using representational similarity analyses, we found that the degree to which message content (i.e. health, social, or valence information) was represented in a self-related processing MPFC region was associated with later smoking behavior, with increased representations of negatively valenced (risk) information corresponding to greater message-consistent behavior change. Furthermore, the relationship between representations and behavior change depended on social network composition: smokers who had proportionally fewer smokers in their network showed increases in smoking behavior when social or health content was strongly represented in MPFC, whereas message-consistent behavior (i.e., less smoking) was more likely for those with proportionally more smokers in their social network who represented social or health consequences more strongly. These results highlight the dynamic relationship between representations in MPFC and key outcomes such as health behavior change; a complete understanding of the role of MPFC in motivation and action should take into account individual differences in neural representation of stimulus attributes and social context variables such as social network composition. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Predictive Genomic Analyses Inform the Basis for Vitamin Metabolism and Provisioning in Bacteria-Arthropod Endosymbioses.

    Science.gov (United States)

    Serbus, Laura R; Rodriguez, Brian Garcia; Sharmin, Zinat; Momtaz, A J M Zehadee; Christensen, Steen

    2017-06-07

    The requirement of vitamins for core metabolic processes creates a unique set of pressures for arthropods subsisting on nutrient-limited diets. While endosymbiotic bacteria carried by arthropods have been widely implicated in vitamin provisioning, the underlying molecular mechanisms are not well understood. To address this issue, standardized predictive assessment of vitamin metabolism was performed in 50 endosymbionts of insects and arachnids. The results predicted that arthropod endosymbionts overall have little capacity for complete de novo biosynthesis of conventional or active vitamin forms. Partial biosynthesis pathways were commonly predicted, suggesting a substantial role in vitamin provisioning. Neither taxonomic relationships between host and symbiont, nor the mode of host-symbiont interaction were clear predictors of endosymbiont vitamin pathway capacity. Endosymbiont genome size and the synthetic capacity of nonsymbiont taxonomic relatives were more reliable predictors. We developed a new software application that also predicted that last-step conversion of intermediates into active vitamin forms may contribute further to vitamin biosynthesis by endosymbionts. Most instances of predicted vitamin conversion were paralleled by predictions of vitamin use. This is consistent with achievement of provisioning in some cases through upregulation of pathways that were retained for endosymbiont benefit. The predicted absence of other enzyme classes further suggests a baseline of vitamin requirement by the majority of endosymbionts, as well as some instances of putative mutualism. Adaptation of this workflow to analysis of other organisms and metabolic pathways will provide new routes for considering the molecular basis for symbiosis on a comprehensive scale. Copyright © 2017 Serbus et al.

  1. Predictive Genomic Analyses Inform the Basis for Vitamin Metabolism and Provisioning in Bacteria-Arthropod Endosymbioses

    Directory of Open Access Journals (Sweden)

    Laura R. Serbus

    2017-06-01

    Full Text Available The requirement of vitamins for core metabolic processes creates a unique set of pressures for arthropods subsisting on nutrient-limited diets. While endosymbiotic bacteria carried by arthropods have been widely implicated in vitamin provisioning, the underlying molecular mechanisms are not well understood. To address this issue, standardized predictive assessment of vitamin metabolism was performed in 50 endosymbionts of insects and arachnids. The results predicted that arthropod endosymbionts overall have little capacity for complete de novo biosynthesis of conventional or active vitamin forms. Partial biosynthesis pathways were commonly predicted, suggesting a substantial role in vitamin provisioning. Neither taxonomic relationships between host and symbiont, nor the mode of host-symbiont interaction were clear predictors of endosymbiont vitamin pathway capacity. Endosymbiont genome size and the synthetic capacity of nonsymbiont taxonomic relatives were more reliable predictors. We developed a new software application that also predicted that last-step conversion of intermediates into active vitamin forms may contribute further to vitamin biosynthesis by endosymbionts. Most instances of predicted vitamin conversion were paralleled by predictions of vitamin use. This is consistent with achievement of provisioning in some cases through upregulation of pathways that were retained for endosymbiont benefit. The predicted absence of other enzyme classes further suggests a baseline of vitamin requirement by the majority of endosymbionts, as well as some instances of putative mutualism. Adaptation of this workflow to analysis of other organisms and metabolic pathways will provide new routes for considering the molecular basis for symbiosis on a comprehensive scale.

  2. Influence of the Human Skin Tumor Type in Photodynamic Therapy Analysed by a Predictive Model

    Directory of Open Access Journals (Sweden)

    I. Salas-García

    2012-01-01

    Full Text Available Photodynamic Therapy (PDT modeling allows the prediction of the treatment results depending on the lesion properties, the photosensitizer distribution, or the optical source characteristics. We employ a predictive PDT model and apply it to different skin tumors. It takes into account optical radiation distribution, a nonhomogeneous topical photosensitizer spatial temporal distribution, and the time-dependent photochemical interaction. The predicted singlet oxygen molecular concentrations with varying optical irradiance are compared and could be directly related with the necrosis area. The results show a strong dependence on the particular lesion. This suggests the need to design optimal PDT treatment protocols adapted to the specific patient and lesion.

  3. Sequence analyses and 3D structure prediction of two Type III ...

    African Journals Online (AJOL)

    Internet

    2012-04-17

    Apr 17, 2012 ... analyses were performed using the sequence data of growth hormone gene (gh) ... used as a phylogenetic marker for different taxonomic ..... structural changes have been observed in some parts of ..... of spatial restraints.

  4. Analysing the Relevance of Experience Partitions to the Prediction of Players’ Self-Reports of Affect

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez; Yannakakis, Georgios N.

    2011-01-01

    A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features, to the accur......A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features...

  5. Analyses and predictions of the thermodynamic properties and phase diagrams of silicate systems

    Energy Technology Data Exchange (ETDEWEB)

    Blander, M. (Argonne National Lab., IL (United States)); Pelton, A.; Eriksson, G. (Ecole Polytechnique, Montreal, PQ (Canada). Dept. of Metallurgy and Materials Engineering)

    1992-01-01

    Molten silicates are ordered solutions which can not be well represented by the usual polynomial representation of deviations from ideal solution behavior (i.e. excess free energies of mixing). An adaptation of quasichemical theory which is capable of describing the properties of ordered solutions represents the measured properties of binary silicates over broad ranges of composition and temperature. For simple silicates such as the MgO-FeO-SiO{sub 2} ternary system, in which silica is the only acid component, a combining rule generally leads to good predictions of ternary solutions from those of the binaries. In basic solutions, these predictions are consistent with those of the conformal ionic solution theory. Our results indicate that our approach could provide a potentially powerful tool for representing and predicting the properties of multicomponent molten silicates.

  6. Analyses and predictions of the thermodynamic properties and phase diagrams of silicate systems

    Energy Technology Data Exchange (ETDEWEB)

    Blander, M. [Argonne National Lab., IL (United States); Pelton, A.; Eriksson, G. [Ecole Polytechnique, Montreal, PQ (Canada). Dept. of Metallurgy and Materials Engineering

    1992-07-01

    Molten silicates are ordered solutions which can not be well represented by the usual polynomial representation of deviations from ideal solution behavior (i.e. excess free energies of mixing). An adaptation of quasichemical theory which is capable of describing the properties of ordered solutions represents the measured properties of binary silicates over broad ranges of composition and temperature. For simple silicates such as the MgO-FeO-SiO{sub 2} ternary system, in which silica is the only acid component, a combining rule generally leads to good predictions of ternary solutions from those of the binaries. In basic solutions, these predictions are consistent with those of the conformal ionic solution theory. Our results indicate that our approach could provide a potentially powerful tool for representing and predicting the properties of multicomponent molten silicates.

  7. Combining Results from Distinct MicroRNA Target Prediction Tools Enhances the Performance of Analyses

    Directory of Open Access Journals (Sweden)

    Arthur C. Oliveira

    2017-05-01

    Full Text Available Target prediction is generally the first step toward recognition of bona fide microRNA (miRNA-target interactions in living cells. Several target prediction tools are now available, which use distinct criteria and stringency to provide the best set of candidate targets for a single miRNA or a subset of miRNAs. However, there are many false-negative predictions, and consensus about the optimum strategy to select and use the output information provided by the target prediction tools is lacking. We compared the performance of four tools cited in literature—TargetScan (TS, miRanda-mirSVR (MR, Pita, and RNA22 (R22, and we determined the most effective approach for analyzing target prediction data (individual, union, or intersection. For this purpose, we calculated the sensitivity, specificity, precision, and correlation of these approaches using 10 miRNAs (miR-1-3p, miR-17-5p, miR-21-5p, miR-24-3p, miR-29a-3p, miR-34a-5p, miR-124-3p, miR-125b-5p, miR-145-5p, and miR-155-5p and 1,400 genes (700 validated and 700 non-validated as targets of these miRNAs. The four tools provided a subset of high-quality predictions and returned few false-positive predictions; however, they could not identify several known true targets. We demonstrate that union of TS/MR and TS/MR/R22 enhanced the quality of in silico prediction analysis of miRNA targets. We conclude that the union rather than the intersection of the aforementioned tools is the best strategy for maximizing performance while minimizing the loss of time and resources in subsequent in vivo and in vitro experiments for functional validation of miRNA-target interactions.

  8. Simulation, prediction, and genetic analyses of daily methane emissions in dairy cattle.

    Science.gov (United States)

    Yin, T; Pinent, T; Brügemann, K; Simianer, H; König, S

    2015-08-01

    This study presents an approach combining phenotypes from novel traits, deterministic equations from cattle nutrition, and stochastic simulation techniques from animal breeding to generate test-day methane emissions (MEm) of dairy cows. Data included test-day production traits (milk yield, fat percentage, protein percentage, milk urea nitrogen), conformation traits (wither height, hip width, body condition score), female fertility traits (days open, calving interval, stillbirth), and health traits (clinical mastitis) from 961 first lactation Brown Swiss cows kept on 41 low-input farms in Switzerland. Test-day MEm were predicted based on the traits from the current data set and 2 deterministic prediction equations, resulting in the traits labeled MEm1 and MEm2. Stochastic simulations were used to assign individual concentrate intake in dependency of farm-type specifications (requirement when calculating MEm2). Genetic parameters for MEm1 and MEm2 were estimated using random regression models. Predicted MEm had moderate heritabilities over lactation and ranged from 0.15 to 0.37, with highest heritabilities around DIM 100. Genetic correlations between MEm1 and MEm2 ranged between 0.91 and 0.94. Antagonistic genetic correlations in the range from 0.70 to 0.92 were found for the associations between MEm2 and milk yield. Genetic correlations between MEm with days open and with calving interval increased from 0.10 at the beginning to 0.90 at the end of lactation. Genetic relationships between MEm2 and stillbirth were negative (0 to -0.24) from the beginning to the peak phase of lactation. Positive genetic relationships in the range from 0.02 to 0.49 were found between MEm2 with clinical mastitis. Interpretation of genetic (co)variance components should also consider the limitations when using data generated by prediction equations. Prediction functions only describe that part of MEm which is dependent on the factors and effects included in the function. With high

  9. Prediction of Seismic Slope Displacements by Dynamic Stick-Slip Analyses

    International Nuclear Information System (INIS)

    Ausilio, Ernesto; Costanzo, Antonio; Silvestri, Francesco; Tropeano, Giuseppe

    2008-01-01

    A good-working balance between simplicity and reliability in assessing seismic slope stability is represented by displacement-based methods, in which the effects of deformability and ductility can be either decoupled or coupled in the dynamic analyses. In this paper, a 1D lumped mass ''stick-slip'' model is developed, accounting for soil heterogeneity and non-linear behaviour, with a base sliding mechanism at a potential rupture surface. The results of the preliminary calibration show a good agreement with frequency-domain site response analysis in no-slip conditions. The comparison with rigid sliding block analyses and with the decoupled approach proves that the stick-slip procedure can result increasingly unconservative for soft soils and deep sliding depths

  10. Potential of MR histogram analyses for prediction of response to chemotherapy in patients with colorectal hepatic metastases.

    Science.gov (United States)

    Liang, He-Yue; Huang, Ya-Qin; Yang, Zhao-Xia; Ying-Ding; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-07-01

    To determine if magnetic resonance imaging (MRI) histogram analyses can help predict response to chemotherapy in patients with colorectal hepatic metastases by using response evaluation criteria in solid tumours (RECIST1.1) as the reference standard. Standard MRI including diffusion-weighted imaging (b=0, 500 s/mm(2)) was performed before chemotherapy in 53 patients with colorectal hepatic metastases. Histograms were performed for apparent diffusion coefficient (ADC) maps, arterial, and portal venous phase images; thereafter, mean, percentiles (1st, 10th, 50th, 90th, 99th), skewness, kurtosis, and variance were generated. Quantitative histogram parameters were compared between responders (partial and complete response, n=15) and non-responders (progressive and stable disease, n=38). Receiver operator characteristics (ROC) analyses were further analyzed for the significant parameters. The mean, 1st percentile, 10th percentile, 50th percentile, 90th percentile, 99th percentile of the ADC maps were significantly lower in responding group than that in non-responding group (p=0.000-0.002) with area under the ROC curve (AUCs) of 0.76-0.82. The histogram parameters of arterial and portal venous phase showed no significant difference (p>0.05) between the two groups. Histogram-derived parameters for ADC maps seem to be a promising tool for predicting response to chemotherapy in patients with colorectal hepatic metastases. • ADC histogram analyses can potentially predict chemotherapy response in colorectal liver metastases. • Lower histogram-derived parameters (mean, percentiles) for ADC tend to have good response. • MR enhancement histogram analyses are not reliable to predict response.

  11. When Bitcoin encounters information in an online forum: Using text mining to analyse user opinions and predict value fluctuation.

    Directory of Open Access Journals (Sweden)

    Young Bin Kim

    Full Text Available Bitcoin is an online currency that is used worldwide to make online payments. It has consequently become an investment vehicle in itself and is traded in a way similar to other open currencies. The ability to predict the price fluctuation of Bitcoin would therefore facilitate future investment and payment decisions. In order to predict the price fluctuation of Bitcoin, we analyse the comments posted in the Bitcoin online forum. Unlike most research on Bitcoin-related online forums, which is limited to simple sentiment analysis and does not pay sufficient attention to note-worthy user comments, our approach involved extracting keywords from Bitcoin-related user comments posted on the online forum with the aim of analytically predicting the price and extent of transaction fluctuation of the currency. The effectiveness of the proposed method is validated based on Bitcoin online forum data ranging over a period of 2.8 years from December 2013 to September 2016.

  12. When Bitcoin encounters information in an online forum: Using text mining to analyse user opinions and predict value fluctuation.

    Science.gov (United States)

    Kim, Young Bin; Lee, Jurim; Park, Nuri; Choo, Jaegul; Kim, Jong-Hyun; Kim, Chang Hun

    2017-01-01

    Bitcoin is an online currency that is used worldwide to make online payments. It has consequently become an investment vehicle in itself and is traded in a way similar to other open currencies. The ability to predict the price fluctuation of Bitcoin would therefore facilitate future investment and payment decisions. In order to predict the price fluctuation of Bitcoin, we analyse the comments posted in the Bitcoin online forum. Unlike most research on Bitcoin-related online forums, which is limited to simple sentiment analysis and does not pay sufficient attention to note-worthy user comments, our approach involved extracting keywords from Bitcoin-related user comments posted on the online forum with the aim of analytically predicting the price and extent of transaction fluctuation of the currency. The effectiveness of the proposed method is validated based on Bitcoin online forum data ranging over a period of 2.8 years from December 2013 to September 2016.

  13. Intrinsic disorder in Viral Proteins Genome-Linked: experimental and predictive analyses

    Directory of Open Access Journals (Sweden)

    Van Dorsselaer Alain

    2009-02-01

    Full Text Available Abstract Background VPgs are viral proteins linked to the 5' end of some viral genomes. Interactions between several VPgs and eukaryotic translation initiation factors eIF4Es are critical for plant infection. However, VPgs are not restricted to phytoviruses, being also involved in genome replication and protein translation of several animal viruses. To date, structural data are still limited to small picornaviral VPgs. Recently three phytoviral VPgs were shown to be natively unfolded proteins. Results In this paper, we report the bacterial expression, purification and biochemical characterization of two phytoviral VPgs, namely the VPgs of Rice yellow mottle virus (RYMV, genus Sobemovirus and Lettuce mosaic virus (LMV, genus Potyvirus. Using far-UV circular dichroism and size exclusion chromatography, we show that RYMV and LMV VPgs are predominantly or partly unstructured in solution, respectively. Using several disorder predictors, we show that both proteins are predicted to possess disordered regions. We next extend theses results to 14 VPgs representative of the viral diversity. Disordered regions were predicted in all VPg sequences whatever the genus and the family. Conclusion Based on these results, we propose that intrinsic disorder is a common feature of VPgs. The functional role of intrinsic disorder is discussed in light of the biological roles of VPgs.

  14. On-line prediction of BWR transients in support of plant operation and safety analyses

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Lekach, S.V.; Mallen, A.N.

    1983-01-01

    A combination of advanced modeling techniques and modern, special-purpose peripheral minicomputer technology is presented which affords realistic predictions of plant transient and severe off-normal events in LWR power plants through on-line simulations at a speed ten times greater than actual process speeds. Results are shown for a BWR plant simulation. The mathematical models account for nonequilibrium, nonhomogeneous two-phase flow effects in the coolant, for acoustical effects in the steam line and for the dynamics of the recirculation loop and feed-water train. Point kinetics incorporate reactivity feedback for void fraction, for fuel temperature, and for coolant temperature. Control systems and trip logic are simulated for the nuclear steam supply system

  15. Predicting Geomorphic and Hydrologic Risks after Wildfire Using Harmonic and Stochastic Analyses

    Science.gov (United States)

    Mikesell, J.; Kinoshita, A. M.; Florsheim, J. L.; Chin, A.; Nourbakhshbeidokhti, S.

    2017-12-01

    Wildfire is a landscape-scale disturbance that often alters hydrological processes and sediment flux during subsequent storms. Vegetation loss from wildfires induce changes to sediment supply such as channel erosion and sedimentation and streamflow magnitude or flooding. These changes enhance downstream hazards, threatening human populations and physical aquatic habitat over various time scales. Using Williams Canyon, a basin burned by the Waldo Canyon Fire (2012) as a case study, we utilize deterministic and statistical modeling methods (Fourier series and first order Markov chain) to assess pre- and post-fire geomorphic and hydrologic characteristics, including of precipitation, enhanced vegetation index (EVI, a satellite-based proxy of vegetation biomass), streamflow, and sediment flux. Local precipitation, terrestrial Light Detection and Ranging (LiDAR) scanning, and satellite-based products are used for these time series analyses. We present a framework to assess variability of periodic and nonperiodic climatic and multivariate trends to inform development of a post-wildfire risk assessment methodology. To establish the extent to which a wildfire affects hydrologic and geomorphic patterns, a Fourier series was used to fit pre- and post-fire geomorphic and hydrologic characteristics to yearly temporal cycles and subcycles of 6, 4, 3, and 2.4 months. These cycles were analyzed using least-squares estimates of the harmonic coefficients or amplitudes of each sub-cycle's contribution to fit the overall behavior of a Fourier series. The stochastic variances of these characteristics were analyzed by composing first-order Markov models and probabilistic analysis through direct likelihood estimates. Preliminary results highlight an increased dependence of monthly post-fire hydrologic characteristics on 12 and 6-month temporal cycles. This statistical and probabilistic analysis provides a basis to determine the impact of wildfires on the temporal dependence of

  16. Condition based maintenance of a 20 kV-PE/XLPE-insulated cable network using the IRC-Analysis; Zustandsorientierte Instandhaltung eines polymerisolierten 20-kV-Kabelnetzes mit der IRC-Analyse. Moderne Diagnostik reduziert Stoerungsrate

    Energy Technology Data Exchange (ETDEWEB)

    Hoff, G.; Kranz, H.G. [BUGH Wuppertal (Germany). Labs. fuer Hochspannungstechnik; Beigert, M.; Petzold, F. [Seba Dynatronic Mess- und Ortungstechnik GmbH, Baunach (Germany); Kneissl, C. [Bereich Konzeption und Messtechnik, Lech Elektrizitaetswerke AG, Augsburg (Germany)

    2001-10-22

    For a preventive maintenance of a polymer insulated cable network a destruction free estimation of the status of the buried PE/XLPE-cables is needed. This contribution presents a condition based maintenance concept which is based on the IRC-Analysis. With this concept a major German utility was able to reduce the amount of failures in a part of the 20kV-cable network. The general tendency of increasing faults was broken. (orig.) [German] Die praeventive Instandhaltung in polymerisolierten Kabelnetzen setzt eine zerstoerungsfreie Zustandbestimmung von gelegten PE/VPE-isolierten Kabeln voraus. Die-Verfasser beschreiben ein zustandsorientiertes Wartungskonzept auf Basis der IRC-Analyse, mit dem es gelungen ist, in einem Teil des 20-kV-Kabelnetzes eines Energieversorgers die in der Vergangenheit stetig steigende Stoerungsrate drastisch zu reduzieren. (orig.)

  17. Circulating biomarkers for predicting cardiovascular disease risk; a systematic review and comprehensive overview of meta-analyses.

    Directory of Open Access Journals (Sweden)

    Thijs C van Holten

    Full Text Available BACKGROUND: Cardiovascular disease is one of the major causes of death worldwide. Assessing the risk for cardiovascular disease is an important aspect in clinical decision making and setting a therapeutic strategy, and the use of serological biomarkers may improve this. Despite an overwhelming number of studies and meta-analyses on biomarkers and cardiovascular disease, there are no comprehensive studies comparing the relevance of each biomarker. We performed a systematic review of meta-analyses on levels of serological biomarkers for atherothrombosis to compare the relevance of the most commonly studied biomarkers. METHODS AND FINDINGS: Medline and Embase were screened on search terms that were related to "arterial ischemic events" and "meta-analyses". The meta-analyses were sorted by patient groups without pre-existing cardiovascular disease, with cardiovascular disease and heterogeneous groups concerning general populations, groups with and without cardiovascular disease, or miscellaneous. These were subsequently sorted by end-point for cardiovascular disease or stroke and summarized in tables. We have identified 85 relevant full text articles, with 214 meta-analyses. Markers for primary cardiovascular events include, from high to low result: C-reactive protein, fibrinogen, cholesterol, apolipoprotein B, the apolipoprotein A/apolipoprotein B ratio, high density lipoprotein, and vitamin D. Markers for secondary cardiovascular events include, from high to low result: cardiac troponins I and T, C-reactive protein, serum creatinine, and cystatin C. For primary stroke, fibrinogen and serum uric acid are strong risk markers. Limitations reside in that there is no acknowledged search strategy for prognostic studies or meta-analyses. CONCLUSIONS: For primary cardiovascular events, markers with strong predictive potential are mainly associated with lipids. For secondary cardiovascular events, markers are more associated with ischemia. Fibrinogen is a

  18. Extending and Applying Spartan to Perform Temporal Sensitivity Analyses for Predicting Changes in Influential Biological Pathways in Computational Models.

    Science.gov (United States)

    Alden, Kieran; Timmis, Jon; Andrews, Paul S; Veiga-Fernandes, Henrique; Coles, Mark

    2017-01-01

    Through integrating real time imaging, computational modelling, and statistical analysis approaches, previous work has suggested that the induction of and response to cell adhesion factors is the key initiating pathway in early lymphoid tissue development, in contrast to the previously accepted view that the process is triggered by chemokine mediated cell recruitment. These model derived hypotheses were developed using spartan, an open-source sensitivity analysis toolkit designed to establish and understand the relationship between a computational model and the biological system that model captures. Here, we extend the functionality available in spartan to permit the production of statistical analyses that contrast the behavior exhibited by a computational model at various simulated time-points, enabling a temporal analysis that could suggest whether the influence of biological mechanisms changes over time. We exemplify this extended functionality by using the computational model of lymphoid tissue development as a time-lapse tool. By generating results at twelve- hour intervals, we show how the extensions to spartan have been used to suggest that lymphoid tissue development could be biphasic, and predict the time-point when a switch in the influence of biological mechanisms might occur.

  19. ATOP - The Advanced Taiwan Ocean Prediction System Based on the mpiPOM. Part 1: Model Descriptions, Analyses and Results

    Directory of Open Access Journals (Sweden)

    Leo Oey

    2013-01-01

    Full Text Available A data-assimilated Taiwan Ocean Prediction (ATOP system is being developed at the National Central University, Taiwan. The model simulates sea-surface height, three-dimensional currents, temperature and salinity and turbulent mixing. The model has options for tracer and particle-tracking algorithms, as well as for wave-induced Stokes drift and wave-enhanced mixing and bottom drag. Two different forecast domains have been tested: a large-grid domain that encompasses the entire North Pacific Ocean at 0.1° × 0.1° horizontal resolution and 41 vertical sigma levels, and a smaller western North Pacific domain which at present also has the same horizontal resolution. In both domains, 25-year spin-up runs from 1988 - 2011 were first conducted, forced by six-hourly Cross-Calibrated Multi-Platform (CCMP and NCEP reanalysis Global Forecast System (GSF winds. The results are then used as initial conditions to conduct ocean analyses from January 2012 through February 2012, when updated hindcasts and real-time forecasts begin using the GFS winds. This paper describes the ATOP system and compares the forecast results against satellite altimetry data for assessing model skills. The model results are also shown to compare well with observations of (i the Kuroshio intrusion in the northern South China Sea, and (ii subtropical counter current. Review and comparison with other models in the literature of ¡§(i¡¨ are also given.

  20. Comparative transcriptome analyses of three medicinal Forsythia species and prediction of candidate genes involved in secondary metabolisms.

    Science.gov (United States)

    Sun, Luchao; Rai, Amit; Rai, Megha; Nakamura, Michimi; Kawano, Noriaki; Yoshimatsu, Kayo; Suzuki, Hideyuki; Kawahara, Nobuo; Saito, Kazuki; Yamazaki, Mami

    2018-05-07

    The three Forsythia species, F. suspensa, F. viridissima and F. koreana, have been used as herbal medicines in China, Japan and Korea for centuries and they are known to be rich sources of numerous pharmaceutical metabolites, forsythin, forsythoside A, arctigenin, rutin and other phenolic compounds. In this study, de novo transcriptome sequencing and assembly was performed on these species. Using leaf and flower tissues of F. suspensa, F. viridissima and F. koreana, 1.28-2.45-Gbp sequences of Illumina based pair-end reads were obtained and assembled into 81,913, 88,491 and 69,458 unigenes, respectively. Classification of the annotated unigenes in gene ontology terms and KEGG pathways was used to compare the transcriptome of three Forsythia species. The expression analysis of orthologous genes across all three species showed the expression in leaf tissues being highly correlated. The candidate genes presumably involved in the biosynthetic pathway of lignans and phenylethanoid glycosides were screened as co-expressed genes. They express highly in the leaves of F. viridissima and F. koreana. Furthermore, the three unigenes annotated as acyltransferase were predicted to be associated with the biosynthesis of acteoside and forsythoside A from the expression pattern and phylogenetic analysis. This study is the first report on comparative transcriptome analyses of medicinally important Forsythia genus and will serve as an important resource to facilitate further studies on biosynthesis and regulation of therapeutic compounds in Forsythia species.

  1. Application of neural networks and its prospect. 4. Prediction of major disruptions in tokamak plasmas, analyses of time series data

    International Nuclear Information System (INIS)

    Yoshino, Ryuji

    2006-01-01

    Disruption prediction of tokamak plasma has been studied by neural network. The disruption prediction performances by neural network are estimated by the prediction success rate, false alarm rate, and time prior to disruption. The current driving type disruption is predicted by time series data, and plasma lifetime, risk of disruption and plasma stability. Some disruptions generated by density limit, impurity mixture, error magnetic field can be predicted 100 % of prediction success rate by the premonitory symptoms. The pressure driving type disruption phenomena generate some hundred micro seconds before, so that the operation limits such as β N limit of DIII-D and density limit of ADITYA were investigated. The false alarm rate was decreased by β N limit training under stable discharge. The pressure driving disruption generated with increasing plasma pressure can be predicted about 90 % by evaluating plasma stability. (S.Y.)

  2. Analyses of Potential Predictive Markers and Response to Targeted Therapy in Patients with Advanced Clear-cell Renal Cell Carcinoma

    Directory of Open Access Journals (Sweden)

    Yan Song

    2015-01-01

    Full Text Available Background: Vascular endothelial growth factor-targeted agents are standard treatments in advanced clear-cell renal cell carcinoma (ccRCC, but biomarkers of activity are lacking. The aim of this study was to investigate the association of Von Hippel-Lindau (VHL gene status, vascular endothelial growth factor receptor (VEGFR or stem cell factor receptor (KIT expression, and their relationships with characteristics and clinical outcome of advanced ccRCC. Methods: A total of 59 patients who received targeted treatment with sunitinib or pazopanib were evaluated for determination at Cancer Hospital and Institute, Chinese Academy of Medical Sciences between January 2010 and November 2012. Paraffin-embedded tumor samples were collected and status of the VHL gene and expression of VEGFR and KIT were determined by VHL sequence analysis and immunohistochemistry. Clinical-pathological features were collected and efficacy such as response rate and Median progression-free survival (PFS and overall survival (OS were calculated and then compared based on expression status. The Chi-square test, the Kaplan-Meier method, and the Lon-rank test were used for statistical analyses. Results: Of 59 patients, objective responses were observed in 28 patients (47.5%. The median PFS was 13.8 months and median OS was 39.9 months. There was an improved PFS in patients with the following clinical features: Male gender, number of metastatic sites 2 or less, VEGFR-2 positive or KIT positive. Eleven patients (18.6% had evidence of VHL mutation, with an objective response rate of 45.5%, which showed no difference with patients with no VHL mutation (47.9%. VHL mutation status did not correlate with either overall response rate (P = 0.938 or PFS (P = 0.277. The PFS was 17.6 months and 22.2 months in VEGFR-2 positive patients and KIT positive patients, respectively, which was significantly longer than that of VEGFR-2 or KIT negative patients (P = 0.026 and P = 0.043. Conclusion

  3. Comparing direct image and wavelet transform-based approaches to analysing remote sensing imagery for predicting wildlife distribution

    NARCIS (Netherlands)

    Murwira, A.; Skidmore, A.K.

    2010-01-01

    In this study we tested the ability to predict the probability of elephant (Loxodonta africana) presence in an agricultural landscape of Zimbabwe based on three methods of measuring the spatial heterogeneity in vegetation cover, where vegetation cover was measured using the Landsat Thematic Mapper

  4. Ultimate compression after impact load prediction in graphite/epoxy coupons using neural network and multivariate statistical analyses

    Science.gov (United States)

    Gregoire, Alexandre David

    2011-07-01

    The goal of this research was to accurately predict the ultimate compressive load of impact damaged graphite/epoxy coupons using a Kohonen self-organizing map (SOM) neural network and multivariate statistical regression analysis (MSRA). An optimized use of these data treatment tools allowed the generation of a simple, physically understandable equation that predicts the ultimate failure load of an impacted damaged coupon based uniquely on the acoustic emissions it emits at low proof loads. Acoustic emission (AE) data were collected using two 150 kHz resonant transducers which detected and recorded the AE activity given off during compression to failure of thirty-four impacted 24-ply bidirectional woven cloth laminate graphite/epoxy coupons. The AE quantification parameters duration, energy and amplitude for each AE hit were input to the Kohonen self-organizing map (SOM) neural network to accurately classify the material failure mechanisms present in the low proof load data. The number of failure mechanisms from the first 30% of the loading for twenty-four coupons were used to generate a linear prediction equation which yielded a worst case ultimate load prediction error of 16.17%, just outside of the +/-15% B-basis allowables, which was the goal for this research. Particular emphasis was placed upon the noise removal process which was largely responsible for the accuracy of the results.

  5. Refining Prediction in Treatment-Resistant Depression: Results of Machine Learning Analyses in the TRD III Sample.

    Science.gov (United States)

    Kautzky, Alexander; Dold, Markus; Bartova, Lucie; Spies, Marie; Vanicek, Thomas; Souery, Daniel; Montgomery, Stuart; Mendlewicz, Julien; Zohar, Joseph; Fabbri, Chiara; Serretti, Alessandro; Lanzenberger, Rupert; Kasper, Siegfried

    The study objective was to generate a prediction model for treatment-resistant depression (TRD) using machine learning featuring a large set of 47 clinical and sociodemographic predictors of treatment outcome. 552 Patients diagnosed with major depressive disorder (MDD) according to DSM-IV criteria were enrolled between 2011 and 2016. TRD was defined as failure to reach response to antidepressant treatment, characterized by a Montgomery-Asberg Depression Rating Scale (MADRS) score below 22 after at least 2 antidepressant trials of adequate length and dosage were administered. RandomForest (RF) was used for predicting treatment outcome phenotypes in a 10-fold cross-validation. The full model with 47 predictors yielded an accuracy of 75.0%. When the number of predictors was reduced to 15, accuracies between 67.6% and 71.0% were attained for different test sets. The most informative predictors of treatment outcome were baseline MADRS score for the current episode; impairment of family, social, and work life; the timespan between first and last depressive episode; severity; suicidal risk; age; body mass index; and the number of lifetime depressive episodes as well as lifetime duration of hospitalization. With the application of the machine learning algorithm RF, an efficient prediction model with an accuracy of 75.0% for forecasting treatment outcome could be generated, thus surpassing the predictive capabilities of clinical evaluation. We also supply a simplified algorithm of 15 easily collected clinical and sociodemographic predictors that can be obtained within approximately 10 minutes, which reached an accuracy of 70.6%. Thus, we are confident that our model will be validated within other samples to advance an accurate prediction model fit for clinical usage in TRD. © Copyright 2017 Physicians Postgraduate Press, Inc.

  6. Visual Versus Fully Automated Analyses of 18F-FDG and Amyloid PET for Prediction of Dementia Due to Alzheimer Disease in Mild Cognitive Impairment.

    Science.gov (United States)

    Grimmer, Timo; Wutz, Carolin; Alexopoulos, Panagiotis; Drzezga, Alexander; Förster, Stefan; Förstl, Hans; Goldhardt, Oliver; Ortner, Marion; Sorg, Christian; Kurz, Alexander

    2016-02-01

    Biomarkers of Alzheimer disease (AD) can be imaged in vivo and can be used for diagnostic and prognostic purposes in people with cognitive decline and dementia. Indicators of amyloid deposition such as (11)C-Pittsburgh compound B ((11)C-PiB) PET are primarily used to identify or rule out brain diseases that are associated with amyloid pathology but have also been deployed to forecast the clinical course. Indicators of neuronal metabolism including (18)F-FDG PET demonstrate the localization and severity of neuronal dysfunction and are valuable for differential diagnosis and for predicting the progression from mild cognitive impairment (MCI) to dementia. It is a matter of debate whether to analyze these images visually or using automated techniques. Therefore, we compared the usefulness of both imaging methods and both analyzing strategies to predict dementia due to AD. In MCI participants, a baseline examination, including clinical and imaging assessments, and a clinical follow-up examination after a planned interval of 24 mo were performed. Of 28 MCI patients, 9 developed dementia due to AD, 2 developed frontotemporal dementia, and 1 developed moderate dementia of unknown etiology. The positive and negative predictive values and the accuracy of visual and fully automated analyses of (11)C-PiB for the prediction of progression to dementia due to AD were 0.50, 1.00, and 0.68, respectively, for the visual and 0.53, 1.00, and 0.71, respectively, for the automated analyses. Positive predictive value, negative predictive value, and accuracy of fully automated analyses of (18)F-FDG PET were 0.37, 0.78, and 0.50, respectively. Results of visual analyses were highly variable between raters but were superior to automated analyses. Both (18)F-FDG and (11)C-PiB imaging appear to be of limited use for predicting the progression from MCI to dementia due to AD in short-term follow-up, irrespective of the strategy of analysis. On the other hand, amyloid PET is extremely useful to

  7. Incorporating Variational Local Analysis and Prediction System (vLAPS) Analyses with Nudging Data Assimilation: Methodology and Initial Results

    Science.gov (United States)

    2017-09-01

    in the hybrid scheme. They conclude that in the Lorenz model they investigated, the hybrid scheme cannot result in errors that are simultaneously ...centered on the analysis time. Note that the spatial and temporal refinement of the analyses are taking place simultaneously (i.e., the first analysis...of a strong capping inversion and then a deep elevated mixed layer. At 1800 UTC (Fig. 5b), daytime heating along with the formation of a convective

  8. Prediction of size-fractionated airborne particle-bound metals using MLR, BP-ANN and SVM analyses.

    Science.gov (United States)

    Leng, Xiang'zi; Wang, Jinhua; Ji, Haibo; Wang, Qin'geng; Li, Huiming; Qian, Xin; Li, Fengying; Yang, Meng

    2017-08-01

    Size-fractionated heavy metal concentrations were observed in airborne particulate matter (PM) samples collected from 2014 to 2015 (spanning all four seasons) from suburban (Xianlin) and industrial (Pukou) areas in Nanjing, a megacity of southeast China. Rapid prediction models of size-fractionated metals were established based on multiple linear regression (MLR), back propagation artificial neural network (BP-ANN) and support vector machine (SVM) by using meteorological factors and PM concentrations as input parameters. About 38% and 77% of PM 2.5 concentrations in Xianlin and Pukou, respectively, were beyond the Chinese National Ambient Air Quality Standard limit of 75 μg/m 3 . Nearly all elements had higher concentrations in industrial areas, and in winter among the four seasons. Anthropogenic elements such as Pb, Zn, Cd and Cu showed larger percentages in the fine fraction (ø≤2.5 μm), whereas the crustal elements including Al, Ba, Fe, Ni, Sr and Ti showed larger percentages in the coarse fraction (ø > 2.5 μm). SVM showed a higher training correlation coefficient (R), and lower mean absolute error (MAE) as well as lower root mean square error (RMSE), than MLR and BP-ANN for most metals. All the three methods showed better prediction results for Ni, Al, V, Cd and As, whereas relatively poor for Cr and Fe. The daily airborne metal concentrations in 2015 were then predicted by the fully trained SVM models and the results showed the heaviest pollution of airborne heavy metals occurred in December and January, whereas the lightest pollution occurred in June and July. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Can the lifetime of the superheater tubes be predicted according to the fuel analyses? Assessment from field and laboratory data

    Energy Technology Data Exchange (ETDEWEB)

    Salmenoja, K. [Kvaerner Pulping Oy, Tampere (Finland)

    1998-12-31

    Lifetime of the superheaters in different power boilers is more or less still a mystery. This is especially true in firing biomass based fuels (biofuels), such as bark, forest residues, and straw. Due to the unhomogeneous nature of the biofuels, the lifetime of the superheaters may vary from case to case. Sometimes the lifetime is significantly shorter than originally expected, sometimes no corrosion even in the hottest tubes is observed. This is one of the main reasons why the boiler operators often demand for a better predictability on the corrosion resistance of the materials to avoid unscheduled shutdowns. (orig.) 9 refs.

  10. Can the lifetime of the superheater tubes be predicted according to the fuel analyses? Assessment from field and laboratory data

    Energy Technology Data Exchange (ETDEWEB)

    Salmenoja, K [Kvaerner Pulping Oy, Tampere (Finland)

    1999-12-31

    Lifetime of the superheaters in different power boilers is more or less still a mystery. This is especially true in firing biomass based fuels (biofuels), such as bark, forest residues, and straw. Due to the unhomogeneous nature of the biofuels, the lifetime of the superheaters may vary from case to case. Sometimes the lifetime is significantly shorter than originally expected, sometimes no corrosion even in the hottest tubes is observed. This is one of the main reasons why the boiler operators often demand for a better predictability on the corrosion resistance of the materials to avoid unscheduled shutdowns. (orig.) 9 refs.

  11. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  12. Reduced CBF recovery detected by longitudinal 3D-SSP SPECT analyses predicts outcome of postoperative patients after subarachnoid haemorrhage.

    Science.gov (United States)

    Mutoh, Tatsushi; Totsune, Tomoko; Takenaka, Shunsuke; Tatewaki, Yasuko; Nakagawa, Manabu; Suarez, Jose I; Taki, Yasuyuki; Ishikawa, Tatsuya

    2018-02-01

    The aim of this study was to evaluate the impact of cerebral blood flow (CBF) recovery obtained from brain single-photon emission computed tomography (SPECT) images on postoperative outcome after aneurysmal subarachnoid haemorrhage (SAH). Twenty-nine patients who had undergone surgical clipping for ruptured anterior communicating artery aneurysms were analyzed prospectively. Routine measurements of CBF were performed using technetium-99 m hexamethyl propyleneamine oxine SPECT on days 4 and 14 after SAH. Regional voxel data analyzed by three dimensional stereotactic surface projection (3D-SSP) were compared between patients and age-matched normal database (NDB). In 3D-SSP analysis of all patients, cortical hypoperfusion around the surgical site in bilateral frontal lobes was evident on day 4 (P SSP SPECT image analyses can be a potential predictor of poor prognosis in postoperative patients after SAH. © 2017 John Wiley & Sons Australia, Ltd.

  13. Evaluation of multivariate statistical analyses for monitoring and prediction of processes in an seawater reverse osmosis desalination plant

    International Nuclear Information System (INIS)

    Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo

    2015-01-01

    Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.

  14. Evaluation of multivariate statistical analyses for monitoring and prediction of processes in an seawater reverse osmosis desalination plant

    Energy Technology Data Exchange (ETDEWEB)

    Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo [Kyung Hee University, Yongin (Korea, Republic of)

    2015-08-15

    Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.

  15. Quasi-laminar stability and sensitivity analyses for turbulent flows: Prediction of low-frequency unsteadiness and passive control

    Science.gov (United States)

    Mettot, Clément; Sipp, Denis; Bézard, Hervé

    2014-04-01

    This article presents a quasi-laminar stability approach to identify in high-Reynolds number flows the dominant low-frequencies and to design passive control means to shift these frequencies. The approach is based on a global linear stability analysis of mean-flows, which correspond to the time-average of the unsteady flows. Contrary to the previous work by Meliga et al. ["Sensitivity of 2-D turbulent flow past a D-shaped cylinder using global stability," Phys. Fluids 24, 061701 (2012)], we use the linearized Navier-Stokes equations based solely on the molecular viscosity (leaving aside any turbulence model and any eddy viscosity) to extract the least stable direct and adjoint global modes of the flow. Then, we compute the frequency sensitivity maps of these modes, so as to predict before hand where a small control cylinder optimally shifts the frequency of the flow. In the case of the D-shaped cylinder studied by Parezanović and Cadot [J. Fluid Mech. 693, 115 (2012)], we show that the present approach well captures the frequency of the flow and recovers accurately the frequency control maps obtained experimentally. The results are close to those already obtained by Meliga et al., who used a more complex approach in which turbulence models played a central role. The present approach is simpler and may be applied to a broader range of flows since it is tractable as soon as mean-flows — which can be obtained either numerically from simulations (Direct Numerical Simulation (DNS), Large Eddy Simulation (LES), unsteady Reynolds-Averaged-Navier-Stokes (RANS), steady RANS) or from experimental measurements (Particle Image Velocimetry - PIV) — are available. We also discuss how the influence of the control cylinder on the mean-flow may be more accurately predicted by determining an eddy-viscosity from numerical simulations or experimental measurements. From a technical point of view, we finally show how an existing compressible numerical simulation code may be used in

  16. The GENOTEND chip: a new tool to analyse gene expression in muscles of beef cattle for beef quality prediction.

    Science.gov (United States)

    Hocquette, Jean-Francois; Bernard-Capel, Carine; Vidal, Veronique; Jesson, Beline; Levéziel, Hubert; Renand, Gilles; Cassar-Malek, Isabelle

    2012-08-15

    bulls slaughtered in year 2, and in the 21 Charolais steers slaughtered in year 1, but not in the group of 19 steers slaughtered in year 2 which differ from the reference group by two factors (gender and year). When the first three groups of animals were analysed together, this subset of genes explained a 4-fold higher proportion of the variability in tenderness than muscle biochemical traits. This study underlined the relevance of the GENOTEND chip to identify markers of beef quality, mainly by confirming previous results and by detecting other genes of the heat shock family as potential markers of beef quality. However, it was not always possible to extrapolate the relevance of these markers to all animal groups which differ by several factors (such as gender or environmental conditions of production) from the initial population of reference in which these markers were identified.

  17. The GENOTEND chip: a new tool to analyse gene expression in muscles of beef cattle for beef quality prediction

    Directory of Open Access Journals (Sweden)

    Hocquette Jean-Francois

    2012-08-01

    validated in the groups of 30 Charolais young bulls slaughtered in year 2, and in the 21 Charolais steers slaughtered in year 1, but not in the group of 19 steers slaughtered in year 2 which differ from the reference group by two factors (gender and year. When the first three groups of animals were analysed together, this subset of genes explained a 4-fold higher proportion of the variability in tenderness than muscle biochemical traits. Conclusion This study underlined the relevance of the GENOTEND chip to identify markers of beef quality, mainly by confirming previous results and by detecting other genes of the heat shock family as potential markers of beef quality. However, it was not always possible to extrapolate the relevance of these markers to all animal groups which differ by several factors (such as gender or environmental conditions of production from the initial population of reference in which these markers were identified.

  18. Factors predicting the development of pressure ulcers in an at-risk population who receive standardized preventive care: secondary analyses of a multicentre randomised controlled trial.

    Science.gov (United States)

    Demarre, Liesbet; Verhaeghe, Sofie; Van Hecke, Ann; Clays, Els; Grypdonck, Maria; Beeckman, Dimitri

    2015-02-01

    To identify predictive factors associated with the development of pressure ulcers in patients at risk who receive standardized preventive care. Numerous studies have examined factors that predict risk for pressure ulcer development. Only a few studies identified risk factors associated with pressure ulcer development in hospitalized patients receiving standardized preventive care. Secondary analyses of data collected in a multicentre randomized controlled trial. The sample consisted of 610 consecutive patients at risk for pressure ulcer development (Braden Score Pressure ulcers in category II-IV were significantly associated with non-blanchable erythema, urogenital disorders and higher body temperature. Predictive factors significantly associated with superficial pressure ulcers were admission to an internal medicine ward, incontinence-associated dermatitis, non-blanchable erythema and a lower Braden score. Superficial sacral pressure ulcers were significantly associated with incontinence-associated dermatitis. Despite the standardized preventive measures they received, hospitalized patients with non-blanchable erythema, urogenital disorders and a higher body temperature were at increased risk for developing pressure ulcers. Improved identification of at-risk patients can be achieved by taking into account specific predictive factors. Even if preventive measures are in place, continuous assessment and tailoring of interventions is necessary in all patients at risk. Daily skin observation can be used to continuously monitor the effectiveness of the intervention. © 2014 John Wiley & Sons Ltd.

  19. Taxometric analyses and predictive accuracy of callous-unemotional traits regarding quality of life and behavior problems in non-conduct disorder diagnoses.

    Science.gov (United States)

    Herpers, Pierre C M; Klip, Helen; Rommelse, Nanda N J; Taylor, Mark J; Greven, Corina U; Buitelaar, Jan K

    2017-07-01

    Callous-unemotional (CU) traits have mainly been studied in relation to conduct disorder (CD), but can also occur in other disorder groups. However, it is unclear whether there is a clinically relevant cut-off value of levels of CU traits in predicting reduced quality of life (QoL) and clinical symptoms, and whether CU traits better fit a categorical (taxonic) or dimensional model. Parents of 979 youths referred to a child and adolescent psychiatric clinic rated their child's CU traits on the Inventory of Callous-Unemotional traits (ICU), QoL on the Kidscreen-27, and clinical symptoms on the Child Behavior Checklist. Experienced clinicians conferred DSM-IV-TR diagnoses of ADHD, ASD, anxiety/mood disorders and DBD-NOS/ODD. The ICU was also used to score the DSM-5 specifier 'with limited prosocial emotions' (LPE) of Conduct Disorder. Receiver operating characteristic (ROC) analyses revealed that the predictive accuracy of the ICU and LPE regarding QoL and clinical symptoms was poor to fair, and similar across diagnoses. A clinical cut-off point could not be defined. Taxometric analyses suggested that callous-unemotional traits on the ICU best reflect a dimension rather than taxon. More research is needed on the impact of CU traits on the functional adaptation, course, and response to treatment of non-CD conditions. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  20. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part I: Effects of Random Error

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.

  1. Can adverse maternal and perinatal outcomes be predicted when blood pressure becomes elevated? Secondary analyses from the CHIPS (Control of Hypertension In Pregnancy Study) randomized controlled trial.

    Science.gov (United States)

    Magee, Laura A; von Dadelszen, Peter; Singer, Joel; Lee, Terry; Rey, Evelyne; Ross, Susan; Asztalos, Elizabeth; Murphy, Kellie E; Menzies, Jennifer; Sanchez, Johanna; Gafni, Amiram; Gruslin, Andrée; Helewa, Michael; Hutton, Eileen; Lee, Shoo K; Logan, Alexander G; Ganzevoort, Wessel; Welch, Ross; Thornton, Jim G; Moutquin, Jean Marie

    2016-07-01

    For women with chronic or gestational hypertension in CHIPS (Control of Hypertension In Pregnancy Study, NCT01192412), we aimed to examine whether clinical predictors collected at randomization could predict adverse outcomes. This was a planned, secondary analysis of data from the 987 women in the CHIPS Trial. Logistic regression was used to examine the impact of 19 candidate predictors on the probability of adverse perinatal (pregnancy loss or high level neonatal care for >48 h, or birthweight hypertension, preeclampsia, or delivery at blood pressure within 1 week before randomization. Continuous variables were represented continuously or dichotomized based on the smaller p-value in univariate analyses. An area-under-the-receiver-operating-curve (AUC ROC) of ≥0.70 was taken to reflect a potentially useful model. Point estimates for AUC ROC were hypertension (0.70, 95% CI 0.67-0.74) and delivery at hypertension develop an elevated blood pressure in pregnancy, or formerly normotensive women develop new gestational hypertension, maternal and current pregnancy clinical characteristics cannot predict adverse outcomes in the index pregnancy. © 2016 The Authors. Acta Obstetricia et Gynecologica Scandinavica published by John Wiley & Sons Ltd on behalf of Nordic Federation of Societies of Obstetrics and Gynecology (NFOG).

  2. Computational fluid dynamics analyses of lateral heat conduction, coolant azimuthal mixing and heat transfer predictions in a BR2 fuel assembly geometry

    International Nuclear Information System (INIS)

    Tzanos, C.P.; Dionne, B.

    2011-01-01

    To support the analyses related to the conversion of the BR2 core from highly-enriched (HEU) to low-enriched (LEU) fuel, the thermal-hydraulics codes PLTEMP and RELAP-3D are used to evaluate the safety margins during steady-state operation (PLTEMP), as well as after a loss-of-flow, loss-of-pressure, or a loss of coolant event (RELAP). In the 1-D PLTEMP and RELAP simulations, conduction in the azimuthal and axial directions is not accounted. The very good thermal conductivity of the cladding and the fuel meat and significant temperature gradients in the lateral directions (axial and azimuthal directions) could lead to a heat flux distribution that is significantly different than the power distribution. To evaluate the significance of the lateral heat conduction, 3-D computational fluid dynamics (CFD) simulations, using the CFD code STAR-CD, were performed. Safety margin calculations are typically performed for a hot stripe, i.e., an azimuthal region of the fuel plates/coolant channel containing the power peak. In a RELAP model, for example, a channel between two plates could be divided into a number of RELAP channels (stripes) in the azimuthal direction. In a PLTEMP model, the effect of azimuthal power peaking could be taken into account by using engineering factors. However, if the thermal mixing in the azimuthal direction of a coolant channel is significant, a stripping approach could be overly conservative by not taking into account this mixing. STAR-CD simulations were also performed to study the thermal mixing in the coolant. Section II of this document presents the results of the analyses of the lateral heat conduction and azimuthal thermal mixing in a coolant channel. Finally, PLTEMP and RELAP simulations rely on the use of correlations to determine heat transfer coefficients. Previous analyses showed that the Dittus-Boelter correlation gives significantly more conservative (lower) predictions than the correlations of Sieder-Tate and Petukhov. STAR-CD 3-D

  3. Prothrombin time is predictive of low plasma prothrombin concentration and clinical outcome in patients with trauma hemorrhage: analyses of prospective observational cohort studies.

    Science.gov (United States)

    Balendran, Clare A; Lövgren, Ann; Hansson, Kenny M; Nelander, Karin; Olsson, Marita; Johansson, Karin J; Brohi, Karim; Fries, Dietmar; Berggren, Anders

    2017-03-14

    Fibrinogen and prothrombin have been suggested to become rate limiting in trauma associated coagulopathy. Administration of fibrinogen is now recommended, however, the importance of prothrombin to patient outcome is unknown. We have utilized two trauma patient databases (database 1 n = 358 and database 2 n = 331) to investigate the relationship of plasma prothrombin concentration on clinical outcome and coagulation status. Database 1 has been used to assess the relationship of plasma prothrombin to administered packed red blood cells (PRBC), clinical outcome and coagulation biomarkers (Prothrombin Time (PT), ROTEM EXTEM Coagulation Time (CT) and Maximum Clot Firmness (MCF)). ROC analyses have been performed to investigate the ability of admission coagulation biomarkers to predict low prothrombin concentration (database 1), massive transfusion and 24 h mortality (database 1 and 2). The importance of prothrombin was further investigated in vitro by PT and ROTEM assays in the presence of a prothrombin neutralizing monoclonal antibody and following step-wise dilution. Patients who survived the first 24 h had higher admission prothrombin levels compared to those who died (94 vs.67 IU/dL). Patients with lower transfusion requirements within the first 24 h (≤10 units of PRBCs) also had higher admission prothrombin levels compared to patients with massive transfusion demands (>10 units of PRBCs) (95 vs.62 IU/dL). Admission PT, in comparison to admission ROTEM EXTEM CT and MCF, was found to be a better predictor of prothrombin concentration <60 IU/dL (AUC 0.94 in database 1), of massive transfusion (AUC 0.92 and 0.81 in database 1 and 2 respectively) and 24 h mortality (AUC 0.90 and 0.78 in database 1 and 2, respectively). In vitro experiments supported a critical role for prothrombin in coagulation and demonstrated that PT and ROTEM EXTEM CT are sensitive methods to measure low prothrombin concentration. Our analyses suggest that prothrombin concentration

  4. Genetically Predicted Body Mass Index and Breast Cancer Risk: Mendelian Randomization Analyses of Data from 145,000 Women of European Descent.

    Directory of Open Access Journals (Sweden)

    Yan Guo

    2016-08-01

    Full Text Available Observational epidemiological studies have shown that high body mass index (BMI is associated with a reduced risk of breast cancer in premenopausal women but an increased risk in postmenopausal women. It is unclear whether this association is mediated through shared genetic or environmental factors.We applied Mendelian randomization to evaluate the association between BMI and risk of breast cancer occurrence using data from two large breast cancer consortia. We created a weighted BMI genetic score comprising 84 BMI-associated genetic variants to predicted BMI. We evaluated genetically predicted BMI in association with breast cancer risk using individual-level data from the Breast Cancer Association Consortium (BCAC (cases  =  46,325, controls  =  42,482. We further evaluated the association between genetically predicted BMI and breast cancer risk using summary statistics from 16,003 cases and 41,335 controls from the Discovery, Biology, and Risk of Inherited Variants in Breast Cancer (DRIVE Project. Because most studies measured BMI after cancer diagnosis, we could not conduct a parallel analysis to adequately evaluate the association of measured BMI with breast cancer risk prospectively.In the BCAC data, genetically predicted BMI was found to be inversely associated with breast cancer risk (odds ratio [OR]  =  0.65 per 5 kg/m2 increase, 95% confidence interval [CI]: 0.56-0.75, p = 3.32 × 10-10. The associations were similar for both premenopausal (OR   =   0.44, 95% CI:0.31-0.62, p  =  9.91 × 10-8 and postmenopausal breast cancer (OR  =  0.57, 95% CI: 0.46-0.71, p  =  1.88 × 10-8. This association was replicated in the data from the DRIVE consortium (OR  =  0.72, 95% CI: 0.60-0.84, p   =   1.64 × 10-7. Single marker analyses identified 17 of the 84 BMI-associated single nucleotide polymorphisms (SNPs in association with breast cancer risk at p < 0.05; for 16 of them, the

  5. Development of temporal modelling for forecasting and prediction of malaria infections using time-series and ARIMAX analyses: a case study in endemic districts of Bhutan.

    Science.gov (United States)

    Wangdi, Kinley; Singhasivanon, Pratap; Silawan, Tassanee; Lawpoolsri, Saranath; White, Nicholas J; Kaewkungwal, Jaranit

    2010-09-03

    Malaria still remains a public health problem in some districts of Bhutan despite marked reduction of cases in last few years. To strengthen the country's prevention and control measures, this study was carried out to develop forecasting and prediction models of malaria incidence in the endemic districts of Bhutan using time series and ARIMAX. This study was carried out retrospectively using the monthly reported malaria cases from the health centres to Vector-borne Disease Control Programme (VDCP) and the meteorological data from Meteorological Unit, Department of Energy, Ministry of Economic Affairs. Time series analysis was performed on monthly malaria cases, from 1994 to 2008, in seven malaria endemic districts. The time series models derived from a multiplicative seasonal autoregressive integrated moving average (ARIMA) was deployed to identify the best model using data from 1994 to 2006. The best-fit model was selected for each individual district and for the overall endemic area was developed and the monthly cases from January to December 2009 and 2010 were forecasted. In developing the prediction model, the monthly reported malaria cases and the meteorological factors from 1996 to 2008 of the seven districts were analysed. The method of ARIMAX modelling was employed to determine predictors of malaria of the subsequent month. It was found that the ARIMA (p, d, q) (P, D, Q)s model (p and P representing the auto regressive and seasonal autoregressive; d and D representing the non-seasonal differences and seasonal differencing; and q and Q the moving average parameters and seasonal moving average parameters, respectively and s representing the length of the seasonal period) for the overall endemic districts was (2,1,1)(0,1,1)12; the modelling data from each district revealed two most common ARIMA models including (2,1,1)(0,1,1)12 and (1,1,1)(0,1,1)12. The forecasted monthly malaria cases from January to December 2009 and 2010 varied from 15 to 82 cases in 2009

  6. The groningen longitudinal glaucoma study III. The predictive value of frequency-doubling perimetry and GDx nerve fibre analyser test results for the development of glaucomatous visual field loss

    NARCIS (Netherlands)

    Heeg, G. P.; Jansonius, N. M.

    Purpose To investigate whether frequency-doubling perimetry (FDT) and nerve fibre analyser (GDx) test results are able to predict glaucomatous visual field loss in glaucoma suspect patients. Methods A large cohort of glaucoma suspect patients (patients with ocular hypertension or a positive family

  7. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior

    OpenAIRE

    Hagger, Martin; Chan, Dervin K. C.; Protogerou, Cleo; Chatzisarantis, Nikos L. D.

    2016-01-01

    Objective Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs fr...

  8. Application of pathways analyses for site performance prediction for the Gas Centrifuge Enrichment Plant and Oak Ridge Central Waste Disposal Facility

    International Nuclear Information System (INIS)

    Pin, F.G.; Oblow, E.M.

    1984-01-01

    The suitability of the Gas Centrifuge Enrichment Plant and the Oak Ridge Central Waste Disposal Facility for shallow-land burial of low-level radioactive waste is evaluated using pathways analyses. The analyses rely on conservative scenarios to describe the generation and migration of contamination and the potential human exposure to the waste. Conceptual and numerical models are developed using data from comprehensive laboratory and field investigations and are used to simulate the long-term transport of contamination to man. Conservatism is built into the analyses when assumptions concerning future events have to be made or when uncertainties concerning site or waste characteristics exist. Maximum potential doses to man are calculated and compared to the appropriate standards. The sites are found to provide adequate buffer to persons outside the DOE reservations. Conclusions concerning site capacity and site acceptability are drawn. In reaching these conclusions, some consideration is given to the uncertainties and conservatisms involved in the analyses. Analytical methods to quantitatively assess the probability of future events to occur and the sensitivity of the results to data uncertainty may prove useful in relaxing some of the conservatism built into the analyses. The applicability of such methods to pathways analyses is briefly discussed. 18 refs., 9 figs

  9. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    Science.gov (United States)

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. In silico and cell-based analyses reveal strong divergence between prediction and observation of T-cell-recognized tumor antigen T-cell epitopes.

    Science.gov (United States)

    Schmidt, Julien; Guillaume, Philippe; Dojcinovic, Danijel; Karbach, Julia; Coukos, George; Luescher, Immanuel

    2017-07-14

    Tumor exomes provide comprehensive information on mutated, overexpressed genes and aberrant splicing, which can be exploited for personalized cancer immunotherapy. Of particular interest are mutated tumor antigen T-cell epitopes, because neoepitope-specific T cells often are tumoricidal. However, identifying tumor-specific T-cell epitopes is a major challenge. A widely used strategy relies on initial prediction of human leukocyte antigen-binding peptides by in silico algorithms, but the predictive power of this approach is unclear. Here, we used the human tumor antigen NY-ESO-1 (ESO) and the human leukocyte antigen variant HLA-A*0201 (A2) as a model and predicted in silico the 41 highest-affinity, A2-binding 8-11-mer peptides and assessed their binding, kinetic complex stability, and immunogenicity in A2-transgenic mice and on peripheral blood mononuclear cells from ESO-vaccinated melanoma patients. We found that 19 of the peptides strongly bound to A2, 10 of which formed stable A2-peptide complexes and induced CD8 + T cells in A2-transgenic mice. However, only 5 of the peptides induced cognate T cells in humans; these peptides exhibited strong binding and complex stability and contained multiple large hydrophobic and aromatic amino acids. These results were not predicted by in silico algorithms and provide new clues to improving T-cell epitope identification. In conclusion, our findings indicate that only a small fraction of in silico -predicted A2-binding ESO peptides are immunogenic in humans, namely those that have high peptide-binding strength and complex stability. This observation highlights the need for improving in silico predictions of peptide immunogenicity. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  11. Molecular and clinical analyses of Greig cephalopolysyndactyly and Pallister-Hall syndromes: Robust phenotype prediction from the type and position of GLI3 mutations

    NARCIS (Netherlands)

    Johnston, Jennifer J.; Olivos-Glander, Isabelle; Killoran, Christina; Elson, Emma; Turner, Joyce T.; Peters, Kathryn F.; Abbott, Margaret H.; Aughton, David J.; Aylsworth, Arthur S.; Bamshad, Michael J.; Booth, Carol; Curry, Cynthia J.; David, Albert; Dinulos, Mary Beth; Flannery, David B.; Fox, Michelle A.; Graham, John M.; Grange, Dorothy K.; Guttmacher, Alan E.; Hannibal, Mark C.; Henn, Wolfram; Hennekam, Raoul C. M.; Holmes, Lewis B.; Hoyme, H. Eugene; Leppig, Kathleen A.; Lin, Angela E.; Macleod, Patrick; Manchester, David K.; Marcelis, Carlo; Mazzanti, Laura; McCann, Emma; McDonald, Marie T.; Mendelsohn, Nancy J.; Moeschler, John B.; Moghaddam, Billur; Neri, Giovanni; Newbury-Ecob, Ruth; Pagon, Roberta A.; Phillips, John A.; Sadler, Laurie S.; Stoler, Joan M.; Tilstra, David; Walsh Vockley, Catherine M.; Zackai, Elaine H.; Zadeh, Touran M.; Brueton, Louise; Black, Graeme Charles M.; Biesecker, Leslie G.

    2005-01-01

    Mutations in the GLI3 zinc-finger transcription factor gene cause Greig cephalopolysyndactyly syndrome (GCPS) and Pallister-Hall syndrome (PHS), which are variable but distinct clinical entities. We hypothesized that GLI3 mutations that predict a truncated functional repressor protein cause PHS and

  12. Can adverse maternal and perinatal outcomes be predicted when blood pressure becomes elevated? Secondary analyses from the CHIPS (Control of Hypertension In Pregnancy Study) randomized controlled trial

    NARCIS (Netherlands)

    Magee, Laura A.; von Dadelszen, Peter; Singer, Joel; Lee, Terry; Rey, Evelyne; Ross, Susan; Asztalos, Elizabeth; Murphy, Kellie E.; Menzies, Jennifer; Sanchez, Johanna; Gafni, Amiram; Gruslin, Andrée; Helewa, Michael; Hutton, Eileen; Lee, Shoo K.; Logan, Alexander G.; Ganzevoort, Wessel; Welch, Ross; Thornton, Jim G.; Moutquin, Jean Marie

    2016-01-01

    Introduction. For women with chronic or gestational hypertension in CHIPS (Control of Hypertension In Pregnancy Study, NCT01192412), we aimed to examine whether clinical predictors collected at randomization could predict adverse outcomes. Material and methods. This was a planned, secondary analysis

  13. Taxometric analyses and predictive accuracy of callous-unemotional traits regarding quality of life and behavior problems in non-conduct disorder diagnoses

    NARCIS (Netherlands)

    Herpers, P.C.M.; Klip, H.; Rommelse, N.N.J.; Taylor, M.J.; Greven, C.U.; Buitelaar, J.K.

    2017-01-01

    Callous-unemotional (CU) traits have mainly been studied in relation to conduct disorder (CD), but can also occur in other disorder groups. However, it is unclear whether there is a clinically relevant cut-off value of levels of CU traits in predicting reduced quality of life (QoL) and clinical

  14. Development of time-trend model for analysing and predicting case pattern of dog bite injury induced rabies-like-illness in Liberia, 2014-2017.

    Science.gov (United States)

    Jomah, N D; Ojo, J F; Odigie, E A; Olugasa, B O

    2014-12-01

    The post-civil war records of dog bite injuries (DBI) and rabies-like-illness (RLI) among humans in Liberia is a vital epidemiological resource for developing a predictive model to guide the allocation of resources towards human rabies control. Whereas DBI and RLI are high, they are largely under-reported. The objective of this study was to develop a time model of the case-pattern and apply it to derive predictors of time-trend point distribution of DBI-RLI cases. A retrospective 6 years data of DBI distribution among humans countrywide were converted to quarterly series using a transformation technique of Minimizing Squared First Difference statistic. The generated dataset was used to train a time-trend model of the DBI-RLI syndrome in Liberia. An additive detenninistic time-trend model was selected due to its performance compared to multiplication model of trend and seasonal movement. Parameter predictors were run on least square method to predict DBI cases for a prospective 4 years period, covering 2014-2017. The two-stage predictive model of DBI case-pattern between 2014 and 2017 was characterised by a uniform upward trend within Liberia's coastal and hinterland Counties over the forecast period. This paper describes a translational application of the time-trend distribution pattern of DBI epidemics, 2008-2013 reported in Liberia, on which a predictive model was developed. A computationally feasible two-stage time-trend permutation approach is proposed to estimate the time-trend parameters and conduct predictive inference on DBI-RLI in Liberia.

  15. A 20mK temperature sensor

    International Nuclear Information System (INIS)

    Wang, N.; Sadoulet, B.; Shutt, T.

    1987-11-01

    We are developing a 20mK temperature sensor made of neutron transmutation doped (NTD) germanium for use as a phonon detector in a dark matter search. We find that NTD germanium thermistors around 20mK have resistances which are a strong function of temperature, and have sufficient sensitivity to eventually reach a base line rms energy fluctuation of 6eV at 25mK. Further work is needed to understand the extreme sensitivity of the thermistors to bias power. 13 refs., 18 figs

  16. Consequences of kriging and land use regression for PM2.5 predictions in epidemiologic analyses: insights into spatial variability using high-resolution satellite data.

    Science.gov (United States)

    Alexeeff, Stacey E; Schwartz, Joel; Kloog, Itai; Chudnovsky, Alexandra; Koutrakis, Petros; Coull, Brent A

    2015-01-01

    Many epidemiological studies use predicted air pollution exposures as surrogates for true air pollution levels. These predicted exposures contain exposure measurement error, yet simulation studies have typically found negligible bias in resulting health effect estimates. However, previous studies typically assumed a statistical spatial model for air pollution exposure, which may be oversimplified. We address this shortcoming by assuming a realistic, complex exposure surface derived from fine-scale (1 km × 1 km) remote-sensing satellite data. Using simulation, we evaluate the accuracy of epidemiological health effect estimates in linear and logistic regression when using spatial air pollution predictions from kriging and land use regression models. We examined chronic (long-term) and acute (short-term) exposure to air pollution. Results varied substantially across different scenarios. Exposure models with low out-of-sample R(2) yielded severe biases in the health effect estimates of some models, ranging from 60% upward bias to 70% downward bias. One land use regression exposure model with >0.9 out-of-sample R(2) yielded upward biases up to 13% for acute health effect estimates. Almost all models drastically underestimated the SEs. Land use regression models performed better in chronic effect simulations. These results can help researchers when interpreting health effect estimates in these types of studies.

  17. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon) berries at ripening initiation

    Science.gov (United States)

    Lücker, Joost; Laszczak, Mario; Smith, Derek; Lund, Steven T

    2009-01-01

    Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison') in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening initiation and may be further

  18. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon berries at ripening initiation

    Directory of Open Access Journals (Sweden)

    Smith Derek

    2009-01-01

    Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening

  19. Effects of pharmacists' interventions on appropriateness of prescribing and evaluation of the instruments' (MAI, STOPP and STARTs' ability to predict hospitalization--analyses from a randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Ulrika Gillespie

    Full Text Available Appropriateness of prescribing can be assessed by various measures and screening instruments. The aims of this study were to investigate the effects of pharmacists' interventions on appropriateness of prescribing in elderly patients, and to explore the relationship between these results and hospital care utilization during a 12-month follow-up period.The study population from a previous randomized controlled study, in which the effects of a comprehensive pharmacist intervention on re-hospitalization was investigated, was used. The criteria from the instruments MAI, STOPP and START were applied retrospectively to the 368 study patients (intervention group (I n = 182, control group (C n = 186. The assessments were done on admission and at discharge to detect differences over time and between the groups. Hospital care consumption was recorded and the association between scores for appropriateness, and hospitalization was analysed.The number of Potentially Inappropriate Medicines (PIMs per patient as identified by STOPP was reduced for I but not for C (1.42 to 0.93 vs. 1.46 to 1.66 respectively, p<0.01. The number of Potential Prescription Omissions (PPOs per patient as identified by START was reduced for I but not for C (0.36 to 0.09 vs. 0.42 to 0.45 respectively, p<0.001. The summated score for MAI was reduced for I but not for C (8.5 to 5.0 and 8.7 to 10.0 respectively, p<0.001. There was a positive association between scores for MAI and STOPP and drug-related readmissions (RR 8-9% and 30-34% respectively. No association was detected between the scores of the tools and total re-visits to hospital.The interventions significantly improved the appropriateness of prescribing for patients in the intervention group as evaluated by the instruments MAI, STOPP and START. High scores in MAI and STOPP were associated with a higher number of drug-related readmissions.

  20. 拉车作业之肌肉疲劳分析与预测%Analyses and Predictions of Muscular Fatigue For Pulling Tasks

    Institute of Scientific and Technical Information of China (English)

    唐范; 李开伟; 易灿南; 彭露

    2017-01-01

    目的 本研究的第一个目的是以肌力下降、耐受时间及主观评价来比较在不同负荷水平下拉车作业造成的肌肉疲劳水平;第二个目的是建立数学模型来量化拉车作业造成肌肉疲劳的程度;最终目的为提供拉车作业工作设计的提供依据,以降低劳动者肌肉骨骼伤害的风险.方法 通过设计模拟手拉叉车实验,测量两种负荷下被试的实验前后拉力值、持续施力的耐受时间以及身体疲劳主观评价的数据,进行肌肉疲劳分析.结果 实验数据显示拉车作业产生肌肉疲劳,性别和负荷对耐受时间、拉力下降速率产生显着影响;性别也显著影响被试对疲劳的主观评价,身体质量指数是影响耐受时间的显著因子.结论 拉车作业中肌肉疲劳会导致拉力显著下降;性别是影响肌肉疲劳的重要因子,女性被试比男性被试更易疲劳;根据预测函数模型计算的男、女被试的疲劳速率k值分别为0.071、0.099.%Objective The first purpose of this study was to compare the muscular fatigue levels between different workloads for pulling tasks via the analyses of muscular strength decrease, endurance time, and subjective ratings.The second purpose of this study was to establish mathematic models to quantify muscular strength for pulling tasks with the ultimate goal of providing basis for job design for pulling tasks so as to reduce the risk of musculoskeletal disorders.Methods A simulated pallet truck pulling task was designed to investigate the developing of muscular fatigue in performing a truck pulling task.The muscular strength of pulling before and after performing the pulling task, endurance time and the subjective ratings on muscular fatigue were analyzed.Results Pulling tasks resulted in muscular fatigue.Gender and workload significantly affected endurance time, and the decrease rate of the pulling strength.Gender affected the subjective ratings on muscular fatigue significantly

  1. Area under the curve predictions of dalbavancin, a new lipoglycopeptide agent, using the end of intravenous infusion concentration data point by regression analyses such as linear, log-linear and power models.

    Science.gov (United States)

    Bhamidipati, Ravi Kanth; Syed, Muzeeb; Mullangi, Ramesh; Srinivas, Nuggehally

    2018-02-01

    1. Dalbavancin, a lipoglycopeptide, is approved for treating gram-positive bacterial infections. Area under plasma concentration versus time curve (AUC inf ) of dalbavancin is a key parameter and AUC inf /MIC ratio is a critical pharmacodynamic marker. 2. Using end of intravenous infusion concentration (i.e. C max ) C max versus AUC inf relationship for dalbavancin was established by regression analyses (i.e. linear, log-log, log-linear and power models) using 21 pairs of subject data. 3. The predictions of the AUC inf were performed using published C max data by application of regression equations. The quotient of observed/predicted values rendered fold difference. The mean absolute error (MAE)/root mean square error (RMSE) and correlation coefficient (r) were used in the assessment. 4. MAE and RMSE values for the various models were comparable. The C max versus AUC inf exhibited excellent correlation (r > 0.9488). The internal data evaluation showed narrow confinement (0.84-1.14-fold difference) with a RMSE models predicted AUC inf with a RMSE of 3.02-27.46% with fold difference largely contained within 0.64-1.48. 5. Regardless of the regression models, a single time point strategy of using C max (i.e. end of 30-min infusion) is amenable as a prospective tool for predicting AUC inf of dalbavancin in patients.

  2. The transcription factor DREAM represses A20 and mediates inflammation

    OpenAIRE

    Tiruppathi, Chinnaswamy; Soni, Dheeraj; Wang, Dong-Mei; Xue, Jiaping; Singh, Vandana; Thippegowda, Prabhakar B.; Cheppudira, Bopaiah P.; Mishra, Rakesh K.; DebRoy, Auditi; Qian, Zhijian; Bachmaier, Kurt; Zhao, Youyang; Christman, John W.; Vogel, Stephen M.; Ma, Averil

    2014-01-01

    Here we show that the transcription-repressor DREAM binds to the A20 promoter to repress the expression of A20, the deubiquitinase suppressing inflammatory NF-κB signaling. DREAM-deficient (Dream−/− ) mice displayed persistent and unchecked A20 expression in response to endotoxin. DREAM functioned by transcriptionally repressing A20 through binding to downstream regulatory elements (DREs). In contrast, USF1 binding to the DRE-associated E-box domain activated A20 expression in response to inf...

  3. Environmental Volunteering and Health Outcomes over a 20-Year Period

    Science.gov (United States)

    Pillemer, Karl; Fuller-Rowell, Thomas E.; Reid, M. C.; Wells, Nancy M.

    2010-01-01

    Purpose: This study tested the hypothesis that volunteering in environmental organizations in midlife is associated with greater physical activity and improved mental and physical health over a 20-year period.  Design and Methods: The study used data from two waves (1974 and 1994) of the Alameda County Study, a longitudinal study of health and mortality that has followed a cohort of 6,928 adults since 1965. Using logistic and multiple regression models, we examined the prospective association between environmental and other volunteerism and three outcomes (physical activity, self-reported health, and depression), with 1974 volunteerism predicting 1994 outcomes, controlling for a number of relevant covariates.  Results: Midlife environmental volunteering was significantly associated with physical activity, self-reported health, and depressive symptoms.  Implications: This population-based study offers the first epidemiological evidence for a significant positive relationship between environmental volunteering and health and well-being outcomes. Further research, including intervention studies, is needed to confirm and shed additional light on these initial findings. PMID:20172902

  4. Juvenile nasopharyngeal angiofibroma in a 20 year old Nigerian male

    African Journals Online (AJOL)

    This paper presents misdiagnosis of a 20 year old male with Juvenile nasopharyngeal angiofibroma (JNA). Methods: The case record of a 20year old male who presented with recurrent spontaneous profuse epistaxis, progressive nasal obstruction, hyponasality and conductive hearing loss with mass in the post nasal space ...

  5. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  6. Changes in environmental tobacco smoke (ETS) exposure over a 20-year period: cross-sectional and longitudinal analyses

    Science.gov (United States)

    Jefferis, Barbara J; Thomson, Andrew G; Lennon, Lucy T; Feyerabend, Colin; Doig, Mira; McMeekin, Laura; Wannamethee, S Goya; Cook, Derek G; Whincup, Peter H

    2009-01-01

    Aims To examine long-term changes in environmental tobacco smoke (ETS) exposure in British men between 1978 and 2000, using serum cotinine. Design Prospective cohort: British Regional Heart Study. Setting General practices in 24 towns in England, Wales and Scotland. Participants Non-smoking men: 2125 studied at baseline [questionnaire (Q1): 1978–80, aged 40–59 years], 3046 studied 20 years later (Q20: 1998–2000, aged 60–79 years) and 1208 studied at both times. Non-smokers were men reporting no current smoking with cotinine < 15 ng/ml at Q1 and/or Q20. Measurements Serum cotinine to assess ETS exposure. Findings In cross-sectional analysis, geometric mean cotinine level declined from 1.36 ng/ml [95% confidence interval (CI): 1.31, 1.42] at Q1 to 0.19 ng/ml (95% CI: 0.18, 0.19) at Q20. The prevalence of cotinine levels ≤ 0.7 ng/ml [associated with low coronary heart disease (CHD) risk] rose from 27.1% at Q1 to 83.3% at Q20. Manual social class and northern region of residence were associated with higher mean cotinine levels both at Q1 and Q20; older age was associated with lower cotinine level at Q20 only. Among 1208 persistent non-smokers, cotinine fell by 1.47 ng/ml (95% CI: 1.37, 1.57), 86% decline. Absolute falls in cotinine were greater in manual occupational groups, in the Midlands and Scotland compared to southern England, although percentage decline was very similar across groups. Conclusions A marked decline in ETS exposure occurred in Britain between 1978 and 2000, which is likely to have reduced ETS-related disease risks appreciably before the introduction of legislation banning smoking in public places. PMID:19207361

  7. Cruise>Climate Variability and Predictability (CLIVAR) A22,A20 (AT20, EM122)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The hydrographic surveys will consist of approximately 180 full water column CTD/LADCP casts along the trackline. Each cast will acquire up to 36 water samples on...

  8. Conceptual Nuclear Design of a 20 MW Multipurpose Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Chul Gyo; Kim, Hak Sung; Park, Cheol [KAERI, Daejeon (Korea, Republic of); Nghiem, Huynh Ton; Vinh, Le Vinh; Dang, Vo Doan Hai [Dalat Nuclear Research Reactor, Hanoi (Viet Nam)

    2007-08-15

    A conceptual nuclear design of a 20 MW multi-purpose research reactor for Vietnam has been jointly done by the KAERI and the DNRI (VAEC). The AHR reference core in this report is a right water cooled and a heavy water reflected open-tank-in-pool type multipurpose research reactor with 20 MW. The rod type fuel of a dispersed U{sub 3}Si{sub 2}-Al with a density of 4.0 gU/cc is used as a fuel. The core consists of fourteen 36-element assemblies, four 18-element assemblies and has three in-core irradiation sites. The reflector tank filled with heavy water surrounds the core and provides rooms for various irradiation holes. Major analyses have been done for the relevant nuclear design parameters such as the neutron flux and power distributions, reactivity coefficients, control rod worths, etc. For the analysis, the MCNP, MVP, and HELIOS codes were used by KAERI and DNRI (VAEC). The results by MCNP (KAERI) and MVP (DNRI) showed good agreements and can be summarized as followings. For a clean, unperturbed core condition such that the fuels are all fresh and there are no irradiation holes in the reflector region, the fast neutron flux (E{sub n}{>=}1.0 MeV) reaches 1.47x10{sup 14} n/cm{sup 2}s and the maximum thermal neutron flux (E{sub n}{<=}0.625 eV) reaches 4.43x10{sup 14} n/cm{sup 2}s in the core region. In the reflector region, the thermal neutron peak occurs about 28 cm far from the core center and the maximum thermal neutron flux is estimated to be 4.09x10{sup 14} n/cm{sup 2}s. For the analysis of the equilibrium cycle core, the irradiation facilities in the reflector region were considered. The cycle length was estimated as 38 days long with a refueling scheme of replacing three 36-element fuel assemblies or replacing two 36-element and one 18-element fuel assemblies. The excess reactivity at a BOC was 103.4 mk, and 24.6 mk at a minimum was reserved at an EOC. The assembly average discharge burnup was 54.6% of initial U-235 loading. For the proposed fuel management

  9. Cumulative risk, cumulative outcome: a 20-year longitudinal study.

    Directory of Open Access Journals (Sweden)

    Leslie Atkinson

    Full Text Available Cumulative risk (CR models provide some of the most robust findings in the developmental literature, predicting numerous and varied outcomes. Typically, however, these outcomes are predicted one at a time, across different samples, using concurrent designs, longitudinal designs of short duration, or retrospective designs. We predicted that a single CR index, applied within a single sample, would prospectively predict diverse outcomes, i.e., depression, intelligence, school dropout, arrest, smoking, and physical disease from childhood to adulthood. Further, we predicted that number of risk factors would predict number of adverse outcomes (cumulative outcome; CO. We also predicted that early CR (assessed at age 5/6 explains variance in CO above and beyond that explained by subsequent risk (assessed at ages 12/13 and 19/20. The sample consisted of 284 individuals, 48% of whom were diagnosed with a speech/language disorder. Cumulative risk, assessed at 5/6-, 12/13-, and 19/20-years-old, predicted aforementioned outcomes at age 25/26 in every instance. Furthermore, number of risk factors was positively associated with number of negative outcomes. Finally, early risk accounted for variance beyond that explained by later risk in the prediction of CO. We discuss these findings in terms of five criteria posed by these data, positing a "mediated net of adversity" model, suggesting that CR may increase some central integrative factor, simultaneously augmenting risk across cognitive, quality of life, psychiatric and physical health outcomes.

  10. Anti-inflammatory and anti-osteoclastogenic effects of zinc finger protein A20 overexpression in human periodontal ligament cells.

    Science.gov (United States)

    Hong, J-Y; Bae, W-J; Yi, J-K; Kim, G-T; Kim, E-C

    2016-08-01

    Although overexpression of the nuclear factor κB inhibitory and ubiquitin-editing enzyme A20 is thought to be involved in the pathogenesis of inflammatory diseases, its function in periodontal disease remains unknown. The aims of the present study were to evaluate A20 expression in patients with periodontitis and to study the effects of A20 overexpression, using a recombinant adenovirus encoding A20 (Ad-A20), on the inflammatory response and on osteoclastic differentiation in lipopolysaccharide (LPS)- and nicotine-stimulated human periodontal ligament cells (hPDLCs). The concentration of prostaglandin E2 was measured by radioimmunoassay. Reverse transcription-polymerase chain reactions and western blot analyses were used to measure mRNA and protein levels, respectively. Osteoclastic differentiation was assessed in mouse bone marrow-derived macrophages using conditioned medium from LPS- and nicotine-treated hPDLCs. A20 was upregulated in the gingival tissues and neutrophils from patients with periodontitis and in LPS- and nicotine-exposed hPDLCs. Pretreatment with A20 overexpression by Ad-A20 markedly attenuated LPS- and nicotine-induced production of prostaglandin E2 , as well as expression of cyclooxygenase-2 and proinflammatory cytokines. Moreover, A20 overexpression inhibited the number and size of tartrate-resistant acid phosphatase-stained osteoclasts, and downregulated osteoclast-specific gene expression. LPS- and nicotine-induced p38 phosphorylation and nuclear factor κB activation were blocked by Ad-A20. Ad-A20 inhibited the effects of nicotine and LPS on the activation of pan-protein kinase C, Akt, GSK-3β and protein kinase Cα. This study is the first to demonstrate that A20 overexpression has anti-inflammatory effects and blocks osteoclastic differentiation in a nicotine- and LPS-stimulated hPDLC model. Thus, A20 overexpression may be a potential therapeutic target in inflammatory bone loss diseases, such as periodontal disease. © 2015 John Wiley

  11. Severe accident recriticality analyses (SARA)

    DEFF Research Database (Denmark)

    Frid, W.; Højerup, C.F.; Lindholm, I.

    2001-01-01

    with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality-both super-prompt power bursts and quasi steady-state power......Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies......, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g(-1), was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s(-1). In most cases, however, the predicted energy deposition was smaller, below...

  12. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  13. NOAA's National Snow Analyses

    Science.gov (United States)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  14. Antiferromagnetism in a 20% Ho-80% Tb alloy single crystal

    DEFF Research Database (Denmark)

    Lebech, Bente

    1968-01-01

    20% Ho-80% Tb exhibits two magnetic phases, similar to those of Tb. The spiral turn angle varies from 31.1° to 21.4°. A minimum effective spin for the occurrence of stable simple ferromagnetic structure at low temperatures is predicted....

  15. 基于网络拓扑结构视角的社交媒体用户转发预测算法%Individual retweet behavior prediction algorithm in social media based on network topology analyses

    Institute of Scientific and Technical Information of China (English)

    方冰; 缪文渊

    2016-01-01

    To investigate who will repost tweets,based on the literatures about whether a tweet would be reposted or not,this paper proposed a logical regression algorithm through analyzing social network topological structure,user behavior and social in-fluences between users,and it tested by real data set.The experiment results demonstrate that compared with alternative algo-rithm which does not consider social network topological structure,the novel proposed prediction algorithm performes much bet-ter.This work lays an important foundation for information propagation path prediction.%为预测某条微博的具体转发者,在微博是否会被转发的研究基础上,提出了基于社交网络拓扑结构、用户行为及用户间关联三个层面的逻辑回归分类算法,并针对该算法进行真实数据集检测。实验结果表明,该预测算法与未考虑网络拓扑结构的算法相比性能显著提升,为实现社交媒体信息传播轨迹精准预测打下了重要基础。

  16. Pathological complete response after neoadjuvant chemotherapy is an independent predictive factor irrespective of simplified breast cancer intrinsic subtypes: a landmark and two-step approach analyses from the EORTC 10994/BIG 1-00 phase III trial.

    Science.gov (United States)

    Bonnefoi, H; Litière, S; Piccart, M; MacGrogan, G; Fumoleau, P; Brain, E; Petit, T; Rouanet, P; Jassem, J; Moldovan, C; Bodmer, A; Zaman, K; Cufer, T; Campone, M; Luporsi, E; Malmström, P; Werutsky, G; Bogaerts, J; Bergh, J; Cameron, D A

    2014-06-01

    Pathological complete response (pCR) following chemotherapy is strongly associated with both breast cancer subtype and long-term survival. Within a phase III neoadjuvant chemotherapy trial, we sought to determine whether the prognostic implications of pCR, TP53 status and treatment arm (taxane versus non-taxane) differed between intrinsic subtypes. Patients were randomized to receive either six cycles of anthracycline-based chemotherapy or three cycles of docetaxel then three cycles of eprirubicin/docetaxel (T-ET). pCR was defined as no evidence of residual invasive cancer (or very few scattered tumour cells) in primary tumour and lymph nodes. We used a simplified intrinsic subtypes classification, as suggested by the 2011 St Gallen consensus. Interactions between pCR, TP53 status, treatment arm and intrinsic subtype on event-free survival (EFS), distant metastasis-free survival (DMFS) and overall survival (OS) were studied using a landmark and a two-step approach multivariate analyses. Sufficient data for pCR analyses were available in 1212 (65%) of 1856 patients randomized. pCR occurred in 222 of 1212 (18%) patients: 37 of 496 (7.5%) luminal A, 22 of 147 (15%) luminal B/HER2 negative, 51 of 230 (22%) luminal B/HER2 positive, 43 of 118 (36%) HER2 positive/non-luminal, 69 of 221(31%) triple negative (TN). The prognostic effect of pCR on EFS did not differ between subtypes and was an independent predictor for better EFS [hazard ratio (HR) = 0.40, P analysis. EORTC 10994/BIG 1-00 Trial registration number NCT00017095. © The Author 2014. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  17. Why looking at the whole hippocampus is not enough – a critical role for anteroposterior axis, subfield and activation analyses to enhance predictive value of hippocampal changes for Alzheimer’s disease diagnosis.

    Directory of Open Access Journals (Sweden)

    Aleksandra eMaruszak

    2014-03-01

    Full Text Available The hippocampus is one of the earliest affected brain regions in Alzheimer´s disease (AD and its dysfunction is believed to underlie the core feature of the disease- memory impairment. Given that hippocampal volume is one of the best AD biomarkers, our review focuses on distinct subfields within the hippocampus, pinpointing regions that might enhance the predictive value of current diagnostic methods. Our review presents how changes in hippocampal volume, shape, symmetry and activation are reflected by cognitive impairment and how they are linked with neurogenesis alterations. Moreover, we revisit the functional differentiation along the anteroposterior longitudinal axis of the hippocampus and discuss its relevance for AD diagnosis. Finally, we indicate that apart from hippocampal subfield volumetry, the characteristic pattern of hippocampal hyperactivation associated with seizures and neurogenesis changes is another promising candidate for an early AD biomarker that could become also a target for early interventions.

  18. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  19. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  20. Description of a 20 Kilohertz power distribution system

    Science.gov (United States)

    Hansen, I. G.

    1986-01-01

    A single phase, 440 VRMS, 20 kHz power distribution system with a regulated sinusoidal wave form is discussed. A single phase power system minimizes the wiring, sensing, and control complexities required in a multi-sourced redundantly distributed power system. The single phase addresses only the distribution link; mulitphase lower frequency inputs and outputs accommodation techniques are described. While the 440 V operating potential was initially selected for aircraft operating below 50,000 ft, this potential also appears suitable for space power systems. This voltage choice recognizes a reasonable upper limit for semiconductor ratings, yet will direct synthesis of 220 V, 3 power. A 20 kHz operating frequency was selected to be above the range of audibility, minimize the weight of reactive components, yet allow the construction of single power stages of 25 to 30 kW. The regulated sinusoidal distribution system has several advantages. With a regulated voltage, most ac/dc conversions involve rather simple transformer rectifier applications. A sinusoidal distribution system, when used in conjunction with zero crossing switching, represents a minimal source of EMI. The present state of 20 kHz power technology includes computer controls of voltage and/or frequency, low inductance cable, current limiting circuit protection, bi-directional power flow, and motor/generator operating using standard induction machines. A status update and description of each of these items and their significance is presented.

  1. Childhood neoplasms presenting at autopsy: A 20-year experience.

    Science.gov (United States)

    Bryant, Victoria A; Booth, John; Palm, Liina; Ashworth, Michael; Jacques, Thomas S; Sebire, Neil J

    2017-09-01

    The aims of the review are to establish the number of undiagnosed neoplasms presenting at autopsy in a single centre and to determine the incidence and most common causes of sudden unexpected death due to neoplasia in infancy and childhood (SUDNIC). Retrospective observational study of paediatric autopsies performed on behalf of Her Majesty's Coroner over a 20-year period (1996-2015; n = 2,432). Neoplasms first diagnosed at autopsy were identified from an established database and cases meeting the criteria for sudden unexpected death were further categorised. Thirteen previously undiagnosed neoplasms were identified, including five haematological malignancies, two medulloblastomas, two neuroblastomas, two cardiac tumours and two malignancies of renal origin. Eight cases met the criteria for SUDNIC (0.33% of autopsies), the commonest group of which were haematological malignancies (n = 3). Neoplasms presenting as unexpected death in infancy and childhood and diagnosed at autopsy are rare. The findings suggest that haematological malignancies are the commonest cause of SUDNIC and highlight the importance of specialist autopsy in cases of sudden unexpected death. © 2017 Wiley Periodicals, Inc.

  2. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  3. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  4. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  5. Hydrophysical conditions and periphyton in natural rivers. Analysis and predictive modelling of periphyton by changed regulations; Hydrofysiske forhold og begroing i naturlige elver. Analyse og prediktiv modellering av begroing ved reguleringsendringer

    Energy Technology Data Exchange (ETDEWEB)

    Stokseth, S

    1994-10-01

    The objective of this thesis has been to examine the interaction between hydrodynamical and physical factors and the temporal and spatial dynamics of periphyton in natural steep rivers. The study strategy has been to work with quantitative system variables to be able to evaluate the potential usability of a predictive model for periphyton changes as a response to river regulations. The thesis is constituted by a theoretical and an empirical study. The theoretical study is aimed at presenting a conceptual model of the relevant factors based on an analysis of published studies. Effort has been made to evaluate and present the background material in a structured way. To concurrently handle the spatial and temporal dynamics of periphyton a new method for data collection has been developed. A procedure for quantifying the photo registrations has been developed. The simple hydrodynamical parameters were estimated from a set of standard formulas whereas the complex parameters were estimated from a three dimensional simulation model called SSIIM. The main conclusion from the analysis is that flood events are the major controlling factors wrt. periphyton biomass and that water temperature is of major importance for the periphyton resistance. Low temperature clearly increases the periphyton erosion resistance. Thus, to model or control the temporal dynamics the river periphyton, the water temperature and the frequency and size of floods should be regarded the most significant controlling factors. The data in this study has been collected from a river with a stable water quality and frequent floods. 109 refs., 41 figs., 34 tabs.

  6. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  7. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  8. A 20-year simulated climatology of global dust aerosol deposition.

    Science.gov (United States)

    Zheng, Yu; Zhao, Tianliang; Che, Huizheng; Liu, Yu; Han, Yongxiang; Liu, Chong; Xiong, Jie; Liu, Jianhui; Zhou, Yike

    2016-07-01

    Based on a 20-year (1991-2010) simulation of dust aerosol deposition with the global climate model CAM5.1 (Community Atmosphere Model, version 5.1), the spatial and temporal variations of dust aerosol deposition were analyzed using climate statistical methods. The results indicated that the annual amount of global dust aerosol deposition was approximately 1161±31Mt, with a decreasing trend, and its interannual variation range of 2.70% over 1991-2010. The 20-year average ratio of global dust dry to wet depositions was 1.12, with interannual variation of 2.24%, showing the quantity of dry deposition of dust aerosol was greater than dust wet deposition. High dry deposition was centered over continental deserts and surrounding regions, while wet deposition was a dominant deposition process over the North Atlantic, North Pacific and northern Indian Ocean. Furthermore, both dry and wet deposition presented a zonal distribution. To examine the regional changes of dust aerosol deposition on land and sea areas, we chose the North Atlantic, Eurasia, northern Indian Ocean, North Pacific and Australia to analyze the interannual and seasonal variations of dust deposition and dry-to-wet deposition ratio. The deposition amounts of each region showed interannual fluctuations with the largest variation range at around 26.96% in the northern Indian Ocean area, followed by the North Pacific (16.47%), Australia (9.76%), North Atlantic (9.43%) and Eurasia (6.03%). The northern Indian Ocean also had the greatest amplitude of interannual variation in dry-to-wet deposition ratio, at 22.41%, followed by the North Atlantic (9.69%), Australia (6.82%), North Pacific (6.31%) and Eurasia (4.36%). Dust aerosol presented a seasonal cycle, with typically strong deposition in spring and summer and weak deposition in autumn and winter. The dust deposition over the northern Indian Ocean exhibited the greatest seasonal change range at about 118.00%, while the North Atlantic showed the lowest seasonal

  9. AMS analyses at ANSTO

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division

    1998-03-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)

  10. AMS analyses at ANSTO

    International Nuclear Information System (INIS)

    Lawson, E.M.

    1998-01-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14 C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14 C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)

  11. Analyses of MHD instabilities

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki

    1985-01-01

    In this article analyses of the MHD stabilities which govern the global behavior of a fusion plasma are described from the viewpoint of the numerical computation. First, we describe the high accuracy calculation of the MHD equilibrium and then the analysis of the linear MHD instability. The former is the basis of the stability analysis and the latter is closely related to the limiting beta value which is a very important theoretical issue of the tokamak research. To attain a stable tokamak plasma with good confinement property it is necessary to control or suppress disruptive instabilities. We, next, describe the nonlinear MHD instabilities which relate with the disruption phenomena. Lastly, we describe vectorization of the MHD codes. The above MHD codes for fusion plasma analyses are relatively simple though very time-consuming and parts of the codes which need a lot of CPU time concentrate on a small portion of the codes, moreover, the codes are usually used by the developers of the codes themselves, which make it comparatively easy to attain a high performance ratio on the vector processor. (author)

  12. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  13. Thermal analyses. Information on the expected baking process; Thermische analyses. Informatie over een te verwachten bakgedrag

    Energy Technology Data Exchange (ETDEWEB)

    Van Wijck, H. [Stichting Technisch Centrum voor de Keramische Industrie TCKI, Velp (Netherlands)

    2009-09-01

    The design process and the drying process for architectural ceramics and pottery partly determine the characteristics of the final product, but the largest changes occur during the baking process. An overview is provided of the different thermal analyses and how the information from these analyses can predict the process in practice. (mk) [Dutch] Het vormgevingsproces en het droogproces voor bouwkeramische producten en aardewerk bepalen voor een deel de eigenschappen van de eindproducten, maar de grootste veranderingen treden op bij het bakproces. Een overzicht wordt gegeven van de verschillende thermische analyses en hoe de informatie uit deze analyses het in de praktijk te verwachten gedrag kan voorspellen.

  14. Micromechanical Analyses of Sturzstroms

    Science.gov (United States)

    Imre, Bernd; Laue, Jan; Springman, Sarah M.

    2010-05-01

    Sturzstroms are very fast landslides of very large initial volume. As type features they display extreme run out, pared with intensive fragmentation of the involved blocks of rock within a collisional flow. The inherent danger to the growing communities in alpine valleys below future potential sites of sturzstroms must be examined and results of predictions of endangered zones allow to impact upon the planning processes in these areas. This calls for the ability to make Type A predictions, according to Lambe (1973), which are done before an event. But Type A predictions are only possible if sufficient understanding of the mechanisms involved in a process is available. The motivation of the doctoral thesis research project presented is therefore to reveal the mechanics of sturzstroms in more detail in order to contribute to the development of a Type A run out prediction model. It is obvious that a sturzstrom represents a highly dynamic collisional granular regime. Thus particles do not only collide but will eventually crush each other. Erismann and Abele (2001) describe this process as dynamic disintegration, where kinetic energy is the main driver for fragmenting the rock mass. In this case an approach combining the type features long run out and fragmentation within a single hypothesis is represented by the dynamic fragmentation-spreading model (Davies and McSaveney, 2009; McSaveney and Davies, 2009). Unfortunately, sturzstroms, and fragmentation within sturzstroms, can not be observed directly in a real event because of their long "reoccurrence time" and the obvious difficulties in placing measuring devices within such a rock flow. Therefore, rigorous modelling is required in particular of the transition from static to dynamic behaviour to achieve better knowledge of the mechanics of sturzstroms, and to provide empirical evidence to confirm the dynamic fragmentation-spreading model. Within this study fragmentation and their effects on the mobility of sturzstroms

  15. A simple beam analyser

    International Nuclear Information System (INIS)

    Lemarchand, G.

    1977-01-01

    (ee'p) experiments allow to measure the missing energy distribution as well as the momentum distribution of the extracted proton in the nucleus versus the missing energy. Such experiments are presently conducted on SACLAY's A.L.S. 300 Linac. Electrons and protons are respectively analysed by two spectrometers and detected in their focal planes. Counting rates are usually low and include time coincidences and accidentals. Signal-to-noise ratio is dependent on the physics of the experiment and the resolution of the coincidence, therefore it is mandatory to get a beam current distribution as flat as possible. Using new technologies has allowed to monitor in real time the behavior of the beam pulse and determine when the duty cycle can be considered as being good with respect to a numerical basis

  16. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  17. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  18. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... common tool to answer this question, analysis of log files, faces the problem that the amount of logged data may be overwhelming. This problems gets even worse in the case of insider attacks, where the attacker’s actions usually will be logged as permissible, standard actions—if they are logged at all....... Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  19. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  20. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens......Websitet er i stigende grad det foretrukne medie inden for informationssøgning,virksomhedspræsentation, e-handel, underholdning, undervisning og social kontakt. I takt med denne voksende mangfoldighed af kommunikationsaktiviteter på nettet, er der kommet mere fokus på at optimere design og...... planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...

  1. A channel profile analyser

    International Nuclear Information System (INIS)

    Gobbur, S.G.

    1983-01-01

    It is well understood that due to the wide band noise present in a nuclear analog-to-digital converter, events at the boundaries of adjacent channels are shared. It is a difficult and laborious process to exactly find out the shape of the channels at the boundaries. A simple scheme has been developed for the direct display of channel shape of any type of ADC on a cathode ray oscilliscope display. This has been accomplished by sequentially incrementing the reference voltage of a precision pulse generator by a fraction of a channel and storing ADC data in alternative memory locations of a multichannel pulse height analyser. Alternative channels are needed due to the sharing at the boundaries of channels. In the flat region of the profile alternate memory locations are channels with zero counts and channels with the full scale counts. At the boundaries all memory locations will have counts. The shape of this is a direct display of the channel boundaries. (orig.)

  2. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  3. Severe accident recriticality analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. E-mail: wiktor.frid@ski.se; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H

    2001-11-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s{sup -1} injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g{sup -1}, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s{sup -1}. In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated

  4. Severe accident recriticality analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H.

    2001-01-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s -1 injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g -1 , was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s -1 . In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated quasi steady

  5. Severe Accident Recriticality Analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Puska, E.K.; Nilsson, Lars; Sjoevall, H.

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B 4 C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  6. Comparative analyses of genetic risk prediction methods reveal ...

    Indian Academy of Sciences (India)

    2015-03-12

    Mar 12, 2015 ... 1National Institute of Biomedical Genomics, Netaji Subhas Sanatorium (T. B. ... 2Institute of Post Graduate Medical Education and Research, J. C. Bose Road, ...... Torres D. M. and Harrison S. A 2008 Diagnosis and therapy of.

  7. Comparison of analyses to predict ruminal fibre degradability and ...

    African Journals Online (AJOL)

    The objective of this study was to compare the ruminal degradability of neutral detergent fibre (NDF) and indigestible NDF (INDF) between silages (n = 24) that originated from three different temperate grass species, i.e. Dactylis glomerata L., Festuca arundinacea L. and hybrid, Felina – Lolium multiflorum L. × Festuca ...

  8. Comparative analyses of genetic risk prediction methods reveal ...

    Indian Academy of Sciences (India)

    0.85. 0.8. 0.95. 0.93. 0.81 rs1227756. 0.5. 0.56. 0.75. 0.55. 0.7. 0.71. 0.58. 0.81. 0.25. 0.5. 0.53. 0.48. 0.64. 0.58. 0.58. 0.8. 0.68. 0.78. 0.48. 0.65. NA. NA. NA. NA rs1501299. 0.64. 0.78. 0.75. 0.83. 0.83. 0.89. 0.83. 0.72. 0.63. 0.73. 0.78. 0.75. 0.84. 0.79. 0.61. 0.78. 0.78. 0.7. 0.73. 0.73. NA. NA. NA. NA rs16944. 0.64. 0.72. 0.68.

  9. Analysis of K-net and Kik-net data: implications for ground motion prediction - acceleration time histories, response spectra and nonlinear site response; Analyse des donnees accelerometriques de K-net et Kik-net: implications pour la prediction du mouvement sismique - accelerogrammes et spectres de reponse - et la prise en compte des effets de site non-lineaire

    Energy Technology Data Exchange (ETDEWEB)

    Pousse, G

    2005-10-15

    This thesis intends to characterize ground motion during earthquake. This work is based on two Japanese networks. It deals with databases of shallow events, depth less than 25 km, with magnitude between 4.0 and 7.3. The analysis of K-net allows to compute a spectral ground motion prediction equation and to review the shape of the Eurocode 8 design spectra. We show the larger amplification at short period for Japanese data and bring in light the soil amplification that takes place at large period. In addition, we develop a new empirical model for simulating synthetic stochastic nonstationary acceleration time histories. By specifying magnitude, distance and site effect, this model allows to produce many time histories, that a seismic event is liable to produce at the place of interest. Furthermore, the study of near-field borehole records of the Kik-net allows to explore the validity domain of predictive equations and to explain what occurs by extrapolating ground motion predictions. Finally, we show that nonlinearity reduces the dispersion of ground motion at the surface. (author)

  10. 17 CFR 240.13a-20 - Plain English presentation of specified information.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Plain English presentation of specified information. 240.13a-20 Section 240.13a-20 Commodity and Securities Exchanges SECURITIES AND... Regulations Under the Securities Exchange Act of 1934 Other Reports § 240.13a-20 Plain English presentation of...

  11. Incidence of hepatitis C infection among prisoners by routine laboratory values during a 20-year period.

    Directory of Open Access Journals (Sweden)

    Andrés Marco

    Full Text Available To estimate the incidence of Hepatitis C virus (HCV and the predictive factors through repeated routine laboratory analyses.An observational cohort study was carried out in Quatre Camins Prison, Barcelona. The study included subjects with an initial negative HCV result and routine laboratory analyses containing HCV serology from 1992 to 2011. The incidence of infection was calculated for the study population and for sub-groups by 100 person-years of follow-up (100 py. The predictive factors were determined through Kaplan-Meier curves and a Cox regression. Hazard ratios (HR and 95% confidence intervals (CI were calculated.A total of 2,377 prisoners were included with a median follow-up time of 1,540.9 days per patient. Among the total population, 117 HCV seroconversions were detected (incidence of 1.17/100 py. The incidence was higher between 1992 and 1995 (2.57/100 py, among cases with HIV co-infection (8.34/100 py and among intravenous drug users (IDU without methadone treatment (MT during follow-up (6.66/100 py. The incidence rate of HCV seroconversion among cases with a history of IDU and current MT was 1.35/100 py, which is close to that of the total study population. The following variables had a positive predictive value for HCV infection: IDU (p<0.001; HR = 7,30; CI: 4.83-11.04, Spanish ethnicity (p = 0.009; HR = 2,03; CI: 1.93-3.44 and HIV infection (p = 0.015; HR = 1.97; CI: 1.14-3.39.The incidence of HCV infection among prisoners was higher during the first part of the study and among IDU during the entire study period. Preventative programs should be directed toward this sub-group of the prison population.

  12. Semen analysis and prediction of natural conception

    NARCIS (Netherlands)

    Leushuis, Esther; van der Steeg, Jan Willem; Steures, Pieternel; Repping, Sjoerd; Bossuyt, Patrick M. M.; Mol, Ben Willem J.; Hompes, Peter G. A.; van der Veen, Fulco

    2014-01-01

    Do two semen analyses predict natural conception better than a single semen analysis and will adding the results of repeated semen analyses to a prediction model for natural pregnancy improve predictions? A second semen analysis does not add helpful information for predicting natural conception

  13. A20 restricts wnt signaling in intestinal epithelial cells and suppresses colon carcinogenesis.

    Directory of Open Access Journals (Sweden)

    Ling Shao

    Full Text Available Colon carcinogenesis consists of a multistep process during which a series of genetic and epigenetic adaptations occur that lead to malignant transformation. Here, we have studied the role of A20 (also known as TNFAIP3, a ubiquitin-editing enzyme that restricts NFκB and cell death signaling, in intestinal homeostasis and tumorigenesis. We have found that A20 expression is consistently reduced in human colonic adenomas than in normal colonic tissues. To further investigate A20's potential roles in regulating colon carcinogenesis, we have generated mice lacking A20 specifically in intestinal epithelial cells and interbred these with mice harboring a mutation in the adenomatous polyposis coli gene (APC(min. While A20(FL/FL villin-Cre mice exhibit uninflamed intestines without polyps, A20(FL/FL villin-Cre APC(min/+ mice contain far greater numbers and larger colonic polyps than control APC(min mice. We find that A20 binds to the β-catenin destruction complex and restricts canonical wnt signaling by supporting ubiquitination and degradation of β-catenin in intestinal epithelial cells. Moreover, acute deletion of A20 from intestinal epithelial cells in vivo leads to enhanced expression of the β-catenin dependent genes cyclinD1 and c-myc, known promoters of colon cancer. Taken together, these findings demonstrate new roles for A20 in restricting β-catenin signaling and preventing colon tumorigenesis.

  14. WALS Prediction

    NARCIS (Netherlands)

    Magnus, J.R.; Wang, W.; Zhang, Xinyu

    2012-01-01

    Abstract: Prediction under model uncertainty is an important and difficult issue. Traditional prediction methods (such as pretesting) are based on model selection followed by prediction in the selected model, but the reported prediction and the reported prediction variance ignore the uncertainty

  15. Neuroprotective Efficacy of an Aminopropyl Carbazole Derivative P7C3-A20 in Ischemic Stroke.

    Science.gov (United States)

    Wang, Shu-Na; Xu, Tian-Ying; Wang, Xia; Guan, Yun-Feng; Zhang, Sai-Long; Wang, Pei; Miao, Chao-Yu

    2016-09-01

    NAMPT is a novel therapeutic target of ischemic stroke. The aim of this study was to investigate the effect of a potential NAMPT activator, P7C3-A20, an aminopropyl carbazole derivative, on ischemic stroke. In vitro study, neuron protection effect of P7C3-A20 was investigated by co-incubation with primary neurons subjected to oxygen-glucose deprivation (OGD) or oxygen-glucose deprivation/reperfusion (OGD/R) injury. In vivo experiment, P7C3-A20 was administrated in middle cerebral artery occlusion (MCAO) rats and infarct volume was examined. Lastly, the brain tissue nicotinamide adenine dinucleotide (NAD) levels were detected in P7C3-A20 treated normal or MCAO mice. Cell viability, morphology, and Tuj-1 staining confirmed the neuroprotective effect of P7C3-A20 in OGD or OGD/R model. P7C3-A20 administration significantly reduced cerebral infarction in MCAO rats. Moreover, brain NAD levels were elevated both in normal and MCAO mice after P7C3-A20 treatment. P7C3-A20 has neuroprotective effect in cerebral ischemia. The study contributes to the development of NAMPT activators against ischemic stroke and expands the horizon of the neuroprotective effect of aminopropyl carbazole chemicals. © 2016 John Wiley & Sons Ltd.

  16. Study of proton and 2 protons emission from light neutron deficient nuclei around A=20

    International Nuclear Information System (INIS)

    Zerguerras, T.

    2001-09-01

    Proton and two proton emission from light neutron deficient nuclei around A=20 have been studied. A radioactive beam of 18 Ne, 17 F and 20 Mg, produced at the Grand Accelerateur National d'Ions Lourds by fragmentation of a 24 Mg primary beam at 95 MeV/A, bombarded a 9 Be target to form unbound states. Proton(s) and nuclei from the decay were detected respectively in the MUST array and the SPEG spectrometer. From energy and angle measurements, the invariant mass of the decaying nucleus could be reconstructed. Double coincidence events between a proton and 17 F, 16 O, 15 O, 14 O and 18 Ne were registered to obtain excitation energy spectra of 18 Ne, 17 F, 16 F, 15 F et 19 Na. Generally, the masses measures are in agreement with previous experiments. In the case of 18 Ne, excitation energy and angular distributions agree well with the predictions of a break up model calculation. From 17 Ne proton coincidences, a first experimental measurement of the ground state mass excess of 18 Na has been obtained and yields 24,19(0,15)MeV. Two proton emission from 17 Ne and 18 Ne excited states and the 19 Mg ground state was studied through triple coincidences between two proton and 15 O, 16 O and 17 Ne respectively. In the first case, the proton-proton relative angle distribution in the center of mass has been compared with model calculation. Sequential emission from excited states of 17 Ne, above the proton emission threshold, through 16 F is dominant but a 2 He decay channel could not be excluded. No 2 He emission from the 1.288 MeV 17 Ne state, or from the 6.15 MeV 18 Ne state has been observed. Only one coincidence event between 17 Ne and two proton was registered, the value of the one neutron stripping reaction cross section of 20 Mg being much lower than predicted. (author)

  17. Analyses and characterization of double shell tank

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-04

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams.

  18. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  19. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    Cavitation instabilities have been predicted for a single void in a ductile metal stressed under high triaxiality conditions. In experiments for a ceramic reinforced by metal particles a single dominant void has been observed on the fracture surface of some of the metal particles bridging a crack......, and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses......, while the surrounding voids are represented by a porous ductile material model in terms of a field quantity that specifies the variation of the void volume fraction in the surrounding metal....

  20. A20 modulates lipid metabolism and energy production to promote liver regeneration.

    Directory of Open Access Journals (Sweden)

    Scott M Damrauer

    2011-03-01

    Full Text Available Liver regeneration is clinically of major importance in the setting of liver injury, resection or transplantation. We have demonstrated that the NF-κB inhibitory protein A20 significantly improves recovery of liver function and mass following extended liver resection (LR in mice. In this study, we explored the Systems Biology modulated by A20 following extended LR in mice.We performed transcriptional profiling using Affymetrix-Mouse 430.2 arrays on liver mRNA retrieved from recombinant adenovirus A20 (rAd.A20 and rAd.βgalactosidase treated livers, before and 24 hours after 78% LR. A20 overexpression impacted 1595 genes that were enriched for biological processes related to inflammatory and immune responses, cellular proliferation, energy production, oxidoreductase activity, and lipid and fatty acid metabolism. These pathways were modulated by A20 in a manner that favored decreased inflammation, heightened proliferation, and optimized metabolic control and energy production. Promoter analysis identified several transcriptional factors that implemented the effects of A20, including NF-κB, CEBPA, OCT-1, OCT-4 and EGR1. Interactive scale-free network analysis captured the key genes that delivered the specific functions of A20. Most of these genes were affected at basal level and after resection. We validated a number of A20's target genes by real-time PCR, including p21, the mitochondrial solute carriers SLC25a10 and SLC25a13, and the fatty acid metabolism regulator, peroxisome proliferator activated receptor alpha. This resulted in greater energy production in A20-expressing livers following LR, as demonstrated by increased enzymatic activity of cytochrome c oxidase, or mitochondrial complex IV.This Systems Biology-based analysis unravels novel mechanisms supporting the pro-regenerative function of A20 in the liver, by optimizing energy production through improved lipid/fatty acid metabolism, and down-regulated inflammation. These findings

  1. Climate prediction and predictability

    Science.gov (United States)

    Allen, Myles

    2010-05-01

    Climate prediction is generally accepted to be one of the grand challenges of the Geophysical Sciences. What is less widely acknowledged is that fundamental issues have yet to be resolved concerning the nature of the challenge, even after decades of research in this area. How do we verify or falsify a probabilistic forecast of a singular event such as anthropogenic warming over the 21st century? How do we determine the information content of a climate forecast? What does it mean for a modelling system to be "good enough" to forecast a particular variable? How will we know when models and forecasting systems are "good enough" to provide detailed forecasts of weather at specific locations or, for example, the risks associated with global geo-engineering schemes. This talk will provide an overview of these questions in the light of recent developments in multi-decade climate forecasting, drawing on concepts from information theory, machine learning and statistics. I will draw extensively but not exclusively from the experience of the climateprediction.net project, running multiple versions of climate models on personal computers.

  2. Ecosystem Development after Mangrove Wetland Creation: Plant-Soil Change across a 20-year Chronosequence

    Science.gov (United States)

    Mangrove wetland restoration and creation efforts are increasingly proposed as mechanisms to compensate for mangrove wetland loss. However, ecosystem development and functional equivalence in restored and created mangrove wetlands is poorly understood. We compared a 20-yr chrono...

  3. The structure of late-life depressive symptoms across a 20-year span: a taxometric investigation.

    Science.gov (United States)

    Holland, Jason M; Schutte, Kathleen K; Brennan, Penny L; Moos, Rudolf H

    2010-03-01

    Past studies of the underlying structure of depressive symptoms have yielded mixed results, with some studies supporting a continuous conceptualization and others supporting a categorical one. However, no study has examined this research question with an exclusively older adult sample, despite the potential uniqueness of late-life depressive symptoms. In the present study, the underlying structure of late-life depressive symptoms was examined among a sample of 1,289 individuals across 3 waves of data collection spanning 20 years. The authors employed a taxometric methodology using indicators of depression derived from the Research Diagnostic Criteria (R. L. Spitzer, J. Endicott, & E. Robins, 1978). Maximum eigenvalue analyses and inchworm consistency tests generally supported a categorical conceptualization and identified a group that was primarily characterized by thoughts about death and suicide. However, compared to a categorical depression variable, depressive symptoms treated continuously were generally better predictors of relevant criterion variables. These findings suggest that thoughts of death and suicide may characterize a specific type of late-life depression, yet a continuous conceptualization still typically maximizes the predictive utility of late-life depressive symptoms.

  4. A20 Functional Domains Regulate Subcellular Localization and NF-Kappa B Activation

    Science.gov (United States)

    2013-08-15

    endothelial cells or if the effects of A20 are limited strictly to the process of apoptosis. The Ferran group began by taking bovine aortic endothelial...protein (64, 67, 81, 118, 122). Interestingly, A20 is also involved in the regulation of intracellular parasite infection (109, 123). Given the...was supplemented with a combination of heat inactivated 10% fetal bovine serum and 1% penicillin G/streptomycin sulfate/gentamycin sulfate

  5. Análise lise da estabilidade e previsibilidade da qualidade fisiológica de sementes de soja produzidas em Cristalina, Goiás = Stability and predictability analyses of the physiological quality of soybean seeds produced in Cristalina, Goiás (Brazil

    Directory of Open Access Journals (Sweden)

    Éder Matsuo

    2008-04-01

    emergence speed and stability analyses were tested through the methods proposed by Lin and Binns (1988 and Annicchiarico (1992. The germination percentage averages, the emergence of plants and the emergence speed index were compared through Tukey’s test at 5% probability. In the evaluation of the seeds’ physiological quality, genotype 7B1454170 was identified as the best, and genotype9B1459189 as the worst. The genotypes Emgopa 313, 7B1454170, 11B145341 and DM339 were classified as offering high stability in physiological quality, and genotypes 3B1346193 and 9B1459189 offered low predictability. The estimation methods used were efficient,coherent among them and allowed the identification, among the evaluated genotypes, of the ones that offered greater stability and predictability.

  6. CADDIS Volume 4. Data Analysis: Advanced Analyses - Controlling for Natural Variability

    Science.gov (United States)

    Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.

  7. CADDIS Volume 4. Data Analysis: Advanced Analyses - Controlling for Natural Variability: SSD Plot Diagrams

    Science.gov (United States)

    Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.

  8. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  9. A20 (Tnfaip3 deficiency in myeloid cells protects against influenza A virus infection.

    Directory of Open Access Journals (Sweden)

    Jonathan Maelfait

    Full Text Available The innate immune response provides the first line of defense against viruses and other pathogens by responding to specific microbial molecules. Influenza A virus (IAV produces double-stranded RNA as an intermediate during the replication life cycle, which activates the intracellular pathogen recognition receptor RIG-I and induces the production of proinflammatory cytokines and antiviral interferon. Understanding the mechanisms that regulate innate immune responses to IAV and other viruses is of key importance to develop novel therapeutic strategies. Here we used myeloid cell specific A20 knockout mice to examine the role of the ubiquitin-editing protein A20 in the response of myeloid cells to IAV infection. A20 deficient macrophages were hyperresponsive to double stranded RNA and IAV infection, as illustrated by enhanced NF-κB and IRF3 activation, concomitant with increased production of proinflammatory cytokines, chemokines and type I interferon. In vivo this was associated with an increased number of alveolar macrophages and neutrophils in the lungs of IAV infected mice. Surprisingly, myeloid cell specific A20 knockout mice are protected against lethal IAV infection. These results challenge the general belief that an excessive host proinflammatory response is associated with IAV-induced lethality, and suggest that under certain conditions inhibition of A20 might be of interest in the management of IAV infections.

  10. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  11. Analysing and Comparing Encodability Criteria

    Directory of Open Access Journals (Sweden)

    Kirstin Peters

    2015-08-01

    Full Text Available Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes.

  12. Analysing Children's Drawings: Applied Imagination

    Science.gov (United States)

    Bland, Derek

    2012-01-01

    This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…

  13. Impact analyses after pipe rupture

    International Nuclear Information System (INIS)

    Chun, R.C.; Chuang, T.Y.

    1983-01-01

    Two of the French pipe whip experiments are reproduced with the computer code WIPS. The WIPS results are in good agreement with the experimental data and the French computer code TEDEL. This justifies the use of its pipe element in conjunction with its U-bar element in a simplified method of impact analyses

  14. Millifluidic droplet analyser for microbiology

    NARCIS (Netherlands)

    Baraban, L.; Bertholle, F.; Salverda, M.L.M.; Bremond, N.; Panizza, P.; Baudry, J.; Visser, de J.A.G.M.; Bibette, J.

    2011-01-01

    We present a novel millifluidic droplet analyser (MDA) for precisely monitoring the dynamics of microbial populations over multiple generations in numerous (=103) aqueous emulsion droplets (100 nL). As a first application, we measure the growth rate of a bacterial strain and determine the minimal

  15. Analyser of sweeping electron beam

    International Nuclear Information System (INIS)

    Strasser, A.

    1993-01-01

    The electron beam analyser has an array of conductors that can be positioned in the field of the sweeping beam, an electronic signal treatment system for the analysis of the signals generated in the conductors by the incident electrons and a display for the different characteristics of the electron beam

  16. Preliminary design for a 20 TeV Collider in a deep tunnel at Fermilab

    International Nuclear Information System (INIS)

    1985-01-01

    The Reference Design Study for a 20 TeV Collider demonstrated the technical and cost feasibility of a 20 TeV superconducting collider facility. Based on magnets of 3T, 5T, and 6.5T the Main Ring of the Collider would have a circumference of 164 km, 113 km, or 90 km. There would be six collision regions, of which four would be developed intially. The 5T and 6.5T rings would have twelve major refrigeration stations, while the 3T design would have 24 major refrigeration stations

  17. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  18. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... yielded major progress with regard to both the phylogenetic positions of extinct species, as well as resolving population genetics questions in both extinct and extant species....

  19. Recriticality analyses for CAPRA cores

    International Nuclear Information System (INIS)

    Maschek, W.; Thiem, D.

    1995-01-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  20. Recriticality analyses for CAPRA cores

    Energy Technology Data Exchange (ETDEWEB)

    Maschek, W.; Thiem, D.

    1995-08-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  1. Technical center for transportation analyses

    International Nuclear Information System (INIS)

    Foley, J.T.

    1978-01-01

    A description is presented of an information search/retrieval/research activity of Sandia Laboratories which provides technical environmental information which may be used in transportation risk analyses, environmental impact statements, development of design and test criteria for packaging of energy materials, and transportation mode research studies. General activities described are: (1) history of center development; (2) environmental information storage/retrieval system; (3) information searches; (4) data needs identification; and (5) field data acquisition system and applications

  2. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  3. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  4. Comprehensive immunoproteogenomic analyses of malignant pleural mesothelioma.

    Science.gov (United States)

    Lee, Hyun-Sung; Jang, Hee-Jin; Choi, Jong Min; Zhang, Jun; de Rosen, Veronica Lenge; Wheeler, Thomas M; Lee, Ju-Seog; Tu, Thuydung; Jindra, Peter T; Kerman, Ronald H; Jung, Sung Yun; Kheradmand, Farrah; Sugarbaker, David J; Burt, Bryan M

    2018-04-05

    We generated a comprehensive atlas of the immunologic cellular networks within human malignant pleural mesothelioma (MPM) using mass cytometry. Data-driven analyses of these high-resolution single-cell data identified 2 distinct immunologic subtypes of MPM with vastly different cellular composition, activation states, and immunologic function; mass spectrometry demonstrated differential abundance of MHC-I and -II neopeptides directly identified between these subtypes. The clinical relevance of this immunologic subtyping was investigated with a discriminatory molecular signature derived through comparison of the proteomes and transcriptomes of these 2 immunologic MPM subtypes. This molecular signature, representative of a favorable intratumoral cell network, was independently associated with improved survival in MPM and predicted response to immune checkpoint inhibitors in patients with MPM and melanoma. These data additionally suggest a potentially novel mechanism of response to checkpoint blockade: requirement for high measured abundance of neopeptides in the presence of high expression of MHC proteins specific for these neopeptides.

  5. The Parent-Child Home Program in Western Manitoba: A 20-Year Evaluation

    Science.gov (United States)

    Gfellner, Barbara M.; McLaren, Lorraine; Metcalfe, Arron

    2008-01-01

    This article is a 20-year evaluation of the Parent-Child Home Program (PCHP) of Child and Family Services in Western Manitoba. Following Levenstein's (1979, 1988) approach, home visitors model parent-child interchanges using books and toys to enhance children's cognitive development through appropriate parenting behaviors. The evaluation provides…

  6. In Patience and Hope: A 20-Year Narrative Study of a Family, School, and Community Partnership

    Science.gov (United States)

    Higgins, Ann; Deegan, James G.

    2009-01-01

    This case study describes a 20-year journey of educational transformation from 1985 to 2005 in a bellwether, or highly developed, instance of one school, family, and community partnership--the Kileely Community Project--situated in a large social housing project in Limerick City in the Midwestern region of the Republic of Ireland. The study is a…

  7. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  8. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a open-quotes lower boundclose quotes, open-quotes best estimateclose quotes, and open-quotes upper boundclose quotes failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties

  9. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1992-01-01

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  10. Chapter No.4. Safety analyses

    International Nuclear Information System (INIS)

    2002-01-01

    In 2001 the activity in the field of safety analyses was focused on verification of the safety analyses reports for NPP V-2 Bohunice and NPP Mochovce concerning the new profiled fuel and probabilistic safety assessment study for NPP Mochovce. The calculation safety analyses were performed and expert reviews for the internal UJD needs were elaborated. An important part of work was performed also in solving of scientific and technical tasks appointed within bilateral projects of co-operation between UJD and its international partnership organisations as well as within international projects ordered and financed by the European Commission. All these activities served as an independent support for UJD in its deterministic and probabilistic safety assessment of nuclear installations. A special attention was paid to a review of probabilistic safety assessment study of level 1 for NPP Mochovce. The probabilistic safety analysis of NPP related to the full power operation was elaborated in the study and a contribution of the technical and operational improvements to the risk decreasing was quantified. A core damage frequency of the reactor was calculated and the dominant initiating events and accident sequences with the major contribution to the risk were determined. The target of the review was to determine the acceptance of the sources of input information, assumptions, models, data, analyses and obtained results, so that the probabilistic model could give a real picture of the NPP. The review of the study was performed in co-operation of UJD with the IAEA (IPSART mission) as well as with other external organisations, which were not involved in the elaboration of the reviewed document and probabilistic model of NPP. The review was made in accordance with the IAEA guidelines and methodical documents of UJD and US NRC. In the field of calculation safety analyses the UJD activity was focused on the analysis of an operational event, analyses of the selected accident scenarios

  11. Analysing the Wrongness of Killing

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2014-01-01

    This article provides an in-depth analysis of the wrongness of killing by comparing different versions of three influential views: the traditional view that killing is always wrong; the liberal view that killing is wrong if and only if the victim does not want to be killed; and Don Marquis‟ future...... of value account of the wrongness of killing. In particular, I illustrate the advantages that a basic version of the liberal view and a basic version of the future of value account have over competing alternatives. Still, ultimately none of the views analysed here are satisfactory; but the different...

  12. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  13. Predictive medicine

    NARCIS (Netherlands)

    Boenink, Marianne; ten Have, Henk

    2015-01-01

    In the last part of the twentieth century, predictive medicine has gained currency as an important ideal in biomedical research and health care. Research in the genetic and molecular basis of disease suggested that the insights gained might be used to develop tests that predict the future health

  14. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  15. CFD analyses in regulatory practice

    International Nuclear Information System (INIS)

    Bloemeling, F.; Pandazis, P.; Schaffrath, A.

    2012-01-01

    Numerical software is used in nuclear regulatory procedures for many problems in the fields of neutron physics, structural mechanics, thermal hydraulics etc. Among other things, the software is employed in dimensioning and designing systems and components and in simulating transients and accidents. In nuclear technology, analyses of this kind must meet strict requirements. Computational Fluid Dynamics (CFD) codes were developed for computing multidimensional flow processes of the type occurring in reactor cooling systems or in containments. Extensive experience has been accumulated by now in selected single-phase flow phenomena. At the present time, there is a need for development and validation with respect to the simulation of multi-phase and multi-component flows. As insufficient input by the user can lead to faulty results, the validity of the results and an assessment of uncertainties are guaranteed only through consistent application of so-called Best Practice Guidelines. The authors present the possibilities now available to CFD analyses in nuclear regulatory practice. This includes a discussion of the fundamental requirements to be met by numerical software, especially the demands upon computational analysis made by nuclear rules and regulations. In conclusion, 2 examples are presented of applications of CFD analysis to nuclear problems: Determining deboration in the condenser reflux mode of operation, and protection of the reactor pressure vessel (RPV) against brittle failure. (orig.)

  16. Mirror energy difference and the structure of loosely bound proton-rich nuclei around A =20

    Science.gov (United States)

    Yuan, Cenxi; Qi, Chong; Xu, Furong; Suzuki, Toshio; Otsuka, Takaharu

    2014-04-01

    The properties of loosely bound proton-rich nuclei around A =20 are investigated within the framework of the nuclear shell model. In these nuclei, the strength of the effective interactions involving the loosely bound proton s1/2 orbit is significantly reduced in comparison with that of those in their mirror nuclei. We evaluate the reduction of the effective interaction by calculating the monopole-based-universal interaction (VMU) in the Woods-Saxon basis. The shell-model Hamiltonian in the sd shell, such as USD, can thus be modified to reproduce the binding energies and energy levels of the weakly bound proton-rich nuclei around A =20. The effect of the reduction of the effective interaction on the structure and decay properties of these nuclei is also discussed.

  17. Paraurethral Leiomyoma in a 20 Year-old Woman: A Case Report

    Directory of Open Access Journals (Sweden)

    Emily Adams-Piper

    2016-01-01

    Full Text Available We present the case of a 20 year-old woman with a vulvar mass, found to be a paraurethral leiomyoma. She subsequently underwent supermedial-approach paraurethral mass excision, distal urethral reconstruction and cystourethroscopy. Paraurethral leiomyoma make up approximately five percent of urethral tumors. This case depicts the presentation and treatment of a paraurethral leiomyoma in one of the youngest women reported in the literature.

  18. Hydrogen Analyses in the EPR

    International Nuclear Information System (INIS)

    Worapittayaporn, S.; Eyink, J.; Movahed, M.

    2008-01-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  19. The hemispherical deflector analyser revisited

    Energy Technology Data Exchange (ETDEWEB)

    Benis, E.P. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece)], E-mail: benis@iesl.forth.gr; Zouros, T.J.M. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece); Department of Physics, University of Crete, P.O. Box 2208, 71003 Heraklion, Crete (Greece)

    2008-04-15

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R{sub 0} and the nominal value of the potential V(R{sub 0}) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD.

  20. The hemispherical deflector analyser revisited

    International Nuclear Information System (INIS)

    Benis, E.P.; Zouros, T.J.M.

    2008-01-01

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R 0 and the nominal value of the potential V(R 0 ) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD

  1. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  2. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  3. NKT sublineage specification and survival requires the ubiquitin-modifying enzyme TNFAIP3/A20.

    Science.gov (United States)

    Drennan, Michael B; Govindarajan, Srinath; Verheugen, Eveline; Coquet, Jonathan M; Staal, Jens; McGuire, Conor; Taghon, Tom; Leclercq, Georges; Beyaert, Rudi; van Loo, Geert; Lambrecht, Bart N; Elewaut, Dirk

    2016-09-19

    Natural killer T (NKT) cells are innate lymphocytes that differentiate into NKT1, NKT2, and NKT17 sublineages during development. However, the signaling events that control NKT sublineage specification and differentiation remain poorly understood. Here, we demonstrate that the ubiquitin-modifying enzyme TNFAIP3/A20, an upstream regulator of T cell receptor (TCR) signaling in T cells, is an essential cell-intrinsic regulator of NKT differentiation. A20 is differentially expressed during NKT cell development, regulates NKT cell maturation, and specifically controls the differentiation and survival of NKT1 and NKT2, but not NKT17, sublineages. Remaining A20-deficient NKT1 and NKT2 thymocytes are hyperactivated in vivo and secrete elevated levels of Th1 and Th2 cytokines after TCR ligation in vitro. Defective NKT development was restored by compound deficiency of MALT1, a key downstream component of TCR signaling in T cells. These findings therefore show that negative regulation of TCR signaling during NKT development controls the differentiation and survival of NKT1 and NKT2 cells. © 2016 Drennan et al.

  4. Proteins analysed as virtual knots

    Science.gov (United States)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  5. Digital image analyser for autoradiography

    International Nuclear Information System (INIS)

    Muth, R.A.; Plotnick, J.

    1985-01-01

    The most critical parameter in quantitative autoradiography for assay of tissue concentrations of tracers is the ability to obtain precise and accurate measurements of optical density of the images. Existing high precision systems for image analysis, rotating drum densitometers, are expensive, suffer from mechanical problems and are slow. More moderately priced and reliable video camera based systems are available, but their outputs generally do not have the uniformity and stability necessary for high resolution quantitative autoradiography. The authors have designed and constructed an image analyser optimized for quantitative single and multiple tracer autoradiography which the authors refer to as a memory-mapped charged-coupled device scanner (MM-CCD). The input is from a linear array of CCD's which is used to optically scan the autoradiograph. Images are digitized into 512 x 512 picture elements with 256 gray levels and the data is stored in buffer video memory in less than two seconds. Images can then be transferred to RAM memory by direct memory-mapping for further processing. Arterial blood curve data and optical density-calibrated standards data can be entered and the optical density images can be converted automatically to tracer concentration or functional images. In double tracer studies, images produced from both exposures can be stored and processed in RAM to yield ''pure'' individual tracer concentration or functional images. Any processed image can be transmitted back to the buffer memory to be viewed on a monitor and processed for region of interest analysis

  6. BN-600 hybrid core benchmark analyses

    International Nuclear Information System (INIS)

    Kim, Y.I.; Stanculescu, A.; Finck, P.; Hill, R.N.; Grimm, K.N.

    2003-01-01

    Benchmark analyses for the hybrid BN-600 reactor that contains three uranium enrichment zones and one plutonium zone in the core, have been performed within the frame of an IAEA sponsored Coordinated Research Project. The results for several relevant reactivity parameters obtained by the participants with their own state-of-the-art basic data and codes, were compared in terms of calculational uncertainty, and their effects on the ULOF transient behavior of the hybrid BN-600 core were evaluated. The comparison of the diffusion and transport results obtained for the homogeneous representation generally shows good agreement for most parameters between the RZ and HEX-Z models. The burnup effect and the heterogeneity effect on most reactivity parameters also show good agreement for the HEX-Z diffusion and transport theory results. A large difference noticed for the sodium and steel density coefficients is mainly due to differences in the spatial coefficient predictions for non fuelled regions. The burnup reactivity loss was evaluated to be 0.025 (4.3 $) within ∼ 5.0% standard deviation. The heterogeneity effect on most reactivity coefficients was estimated to be small. The heterogeneity treatment reduced the control rod worth by 2.3%. The heterogeneity effect on the k-eff and control rod worth appeared to differ strongly depending on the heterogeneity treatment method. A substantial spread noticed for several reactivity coefficients did not give a significant impact on the transient behavior prediction. This result is attributable to compensating effects between several reactivity effects and the specific design of the partially MOX fuelled hybrid core. (author)

  7. Development of a 20kA current feedthrough using YBCO bulk conductors

    International Nuclear Information System (INIS)

    Maehata, Keisuke; Ishibashi, Kenji; Shintomi, Takakazu; Iwamoto, Akifumi; Maekawa, Ryuji; Mito, Toshiyuki

    2004-01-01

    In the phase II experiment of the Large Helical Device (LHD) of the National Institute for Fusion Science (NIFS), it is planned to operate the helical coils at 1.8 K by employing pressurized superfluid cooling to raise the magnetic field to 4 T with 17.3 kA. It is important to develop a 20kA-class current feedthrough into the 1.8 K region, but it must have a high current capacity and low heat leakage in the maximum magnetic leakage field of 1 T. Rectangle-shaped YBCO bulk conductors measuring 20 mm wide, 140 mm long and 10 mm thick were manufactured from square-pillar-shaped YBCO bulk materials for a 20 kA current. To check the quality of the bulk conductors, internal defects or cracks were detected by carrying out a precise survey of trapped magnetic flux. An assembled 20 kA current feedthrough was mounted in the λ-plate of a pressurized superfluid cooling cryostat. Experiments of current feeding into the 1.8 K region were carried out by operating the 20 kA current feedthrough. In the experiments, the transport current was kept at 20 kA for longer than 1,200 s. During the 20 kA operation, the current transport section of the YBCO bulk conductors remained in the superconducting state and the voltage drop between the YBCO bulk conductors and the copper electrode was observed to be constant. A contact resistance and the Joule heat generation in the joint region between the YBCO bulk conductors and the copper electrode were obtained as 1.45 nΩ and 0.72 W, respectively in the 20 kA operation. We have demonstrated the feasibility of using a 20 kA current feedthrough for the phase II experiment of the LHD. (author)

  8. Towards a 20th Century History of Relationships between Theatre and Neuroscience

    Directory of Open Access Journals (Sweden)

    Gabriele Sofia

    2014-05-01

    Full Text Available This article considers some preliminary reflections in view of a 20th century theatre-and-neuroscience history. Up to now, the history of the 20th century theatre has been too fragmentary and irregular, missing out on the subterranean links which, either directly or indirectly, bound different experiences. The article aims to put in evidence the recurrent problems of these encounters. The hypothesis of the essay concerns the possibility of gathering and grouping a great part of the relationships between theatre and neuroscience around four trajectories: the physiology of action, the physiology of emotions, ethology, and studies on the spectator’s perception.

  9. Proliferative myositis of the latissimus dorsi presenting in a 20-year-old male athlete

    LENUS (Irish Health Repository)

    Mc Hugh, N

    2017-08-01

    We describe the case of a 20-year-old rower presenting with an uncommon condition of Proliferative Myositis (PM) affecting the Latissimus Dorsi (LD). PM is a rare, benign tumour infrequently developing in the upper back. Its rapid growth and firm consistency may mistake it for sarcoma at presentation. Therefore, careful multidisciplinary work-up is crucial, and should involve appropriate radiological and histopathological investigations. Here, we propose the aetiology of LD PM to be persistent myotrauma induced by repetitive rowing motions. Symptoms and rate of progression ultimately determine the management which includes surveillance and\\/or conservative resection. There have been no documented cases of recurrence or malignant transformation.

  10. Prediction Markets

    DEFF Research Database (Denmark)

    Horn, Christian Franz; Ivens, Bjørn Sven; Ohneberg, Michael

    2014-01-01

    In recent years, Prediction Markets gained growing interest as a forecasting tool among researchers as well as practitioners, which resulted in an increasing number of publications. In order to track the latest development of research, comprising the extent and focus of research, this article...... provides a comprehensive review and classification of the literature related to the topic of Prediction Markets. Overall, 316 relevant articles, published in the timeframe from 2007 through 2013, were identified and assigned to a herein presented classification scheme, differentiating between descriptive...... works, articles of theoretical nature, application-oriented studies and articles dealing with the topic of law and policy. The analysis of the research results reveals that more than half of the literature pool deals with the application and actual function tests of Prediction Markets. The results...

  11. Social Media Analyses for Social Measurement

    Science.gov (United States)

    Schober, Michael F.; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G.

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or “found” social media content. But just how trustworthy such measurement can be—say, to replace official statistics—is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys. PMID:27257310

  12. Social Media Analyses for Social Measurement.

    Science.gov (United States)

    Schober, Michael F; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or "found" social media content. But just how trustworthy such measurement can be-say, to replace official statistics-is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys.

  13. [TRENDS OF PERMANENT PACEMAKER IMPLANTATION IN A SINGLE CENTER OVER A 20-YEAR PERIOD].

    Science.gov (United States)

    Antonelli, Dante; Ilan, Limor Bushar; Freedberg, Nahum A; Feldman, Alexander; Turgeman, Yoav

    2015-05-01

    To review the changes in permanent pacemaker implantation indications, pacing modes and patients' demographics over a 20-year period. We retrospectively retrieved data on patients who underwent first implantation of the pacemaker between 1-1-1991 and 31-12-2010. One thousand and nine (1,009) patients underwent a first pacemaker implantation during that period; 535 were men (53%), their mean age was 74.6±19.5 years; the highest rate of implanted pacemaker was in patients ranging in age from 70-79 years, however there was an increasing number of patients aged over 80 years. The median survival time after initial pacemaker implantation was 8 years. Syncope was the most common symptom (62.5%) and atrioventricular block was the most common electrocardiographic indication (56.4%) leading to pacemaker implantation. There was increased utilization of dual chamber and rate responsive pacemakers over the years. There was no difference regarding mode selection between genders. Pacemaker implantation rates have increased over a 20-year period. Dual chamber replaced most of the single ventricular chamber pacemaker and rate responsive pacemakers became the norm. The data of a small volume center are similar to those reported in pacemaker surveys of high volume pacemaker implantation centers. They confirm adherence to the published guidelines for pacing.

  14. Hydrodynamic phonon drift and second sound in a (20,20) single-wall carbon nanotube

    International Nuclear Information System (INIS)

    Lee, Sangyeop; Lindsay, Lucas

    2017-01-01

    Here, two hydrodynamic features of phonon transport, phonon drift and second sound, in a (20,20) single wall carbon nanotube (SWCNT) are discussed using lattice dynamics calculations employing an optimized Tersoff potential for atomic interactions. We formally derive a formula for the contribution of drift motion of phonons to total heat flux at steady state. It is found that the drift motion of phonons carry more than 70% and 90% of heat at 300 K and 100 K, respectively, indicating that phonon flow can be reasonably approximated as hydrodynamic if the SWCNT is long enough to avoid ballistic phonon transport. The dispersion relation of second sound is derived from the Peierls-Boltzmann transport equation with Callaway s scattering model and quantifies the speed of second sound and its relaxation. The speed of second sound is around 4000 m/s in a (20,20) SWCNT and the second sound can propagate more than 10 m in an isotopically pure (20,20) SWCNT for frequency around 1 GHz at 100 K.

  15. Superconductor design and loss analysis for a 20 MJ induction heating coil

    International Nuclear Information System (INIS)

    Walker, M.S.; Declercq, J.G.; Zeitlin, B.A.

    1980-01-01

    The design of a 50 k Ampere conductor for use in a 20 MJ Induction Heating Coil is described. The conductor is a wide flat cable of 36 subcables, each of which contains six NbTi strands around a stainless steel core strand. The 2.04 mm (0.080'') diameter monolithic strands allow bubble clearing for cryostable operation at a pool boiling heat transfer from the unoccluded strand surface of 0.26 Watts/cm 2 . A thin, tough polyester amide-imide (Westinghouse Omega) insulation provides a rugged coating that will resist flaking and chipping during the cabling and compaction operations and provide (1) a reliable adherent surface for enhanced heat transfer, and (2) a low voltage standoff preventing interstrand coupling losses. The strands are uniquely configured using CuNi elements to provide low ac losses with NbTi filaments in an all-copper matrix. AC losses are expected to be approximately 0.3% of 20 MJ for a -7.5 T to 7.5 T one-second 1/2-cosinusoidal bipolar operation in a 20 MJ coil. They will be approximately 0.1% of 100 MJ for 1.8 second -8 T and +8 T ramped operation in a 100 MJ coil. The design is firmly based on the results of tests performed on prototype strands and subcables

  16. Thermal and hydraulic analyses of the System 81 cold traps

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K.

    1977-06-15

    Thermal and hydraulic analyses of the System 81 Type I and II cold traps were completed except for thermal transients analysis. Results are evaluated, discussed, and reported. Analytical models were developed to determine the physical dimensions of the cold traps and to predict the performance. The FFTF cold trap crystallizer performances were simulated using the thermal model. This simulation shows that the analytical model developed predicts reasonably conservative temperatures. Pressure drop and sodium residence time calculations indicate that the present design will meet the requirements specified in the E-Specification. Steady state temperature data for the critical regions were generated to assess the magnitude of the thermal stress.

  17. Predicting unpredictability

    Science.gov (United States)

    Davis, Steven J.

    2018-04-01

    Analysts and markets have struggled to predict a number of phenomena, such as the rise of natural gas, in US energy markets over the past decade or so. Research shows the challenge may grow because the industry — and consequently the market — is becoming increasingly volatile.

  18. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  19. The effects of HIV/AIDS on rural communities in East Africa: a 20-year perspective.

    Science.gov (United States)

    Seeley, Janet; Dercon, Stefan; Barnett, Tony

    2010-03-01

    Much of the research on implications of the HIV epidemic for individual households and broader rural economies in the 1980s and early 1990s predicted progressive declines in agricultural production, with dire consequences for rural livelihoods. Restudies in Tanzania and Uganda show that from 1986 to the present, HIV and AIDS have sometimes thrown households into disarray and poverty, but more often have reduced development. The progressive and systematic decline predicted in earlier work has not come to pass. However, poverty remains, as does endemic HIV disease.

  20. Study of proton and 2 protons emission from light neutron deficient nuclei around A=20; Etude de l'emission proton et de deux protons dans les noyaux legers deficients en neutrons de la region A=20

    Energy Technology Data Exchange (ETDEWEB)

    Zerguerras, T

    2001-09-01

    Proton and two proton emission from light neutron deficient nuclei around A=20 have been studied. A radioactive beam of {sup 18}Ne, {sup 17}F and {sup 20}Mg, produced at the Grand Accelerateur National d'Ions Lourds by fragmentation of a {sup 24}Mg primary beam at 95 MeV/A, bombarded a {sup 9}Be target to form unbound states. Proton(s) and nuclei from the decay were detected respectively in the MUST array and the SPEG spectrometer. From energy and angle measurements, the invariant mass of the decaying nucleus could be reconstructed. Double coincidence events between a proton and {sup 17}F, {sup 16}O, {sup 15}O, {sup 14}O and {sup 18}Ne were registered to obtain excitation energy spectra of {sup 18}Ne, {sup 17}F, {sup 16}F, {sup 15}F et {sup 19}Na. Generally, the masses measures are in agreement with previous experiments. In the case of {sup 18}Ne, excitation energy and angular distributions agree well with the predictions of a break up model calculation. From {sup 17}Ne proton coincidences, a first experimental measurement of the ground state mass excess of {sup 18}Na has been obtained and yields 24,19(0,15)MeV. Two proton emission from {sup 17}Ne and {sup 18}Ne excited states and the {sup 19}Mg ground state was studied through triple coincidences between two proton and {sup 15}O, {sup 16}O and {sup 17}Ne respectively. In the first case, the proton-proton relative angle distribution in the center of mass has been compared with model calculation. Sequential emission from excited states of {sup 17}Ne, above the proton emission threshold, through {sup 16}F is dominant but a {sup 2}He decay channel could not be excluded. No {sup 2}He emission from the 1.288 MeV {sup 17}Ne state, or from the 6.15 MeV {sup 18}Ne state has been observed. Only one coincidence event between {sup 17}Ne and two proton was registered, the value of the one neutron stripping reaction cross section of {sup 20}Mg being much lower than predicted. (author)

  1. Study of proton and 2 protons emission from light neutron deficient nuclei around A=20; Etude de l'emission proton et de deux protons dans les noyaux legers deficients en neutrons de la region A=20

    Energy Technology Data Exchange (ETDEWEB)

    Zerguerras, T

    2001-09-01

    Proton and two proton emission from light neutron deficient nuclei around A=20 have been studied. A radioactive beam of {sup 18}Ne, {sup 17}F and {sup 20}Mg, produced at the Grand Accelerateur National d'Ions Lourds by fragmentation of a {sup 24}Mg primary beam at 95 MeV/A, bombarded a {sup 9}Be target to form unbound states. Proton(s) and nuclei from the decay were detected respectively in the MUST array and the SPEG spectrometer. From energy and angle measurements, the invariant mass of the decaying nucleus could be reconstructed. Double coincidence events between a proton and {sup 17}F, {sup 16}O, {sup 15}O, {sup 14}O and {sup 18}Ne were registered to obtain excitation energy spectra of {sup 18}Ne, {sup 17}F, {sup 16}F, {sup 15}F et {sup 19}Na. Generally, the masses measures are in agreement with previous experiments. In the case of {sup 18}Ne, excitation energy and angular distributions agree well with the predictions of a break up model calculation. From {sup 17}Ne proton coincidences, a first experimental measurement of the ground state mass excess of {sup 18}Na has been obtained and yields 24,19(0,15)MeV. Two proton emission from {sup 17}Ne and {sup 18}Ne excited states and the {sup 19}Mg ground state was studied through triple coincidences between two proton and {sup 15}O, {sup 16}O and {sup 17}Ne respectively. In the first case, the proton-proton relative angle distribution in the center of mass has been compared with model calculation. Sequential emission from excited states of {sup 17}Ne, above the proton emission threshold, through {sup 16}F is dominant but a {sup 2}He decay channel could not be excluded. No {sup 2}He emission from the 1.288 MeV {sup 17}Ne state, or from the 6.15 MeV {sup 18}Ne state has been observed. Only one coincidence event between {sup 17}Ne and two proton was registered, the value of the one neutron stripping reaction cross section of {sup 20}Mg being much lower than predicted. (author)

  2. Analyses of hypothetical FCI's in a fast reactor

    International Nuclear Information System (INIS)

    Padilla, A. Jr.; Martin, F.J.; Niccoli, L.G.

    1981-01-01

    Parametric analyses using the SIMMER code were performed to evaluate the potential for a severe recriticality from a pressure-driven recompaction caused by an energetic FCI during the transition phase of a hypothetical accident in a fast reactor. For realistic and reasonable estimates for the assumed accident conditions, a severe recriticality was not predicted. The conditions under which a severe recriticality would be obtained or averted were identified. 10 figures, 2 tables

  3. Decreasing Sports Activity with Increasing Age? Findings from a 20-Year Longitudinal and Cohort Sequence Analysis

    Science.gov (United States)

    Breuer, Christoph; Wicker, Pamela

    2009-01-01

    According to cross-sectional studies in sport science literature, decreasing sports activity with increasing age is generally assumed. In this paper, the validity of this assumption is checked by applying more effective methods of analysis, such as longitudinal and cohort sequence analyses. With the help of 20 years' worth of data records from the…

  4. Factitious aortic dissection leading to thoracotomy in a 20-year-old man.

    Science.gov (United States)

    Chambers, Elise; Yager, Joel; Apfeldorf, William; Camps-Romero, Eduardo

    2007-01-01

    A 20-year-old man presented to an emergency department with dramatic, sudden-onset, tearing chest pain. He also claimed to have been previously diagnosed with Ehler-Danlos syndrome and a previous Type I aortic dissection (intimal tear of ascending aorta), rapidly increasing his treating physician's suspicion of an emergent aortic dissection. The patient was quickly transferred to a large university hospital, where he underwent a median sternotomy and thoracotomy, with no aortic pathology found on operation and biopsy. After the patient's postoperative recovery, he was treated at a mental health facility, where he remained ambivalent about his psychiatric condition and did not respond well to treatment. This case report describes a unique case of factitious disorder that led to a serious operative intervention and subsequent psychiatric care and assesses factors that might have contributed to his hospital course.

  5. Authorship and characteristics of articles in pharmacy journals: changes over a 20-year interval.

    Science.gov (United States)

    Dotson, Bryan; McManus, Kevin P; Zhao, Jing J; Whittaker, Peter

    2011-03-01

    To our knowledge, no studies have evaluated authorship patterns and characteristics of articles in pharmacy journals. To investigate changes over a 20-year period in authorship and characteristics of articles in pharmacy journals. All articles published in the American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, and Pharmacotherapy in 1989, 1999, and 2009 were reviewed. Data collected for each article included article type, number of authors, number of physician authors, whether any author was affiliated with a pharmaceutical company, and source of funding. The number of articles included was 574 in 1989, 659 in 1999, and 589 in 2009. The mean number of authors per article increased from 2.5 in 1989 to 2.8 in 1999 and 3.6 in 2009 (p6 authors) increased from 2% in 1989 to 3% in 1999 and 9% in 2009 (pInternational Committee of Medical Journal Editors.

  6. Interaction cross-sections and matter radii of A = 20 isobars

    International Nuclear Information System (INIS)

    Chulkov, L.; Bochkarev, O.; Geissel, H.; Golovkov, M.; Janas, Z.; Keller, H.; Kobayashi, T.; Muenzenberg, G.; Nickel, F.; Ogloblin, A.; Patra, S.; Piechaczek, A.; Roeckl, E.; Schwab, W.; Suemmerer, K.; Suzuki, T.; Tanihata, I.; Yoshida, K.

    1995-11-01

    High-energy interaction cross-sections of A=20 nuclei ( 20 N, 20 O, 20 F, 20 Ne, 20 Na, 20 Mg) on carbon were measured with accuracies of ∼1%. The nuclear matter rms radii derived from the measured cross-sections show an irregular dependence on isospin projection. The largest difference in radii, which amounts to approximately 0.2 fm, has been obtained for the mirror nuclei 20 O and 20 Mg. The influenc of nuclear deformation and binding energy on the radii is discussed. By evaluating the difference in rms radii of neutron and proton distributions, evidence has been found for the existence of a proton skin for 20 Mg and of a neutron skin for 20 N. (orig.)

  7. Self-Rated Activity Levels and Longevity: Evidence from a 20 Year Longitudinal Study

    Science.gov (United States)

    Mullee, Mark A.; Coleman, Peter G.; Briggs, Roger S. J.; Stevenson, James E.; Turnbull, Joanne C.

    2008-01-01

    The study reports on factors predicting the longevity of 328 people over the age of 65 drawn from an English city and followed over 20 years. Both the reported activities score and the individual's comparative evaluation of their own level of activity independently reduced the risk of death, even when health and cognitive status were taken into…

  8. Unification predictions

    International Nuclear Information System (INIS)

    Ghilencea, D.; Ross, G.G.; Lanzagorta, M.

    1997-07-01

    The unification of gauge couplings suggests that there is an underlying (supersymmetric) unification of the strong, electromagnetic and weak interactions. The prediction of the unification scale may be the first quantitative indication that this unification may extend to unification with gravity. We make a precise determination of these predictions for a class of models which extend the multiplet structure of the Minimal Supersymmetric Standard Model to include the heavy states expected in many Grand Unified and/or superstring theories. We show that there is a strong cancellation between the 2-loop and threshold effects. As a result the net effect is smaller than previously thought, giving a small increase in both the unification scale and the value of the strong coupling at low energies. (author). 15 refs, 5 figs

  9. Evaluating the Predictive Value of Growth Prediction Models

    Science.gov (United States)

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  10. Predictable return distributions

    DEFF Research Database (Denmark)

    Pedersen, Thomas Quistgaard

    trace out the entire distribution. A univariate quantile regression model is used to examine stock and bond return distributions individually, while a multivariate model is used to capture their joint distribution. An empirical analysis on US data shows that certain parts of the return distributions......-of-sample analyses show that the relative accuracy of the state variables in predicting future returns varies across the distribution. A portfolio study shows that an investor with power utility can obtain economic gains by applying the empirical return distribution in portfolio decisions instead of imposing...

  11. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    To enable a computer to process information in human languages, ... vised morphological analyser (UMA) would learn how to analyse a language just by looking ... result for English, but they did remarkably worse for Finnish and Turkish.

  12. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  13. The Cardiff dental study: a 20-year critical evaluation of the psychological health gain from orthodontic treatment.

    Science.gov (United States)

    Kenealy, Pamela M; Kingdon, Anne; Richmond, Stephen; Shaw, William C

    2007-02-01

    Despite the widespread belief that orthodontics improves psychological well-being and self-esteem, there is little objective evidence to support this (Kenealy et al., 1989a; Shaw, O'Brien, Richmond, & Brook, 1991). A 20 year follow-up study compared the dental and psychosocial status of individuals who received, or did not receive, orthodontics as teenagers. A prospective longitudinal cohort design with four studies of the effect of orthodontic treatment. Secondary analysis of outcome data incorporated orthodontic need at baseline and treatment received in a 2 x 2 factorial design. A multidisciplinary research programme studied a cohort of 1,018, 11-12 year old participants in 1981. Extensive assessment of dental health and psychosocial well-being was conducted; facial and dental photographs and plaster casts of dentition were obtained and rated for attractiveness and pre-treatment need. No recommendations about orthodontic treatment were made, and an observational approach was adopted. At the third follow-up 337 (30-31 year olds) were re-examined in 2001. Participants with a prior need for orthodontic treatment as children who obtained treatment demonstrated better tooth alignment and satisfaction. However when self-esteem at baseline was controlled for, orthodontics had little positive impact on psychological health and quality of life in adulthood. Lack of orthodontic treatment where there was a prior need did not lead to psychological difficulties in later life. Dental status alone was a weak predictor of self-esteem at outcome explaining 8% of the variance. Self-esteem in adulthood was more strongly predicted (65% of the variance) by psychological variables at outcome: perception of quality of life, life satisfaction, self-efficacy, depression, social anxiety, emotional health, and by self-perception of attractiveness. Longitudinal analysis revealed that the observed effect of orthodontic treatment on self esteem at outcome was accounted for by self esteem at

  14. Analyses of bundle experiment data using MATRA-h

    Energy Technology Data Exchange (ETDEWEB)

    Lim, In Cheol; Chea, Hee Taek [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    When the construction and operation license for HANARO was renewed in 1995, 25% of CHF penalty was imposed. The reason for this was that the validation work related to the CHF design calculation was not enough for the assurance of CHF margin. As a part of the works to recover this CHF penalty, MATRA-h was developed by implementing the new correlations for the heat transfer, CHF prediction, subcooled void to the MATRA-a, which is the modified version of COBRA-IV-I done by KAERI. Using MATRA-h, the subchannel analyses for the bundle experiment data were performed. The comparison of the code predictions with the experimental results, it was found that the code would give the conservative predictions as far as the CHF in the bundle geometry is concerned. (author). 12 refs., 25 figs., 16 tabs.

  15. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  16. The ABC (Analysing Biomolecular Contacts-database

    Directory of Open Access Journals (Sweden)

    Walter Peter

    2007-03-01

    Full Text Available As protein-protein interactions are one of the basic mechanisms in most cellular processes, it is desirable to understand the molecular details of protein-protein contacts and ultimately be able to predict which proteins interact. Interface areas on a protein surface that are involved in protein interactions exhibit certain characteristics. Therefore, several attempts were made to distinguish protein interactions from each other and to categorize them. One way of classification are the groups of transient and permanent interactions. Previously two of the authors analysed several properties for transient complexes such as the amino acid and secondary structure element composition and pairing preferences. Certainly, interfaces can be characterized by many more possible attributes and this is a subject of intense ongoing research. Although several freely available online databases exist that illuminate various aspects of protein-protein interactions, we decided to construct a new database collecting all desired interface features allowing for facile selection of subsets of complexes. As database-server we applied MySQL and the program logic was written in JAVA. Furthermore several class extensions and tools such as JMOL were included to visualize the interfaces and JfreeChart for the representation of diagrams and statistics. The contact data is automatically generated from standard PDB files by a tcl/tk-script running through the molecular visualization package VMD. Currently the database contains 536 interfaces extracted from 479 PDB files and it can be queried by various types of parameters. Here, we describe the database design and demonstrate its usefulness with a number of selected features.

  17. Preschool Personality Antecedents of Narcissism in Adolescence and Emergent Adulthood: A 20-Year Longitudinal Study

    OpenAIRE

    Carlson, Kevin S.; Gjerde, Per F.

    2009-01-01

    This prospective study examined relations between preschool personality attributes and narcissism during adolescence and emerging adulthood. We created five a priori preschool scales anticipated to foretell future narcissism. Independent assessors evaluated the participants' personality at ages 14, 18, and 23. Based upon these evaluations, we generated observer-based narcissism scales for each of these three ages. All preschool scales predicted subsequent narcissism, except Interpersonal Anta...

  18. Prediction, Regression and Critical Realism

    DEFF Research Database (Denmark)

    Næss, Petter

    2004-01-01

    This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... phenomena. This position is fundamentally problematic to public planning. Without at least some ability to predict the likely consequences of different proposals, the justification for public sector intervention into market mechanisms will be frail. Statistical methods like regression analyses are commonly...... seen as necessary in order to identify aggregate level effects of policy measures, but are questioned by many advocates of critical realist ontology. Using research into the relationship between urban structure and travel as an example, the paper discusses relevant research methods and the kinds...

  19. Acute mastoiditis in a Norwegian population: a 20 year retrospective study.

    Science.gov (United States)

    Vassbotn, Flemming S; Klausen, Olav G; Lind, Ola; Moller, Per

    2002-02-25

    We have retrospectively examined the nature of acute mastioditis (in western Norway) during a 20 year period (1980-2000). Sixty-one cases of AM were identified in 57 patients with a mean age of 3.6 years. We found no significant change in the incidence of AM during the last 20 years. Seven patients were treated solely with intravenous antibiotics and myringotomies. Fifty patients also underwent cortical mastoidectomy, four cases with bilateral surgery. Antibiotic treatment was given to 31 of the patients before admission to hospital and this group had a significant longer duration of symptoms (12.4 days) compared to untreated patients (7.3 days). Streptococcus pneumoniae was the most common organism recovered from patient cultures. Surgery was found to correlate to patients with retroauricular fluctuation or to children with at least two of the three clinical signs: protrusion of the ear, retroauricular oedema and swelling of the ear canal. Our data show that clinical examination only reveal 50% of the cases with surgically proven retroauricular subperiostal abscess. We therefore recommend a CT scan of patients treated conservatively.

  20. Test results of a 20 kA high temperature superconductor current lead using REBCO tapes

    Science.gov (United States)

    Heller, R.; Fietz, W. H.; Gröner, F.; Heiduk, M.; Hollik, M.; Lange, C.; Lietzow, R.

    2018-05-01

    The Karlsruhe Institute of Technology has developed a 20 kA high temperature superconductor (HTS) current lead (CL) using the second generation material REBCO, as industry worldwide concentrate on the production of this material. The aim was to demonstrate the possibility of replacing the Bi-2223/AgAu tapes by REBCO tapes, while for easy comparison of results, all other components are copies of the 20 kA HTS CL manufactured for the satellite tokamak JT-60SA. After the manufacture of all CL components including the newly developed REBCO module, the assembly of the CL has been executed at KIT and an experiment has been carried out in the CuLTKa test facility where the REBCO CL was installed and connected to a JT-60SA CL via a superconducting bus bar. The experiment covers steady state operation up to 20 kA, pulsed operation, measurement of the heat load at 4.5 K end, loss-of-flow-accident simulations, and quench performance studies. Here the results of these tests are reported and directly compared to those of the JT-60SA CL.

  1. The Incidence of Primary Systemic Vasculitis in Jerusalem: A 20-year Hospital-based Retrospective Study.

    Science.gov (United States)

    Nesher, Gideon; Ben-Chetrit, Eli; Mazal, Bracha; Breuer, Gabriel S

    2016-06-01

    The incidence of primary systemic vasculitides varies among different geographic regions and ethnic origins. The aim of this study was to examine the incidence rates of vasculitides in the Jerusalem Jewish population, and to examine possible trends in incidence rates over a 20-year period. The clinical databases of inpatients at the 2 medical centers in Jerusalem were searched for patients with vasculitis diagnosed between 1990-2009. Individual records were then reviewed by one of the authors. The significance of trends in incidence rates throughout the study period was evaluated by Pearson correlation coefficient. The average annual incidence rate of polyarteritis nodosa was 3.6/million adults (95% CI 1.6-4.7). Incidence rates did not change significantly during this period (r = 0.39, p = 0.088). The incidence of granulomatosis with polyangiitis (GPA) was 4.1 (2.2-5.9) for the whole period, during which it increased significantly (r = 0.53, p Jerusalem are in the lower range of global incidence rates. While GPA and MPA incidence are increasing, GCA incidence is decreasing.

  2. Azadioxatriangulenium: exploring the effect of a 20 ns fluorescence lifetime in fluorescence anisotropy measurements

    Science.gov (United States)

    Bogh, Sidsel A.; Bora, Ilkay; Rosenberg, Martin; Thyrhaug, Erling; Laursen, Bo W.; Just Sørensen, Thomas

    2015-12-01

    Azaoxatriangulenium (ADOTA) has been shown to be highly emissive despite a moderate molar absorption coefficient of the primary electronic transition. As a result, the fluorescence lifetime is ~20 ns, longer than all commonly used red fluorescent organic probes. The electronic transitions in ADOTA are highly polarised (r 0  =  0.38), which in combination with the long fluorescence lifetime extents the size-range of biomolecular weights that can be detected in fluorescence polarisation-based experiments. Here, the rotational dynamics of bovine serum albumin (BSA) are monitored with three different ADOTA derivatives, differing only in constitution of the reactive linker. A detailed study of the degree of labelling, the steady-state anisotropy, and the time-resolved anisotropy of the three different ADOTA-BSA conjugates are reported. The fluorescence quantum yields (ϕ fl) of the free dyes in PBS solution are determined to be ~55%, which is reduced to ~20% in the ADOTA-BSA conjugates. Despite the reduction in ϕ fl, a ~20 ns intensity averaged lifetime is maintained, allowing for the rotational dynamics of BSA to be monitored for up to 100 ns. Thus, ADOTA can be used in fluorescence polarisation assays to fill the gap between commonly used organic dyes and the long luminescence lifetime transition metal complexes. This allows for efficient steady-state fluorescence polarisation assays for detecting binding of analytes with molecular weights of up to 100 kDa.

  3. Natural course of posttraumatic stress disorder: a 20-month prospective study of Turkish earthquake survivors.

    Science.gov (United States)

    Karamustafalioglu, Oguz K; Zohar, Joseph; Güveli, Mustafa; Gal, Gilad; Bakim, Bahadir; Fostick, Leah; Karamustafalioglu, Nesrin; Sasson, Yehuda

    2006-06-01

    A 20-month prospective follow-up of survivors of the severe earthquake in Turkey in 1999 examined the natural course of posttraumatic stress disorder (PTSD) and the contribution of different symptom clusters to the emergence of PTSD. Subjects were randomly sampled in a suburb of Istanbul that was severely affected by the earthquake. A total of 464 adults were assessed with a self-report instrument for PTSD symptoms on 3 consecutive surveys that were administered 1 to 3, 6 to 10, and 18 to 20 months following the earthquake. The prevalence of PTSD was 30.2% on the first survey and decreased to 26.9% and 10.6% on the second and third surveys, respectively. Female subjects showed initially higher (34.8%) PTSD rates compared with male subjects (19.1%). However, gender differences disappeared by the time of the third survey due to high spontaneous remission rates in female subjects. Low levels of chronic and delayed-onset PTSD were observed. A major contribution of the avoidance symptoms to PTSD diagnosis was identified by statistical analysis. Initial PTSD following an earthquake may be as prevalent as in other natural disasters, but high rates of spontaneous remission lead to low prevalence 1.5 years following the earthquake. Initial avoidance characteristics play a major role in the emergence of PTSD.

  4. Older adults' alcohol consumption and late-life drinking problems: a 20-year perspective.

    Science.gov (United States)

    Moos, Rudolf H; Schutte, Kathleen K; Brennan, Penny L; Moos, Bernice S

    2009-08-01

    The aim of this study was to identify changes in patterns of alcohol consumption over a 20-year interval among older women and men, and to examine the associations between guideline-defined excessive drinking and late-life drinking problems. DESIGN, PARTICIPANTS AND MEASURES: A community sample of 719 adults between 55 and 65 years of age who consumed alcohol at or prior to baseline participated in a survey of alcohol consumption and drinking problems and was followed 10 years and 20 years later. The likelihood of excessive drinking declined over the 20-year interval as adults matured into their 70s and 80s. However, at ages 75-85, 27.1% of women and 48.6% of men consumed more than two drinks per day or seven drinks per week. At comparable guideline levels of alcohol consumption, older men were more likely to have drinking problems than were older women. Consumption of more than two drinks per day or seven drinks per week was identified as a potential conservative guideline for identifying excessive drinking associated with an elevated likelihood of drinking problems. A substantial percentage of older adults who consume alcohol engage in guideline-defined excessive drinking and incur drinking problems. The finding that older men may be more likely than older women to experience problems when they drink beyond guideline levels suggests that alcohol guidelines for men should not be set higher than those for women.

  5. Older Adults’ Alcohol Consumption and Late-Life Drinking Problems: A 20-Year Perspective

    Science.gov (United States)

    Moos, Rudolf H.; Schutte, Kathleen K.; Brennan, Penny L.; Moos, Bernice S.

    2009-01-01

    Aims The aim was to identify changes in patterns of alcohol consumption over a 20-year interval among older women and men, and to examine the associations between guideline-defined excessive drinking and late-life drinking problems. Design, Participants, and Measures A community sample of 719 adults between 55 and 65 years of age who consumed alcohol at or prior to baseline participated in a survey of alcohol consumption and drinking problems and was followed 10 years and 20 years later. Findings The likelihood of excessive drinking declined over the 20-year interval as adults matured into their 70s and 80s. However, at ages 75–85, 27% of women and 49% of men consumed more than 2 drinks per day or 7 drinks per week. At comparable guideline levels of alcohol consumption, older men were more likely to have drinking problems than were older women. Consumption of more than 2 drinks per day or 7 drinks per week was identified as a potential conservative guideline for identifying excessive drinking associated with an elevated likelihood of drinking problems. Conclusions A substantial percentage of older adults who consume alcohol engage in guideline-defined excessive drinking and incur drinking problems. The finding that older men may be more likely than older women to experience problems when they drink beyond guideline levels suggests that alcohol guidelines for men should not be set higher than those for women. PMID:19438836

  6. A 20-Year High-Resolution Wave Resource Assessment of Japan with Wave-Current Interactions

    Science.gov (United States)

    Webb, A.; Waseda, T.; Kiyomatsu, K.

    2016-02-01

    Energy harvested from surface ocean waves and tidal currents has the potential to be a significant source of green energy, particularly for countries with extensive coastlines such as Japan. As part of a larger marine renewable energy project*, The University of Tokyo (in cooperation with JAMSTEC) has conducted a state-of-the-art wave resource assessment (with uncertainty estimates) to assist with wave generator site identification and construction in Japan. This assessment will be publicly available and is based on a large-scale NOAA WAVEWATCH III (version 4.18) simulation using NCEP and JAMSTEC forcings. It includes several key components to improve model skill: a 20-year simulation to reduce aleatory uncertainty, a four-nested-layer approach to resolve a 1 km shoreline, and finite-depth and current effects included in all wave power density calculations. This latter component is particularly important for regions near strong currents such as the Kuroshio. Here, we will analyze the different wave power density equations, discuss the model setup, and present results from the 20-year assessment (with a focus on the role of wave-current interactions). Time permitting, a comparison will also be made with simulations using JMA MSM 5 km winds. *New Energy and Industrial Technology Development Organization (NEDO): "Research on the Framework and Infrastructure of Marine Renewable Energy; an Energy Potential Assessment"

  7. Plasma characteristics in the discharge region of a 20 A emission current hollow cathode

    Science.gov (United States)

    Mingming, SUN; Tianping, ZHANG; Xiaodong, WEN; Weilong, GUO; Jiayao, SONG

    2018-02-01

    Numerical calculation and fluid simulation methods were used to obtain the plasma characteristics in the discharge region of the LIPS-300 ion thruster’s 20 A emission current hollow cathode and to verify the structural design of the emitter. The results of the two methods indicated that the highest plasma density and electron temperature, which improved significantly in the orifice region, were located in the discharge region of the hollow cathode. The magnitude of plasma density was about 1021 m-3 in the emitter and orifice regions, as obtained by numerical calculations, but decreased exponentially in the plume region with the distance from the orifice exit. Meanwhile, compared to the emitter region, the electron temperature and current improved by about 36% in the orifice region. The hollow cathode performance test results were in good agreement with the numerical calculation results, which proved that that the structural design of the emitter and the orifice met the requirements of a 20 A emission current. The numerical calculation method can be used to estimate plasma characteristics in the preliminary design stage of hollow cathodes.

  8. Conceptual design of a 20 Tesla pulsed solenoid for a laser solenoid fusion reactor

    International Nuclear Information System (INIS)

    Nolan, J.J.; Averill, R.J.

    1977-01-01

    Design considerations are described for a strip wound solenoid which is pulsed to 20 tesla while immersed in a 20 tesla bias field so as to achieve within the bore of the pulsed solenoid at net field sequence starting at 20 tesla and going first down to zero, then up to 40 tesla, and finally back to 20 tesla in a period of about 5 x 10 -3 seconds. The important parameters of the solenoid, e.g., aperture, build, turns, stored and dissipated energy, field intensity and powering circuit, are given. A numerical example for a specific design is presented. Mechanical stresses in the solenoid and the subsequent choice of materials for coil construction are discussed. Although several possible design difficulties are not discussed in this preliminary report of a conceptual magnet design, such as uniformity of field, long-term stability of insulation under neutron bombardment and choice of structural materials of appropriate tensile strength and elasticity to withstand magnetic forces developed, these questions are addressed in detail in the complete design report and in part in reference one. Furthermore, the authors feel that the problems encountered in this conceptual design are surmountable and are not a hindrance to the construction of such a magnet system

  9. Autophagy contributes to apoptosis in A20 and EL4 lymphoma cells treated with fluvastatin.

    Science.gov (United States)

    Qi, Xu-Feng; Kim, Dong-Heui; Lee, Kyu-Jae; Kim, Cheol-Su; Song, Soon-Bong; Cai, Dong-Qing; Kim, Soo-Ki

    2013-11-08

    Convincing evidence indicates that statins stimulate apoptotic cell death in several types of proliferating tumor cells in a cholesterol-lowering-independent manner. However, the relationship between apoptosis and autophagy in lymphoma cells exposed to statins remains unclear. The objective of this study was to elucidate the potential involvement of autophagy in fluvastatin-induced cell death of lymphoma cells. We found that fluvastatin treatment enhanced the activation of pro-apoptotic members such as caspase-3 and Bax, but suppressed the activation of anti-apoptotic molecule Bcl-2 in lymphoma cells including A20 and EL4 cells. The process was accompanied by increases in numbers of annexin V alone or annexin V/PI double positive cells. Furthermore, both autophagosomes and increases in levels of LC3-II were also observed in fluvastatin-treated lymphoma cells. However, apoptosis in fluvastatin-treated lymphoma cells could be blocked by the addition of 3-methyladenine (3-MA), the specific inhibitor of autophagy. Fluvastatin-induced activation of caspase-3, DNA fragmentation, and activation of LC3-II were blocked by metabolic products of the HMG-CoA reductase reaction, such as mevalonate, farnesyl pyrophosphate (FPP) and geranylgeranyl pyrophosphate (GGPP). These results suggest that autophagy contributes to fluvastatin-induced apoptosis in lymphoma cells, and that these regulating processes require inhibition of metabolic products of the HMG-CoA reductase reaction including mevalonate, FPP and GGPP.

  10. [Methods, challenges and opportunities for big data analyses of microbiome].

    Science.gov (United States)

    Sheng, Hua-Fang; Zhou, Hong-Wei

    2015-07-01

    Microbiome is a novel research field related with a variety of chronic inflamatory diseases. Technically, there are two major approaches to analysis of microbiome: metataxonome by sequencing the 16S rRNA variable tags, and metagenome by shot-gun sequencing of the total microbial (mainly bacterial) genome mixture. The 16S rRNA sequencing analyses pipeline includes sequence quality control, diversity analyses, taxonomy and statistics; metagenome analyses further includes gene annotation and functional analyses. With the development of the sequencing techniques, the cost of sequencing will decrease, and big data analyses will become the central task. Data standardization, accumulation, modeling and disease prediction are crucial for future exploit of these data. Meanwhile, the information property in these data, and the functional verification with culture-dependent and culture-independent experiments remain the focus in future research. Studies of human microbiome will bring a better understanding of the relations between the human body and the microbiome, especially in the context of disease diagnosis and therapy, which promise rich research opportunities.

  11. Can NGOs Make a Difference? Revisiting and Reframing a 20-year Debate

    DEFF Research Database (Denmark)

    Opoku-Mensah, Paul Yaw

    2007-01-01

    The article seeks to connect the vibrant debates in the Nordic region on NGOs and the aid system with the international comparative debates on NGOs and development alternatives. It argues for a    reformulation of the international debate on NGOs and development alternatives to address...... the foundational questions related to the formative role and structural impact of the international aid system on NGOs and their roles. This reformulation moves the discussions further and enables analyses that provide understanding of the actual and potential role of NGOs to transform development  processes....

  12. Validation of a 20-year forecast of US childhood lead poisoning: Updated prospects for 2010

    International Nuclear Information System (INIS)

    Jacobs, David E.; Nevin, Rick

    2006-01-01

    We forecast childhood lead poisoning and residential lead paint hazard prevalence for 1990-2010, based on a previously unvalidated model that combines national blood lead data with three different housing data sets. The housing data sets, which describe trends in housing demolition, rehabilitation, window replacement, and lead paint, are the American Housing Survey, the Residential Energy Consumption Survey, and the National Lead Paint Survey. Blood lead data are principally from the National Health and Nutrition Examination Survey. New data now make it possible to validate the midpoint of the forecast time period. For the year 2000, the model predicted 23.3 million pre-1960 housing units with lead paint hazards, compared to an empirical HUD estimate of 20.6 million units. Further, the model predicted 498,000 children with elevated blood lead levels (EBL) in 2000, compared to a CDC empirical estimate of 434,000. The model predictions were well within 95% confidence intervals of empirical estimates for both residential lead paint hazard and blood lead outcome measures. The model shows that window replacement explains a large part of the dramatic reduction in lead poisoning that occurred from 1990 to 2000. Here, the construction of the model is described and updated through 2010 using new data. Further declines in childhood lead poisoning are achievable, but the goal of eliminating children's blood lead levels ≥10 μg/dL by 2010 is unlikely to be achieved without additional action. A window replacement policy will yield multiple benefits of lead poisoning prevention, increased home energy efficiency, decreased power plant emissions, improved housing affordability, and other previously unrecognized benefits. Finally, combining housing and health data could be applied to forecasting other housing-related diseases and injuries

  13. Predicting supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Heinemeyer, S. [Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Weiglein, G. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2010-07-15

    We review the result of SUSY parameter fits based on frequentist analyses of experimental constraints from electroweak precision data, (g-2){sub {mu}}, B physics and cosmological data. We investigate the parameters of the constrained MSSM (CMSSM) with universal soft supersymmetry-breaking mass parameters, and a model with common non-universal Higgs mass parameters in the superpotential (NUHM1). Shown are the results for the SUSY and Higgs spectrum of the models. Many sparticle masses are highly correlated in both the CMSSM and NUHM1, and parts of the regions preferred at the 68% C.L. are accessible to early LHC running. The best-fit points could be tested even with 1 fb{sup -1} at {radical}(s)=7 TeV. (orig.)

  14. Socioeconomic factors and mortality in emergency general surgery: trends over a 20-year period.

    Science.gov (United States)

    Armenia, Sarah J; Pentakota, Sri Ram; Merchant, Aziz M

    2017-05-15

    Socioeconomic factors such as race, insurance, and income quartiles have been identified as independent risk factors in emergency general surgery (EGS), but this impact has not been studied over time. We sought to identify trends in disparities in EGS-related operative mortality over a 20-y period. The National Inpatient Sample was used to identify patient encounters coded for EGS in 1993, 2003, and 2013. Logistic regression models were used to examine the adjusted relationship between race, primary payer status, and median income quartiles and in-hospital mortality after adjusting for patients' age, gender, Elixhauser comorbidity score, and hospital region, size, and location-cum-teaching status. We identified 391,040 patient encounters. In 1993, Black race was associated with higher odds of in-hospital mortality (odds ratio [95% confidence interval]: 1.35 [1.20-1.53]) than White race, although this difference dissipated in subsequent years. Medicare, Medicaid, and underinsured patients had a higher odds of mortality than those with private insurance for the entire 20-y period; only the disparity in the underinsured decreased over time (1993, 1.63 [1.35-1.98]; 2013, 1.41 [1.20-1.67]). In 2003 (1.23 [1.10-1.38]) and 2013 (1.23 [1.11-1.37]), patients from the lowest income quartile were more likely to die after EGS than patients from the highest income quartile. Socioeconomic disparities in EGS-related operative morality followed inconsistent trends. Over time, while gaps in in-hospital mortality among Blacks and Whites have narrowed, disparities among patients belonging to lowest income quartile have worsened. Medicare and Medicaid beneficiaries continued to experience higher odds of in-hospital mortality relative to those with private insurance. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Ecosystem development after mangrove wetland creation: plant-soil change across a 20-year chronosequence

    Science.gov (United States)

    Osland, Michael J.; Spivak, Amanda C.; Nestlerode, Janet A.; Lessmann, Jeannine M.; Almario, Alejandro E.; Heitmuller, Paul T.; Russell, Marc J.; Krauss, Ken W.; Alvarez, Federico; Dantin, Darrin D.; Harvey, James E.; From, Andrew S.; Cormier, Nicole; Stagg, Camille L.

    2012-01-01

    Mangrove wetland restoration and creation efforts are increasingly proposed as mechanisms to compensate for mangrove wetland losses. However, ecosystem development and functional equivalence in restored and created mangrove wetlands are poorly understood. We compared a 20-year chronosequence of created tidal wetland sites in Tampa Bay, Florida (USA) to natural reference mangrove wetlands. Across the chronosequence, our sites represent the succession from salt marsh to mangrove forest communities. Our results identify important soil and plant structural differences between the created and natural reference wetland sites; however, they also depict a positive developmental trajectory for the created wetland sites that reflects tightly coupled plant-soil development. Because upland soils and/or dredge spoils were used to create the new mangrove habitats, the soils at younger created sites and at lower depths (10-30 cm) had higher bulk densities, higher sand content, lower soil organic matter (SOM), lower total carbon (TC), and lower total nitrogen (TN) than did natural reference wetland soils. However, in the upper soil layer (0-10 cm), SOM, TC, and TN increased with created wetland site age simultaneously with mangrove forest growth. The rate of created wetland soil C accumulation was comparable to literature values for natural mangrove wetlands. Notably, the time to equivalence for the upper soil layer of created mangrove wetlands appears to be faster than for many other wetland ecosystem types. Collectively, our findings characterize the rate and trajectory of above- and below-ground changes associated with ecosystem development in created mangrove wetlands; this is valuable information for environmental managers planning to sustain existing mangrove wetlands or mitigate for mangrove wetland losses.

  16. Treating a 20 mm Hg gradient alleviates myocardial hypertrophy in experimental aortic coarctation.

    Science.gov (United States)

    Wendell, David C; Friehs, Ingeborg; Samyn, Margaret M; Harmann, Leanne M; LaDisa, John F

    2017-10-01

    Children with coarctation of the aorta (CoA) can have a hyperdynamic and remodeled left ventricle (LV) from increased afterload. Literature from an experimental model suggests the putative 20 mm Hg blood pressure gradient (BPG) treatment guideline frequently implemented in CoA studies may permit irreversible vascular changes. LV remodeling from pressure overload has been studied, but data are limited following correction and using a clinically representative BPG. Rabbits underwent CoA at 10 weeks to induce a 20 mm Hg BPG using permanent or dissolvable suture thereby replicating untreated and corrected CoA, respectively. Cardiac function was evaluated at 32 weeks by magnetic resonance imaging using a spoiled cine GRE sequence (TR/TE/FA 8/2.9/20), 14 × 14-cm FOV, and 3-mm slice thickness. Images (20 frames/cycle) were acquired in 6-8 short axis views from the apex to the mitral valve annulus. LV volume, ejection fraction (EF), and mass were quantified. LV mass was elevated for CoA (5.2 ± 0.55 g) versus control (3.6 ± 0.16 g) and corrected (4.0 ± 0.44 g) rabbits, resulting in increased LV mass/volume ratio for CoA rabbits. A trend toward increased EF and stroke volume was observed but did not reach significance. Elevated EF by volumetric analysis in CoA rabbits was supported by concomitant increases in total aortic flow by phase-contrast magnetic resonance imaging. The indices quantified trended toward a persistent hyperdynamic LV despite correction, but differences were not statistically significant versus control rabbits. These findings suggest the current putative 20 mm Hg BPG for treatment may be reasonable from the LV's perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Phosphorus retention in a 20-year-old septic system filter bed.

    Science.gov (United States)

    Robertson, W D

    2012-01-01

    Septic systems in lakeshore environments often occur where thin soils overlie bedrock and, consequently, filter beds may be constructed of imported filter sand. The objective of this study was to assess the mobility of wastewater phosphorus (P) in such a potentially vulnerable setting by examining a 20-yr-old domestic septic system located near Parry Sound, ON, Canada, where the filter bed is constructed of imported noncalcareous sand. The groundwater plume is acidic (pH 6.0) and has a zone of elevated PO-P (up to 3.1 ± 1.7 mg L) below the tile lines but no elevated PO-P is present beyond 5 m from the tile lines. Elevated concentrations of desorbable P (up to 137 mg kg) and acid-extractable P (up to 3210 mg kg) occur in the filter sand within 1 m below four of seven tile lines present and the total mass of excess acid-extractable P (39 kg) is similar to the estimated total lifetime P loading to the system (33 kg). Microprobe images reveal abundant Fe and Al-rich authigenic mineral coatings on the sand grains that are increasingly P rich (up to 10% w/w P) near the tile lines. Additionally, 6 yr of monitoring data show that groundwater PO concentrations are not increasing. This indicates that mineral precipitation, not adsorption, dominates P immobilization at this site. This example of robust long-term P retention opens up the possibility of improving P removal in on-site treatment systems by prescribing specific sand types for filter bed construction. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  18. Towards a 20 kA high temperature superconductor current lead module using REBCO tapes

    Science.gov (United States)

    Heller, R.; Bagrets, N.; Fietz, W. H.; Gröner, F.; Kienzler, A.; Lange, C.; Wolf, M. J.

    2018-01-01

    Most of the large fusion devices presently under construction or in operation consisting of superconducting magnets like EAST, Wendelstein 7-X (W7-X), JT-60SA, and ITER, use high temperature superconductor (HTS) current leads (CL) to reduce the cryogenic load and operational cost. In all cases, the 1st generation HTS material Bi-2223 is used which is embedded in a low-conductivity matrix of AgAu. In the meantime, industry worldwide concentrates on the production of the 2nd generation HTS REBCO material because of the better field performance in particular at higher temperature. As the new material can only be produced in a multilayer thin-film structure rather than as a multi-filamentary tape, the technology developed for Bi-2223-based current leads cannot be transferred directly to REBCO. Therefore, several laboratories are presently investigating the design of high current HTS current leads made of REBCO. Karlsruhe Institute of Technology is developing a 20 kA HTS current lead using brass-stabilized REBCO tapes—as a further development to the Bi-2223 design used in the JT-60SA current leads. The same copper heat exchanger module as in the 20 kA JT-60SA current lead will be used for simplicity, which will allow a comparison of the newly developed REBCO CL with the earlier produced and investigated CL for JT-60SA. The present paper discusses the design and accompanying test of single tape and stack REBCO mock-ups. Finally, the fabrication of the HTS module using REBCO stacks is described.

  19. Prenatal Diagnosis of Transposition of the Great Arteries over a 20-Year Period: Improved but Imperfect

    Science.gov (United States)

    Escobar-Diaz, Maria C; Freud, Lindsay R; Bueno, Alejandra; Brown, David W; Friedman, Kevin; Schidlow, David; Emani, Sitaram; del Nido, Pedro; Tworetzky, Wayne

    2015-01-01

    Objective To evaluate temporal trends in prenatal diagnosis of transposition of the great arteries with intact ventricular septum (TGA/IVS) and its impact on neonatal morbidity and mortality. Methods Newborns with TGA/IVS referred for surgical management to our center over a 20-year period (1992 – 2011) were included. The study time was divided into 5 four-year periods, and the primary outcome was rate of prenatal diagnosis. Secondary outcomes included neonatal pre-operative status and perioperative survival. Results Of the 340 patients, 81 (24%) had a prenatal diagnosis. Prenatal diagnosis increased over the study period from 6% to 41% (p<0.001). Prenatally diagnosed patients underwent a balloon atrial septostomy (BAS) earlier than postnatally diagnosed patients (0 vs. 1 day, p<0.001) and fewer required mechanical ventilation (56% vs. 69%, p=0.03). There were no statistically significant differences in pre-operative acidosis (16% vs. 26%, p=0.1) and need for preoperative ECMO (2% vs. 3%, p=1.0). There was also no significant mortality difference (1 pre-operative and no post-operative deaths among prenatally diagnosed patients, as compared to 4 pre-operative and 6 post-operative deaths among postnatally diagnosed patients). Conclusion The prenatal detection rate of TGA/IVS has improved but still remains below 50%, suggesting the need for strategies to increase detection rates. The mortality rate was not statistically different between pre- and postnatally diagnosed patients; however, there were significant pre-operative differences with regard to earlier BAS and less mechanical ventilation. Ongoing study is required to elucidate whether prenatal diagnosis confers long-term benefit. PMID:25484180

  20. Family cohesion and posttraumatic intrusion and avoidance among war veterans: a 20-year longitudinal study.

    Science.gov (United States)

    Zerach, Gadi; Solomon, Zahava; Horesh, Danny; Ein-Dor, Tsachi

    2013-02-01

    The bi-directional relationships between combat-induced posttraumatic symptoms and family relations are yet to be understood. The present study assesses the longitudinal interrelationship of posttraumatic intrusion and avoidance and family cohesion among 208 Israeli combat veterans from the 1982 Lebanon War. Two groups of veterans were assessed with self-report questionnaires 1, 3 and 20 years after the war: a combat stress reaction (CSR) group and a matched non-CSR control group. Latent Trajectories Modeling showed that veterans of the CSR group reported higher intrusion and avoidance than non-CSR veterans at all three points of time. With time, there was a decline in these symptoms in both groups, but the decline was more salient among the CSR group. The latter also reported lower levels of family cohesion. Furthermore, an incline in family cohesion levels was found in both groups over the years. Most importantly, Autoregressive Cross-Lagged Modeling among CSR and non-CSR veterans revealed that CSR veterans' posttraumatic symptoms in 1983 predicted lower family cohesion in 1985, and lower family cohesion, in turn, predicted posttraumatic symptoms in 2002. The findings suggest that psychological breakdown on the battlefield is a marker for future family cohesion difficulties. Our results lend further support for the bi-directional mutual effects of posttraumatic symptoms and family cohesion over time.

  1. Regulation of the human SLC25A20 expression by peroxisome proliferator-activated receptor alpha in human hepatoblastoma cells

    Energy Technology Data Exchange (ETDEWEB)

    Tachibana, Keisuke, E-mail: nya@phs.osaka-u.ac.jp [Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Takeuchi, Kentaro; Inada, Hirohiko [Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Yamasaki, Daisuke [Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); The Center for Advanced Medical Engineering and Informatics, Osaka University, 2-2 Yamadaoka, Suita, Osaka 565-0871 (Japan); Ishimoto, Kenji [Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); Graduate School of Medicine, Osaka University, 2-2 Yamadaoka, Suita, Osaka 565-0871 (Japan); Tanaka, Toshiya; Hamakubo, Takao; Sakai, Juro; Kodama, Tatsuhiko [Laboratory for System Biology and Medicine, Research Center for Advanced Science and Technology, University of Tokyo, 4-6-1 Komaba, Meguro, Tokyo 153-8904 (Japan); Doi, Takefumi [Graduate School of Pharmaceutical Sciences, Osaka University, 1-6 Yamadaoka, Suita, Osaka 565-0871 (Japan); The Center for Advanced Medical Engineering and Informatics, Osaka University, 2-2 Yamadaoka, Suita, Osaka 565-0871 (Japan); Graduate School of Medicine, Osaka University, 2-2 Yamadaoka, Suita, Osaka 565-0871 (Japan)

    2009-11-20

    Solute carrier family 25, member 20 (SLC25A20) is a key molecule that transfers acylcarnitine esters in exchange for free carnitine across the mitochondrial membrane in the mitochondrial {beta}-oxidation. The peroxisome proliferator-activated receptor alpha (PPAR{alpha}) is a ligand-activated transcription factor that plays an important role in the regulation of {beta}-oxidation. We previously established tetracycline-regulated human cell line that can be induced to express PPAR{alpha} and found that PPAR{alpha} induces the SLC25A20 expression. In this study, we analyzed the promoter region of the human slc25a20 gene and showed that PPAR{alpha} regulates the expression of human SLC25A20 via the peroxisome proliferator responsive element.

  2. Regulation of the human SLC25A20 expression by peroxisome proliferator-activated receptor alpha in human hepatoblastoma cells

    International Nuclear Information System (INIS)

    Tachibana, Keisuke; Takeuchi, Kentaro; Inada, Hirohiko; Yamasaki, Daisuke; Ishimoto, Kenji; Tanaka, Toshiya; Hamakubo, Takao; Sakai, Juro; Kodama, Tatsuhiko; Doi, Takefumi

    2009-01-01

    Solute carrier family 25, member 20 (SLC25A20) is a key molecule that transfers acylcarnitine esters in exchange for free carnitine across the mitochondrial membrane in the mitochondrial β-oxidation. The peroxisome proliferator-activated receptor alpha (PPARα) is a ligand-activated transcription factor that plays an important role in the regulation of β-oxidation. We previously established tetracycline-regulated human cell line that can be induced to express PPARα and found that PPARα induces the SLC25A20 expression. In this study, we analyzed the promoter region of the human slc25a20 gene and showed that PPARα regulates the expression of human SLC25A20 via the peroxisome proliferator responsive element.

  3. Fracture analyses of WWER reactor pressure vessels

    International Nuclear Information System (INIS)

    Sievers, J.; Liu, X.

    1997-01-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab

  4. Fracture analyses of WWER reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Sievers, J; Liu, X [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    1997-09-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab.

  5. Frequency and changes in trends of leading risk factors of coronary heart disease in women in the city of Novi Sad during a 20-year period

    Directory of Open Access Journals (Sweden)

    Rakić Dušica

    2012-01-01

    Full Text Available Backround/Aim. From 1984 to 2004 the city of Novi Sad participated through its Health Center “Novi Sad” in the international Multinational MONItoring of Trends and Determinants in CArdiovascular Disease (MONICA project, as one of the 38 research centers in 21 countries around the world. The aim of this study was to determine frequency and changes of trends in leading risk factors of coronary heart disease (CHD and to analyze the previous trend of movement of coronary event in women in Novi Sad during a 20- year period. Methods. In 2004, the fourth survey within MONICA project was conducted in the city of Novi Sad. The representative sample included 1,041 women between the age of 25 and 74. The prevalence of risk factors in CHD such as smoking, high blood pressure, elevated blood cholesterol, elevated blood glucose and obesity was determined. Also, indicators of risk factors and rates of coronary events in women were compared with the results from MONICA project obtained in previous three screens, as well as with the results from other research centres. χ2-test, linear trend and correlartion coefficient were used in statistical analysis of results obtained. Results. It was observed that during a 20-year period covered by the study, the prevalence of the leading risk factors for the development of CHD in the surveyed women was significantly increasing and in positive correlation with the values of linear trend. Also, the increase of morbidity rates and mortality rates of coronary event were in positive correlation. The decrease was only recorded in the period from 1985-1989 (the implementation of the intervention programme. Conclusion. Upon analysing the increase in prevalence of leading risk factors of CHD and significant increase in the rates of coronary event, we can conclude that health status of women in Novi Sad during a 20-year period was deteriorating.

  6. Involvement of Ubiquitin-Editing Protein A20 in Modulating Inflammation in Rat Cochlea Associated with Silver Nanoparticle-Induced CD68 Upregulation and TLR4 Activation

    Science.gov (United States)

    Feng, Hao; Pyykkö, Ilmari; Zou, Jing

    2016-05-01

    Silver nanoparticles (AgNPs) were shown to temporarily impair the biological barriers in the skin of the external ear canal, mucosa of the middle ear, and inner ear, causing partially reversible hearing loss after delivery into the middle ear. The current study aimed to elucidate the molecular mechanism, emphasizing the TLR signaling pathways in association with the potential recruitment of macrophages in the cochlea and the modulation of inflammation by ubiquitin-editing protein A20. Molecules potentially involved in these signaling pathways were thoroughly analysed using immunohistochemistry in the rat cochlea exposed to AgNPs at various concentrations through intratympanic injection. The results showed that 0.4 % AgNPs but not 0.02 % AgNPs upregulated the expressions of CD68, TLR4, MCP1, A20, and RNF11 in the strial basal cells, spiral ligament fibrocytes, and non-sensory supporting cells of Corti's organ. 0.4 % AgNPs had no effect on CD44, TLR2, MCP2, Rac1, myosin light chain, VCAM1, Erk1/2, JNK, p38, IL-1β, TNF-α, TNFR1, TNFR2, IL-10, or TGF-β. This study suggested that AgNPs might confer macrophage-like functions on the strial basal cells and spiral ligament fibrocytes and enhance the immune activities of non-sensory supporting cells of Corti's organ through the upregulation of CD68, which might be involved in TLR4 activation. A20 and RNF11 played roles in maintaining cochlear homeostasis via negative regulation of the expressions of inflammatory cytokines.

  7. Identifying null meta-analyses that are ripe for updating

    Directory of Open Access Journals (Sweden)

    Fang Manchun

    2003-07-01

    Full Text Available Abstract Background As an increasingly large number of meta-analyses are published, quantitative methods are needed to help clinicians and systematic review teams determine when meta-analyses are not up to date. Methods We propose new methods for determining when non-significant meta-analytic results might be overturned, based on a prediction of the number of participants required in new studies. To guide decision making, we introduce the "new participant ratio", the ratio of the actual number of participants in new studies to the predicted number required to obtain statistical significance. A simulation study was conducted to study the performance of our methods and a real meta-analysis provides further evidence. Results In our three simulation configurations, our diagnostic test for determining whether a meta-analysis is out of date had sensitivity of 55%, 62%, and 49% with corresponding specificity of 85%, 80%, and 90% respectively. Conclusions Simulations suggest that our methods are able to detect out-of-date meta-analyses. These quick and approximate methods show promise for use by systematic review teams to help decide whether to commit the considerable resources required to update a meta-analysis. Further investigation and evaluation of the methods is required before they can be recommended for general use.

  8. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  9. Altools: a user friendly NGS data analyser.

    Science.gov (United States)

    Camiolo, Salvatore; Sablok, Gaurav; Porceddu, Andrea

    2016-02-17

    Genotyping by re-sequencing has become a standard approach to estimate single nucleotide polymorphism (SNP) diversity, haplotype structure and the biodiversity and has been defined as an efficient approach to address geographical population genomics of several model species. To access core SNPs and insertion/deletion polymorphisms (indels), and to infer the phyletic patterns of speciation, most such approaches map short reads to the reference genome. Variant calling is important to establish patterns of genome-wide association studies (GWAS) for quantitative trait loci (QTLs), and to determine the population and haplotype structure based on SNPs, thus allowing content-dependent trait and evolutionary analysis. Several tools have been developed to investigate such polymorphisms as well as more complex genomic rearrangements such as copy number variations, presence/absence variations and large deletions. The programs available for this purpose have different strengths (e.g. accuracy, sensitivity and specificity) and weaknesses (e.g. low computation speed, complex installation procedure and absence of a user-friendly interface). Here we introduce Altools, a software package that is easy to install and use, which allows the precise detection of polymorphisms and structural variations. Altools uses the BWA/SAMtools/VarScan pipeline to call SNPs and indels, and the dnaCopy algorithm to achieve genome segmentation according to local coverage differences in order to identify copy number variations. It also uses insert size information from the alignment of paired-end reads and detects potential large deletions. A double mapping approach (BWA/BLASTn) identifies precise breakpoints while ensuring rapid elaboration. Finally, Altools implements several processes that yield deeper insight into the genes affected by the detected polymorphisms. Altools was used to analyse both simulated and real next-generation sequencing (NGS) data and performed satisfactorily in terms of

  10. Ginkgo Biloba Extract and Long-Term Cognitive Decline: A 20-Year Follow-Up Population-Based Study

    Science.gov (United States)

    Amieva, Hélène; Meillon, Céline; Helmer, Catherine; Barberger-Gateau, Pascale; Dartigues, Jean François

    2013-01-01

    Background Numerous studies have looked at the potential benefits of various nootropic drugs such as Ginkgo biloba extract (EGb761®; Tanakan®) and piracetam (Nootropyl®) on age-related cognitive decline often leading to inconclusive results due to small sample sizes or insufficient follow-up duration. The present study assesses the association between intake of EGb761® and cognitive function of elderly adults over a 20-year period. Methods and Findings The data were gathered from the prospective community-based cohort study ‘Paquid’. Within the study sample of 3612 non-demented participants aged 65 and over at baseline, three groups were compared: 589 subjects reporting use of EGb761® at at least one of the ten assessment visits, 149 subjects reporting use of piracetam at one of the assessment visits and 2874 subjects not reporting use of either EGb761® or piracetam. Decline on MMSE, verbal fluency and visual memory over the 20-year follow-up was analysed with a multivariate mixed linear effects model. A significant difference in MMSE decline over the 20-year follow-up was observed in the EGb761® and piracetam treatment groups compared to the ‘neither treatment’ group. These effects were in opposite directions: the EGb761® group declined less rapidly than the ‘neither treatment’ group, whereas the piracetam group declined more rapidly (β = −0.6). Regarding verbal fluency and visual memory, no difference was observed between the EGb761® group and the ‘neither treatment’ group (respectively, β = 0.21 and β = −0.03), whereas the piracetam group declined more rapidly (respectively, β = −1.40 and β = −0.44). When comparing the EGb761® and piracetam groups directly, a different decline was observed for the three tests (respectively β = −1.07, β = −1.61 and β = −0.41). Conclusion Cognitive decline in a non-demented elderly population was lower in subjects who reported using EGb761® than in

  11. Ginkgo biloba extract and long-term cognitive decline: a 20-year follow-up population-based study.

    Directory of Open Access Journals (Sweden)

    Hélène Amieva

    Full Text Available Numerous studies have looked at the potential benefits of various nootropic drugs such as Ginkgo biloba extract (EGb761®; Tanakan® and piracetam (Nootropyl® on age-related cognitive decline often leading to inconclusive results due to small sample sizes or insufficient follow-up duration. The present study assesses the association between intake of EGb761® and cognitive function of elderly adults over a 20-year period.The data were gathered from the prospective community-based cohort study 'Paquid'. Within the study sample of 3612 non-demented participants aged 65 and over at baseline, three groups were compared: 589 subjects reporting use of EGb761® at at least one of the ten assessment visits, 149 subjects reporting use of piracetam at one of the assessment visits and 2874 subjects not reporting use of either EGb761® or piracetam. Decline on MMSE, verbal fluency and visual memory over the 20-year follow-up was analysed with a multivariate mixed linear effects model. A significant difference in MMSE decline over the 20-year follow-up was observed in the EGb761® and piracetam treatment groups compared to the 'neither treatment' group. These effects were in opposite directions: the EGb761® group declined less rapidly than the 'neither treatment' group, whereas the piracetam group declined more rapidly (β = -0.6. Regarding verbal fluency and visual memory, no difference was observed between the EGb761® group and the 'neither treatment' group (respectively, β = 0.21 and β = -0.03, whereas the piracetam group declined more rapidly (respectively, β = -1.40 and β = -0.44. When comparing the EGb761® and piracetam groups directly, a different decline was observed for the three tests (respectively β = -1.07, β = -1.61 and β = -0.41.Cognitive decline in a non-demented elderly population was lower in subjects who reported using EGb761® than in those who did not. This effect may be a specific medication

  12. OsDOG, a gibberellin-induced A20/AN1 zinc-finger protein, negatively regulates gibberellin-mediated cell elongation in rice.

    Science.gov (United States)

    Liu, Yaju; Xu, Yunyuan; Xiao, Jun; Ma, Qibin; Li, Dan; Xue, Zhen; Chong, Kang

    2011-07-01

    The A20/AN1 zinc-finger proteins (ZFPs) play pivotal roles in animal immune responses and plant stress responses. From previous gibberellin (GA) microarray data and A20/AN1 ZFP family member association, we chose Oryza sativa dwarf rice with overexpression of gibberellin-induced gene (OsDOG) to examine its function in the GA pathway. OsDOG was induced by gibberellic acid (GA(3)) and repressed by the GA-synthesis inhibitor paclobutrazol. Different transgenic lines with constitutive expression of OsDOG showed dwarf phenotypes due to deficiency of cell elongation. Additional GA(1) and real-time PCR quantitative assay analyses confirmed that the decrease of GA(1) in the overexpression lines resulted from reduced expression of GA3ox2 and enhanced expression of GA2ox1 and GA2ox3. Adding exogenous GA rescued the constitutive expression phenotypes of the transgenic lines. OsDOG has a novel function in regulating GA homeostasis and in negative maintenance of plant cell elongation in rice. Copyright © 2011 Elsevier GmbH. All rights reserved.

  13. Vitamin E γ-Tocotrienol Inhibits Cytokine-Stimulated NF-κB Activation by Induction of Anti-Inflammatory A20 via Stress Adaptive Response Due to Modulation of Sphingolipids.

    Science.gov (United States)

    Wang, Yun; Park, Na-Young; Jang, Yumi; Ma, Averil; Jiang, Qing

    2015-07-01

    NF-κB plays a central role in pathogenesis of inflammation and cancer. Many phytochemicals, including γ-tocotrienol (γTE), a natural form of vitamin E, have been shown to inhibit NF-κB activation, but the underlying mechanism has not been identified. In this study, we show that γTE inhibited cytokine-triggered activation of NF-κB and its upstream regulator TGF-β-activated kinase-1 in murine RAW 264.7 macrophages and primary bone marrow-derived macrophages. In these cells, γTE induced upregulation of A20, an inhibitor of NF-κB. Knockout of A20 partially diminished γTE's anti-NF-κB effect, but γTE increased another NF-κB inhibitor, Cezanne, in A20(-/-) cells. In search of the reason for A20 upregulation, we found that γTE treatment increased phosphorylation of translation initiation factor 2, IκBα, and JNK, indicating induction of endoplasmic reticulum stress. Liquid chromatography-tandem mass spectrometry analyses revealed that γTE modulated sphingolipids, including enhancement of intracellular dihydroceramides, sphingoid bases in de novo synthesis of the sphingolipid pathway. Chemical inhibition of de novo sphingolipid synthesis partially reversed γTE's induction of A20 and the anti-NF-κB effect. The importance of dihydroceramide increase is further supported by the observation that C8-dihydroceramide mimicked γTE in upregulating A20, enhancing endoplasmic reticulum stress, and attenuating TNF-triggered NF-κB activation. Our study identifies a novel anti-NF-κB mechanism where A20 is induced by stress-induced adaptive response as a result of modulation of sphingolipids, and it demonstrates an immunomodulatory role of dihydrocermides. Copyright © 2015 by The American Association of Immunologists, Inc.

  14. Publication of noninferiority clinical trials: changes over a 20-year interval.

    Science.gov (United States)

    Suda, Katie J; Hurley, Anne M; McKibbin, Trevor; Motl Moroney, Susannah E

    2011-09-01

    The primary objective was to evaluate the change in publication rate of noninferiority trials over a 20-year interval (1989-2009). Secondary objectives were to analyze the frequency of noninferiority trials by therapeutic category, the frequency of noninferiority trial publication by journal, the impact factors of the publishing journals, any potential special advantages of the study drug over the control, the funding sources of the trials, pharmaceutical industry affiliation of the authors, and the use of ghostwriters in the creation of manuscripts. Retrospective literature review of 583 articles. PubMed (January 1989-December 2009) and EMBASE (first quarter 1989-fourth quarter 2009) databases. A total of 583 articles of the results of randomized controlled clinical trials with a noninferiority study design that evaluated drug therapies, published in English, between 1989 and 2009, were included in the analysis. A consistent increase was noted in their yearly publication rates, with no trials published in 1989 versus 133 in 2009. One hundred twenty-six articles (21.6%) were in the therapeutic category of infectious diseases, followed by 78 (13.4%) in cardiology. Among the journals identified, The New England Journal of Medicine had the highest publication rate of trials with a noninferiority design, with 29 (5.0%) of the identified trials published in this journal. The median impact factor of the journals publishing noninferiority trials was 4.807 (interquartile range 3.064-7.5). The most common advantage of the study drug over the control was reduced duration of treatment or reduced pill burden (80 studies [22.9%]). A total of 425 trials (72.9%) listed the pharmaceutical industry as the only funding source. Among 369 trials with authors employed by the pharmaceutical industry, 101 (17.3%) disclosed an acknowledgment to an individual, other than those listed as authors, who contributed to writing the manuscript and who was affiliated with a medical information

  15. A20 is critical for the induction of Pam3CSK4-tolerance in monocytic THP-1 cells.

    Directory of Open Access Journals (Sweden)

    Jinyue Hu

    Full Text Available A20 functions to terminate Toll-like receptor (TLR-induced immune response, and play important roles in the induction of lipopolysacchride (LPS-tolerance. However, the molecular mechanism for Pam3CSK4-tolerance is uncertain. Here we report that TLR1/2 ligand Pam3CSK4 induced tolerance in monocytic THP-1 cells. The pre-treatment of THP-1 cells with Pam3CSK4 down-regulated the induction of pro-inflammatory cytokines induced by Pam3CSK4 re-stimulation. Pam3CSK4 pre-treatment also down-regulated the signaling transduction of JNK, p38 and NF-κB induced by Pam3CSK4 re-stimulation. The activation of TLR1/2 induced a rapid and robust up-regulation of A20, suggesting that A20 may contribute to the induction of Pam3CSK4-tolerance. This hypothesis was proved by the observation that the over-expression of A20 by gene transfer down-regulated Pam3CSK4-induced inflammatory responses, and the down-regulation of A20 by RNA interference inhibited the induction of tolerance. Moreover, LPS induced a significant up-regulation of A20, which contributed to the induction of cross-tolerance between LPS and Pam3CSK4. A20 was also induced by the treatment of THP-1 cells with TNF-α and IL-1β. The pre-treatment with TNF-α and IL-1β partly down-regulated Pam3CSK4-induced activation of MAPKs. Furthermore, pharmacologic inhibition of GSK3 signaling down-regulated Pam3CSK4-induced A20 expression, up-regulated Pam3CSK4-induced inflammatory responses, and partly reversed Pam3CSK4 pre-treatment-induced tolerance, suggesting that GSK3 is involved in TLR1/2-induced tolerance by up-regulation of A20 expression. Taken together, these results indicated that A20 is a critical regulator for TLR1/2-induced pro-inflammatory responses.

  16. An MDE Approach for Modular Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Aksit, Mehmet; Rensink, Arend

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java, and access the program to be analyzed through libraries that offer an API for reading, writing

  17. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  18. Diversity of primary care systems analysed.

    NARCIS (Netherlands)

    Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.

    2015-01-01

    This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.

  19. Approximate analyses of inelastic effects in pipework

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    This presentation shows figures concerned with analyses of inelastic effects in pipework as follows: comparison of experimental and calculated simplified analyses results for free end rotation and for circumferential strain; interrupted stress relaxation; regenerated relaxation caused by reversed yield; buckling of straight pipe under combined bending and torsion; results of fatigues test of pipe bend

  20. Diagnostic Comparison of Meteorological Analyses during the 2002 Antarctic Winter

    Science.gov (United States)

    Manney, Gloria L.; Allen, Douglas R.; Kruger, Kirstin; Naujokat, Barbara; Santee, Michelle L.; Sabutis, Joseph L.; Pawson, Steven; Swinbank, Richard; Randall, Cora E.; Simmons, Adrian J.; hide

    2005-01-01

    Several meteorological datasets, including U.K. Met Office (MetO), European Centre for Medium-Range Weather Forecasts (ECMWF), National Centers for Environmental Prediction (NCEP), and NASA's Goddard Earth Observation System (GEOS-4) analyses, are being used in studies of the 2002 Southern Hemisphere (SH) stratospheric winter and Antarctic major warming. Diagnostics are compared to assess how these studies may be affected by the meteorological data used. While the overall structure and evolution of temperatures, winds, and wave diagnostics in the different analyses provide a consistent picture of the large-scale dynamics of the SH 2002 winter, several significant differences may affect detailed studies. The NCEP-NCAR reanalysis (REAN) and NCEP-Department of Energy (DOE) reanalysis-2 (REAN-2) datasets are not recommended for detailed studies, especially those related to polar processing, because of lower-stratospheric temperature biases that result in underestimates of polar processing potential, and because their winds and wave diagnostics show increasing differences from other analyses between similar to 30 and 10 hPa (their top level). Southern Hemisphere polar stratospheric temperatures in the ECMWF 40-Yr Re-analysis (ERA-40) show unrealistic vertical structure, so this long-term reanalysis is also unsuited for quantitative studies. The NCEP/Climate Prediction Center (CPC) objective analyses give an inferior representation of the upper-stratospheric vortex. Polar vortex transport barriers are similar in all analyses, but there is large variation in the amount, patterns, and timing of mixing, even among the operational assimilated datasets (ECMWF, MetO, and GEOS-4). The higher-resolution GEOS-4 and ECMWF assimilations provide significantly better representation of filamentation and small-scale structure than the other analyses, even when fields gridded at reduced resolution are studied. The choice of which analysis to use is most critical for detailed transport

  1. Predictable Medea

    Directory of Open Access Journals (Sweden)

    Elisabetta Bertolino

    2010-01-01

    Full Text Available By focusing on the tragedy of the 'unpredictable' infanticide perpetrated by Medea, the paper speculates on the possibility of a non-violent ontological subjectivity for women victims of gendered violence and whether it is possible to respond to violent actions in non-violent ways; it argues that Medea did not act in an unpredictable way, rather through the very predictable subject of resentment and violence. 'Medea' represents the story of all of us who require justice as retribution against any wrong. The presupposition is that the empowered female subjectivity of women’s rights contains the same desire of mastering others of the masculine current legal and philosophical subject. The subject of women’s rights is grounded on the emotions of resentment and retribution and refuses the categories of the private by appropriating those of the righteous, masculine and public subject. The essay opposes the essentialised stereotypes of the feminine and the maternal with an ontological approach of people as singular, corporeal, vulnerable and dependent. There is therefore an emphasis on the excluded categories of the private. Forgiveness is taken into account as a category of the private and a possibility of responding to violence with newness. A violent act is seen in relations to the community of human beings rather than through an isolated setting as in the case of the individual of human rights. In this context, forgiveness allows to risk again and being with. The result is also a rethinking of feminist actions, feminine subjectivity and of the maternal. Overall the paper opens up the Arendtian category of action and forgiveness and the Cavarerian unique and corporeal ontology of the selfhood beyond gendered stereotypes.

  2. Contribution of thermo-fluid analyses to the LHC experiments

    CERN Document Server

    Gasser, G

    2003-01-01

    The big amount of electrical and electronic equipment that will be installed in the four LHC experiments will cause important heat dissipation into the detectors’ volumes. This is a major issue for the experimental groups, as temperature stability is often a fundamental requirement for the different sub-detectors to be able to provide a good measurement quality. The thermofluid analyses that are carried out in the ST/CV group are a very efficient tool to understand and predict the thermal behaviour of the detectors. These studies are undertaken according to the needs of the experimental groups; they aim at evaluate the thermal stability for a proposed design, or to compare different technical solutions in order to choose the best one for the final design. The usual approach to carry out these studies is first presented and then, some practical examples of thermo-fluid analyses are presented focusing on the main results in order to illustrate their contribution.

  3. ATWS analyses for Krsko Full Scope Simulator verification

    Energy Technology Data Exchange (ETDEWEB)

    Cerne, G; Tiselj, I; Parzer, I [Reactor Engineering Div., Inst. Jozef Stefan, Ljubljana (Slovenia)

    2000-07-01

    The purpose of this analysis was to simulate Anticipated Transient without Scram transient for Krsko NPP. The results of these calculations were used for verification of reactor coolant system thermal-hydraulic response predicted by Krsko Full Scope Simulator. For the thermal-hydraulic analyses the RELAP5/MOD2 code and the input card deck for NPP Krsko was used. The analyses for ATWS were performed to assess the influence and benefit of ATWS Mitigation System Actuation Circuitry (AMSAC). In the presented paper the most severe ATWS scenarios have been analyzed, starting with the loss of Main Feedwater at both steam generators. Thus, gradual loss of secondary heat sink occurred. On top of that, control rods were not supposed to scram, leaving the chain reaction to be controlled only by inherent physical properties of the fuel and moderator and eventual actions of the BOP system. The primary system response has been studied regarding the AMSAC availability. (author)

  4. Publication Trends in Acupuncture Research: A 20-Year Bibliometric Analysis Based on PubMed.

    Science.gov (United States)

    Ma, Yan; Dong, Ming; Zhou, Kehua; Mita, Carol; Liu, Jianping; Wayne, Peter M

    2016-01-01

    Acupuncture has become popular and widely practiced in many countries around the world. Despite the large amount of acupuncture-related literature that has been published, broader trends in the prevalence and scope of acupuncture research remain underexplored. The current study quantitatively analyzes trends in acupuncture research publications in the past 20 years. A bibliometric approach was used to search PubMed for all acupuncture-related research articles including clinical and animal studies. Inclusion criteria were articles published between 1995 and 2014 with sufficient information for bibliometric analyses. Rates and patterns of acupuncture publication within the 20 year observational period were estimated, and compared with broader publication rates in biomedicine. Identified eligible publications were further analyzed with respect to study type/design, clinical condition addressed, country of origin, and journal impact factor. A total of 13,320 acupuncture-related publications were identified using our search strategy and eligibility criteria. Regression analyses indicated an exponential growth in publications over the past two decades, with a mean annual growth rate of 10.7%. This compares to a mean annual growth rate of 4.5% in biomedicine. A striking trend was an observed increase in the proportion of randomized clinical trials (RCTs), from 7.4% in 1995 to 20.3% in 2014, exceeding the 4.5% proportional growth of RCTs in biomedicine. Over the 20 year period, pain was consistently the most common focus of acupuncture research (37.9% of publications). Other top rankings with respect to medical focus were arthritis, neoplasms/cancer, pregnancy or labor, mood disorders, stroke, nausea/vomiting, sleep, and paralysis/palsy. Acupuncture research was conducted in 60 countries, with the top 3 contributors being China (47.4%), United States (17.5%), and United Kingdom (8.2%). Retrieved articles were published mostly in complementary and alternative medicine (CAM

  5. Secondary structural analyses of ITS1 in Paramecium.

    Science.gov (United States)

    Hoshina, Ryo

    2010-01-01

    The nuclear ribosomal RNA gene operon is interrupted by internal transcribed spacer (ITS) 1 and ITS2. Although the secondary structure of ITS2 has been widely investigated, less is known about ITS1 and its structure. In this study, the secondary structure of ITS1 sequences for Paramecium and other ciliates was predicted. Each Paramecium ITS1 forms an open loop with three helices, A through C. Helix B was highly conserved among Paramecium, and similar helices were found in other ciliates. A phylogenetic analysis using the ITS1 sequences showed high-resolution, implying that ITS1 is a good tool for species-level analyses.

  6. Acute abdomen in pregnancy requiring surgical management: a 20-case series.

    Science.gov (United States)

    Unal, Aysun; Sayharman, Sema Etiz; Ozel, Leyla; Unal, Ethem; Aka, Nurettin; Titiz, Izzet; Kose, Gultekin

    2011-11-01

    The obstetrician often has a difficult task in diagnosing and managing the acute abdomen in pregnancy. A reluctance to operate during pregnancy adds unnecessary delay, which may increase morbidity for both mother and fetus. In this study, we present our experience in pregnant patients with acute abdomen. Pregnant patients with acute abdomen requiring surgical exploration were enrolled from 2007 to 2010. Demographics, gestational age, symptoms, fetal loss, preterm delivery, imaging studies, operative results, postoperative complications and histopathologic evaluations were recorded. Ultrasound (US) and magnetic resonance (MR) imaging studies were evaluated. Data analyses were performed with Microsoft Excel and statistical evaluations were done by using Student's t-test. There were 20 patients with a mean age of 32 years. The rate of emergency surgery was seen to be significantly higher in the second trimester (pacute abdomen (30% and 15%, respectively). All patients tolerated surgery well, and postoperative complications included wound infection, 10%, preterm labor, 5%, and prolonged paralytic ileus, 5%. One patient died from advanced gastric carcinoma and the only fetal death was seen in this case. Prompt diagnosis and appropriate therapy are crucial in pregnant with acute abdomen. The use of US may be limited and CT is not desirable due to fetal irradiation. MR has thus become increasingly popular in the evaluation of such patients. Adhesive small bowel obstruction should be kept in mind as an important etiology. Copyright © 2011. Published by Elsevier Ireland Ltd.

  7. Cancer Incidence and Mortality in a Cohort of US Blood Donors: A 20-Year Study

    International Nuclear Information System (INIS)

    Vahidnia, F.; Busch, M. P.; Custer, B.; Hirschler, N. V.; Chinn, A.; Agapova, M.; Busch, M. P.; Custer, B.

    2013-01-01

    Blood donors are considered one of the healthiest populations. This study describes the epidemiology of cancer in a cohort of blood donors up to 20 years after blood donation. Records from donors who participated in the Retroviral Epidemiology Donor Study (REDS, 1991-2002) at Blood Centers of the Pacific (BCP), San Francisco, were linked to the California Cancer Registry (CCR, 1991-2010). Standardized incidence ratios (SIR) were estimated using standard US 2000 population, and survival analysis used to compare all-cause mortality among donors and a random sample of non donors with cancer from CCR. Of 55,158 eligible allogeneic blood donors followed-up for 863,902 person-years, 4,236 (7.7%) primary malignant cancers were diagnosed. SIR in donors was 1.59 (95% CI = 1.54,1.64). Donors had significantly lower mortality (adjusted HR = 0.70, 95% CI = 0.66-0.74) compared with non donor cancer patients, except for respiratory system cancers (adjusted HR = 0.93, 95% CI = 0.82-1.05). Elevated cancer incidence among blood donors may reflect higher diagnosis rates due to health seeking behavior and cancer screening in donors. A “healthy donor effect” on mortality following cancer diagnosis was demonstrated. This population-based database and sample repository of blood donors with long-term monitoring of cancer incidence provides the opportunity for future analyses of genetic and other bio markers of cancer

  8. Outcome of radioiodine-131 therapy in hyperfunctioning thyroid nodules: a 20 years' retrospective study.

    Science.gov (United States)

    Ceccarelli, Claudia; Bencivelli, Walter; Vitti, Paolo; Grasso, Lucia; Pinchera, Aldo

    2005-03-01

    To investigate the risk of hypothyroidism after radioiodine (131I) treatment for hyperfunctioning thyroid nodules. Retrospective analysis of patients treated with 131I for hyperfunctioning thyroid nodules and followed up for a maximum of 20 years. A total of 346 patients treated with 131I in the years 1975-95, for a single hyperfunctioning nodule. Hypothyroidism was defined as TSH levels > 3.7 mU/l. Kaplan-Meier survival analysis was used to analyse permanence of euthyroidism after 131I. A stepwise Cox proportional hazard model was used to identify factors influencing the progression to hypothyroidism. The cumulative incidence of hypothyroidism was 7.6% at 1 year, 28% at 5 years, 46% at 10 years and 60% at 20 years. Age (P thyroid and nodule size, thyroid status at diagnosis and degree of extranodular thyroid parenchymal suppression had no influence. In hyperthyroid patients with partial parenchymal suppression, however, previous MMI treatment was the most important prognostic factor (P hyperfunctioning nodule are hypothyroid. Factors increasing the risk of hypothyroidism are age, 131I uptake and MMI pretreatment. The prognostic value of this last factor, however, depends on the degree of suppression of the extranodular thyroid parenchyma at the scan.

  9. Cancer Incidence and Mortality in a Cohort of US Blood Donors: A 20-Year Study

    Science.gov (United States)

    Hirschler, Nora V.; Chinn, Artina; Busch, Michael P.; Custer, Brian

    2013-01-01

    Blood donors are considered one of the healthiest populations. This study describes the epidemiology of cancer in a cohort of blood donors up to 20 years after blood donation. Records from donors who participated in the Retroviral Epidemiology Donor Study (REDS, 1991–2002) at Blood Centers of the Pacific (BCP), San Francisco, were linked to the California Cancer Registry (CCR, 1991–2010). Standardized incidence ratios (SIR) were estimated using standard US 2000 population, and survival analysis used to compare all-cause mortality among donors and a random sample of nondonors with cancer from CCR. Of 55,158 eligible allogeneic blood donors followed-up for 863,902 person-years, 4,236 (7.7%) primary malignant cancers were diagnosed. SIR in donors was 1.59 (95% CI = 1.54,1.64). Donors had significantly lower mortality (adjusted HR = 0.70, 95% CI = 0.66–0.74) compared with nondonor cancer patients, except for respiratory system cancers (adjusted HR = 0.93, 95% CI = 0.82–1.05). Elevated cancer incidence among blood donors may reflect higher diagnosis rates due to health seeking behavior and cancer screening in donors. A “healthy donor effect” on mortality following cancer diagnosis was demonstrated. This population-based database and sample repository of blood donors with long-term monitoring of cancer incidence provides the opportunity for future analyses of genetic and other biomarkers of cancer. PMID:24489545

  10. Cancer Incidence and Mortality in a Cohort of US Blood Donors: A 20-Year Study

    Directory of Open Access Journals (Sweden)

    Farnaz Vahidnia

    2013-01-01

    Full Text Available Blood donors are considered one of the healthiest populations. This study describes the epidemiology of cancer in a cohort of blood donors up to 20 years after blood donation. Records from donors who participated in the Retroviral Epidemiology Donor Study (REDS, 1991–2002 at Blood Centers of the Pacific (BCP, San Francisco, were linked to the California Cancer Registry (CCR, 1991–2010. Standardized incidence ratios (SIR were estimated using standard US 2000 population, and survival analysis used to compare all-cause mortality among donors and a random sample of nondonors with cancer from CCR. Of 55,158 eligible allogeneic blood donors followed-up for 863,902 person-years, 4,236 (7.7% primary malignant cancers were diagnosed. SIR in donors was 1.59 (95% CI = 1.54,1.64. Donors had significantly lower mortality (adjusted HR = 0.70, 95% CI = 0.66–0.74 compared with nondonor cancer patients, except for respiratory system cancers (adjusted HR = 0.93, 95% CI = 0.82–1.05. Elevated cancer incidence among blood donors may reflect higher diagnosis rates due to health seeking behavior and cancer screening in donors. A “healthy donor effect” on mortality following cancer diagnosis was demonstrated. This population-based database and sample repository of blood donors with long-term monitoring of cancer incidence provides the opportunity for future analyses of genetic and other biomarkers of cancer.

  11. TALC, a new deployable concept for a 20 m far-infrared space telescope

    International Nuclear Information System (INIS)

    Durand, Gilles; Sauvage, Marc; Rodriguez, Louis; Ronayette, Samuel; Reveret, Vincent; Aussel, Herve; Pantin, Eric; Berthe, Michel; Martignac, Jerome; Motte, Frederique; Talvard, Michel; Minier, Vincent; Scola, Loris; Carty, Michael

    2014-01-01

    TALC, Thin Aperture Light Collector is a 20 m space observatory project exploring some unconventional optical solutions (between the single dish and the interferometer) allowing the resolving power of a classical 27 m telescope. With TALC, the principle is to remove the central part of the prime mirror dish, cut the remaining ring into 24 sectors and store them on top of one-another. The aim of this far infrared telescope is to explore the 600 μm to 100 μm region. With this approach we have shown that we can store a ring-telescope of outer diameter 20 m and ring thickness of 3 m inside the fairing of Ariane 5 or Ariane 6. The general structure is the one of a bicycle wheel, whereas the inner sides of the segments are in compression to each other and play the rule of a rim. The segments are linked to each other using a pantograph scissor system that let the segments extend from a pile of dishes to a parabolic ring keeping high stiffness at all time during the deployment. The inner corners of the segments are linked to a central axis using spokes as in a bicycle wheel. The secondary mirror and the instrument box are built as a solid unit fixed at the extremity of the main axis. The tensegrity analysis of this structure shows a very high stiffness to mass ratio, resulting into 3 Hz Eigen frequency. The segments will consist of two composite skins and honeycomb CFRP structure build by replica process. Solid segments will be compared to deformable segments using the controlled shear of the rear surface. The adjustment of the length of the spikes and the relative position of the side of neighbor segments let control the phasing of the entire primary mirror. The telescope is cooled by natural radiation. It is protected from sun radiation by a large inflatable solar screen, loosely linked to the telescope. The orientation is performed by inertia-wheels. This telescope carries a wide field bolometer camera using cryo-cooler at 0.3 K as one of the main instruments. This

  12. TALC: a new deployable concept for a 20m far-infrared space telescope

    Science.gov (United States)

    Durand, Gilles; Sauvage, Marc; Bonnet, Aymeric; Rodriguez, Louis; Ronayette, Samuel; Chanial, Pierre; Scola, Loris; Révéret, Vincent; Aussel, Hervé; Carty, Michael; Durand, Matthis; Durand, Lancelot; Tremblin, Pascal; Pantin, Eric; Berthe, Michel; Martignac, Jérôme; Motte, Frédérique; Talvard, Michel; Minier, Vincent; Bultel, Pascal

    2014-08-01

    TALC, Thin Aperture Light Collector is a 20 m space observatory project exploring some unconventional optical solutions (between the single dish and the interferometer) allowing the resolving power of a classical 27 m telescope. With TALC, the principle is to remove the central part of the prime mirror dish, cut the remaining ring into 24 sectors and store them on top of one-another. The aim of this far infrared telescope is to explore the 600 μm to 100 μm region. With this approach we have shown that we can store a ring-telescope of outer diameter 20m and ring thickness of 3m inside the fairing of Ariane 5 or Ariane 6. The general structure is the one of a bicycle wheel, whereas the inner sides of the segments are in compression to each other and play the rule of a rim. The segments are linked to each other using a pantograph scissor system that let the segments extend from a pile of dishes to a parabolic ring keeping high stiffness at all time during the deployment. The inner corners of the segments are linked to a central axis using spokes as in a bicycle wheel. The secondary mirror and the instrument box are built as a solid unit fixed at the extremity of the main axis. The tensegrity analysis of this structure shows a very high stiffness to mass ratio, resulting into 3 Hz Eigen frequency. The segments will consist of two composite skins and honeycomb CFRP structure build by replica process. Solid segments will be compared to deformable segments using the controlled shear of the rear surface. The adjustment of the length of the spikes and the relative position of the side of neighbor segments let control the phasing of the entire primary mirror. The telescope is cooled by natural radiation. It is protected from sun radiation by a large inflatable solar screen, loosely linked to the telescope. The orientation is performed by inertia-wheels. This telescope carries a wide field bolometer camera using cryocooler at 0.3K as one of the main instruments. This

  13. A 20-Year Overview of Quercus robur L. Mortality and Crown Conditions in Slovenia

    Directory of Open Access Journals (Sweden)

    Matjaž Čater

    2015-02-01

    Full Text Available Pedunculate oak (Quercus robur L. forests in Slovenia are experiencing widespread mortality. Changes in lowlands are reflected in decline of complete forest complexes, high mortality, uneven stand structure and associated forest regeneration problems. Prediction of the present-tree response in disturbed forest ecosystems may significantly contribute to better guideline policies for the silvicultural and forest management practice in the changing environment in both stressed and stabile forest ecosystems. Data from annual crown condition surveys for the 1995–2014 period from four permanent plots have been compared with parameters from hemispherical photo analysis and hydrometeorological data. Good agreement has been confirmed between crown defoliation and total openness; all parameters from the hemispherical photo analysis, which were corrected for winter period values, also indicated a better agreement. Mortality rate and crown defoliation correlated well with extreme drought events in 2003 and 2013. Pattern of agreement among compared parameters was different for the plots Krakovski gozd, Dobrava and some other plots. Mortality is influenced by the average air temperatures much more than by precipitation and groundwater table oscillations.

  14. PBFA Z: A 20-MA z-pinch driver for plasma radiation sources

    International Nuclear Information System (INIS)

    Spielman, R.B.; Breeze, S.F.; Deeney, C.

    1996-01-01

    Sandia National Laboratories is completing a major modification to the PBFA-II facility. PBFA Z will be a z-pinch driver capable of delivering up to 20 MA to a z-pinch load. It optimizes the electrical coupling to the implosion energy of z pinches at implosion velocities of ∼ 40 cm/μs. Design constraints resulted in an accelerator with a 0.12-Ω impedance, a 10.25-nH inductance, and a 120-ns pulse width. The design required new water transmission lines, insulator stack, and vacuum power feeds. Current is delivered to the z-pinch load through four, self-magnetically-insulated vacuum transmission lines and a double post-hole convolute. A variety of design codes are used to model the power flow. These predict a peak current of 20 MA to a z-pinch load having a 2-cm length, a 2-cm radius, and a 15--mg mass, coupling 1.5 MJ into kinetic energy. We present 2-D Rad-Hydro calculations showing MJ x-ray outputs from tungsten wire-array z pinches

  15. PBFA Z: A 20-MA Z-pinch driver for plasma radiation sources

    International Nuclear Information System (INIS)

    Spielman, R.B.; Breeze, S.F.; Deeney, C.

    1996-01-01

    Sandia National Laboratories is completing a major modification to the PBFA-II facility. PBFA Z will be capable of delivering up to 20 MA to a z-pinch load. It optimizes the electrical coupling to the implosion energy of z pinches at implosion velocities of ∼ 40 cm/μs. Design constraints resulted in an accelerator with a 0.12-Ω impedance, a 10.25-nH inductance, and a 120-ns pulse width. The design required new water transmission lines, insulator stack, and vacuum power feeds. Current is delivered to the z-pinch load through four self-magnetically-insulated vacuum transmission lines and a double post-hole convolute. A variety of design codes are used to model the power flow. These predict a peak current of 20 MA to a z-pinch load having a 2-cm length, a 2-cm radius, and a 15-mg mass, coupling 1.5 MJ into kinetic energy. Calculations are presented showing MJ x-ray outputs from tungsten wire-array z pinches. (author). 4 figs., 14 refs

  16. Continuous Improvement of a Groundwater Model over a 20-Year Period: Lessons Learned.

    Science.gov (United States)

    Andersen, Peter F; Ross, James L; Fenske, Jon P

    2018-04-17

    Groundwater models developed for specific sites generally become obsolete within a few years due to changes in: (1) modeling technology; (2) site/project personnel; (3) project funding; and (4) modeling objectives. Consequently, new models are sometimes developed for the same sites using the latest technology and data, but without potential knowledge gained from the prior models. When it occurs, this practice is particularly problematic because, although technology, data, and observed conditions change, development of the new numerical model may not consider the conceptual model's underpinnings. As a contrary situation, we present the unique case of a numerical flow and trichloroethylene (TCE) transport model that was first developed in 1993 and since revised and updated annually by the same personnel. The updates are prompted by an increase in the amount of data, exposure to a wider range of hydrologic conditions over increasingly longer timeframes, technological advances, evolving modeling objectives, and revised modeling methodologies. The history of updates shows smooth, incremental changes in the conceptual model and modeled aquifer parameters that result from both increase and decrease in complexity. Myriad modeling objectives have included demonstrating the ineffectiveness of a groundwater extraction/injection system, evaluating potential TCE degradation, locating new monitoring points, and predicting likelihood of exceedance of groundwater standards. The application emphasizes an original tenet of successful groundwater modeling: iterative adjustment of the conceptual model based on observations of actual vs. model response. © 2018, National Ground Water Association.

  17. PBFA Z: A 20-MA Z-pinch driver for plasma radiation sources

    Energy Technology Data Exchange (ETDEWEB)

    Spielman, R B; Breeze, S F; Deeney, C [Sandia Labs., Albuquerque, NM (United States); and others

    1997-12-31

    Sandia National Laboratories is completing a major modification to the PBFA-II facility. PBFA Z will be capable of delivering up to 20 MA to a z-pinch load. It optimizes the electrical coupling to the implosion energy of z pinches at implosion velocities of {approx} 40 cm/{mu}s. Design constraints resulted in an accelerator with a 0.12-{Omega} impedance, a 10.25-nH inductance, and a 120-ns pulse width. The design required new water transmission lines, insulator stack, and vacuum power feeds. Current is delivered to the z-pinch load through four self-magnetically-insulated vacuum transmission lines and a double post-hole convolute. A variety of design codes are used to model the power flow. These predict a peak current of 20 MA to a z-pinch load having a 2-cm length, a 2-cm radius, and a 15-mg mass, coupling 1.5 MJ into kinetic energy. Calculations are presented showing MJ x-ray outputs from tungsten wire-array z pinches. (author). 4 figs., 14 refs.

  18. Identification of a novel A20-binding inhibitor of nuclear factor-kappa B activation termed ABIN-2.

    Science.gov (United States)

    Van Huffel, S; Delaei, F; Heyninck, K; De Valck, D; Beyaert, R

    2001-08-10

    The nuclear factor kappaB (NF-kappaB) plays a central role in the regulation of genes implicated in immune responses, inflammatory processes, and apoptotic cell death. The zinc finger protein A20 is a cellular inhibitor of NF-kappaB activation by various stimuli and plays a critical role in terminating NF-kappaB responses. The underlying mechanism for NF-kappaB inhibition by A20 is still unknown. A20 has been shown to interact with several proteins including tumor necrosis factor (TNF) receptor-associated factors 2 and 6, as well as the inhibitory protein of kappaB kinase (IKK) gamma protein. Here we report the cloning and characterization of ABIN-2, a previously unknown protein that binds to the COOH-terminal zinc finger domain of A20. NF-kappaB activation induced by TNF and interleukin-1 is inhibited by overexpression of ABIN-2. The latter also inhibits NF-kappaB activation induced by overexpression of receptor-interacting protein or TNF receptor-associated factor 2. In contrast, NF-kappaB activation by overexpression of IKKbeta or direct activators of the IKK complex, such as Tax, cannot be inhibited by ABIN-2. These results indicate that ABIN-2 interferes with NF-kappaB activation upstream of the IKK complex and that it might contribute to the NF-kappaB-inhibitory function of A20.

  19. Specific recognition of linear polyubiquitin by A20 zinc finger 7 is involved in NF-κB regulation

    Science.gov (United States)

    Tokunaga, Fuminori; Nishimasu, Hiroshi; Ishitani, Ryuichiro; Goto, Eiji; Noguchi, Takuya; Mio, Kazuhiro; Kamei, Kiyoko; Ma, Averil; Iwai, Kazuhiro; Nureki, Osamu

    2012-01-01

    LUBAC (linear ubiquitin chain assembly complex) activates the canonical NF-κB pathway through linear polyubiquitination of NEMO (NF-κB essential modulator, also known as IKKγ) and RIP1. However, the regulatory mechanism of LUBAC-mediated NF-κB activation remains elusive. Here, we show that A20 suppresses LUBAC-mediated NF-κB activation by binding linear polyubiquitin via the C-terminal seventh zinc finger (ZF7), whereas CYLD suppresses it through deubiquitinase (DUB) activity. We determined the crystal structures of A20 ZF7 in complex with linear diubiquitin at 1.70–1.98 Å resolutions. The crystal structures revealed that A20 ZF7 simultaneously recognizes the Met1-linked proximal and distal ubiquitins, and that genetic mutations associated with B cell lymphomas map to the ubiquitin-binding sites. Our functional analysis indicated that the binding of A20 ZF7 to linear polyubiquitin contributes to the recruitment of A20 into a TNF receptor (TNFR) signalling complex containing LUBAC and IκB kinase (IKK), which results in NF-κB suppression. These findings provide new insight into the regulation of immune and inflammatory responses. PMID:23032187

  20. Neurological abnormalities predict disability

    DEFF Research Database (Denmark)

    Poggesi, Anna; Gouw, Alida; van der Flier, Wiesje

    2014-01-01

    To investigate the role of neurological abnormalities and magnetic resonance imaging (MRI) lesions in predicting global functional decline in a cohort of initially independent-living elderly subjects. The Leukoaraiosis And DISability (LADIS) Study, involving 11 European centres, was primarily aimed...... at evaluating age-related white matter changes (ARWMC) as an independent predictor of the transition to disability (according to Instrumental Activities of Daily Living scale) or death in independent elderly subjects that were followed up for 3 years. At baseline, a standardized neurological examination.......0 years, 45 % males), 327 (51.7 %) presented at the initial visit with ≥1 neurological abnormality and 242 (38 %) reached the main study outcome. Cox regression analyses, adjusting for MRI features and other determinants of functional decline, showed that the baseline presence of any neurological...

  1. Level II Ergonomic Analyses, Dover AFB, DE

    Science.gov (United States)

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  2. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  3. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...

  4. Insights into a 20-ha multi-contaminated brownfield megasite: An environmental forensics approach

    Energy Technology Data Exchange (ETDEWEB)

    Gallego, J.R., E-mail: jgallego@uniovi.es; Rodríguez-Valdés, E.; Esquinas, N.; Fernández-Braña, A.; Afif, E.

    2016-09-01

    Here we addressed the contamination of soils in an abandoned brownfield located in an industrial area. Detailed soil and waste characterisation guided by historical information about the site revealed pyrite ashes (a residue derived from the roasting of pyrite ores) as the main environmental risk. In fact, the disposal of pyrite ashes and the mixing of these ashes with soils have affected a large area of the site, thereby causing heavy metal(loid) pollution (As and Pb levels reaching several thousands of ppm). A full characterisation of the pyrite ashes was thus performed. In this regard, we determined the bioavailable metal species present and their implications, grain-size distribution, mineralogy, and Pb isotopic signature in order to obtain an accurate conceptual model of the site. We also detected significant concentrations of pyrogenic benzo(a)pyrene and other PAHs, and studied the relation of these compounds with the pyrite ashes. In addition, we examined other waste and spills of minor importance within the study site. The information gathered offered an insight into pollution sources, unravelled evidence from the industrial processes that took place decades ago, and identified the co-occurrence of contaminants by means of multivariate statistics. The environmental forensics study carried out provided greater information than conventional analyses for risk assessment purposes and for the selection of clean-up strategies adapted to future land use. - Highlights: • Complex legacy of contamination afflicts 20-ha brownfield • As and Pb highest soil pollutants • Forensic study reveals main waste and spills. • Comprehensive study of pyrite ashes (multi-point source of pollution) • Co-occurrence of PAH also linked to pyrite ashes.

  5. Long-term bladder and bowel management after spinal cord injury: a 20-year longitudinal study.

    Science.gov (United States)

    Savic, Gordana; Frankel, Hans L; Jamous, Mohamed Ali; Soni, Bakulesh M; Charlifue, Susan

    2018-02-16

    Prospective observational. The aim of this study was to analyse changes in bladder and bowel management methods in persons with long-standing spinal cord injury (SCI). Two spinal centres in UK. Data were collected through interviews and examinations between 1990 and 2010 in a sample of persons injured more than 20 years prior to 1990. For the 85 participants who completed the 2010 follow-up, the mean age was 67.7 years and the mean duration of injury was 46.3 years, 80% were male, 37.7% had tetraplegia AIS grade A, B, or C, 44.7% paraplegia AIS A, B, or C, and 17.6% an AIS D grade regardless of level. In all, 50.6% reported having changed their bladder method, 63.1% their bowel method, and 40.5% both methods since they enroled in the study. The reasons for change were a combination of medical and practical. In men, condom drainage remained the most frequent bladder method, and in women, suprapubic catheter replaced straining/expressing as the most frequent method. The use of condom drainage and straining/expressing bladder methods decreased, whereas the use of suprapubic and intermittent catheters increased. Manual evacuation remained the most frequent bowel management method. The percentage of participants on spontaneous/voluntary bowel emptying, straining and medications alone decreased, whereas the use of colostomy and transanal irrigation increased over time. More than half the sample, all living with SCI for more than 40 years, required change in their bladder and bowel management methods, for either medical or practical reasons. Regular follow-ups ensure adequate change of method if/when needed.

  6. Insights into a 20-ha multi-contaminated brownfield megasite: An environmental forensics approach

    International Nuclear Information System (INIS)

    Gallego, J.R.; Rodríguez-Valdés, E.; Esquinas, N.; Fernández-Braña, A.; Afif, E.

    2016-01-01

    Here we addressed the contamination of soils in an abandoned brownfield located in an industrial area. Detailed soil and waste characterisation guided by historical information about the site revealed pyrite ashes (a residue derived from the roasting of pyrite ores) as the main environmental risk. In fact, the disposal of pyrite ashes and the mixing of these ashes with soils have affected a large area of the site, thereby causing heavy metal(loid) pollution (As and Pb levels reaching several thousands of ppm). A full characterisation of the pyrite ashes was thus performed. In this regard, we determined the bioavailable metal species present and their implications, grain-size distribution, mineralogy, and Pb isotopic signature in order to obtain an accurate conceptual model of the site. We also detected significant concentrations of pyrogenic benzo(a)pyrene and other PAHs, and studied the relation of these compounds with the pyrite ashes. In addition, we examined other waste and spills of minor importance within the study site. The information gathered offered an insight into pollution sources, unravelled evidence from the industrial processes that took place decades ago, and identified the co-occurrence of contaminants by means of multivariate statistics. The environmental forensics study carried out provided greater information than conventional analyses for risk assessment purposes and for the selection of clean-up strategies adapted to future land use. - Highlights: • Complex legacy of contamination afflicts 20-ha brownfield • As and Pb highest soil pollutants • Forensic study reveals main waste and spills. • Comprehensive study of pyrite ashes (multi-point source of pollution) • Co-occurrence of PAH also linked to pyrite ashes

  7. MicroRNA-125b-5p suppresses Brucella abortus intracellular survival via control of A20 expression.

    Science.gov (United States)

    Liu, Ning; Wang, Lin; Sun, Changjiang; Yang, Li; Sun, Wanchun; Peng, Qisheng

    2016-07-29

    Brucella may establish chronic infection by regulating the expression of miRNAs. However, the role of miRNAs in modulating the intracellular growth of Brucella remains unclear. In this study, we show that Brucella. abortus infection leads to downregulation of miR-125b-5p in macrophages. We establish that miR-125b-5p targets A20, an inhibitor of the NF-kB activation. Additionally, expression of miR-125b-5p decreases A20 expression in B. abortus-infected macrophages and leads to NF-kB activation and increased production of TNFα. Furthermore, B. abortus survival is attenuated in the presence of miR-125b-5p. These results uncover a role for miR-125b-5p in the regulation of B. abortus intracellular survival via the control of A20 expression.

  8. Comparison with Russian analyses of meteor impact

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  9. 7 CFR 94.102 - Analyses available.

    Science.gov (United States)

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey, standard...

  10. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  11. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  12. CpG oligodeoxynucleotide induces apoptosis and cell cycle arrest in A20 lymphoma cells via TLR9-mediated pathways.

    Science.gov (United States)

    Qi, Xu-Feng; Zheng, Li; Kim, Cheol-Su; Lee, Kyu-Jae; Kim, Dong-Heui; Cai, Dong-Qing; Qin, Jun-Wen; Yu, Yan-Hong; Wu, Zheng; Kim, Soo-Ki

    2013-07-01

    Recent studies have suggested that the anti-cancer activity of CpG-oligodeoxynucleotides (CpG-ODNs) is owing to their immunomodulatory effects in tumor-bearing host. The purpose of this study is to investigate the directly cytotoxic activity of KSK-CpG, a novel CpG-ODN with an alternative CpG motif, against A20 and EL4 lymphoma cells in comparison with previously used murine CpG motif (1826-CpG). To evaluate the potential cytotoxic effects of KSK-CpG on lymphoma cells, cell viability assay, confocal microscopy, flow cytometry, DNA fragmentation, Western blotting, and reverse transcription-polymerase chain reaction (RT-PCR) analysis were used. We found that KSK-CpG induced direct cytotoxicity in A20 lymphoma cells, but not in EL4 lymphoma cells, at least in part via TLR9-mediated pathways. Apoptotic cell death was demonstrated to play an important role in CpG-ODNs-induced cytotoxicity. In addition, both mitochondrial membrane potential decrease and G1-phase arrest were involved in KSK-CpG-induced apoptosis in A20 cells. The activities of apoptotic molecules such as caspase-3, PARP, and Bax were increased, but the activation of p27 Kip1 and ERK were decreased in KSK-CpG-treated A20 cells. Furthermore, autocrine IFN-γ partially contributed to apoptotic cell death in KSK-CpG-treated A20 cells. Collectively, our findings suggest that KSK-CpG induces apoptotic cell death in A20 lymphoma cells at least in part by inducing G1-phase arrest and autocrine IFN-γ via increasing TLR9 expression, without the need for immune system of tumor-bearing host. This new understanding supports the development of TLR9-targeted therapy with CpG-ODN as a direct therapeutic agent for treating B lymphoma. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  14. Safety analyses for reprocessing and waste processing

    International Nuclear Information System (INIS)

    1983-03-01

    Presentation of an incident analysis of process steps of the RP, simplified considerations concerning safety, and safety analyses of the storage and solidification facilities of the RP. A release tree method is developed and tested. An incident analysis of process steps, the evaluation of the SRL-study and safety analyses of the storage and solidification facilities of the RP are performed in particular. (DG) [de

  15. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  16. Has life satisfaction in Norway increased over a 20-year period? Exploring age and gender differences in a prospective longitudinal study, HUNT.

    Science.gov (United States)

    Lysberg, Frode; Gjerstad, PåL; Småstuen, Milada Cvancarova; Innstrand, Siw Tone; Høie, Magnhild Mjåvatn; Arild Espnes, Geir

    2018-02-01

    The aim of the present study was to investigate the change in overall life satisfaction for different age groups and between genders over a 20-year period. Data from 1984 to 2008 were extracted from a large prospective longitudinal health study of Nord-Trøndelag (HUNT), Norway. The study included more than 176,000 participants ranging from 20 to 70+ years of age. Data were analysed using logistic regression and adjusted for gender. The analyses revealed an increase in life satisfaction for all age groups from 1984-1986 (HUNT 1) to 1995-1997 (HUNT 2), with the highest levels being reached at 2006-2008 (HUNT 3). For all age groups, the data showed an increase of about 20% for the period from 1984-1986 (HUNT 1) to 1995-1997 (HUNT 2). From 1995-1997 (HUNT 2) to 2006-2008 (HUNT 3), the increase in overall life satisfaction was 16% for the younger age groups, and about 32% for the older age groups (40-69 and 70+ years). Women's scores for overall life satisfaction were higher for nearly all age groups when compared to men using HUNT 3 as a reference. These findings suggest an increase in life satisfaction for all age groups from 1984 to 2008, especially for the older age group (40-69 and 70+ years). The data indicate that women score higher on life satisfaction for most age groups as compared to men.

  17. The descriptive epidemiology of sitting. A 20-country comparison using the International Physical Activity Questionnaire (IPAQ).

    Science.gov (United States)

    Bauman, Adrian; Ainsworth, Barbara E; Sallis, James F; Hagströmer, Maria; Craig, Cora L; Bull, Fiona C; Pratt, Michael; Venugopal, Kamalesh; Chau, Josephine; Sjöström, Michael

    2011-08-01

    Recent epidemiologic evidence points to the health risks of prolonged sitting, that are independent of physical activity, but few papers have reported the descriptive epidemiology of sitting in population studies with adults. This paper reports the prevalence of "high sitting time" and its correlates in an international study in 20 countries. Representative population samples from 20 countries were collected 2002-2004, and a question was asked on usual weekday hours spent sitting. This question was part of the International Prevalence Study, using the International Physical Activity Questionnaire (IPAQ). The sitting measure has acceptable reliability and validity. Daily sitting time was compared among countries, and by age group, gender, educational attainment, and physical activity. Data were available for 49,493 adults aged 18-65 years from 20 countries. The median reported sitting time was 300 minutes/day, with an interquartile range of 180-480 minutes. Countries reporting the lowest amount of sitting included Portugal, Brazil, and Colombia (medians ≤180 min/day), whereas adults in Taiwan, Norway, Hong Kong, Saudi Arabia, and Japan reported the highest sitting times (medians ≥360 min/day). In adjusted analyses, adults aged 40-65 years were significantly less likely to be in the highest quintile for sitting than adults aged 18-39 years (AOR=0.796), and those with postschool education had higher sitting times compared with those with high school or less education (OR=1.349). Physical activity showed an inverse relationship, with those reporting low activity on the IPAQ three times more likely to be in the highest-sitting quintile compared to those reporting high physical activity. Median sitting time varied widely across countries. Assessing sitting time is an important new area for preventive medicine, in addition to assessing physical activity and sedentary behaviors. Population surveys that monitor lifestyle behaviors should add measures of sitting time to

  18. Nuclear criticality predictability

    International Nuclear Information System (INIS)

    Briggs, J.B.

    1999-01-01

    As a result of lots of efforts, a large portion of the tedious and redundant research and processing of critical experiment data has been eliminated. The necessary step in criticality safety analyses of validating computer codes with benchmark critical data is greatly streamlined, and valuable criticality safety experimental data is preserved. Criticality safety personnel in 31 different countries are now using the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments'. Much has been accomplished by the work of the ICSBEP. However, evaluation and documentation represents only one element of a successful Nuclear Criticality Safety Predictability Program and this element only exists as a separate entity, because this work was not completed in conjunction with the experimentation process. I believe; however, that the work of the ICSBEP has also served to unify the other elements of nuclear criticality predictability. All elements are interrelated, but for a time it seemed that communications between these elements was not adequate. The ICSBEP has highlighted gaps in data, has retrieved lost data, has helped to identify errors in cross section processing codes, and has helped bring the international criticality safety community together in a common cause as true friends and colleagues. It has been a privilege to associate with those who work so diligently to make the project a success. (J.P.N.)

  19. Plant corrosion: prediction of materials performance

    International Nuclear Information System (INIS)

    Strutt, J.E.; Nicholls, J.R.

    1987-01-01

    Seventeen papers have been compiled forming a book on computer-based approaches to corrosion prediction in a wide range of industrial sectors, including the chemical, petrochemical and power generation industries. Two papers have been selected and indexed separately. The first describes a system operating within BNFL's Reprocessing Division to predict materials performance in corrosive conditions to aid future plant design. The second describes the truncation of the distribution function of pit depths during high temperature oxidation of a 20Cr austenitic steel in the fuel cladding in AGR systems. (U.K.)

  20. SPE dose prediction using locally weighted regression

    International Nuclear Information System (INIS)

    Hines, J. W.; Townsend, L. W.; Nichols, T. F.

    2005-01-01

    When astronauts are outside earth's protective magnetosphere, they are subject to large radiation doses resulting from solar particle events (SPEs). The total dose received from a major SPE in deep space could cause severe radiation poisoning. The dose is usually received over a 20-40 h time interval but the event's effects may be mitigated with an early warning system. This paper presents a method to predict the total dose early in the event. It uses a locally weighted regression model, which is easier to train and provides predictions as accurate as neural network models previously used. (authors)

  1. SPE dose prediction using locally weighted regression

    International Nuclear Information System (INIS)

    Hines, J. W.; Townsend, L. W.; Nichols, T. F.

    2005-01-01

    When astronauts are outside Earth's protective magnetosphere, they are subject to large radiation doses resulting from solar particle events. The total dose received from a major solar particle event in deep space could cause severe radiation poisoning. The dose is usually received over a 20-40 h time interval but the event's effects may be reduced with an early warning system. This paper presents a method to predict the total dose early in the event. It uses a locally weighted regression model, which is easier to train, and provides predictions as accurate as the neural network models that were used previously. (authors)

  2. Nuclear power plant analysers: their approach to analysis and design

    International Nuclear Information System (INIS)

    Ancarani, A.; Zanobetti, D.

    1985-01-01

    ''Analysers'' as used for nuclear power plant simulators are powerful tools and their purpose can be variously assigned: it may vary from the aid in the design of power plants to the assistance to operators in emergency situations. A fundamental problem arising from the analysers' concept and use is the definition of the simulation capability. This can be assessed either by comparison with previous operational data statistically significant and suitably elaborated; or by comparison with theoretical (computed) values obtained from engineering codes. In both these, to take advantage of all the possibilities offered by the ''analysers'', it is mandatory that suitable terms of reference be clearly stated and agreed upon. Particular care is devoted to accuracy in the prediction of physical values both for the steady state and the transient situations. For instance, it can be seen that such evaluations can be met by specifying the maximum error on value of parameters (ordinates), save for very fast transients; the maximum error on time (abscissae) for occurrence of extreme values; the maximum error on values of extremes (ordinates); the maximum error on derivatives (slopes) for rapidly variable transients, save near extreme values. The paper also deals with a brief account of the present projects and proposals in different countries as known from various sources, and mentions a possible co-ordination at international level. (author)

  3. Multivariate differential analyses of adolescents' experiences of aggression in families

    Directory of Open Access Journals (Sweden)

    Chris Myburgh

    2011-01-01

    Full Text Available Aggression is part of South African society and has implications for the mental health of persons living in South Africa. If parents are aggressive adolescents are also likely to be aggressive and that will impact negatively on their mental health. In this article the nature and extent of adolescents' experiences of aggression and aggressive behaviour in the family are investigated. A deductive explorative quantitative approach was followed. Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers Cronbach Alpha, various consecutive first and second order factor analyses, correlations, multiple regression, MANOVA, ANOVA and Scheffè/ Dunnett tests were used. It was found that aggression correlated negatively with the independent variables; and the correlations between adolescents and their parents were significant. Regression analyses indicated that different predictors predicted aggression. Furthermore, differences between adolescents and their parents indicated that the experienced levels of aggression between adolescents and their parents were small. Implications for education are given.

  4. Mass separated neutral particle energy analyser

    International Nuclear Information System (INIS)

    Takeuchi, Hiroshi; Matsuda, Toshiaki; Miura, Yukitoshi; Shiho, Makoto; Maeda, Hikosuke; Hashimoto, Kiyoshi; Hayashi, Kazuo.

    1983-09-01

    A mass separated neutral particle energy analyser which could simultaneously measure hydrogen and deuterium atoms emitted from tokamak plasma was constructed. The analyser was calibrated for the energy and mass separation in the energy range from 0.4 keV to 9 keV. In order to investigate the behavior of deuteron and proton in the JFT-2 tokamak plasma heated with ion cyclotron wave and neutral beam injection, this analyser was installed in JFT-2 tokamak. It was found that the energy spectrum could be determined with sufficient accuracy. The obtained ion temperature and ratio of deuteron and proton density from the energy spectrum were in good agreement with the value deduced from Doppler broadening of TiXIV line and the line intensities of H sub(α) and D sub(α) respectively. (author)

  5. Advanced toroidal facility vaccuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs

  6. Thermal and stress analyses with ANSYS program

    International Nuclear Information System (INIS)

    Kanoo, Iwao; Kawaguchi, Osamu; Asakura, Junichi.

    1975-03-01

    Some analyses of the heat conduction and elastic/inelastic stresses, carried out in Power Reactor and Nuclear Fuel Development Corporation (PNC) in fiscal 1973 using ANSYS (Engineering Analysis System) program, are summarized. In chapter I, the present state of structural analysis programs available for a FBR (fast breeder reactor) in PNC is explained. Chapter II is a brief description of the ANSYS current status. In chapter III are presented 8 examples of the steady-state and transient thermal analyses for fast-reactor plant components, and in chapter IV 5 examples of the inelastic structural analysis. With the advance in the field of finite element method, its applications in design study should extend progressively in the future. The present report, it is hoped, will contribute as references in similar analyses and at the same time help to understand the deformation and strain behaviors of structures. (Mori, K.)

  7. Periodic safety analyses; Les essais periodiques

    Energy Technology Data Exchange (ETDEWEB)

    Gouffon, A; Zermizoglou, R

    1990-12-01

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989.

  8. A Simple, Reliable Precision Time Analyser

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, B. V.; Nargundkar, V. R.; Subbarao, K.; Kamath, M. S.; Eligar, S. K. [Atomic Energy Establishment Trombay, Bombay (India)

    1966-06-15

    A 30-channel time analyser is described. The time analyser was designed and built for pulsed neutron research but can be applied to other uses. Most of the logic is performed by means of ferrite memory core and transistor switching circuits. This leads to great versatility, low power consumption, extreme reliability and low cost. The analyser described provides channel Widths from 10 {mu}s to 10 ms; arbitrarily wider channels are easily obtainable. It can handle counting rates up to 2000 counts/min in each channel with less than 1% dead time loss. There is a provision for an initial delay equal to 100 channel widths. An input pulse de-randomizer unit using tunnel diodes ensures exactly equal channel widths. A brief description of the principles involved in core switching circuitry is given. The core-transistor transfer loop is compared with the usual core-diode loops and is shown to be more versatile and better adapted to the making of a time analyser. The circuits derived from the basic loop are described. These include the scale of ten, the frequency dividers and the delay generator. The current drivers developed for driving the cores are described. The crystal-controlled clock which controls the width of the time channels and synchronizes the operation of the various circuits is described. The detector pulse derandomizer unit using tunnel diodes is described. The scheme of the time analyser is then described showing how the various circuits can be integrated together to form a versatile time analyser. (author)

  9. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  10. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...

  11. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  12. An Earthquake Prediction System Using The Time Series Analyses of Earthquake Property And Crust Motion

    International Nuclear Information System (INIS)

    Takeda, Fumihide; Takeo, Makoto

    2004-01-01

    We have developed a short-term deterministic earthquake (EQ) forecasting system similar to those used for Typhoons and Hurricanes, which has been under a test operation at website http://www.tec21.jp/ since June of 2003. We use the focus and crust displacement data recently opened to the public by Japanese seismograph and global positioning system (GPS) networks, respectively. Our system divides the forecasting area into the five regional areas of Japan, each of which is about 5 deg. by 5 deg. We have found that it can forecast the focus, date of occurrence and magnitude (M) of an impending EQ (whose M is larger than about 6), all within narrow limits. We have two examples to describe the system. One is the 2003/09/26 EQ of M 8 in the Hokkaido area, which is of hindsight. Another is a successful rollout of the most recent forecast on the 2004/05/30 EQ of M 6.7 off coast of the southern Kanto (Tokyo) area

  13. Behavioral and Physiological Neural Network Analyses: A Common Pathway toward Pattern Recognition and Prediction

    Science.gov (United States)

    Ninness, Chris; Lauter, Judy L.; Coffee, Michael; Clary, Logan; Kelly, Elizabeth; Rumph, Marilyn; Rumph, Robin; Kyle, Betty; Ninness, Sharon K.

    2012-01-01

    Using 3 diversified datasets, we explored the pattern-recognition ability of the Self-Organizing Map (SOM) artificial neural network as applied to diversified nonlinear data distributions in the areas of behavioral and physiological research. Experiment 1 employed a dataset obtained from the UCI Machine Learning Repository. Data for this study…

  14. Prediction of heart disease using apache spark analysing decision trees and gradient boosting algorithm

    Science.gov (United States)

    Chugh, Saryu; Arivu Selvan, K.; Nadesh, RK

    2017-11-01

    Numerous destructive things influence the working arrangement of human body as hypertension, smoking, obesity, inappropriate medication taking which causes many contrasting diseases as diabetes, thyroid, strokes and coronary diseases. The impermanence and horribleness of the environment situation is also the reason for the coronary disease. The structure of Apache start relies on the evolution which requires gathering of the data. To break down the significance of use programming focused on data structure the Apache stop ought to be utilized and it gives various central focuses as it is fast in light as it uses memory worked in preparing. Apache Spark continues running on dispersed environment and chops down the data in bunches giving a high profitability rate. Utilizing mining procedure as a part of the determination of coronary disease has been exhaustively examined indicating worthy levels of precision. Decision trees, Neural Network, Gradient Boosting Algorithm are the various apache spark proficiencies which help in collecting the information.

  15. Assessment of protein disorder region predictions in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan; Kryshtafovych, Andriy; Moult, John; Tramontano, Anna; Fidelis, Krzysztof

    2013-01-01

    The article presents the assessment of disorder region predictions submitted to CASP10. The evaluation is based on the three measures tested in previous CASPs: (i) balanced accuracy, (ii) the Matthews correlation coefficient for the binary predictions, and (iii) the area under the curve in the receiver operating characteristic (ROC) analysis of predictions using probability annotation. We also performed new analyses such as comparison of the submitted predictions with those obtained with a Naïve disorder prediction method and with predictions from the disorder prediction databases D2P2 and MobiDB. On average, the methods participating in CASP10 demonstrated slightly better performance than those in CASP9.

  16. Assessment of protein disorder region predictions in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan

    2013-11-22

    The article presents the assessment of disorder region predictions submitted to CASP10. The evaluation is based on the three measures tested in previous CASPs: (i) balanced accuracy, (ii) the Matthews correlation coefficient for the binary predictions, and (iii) the area under the curve in the receiver operating characteristic (ROC) analysis of predictions using probability annotation. We also performed new analyses such as comparison of the submitted predictions with those obtained with a Naïve disorder prediction method and with predictions from the disorder prediction databases D2P2 and MobiDB. On average, the methods participating in CASP10 demonstrated slightly better performance than those in CASP9.

  17. 10 CFR 61.13 - Technical analyses.

    Science.gov (United States)

    2010-01-01

    ... air, soil, groundwater, surface water, plant uptake, and exhumation by burrowing animals. The analyses... processes such as erosion, mass wasting, slope failure, settlement of wastes and backfill, infiltration through covers over disposal areas and adjacent soils, and surface drainage of the disposal site. The...

  18. Analysing Simple Electric Motors in the Classroom

    Science.gov (United States)

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  19. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  20. Analysing User Lifetime in Voluntary Online Collaboration

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show that ...

  1. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  2. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  3. UMTS signal measurements with digital spectrum analysers

    International Nuclear Information System (INIS)

    Licitra, G.; Palazzuoli, D.; Ricci, A. S.; Silvi, A. M.

    2004-01-01

    The launch of the Universal Mobile Telecommunications System (UNITS), the most recent mobile telecommunications standard has imposed the requirement of updating measurement instrumentation and methodologies. In order to define the most reliable measurement procedure, which is aimed at assessing the exposure to electromagnetic fields, modern spectrum analysers' features for correct signal characterisation has been reviewed. (authors)

  4. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  5. Micromechanical photothermal analyser of microfluidic samples

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention relates to a micromechanical photothermal analyser of microfluidic samples comprising an oblong micro-channel extending longitudinally from a support element, the micro-channel is made from at least two materials with different thermal expansion coefficients, wherein...

  6. Systematic review and meta-analyses

    DEFF Research Database (Denmark)

    Dreier, Julie Werenberg; Andersen, Anne-Marie Nybo; Berg-Beckhoff, Gabriele

    2014-01-01

    1990 were excluded. RESULTS: The available literature supported an increased risk of adverse offspring health in association with fever during pregnancy. The strongest evidence was available for neural tube defects, congenital heart defects, and oral clefts, in which meta-analyses suggested between a 1...

  7. Secundaire analyses organisatiebeleid psychosociale arbeidsbelasting (PSA)

    NARCIS (Netherlands)

    Kraan, K.O.; Houtman, I.L.D.

    2016-01-01

    Hoe het organisatiebeleid rond psychosociale arbeidsbelasting (PSA) eruit ziet anno 2014 en welke samenhang er is met ander beleid en uitkomstmaten, zijn de centrale vragen in dit onderzoek. De resultaten van deze verdiepende analyses kunnen ten goede komen aan de lopende campagne ‘Check je

  8. Exergoeconomic and environmental analyses of CO

    NARCIS (Netherlands)

    Mosaffa, A. H.; Garousi Farshi, L; Infante Ferreira, C.A.; Rosen, M. A.

    2016-01-01

    Exergoeconomic and environmental analyses are presented for two CO2/NH3 cascade refrigeration systems equipped with (1) two flash tanks and (2) a flash tank along with a flash intercooler with indirect subcooler. A comparative study is performed for the proposed systems, and

  9. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  10. Multivariate differential analyses of adolescents' experiences of ...

    African Journals Online (AJOL)

    Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers) Cronbach Alpha, various consecutive first and second order factor ...

  11. Chromosomal evolution and phylogenetic analyses in Tayassu ...

    Indian Academy of Sciences (India)

    Chromosome preparation and karyotype description. The material analysed consists of chromosome preparations of the tayassuid species T. pecari (three individuals) and. P. tajacu (four individuals) and were made from short-term lymphocyte cultures of whole blood samples using standard protocols (Chaves et al. 2002).

  12. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  13. Making detailed predictions makes (some) predictions worse

    Science.gov (United States)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  14. Summary of dynamic analyses of the advanced neutron source reactor inner control rods

    International Nuclear Information System (INIS)

    Hendrich, W.R.

    1995-08-01

    A summary of the structural dynamic analyses that were instrumental in providing design guidance to the Advanced Neutron source (ANS) inner control element system is presented in this report. The structural analyses and the functional constraints that required certain performance parameters were combined to shape and guide the design effort toward a prediction of successful and reliable control and scram operation to be provided by these inner control rods

  15. Disruption prediction at JET

    International Nuclear Information System (INIS)

    Milani, F.

    1998-12-01

    The sudden loss of the plasma magnetic confinement, known as disruption, is one of the major issue in a nuclear fusion machine as JET (Joint European Torus). Disruptions pose very serious problems to the safety of the machine. The energy stored in the plasma is released to the machine structure in few milliseconds resulting in forces that at JET reach several Mega Newtons. The problem is even more severe in the nuclear fusion power station where the forces are in the order of one hundred Mega Newtons. The events that occur during a disruption are still not well understood even if some mechanisms that can lead to a disruption have been identified and can be used to predict them. Unfortunately it is always a combination of these events that generates a disruption and therefore it is not possible to use simple algorithms to predict it. This thesis analyses the possibility of using neural network algorithms to predict plasma disruptions in real time. This involves the determination of plasma parameters every few milliseconds. A plasma boundary reconstruction algorithm, XLOC, has been developed in collaboration with Dr. D. O'Brien and Dr. J. Ellis capable of determining the plasma wall/distance every 2 milliseconds. The XLOC output has been used to develop a multilayer perceptron network to determine plasma parameters as l i and q ψ with which a machine operational space has been experimentally defined. If the limits of this operational space are breached the disruption probability increases considerably. Another approach for prediction disruptions is to use neural network classification methods to define the JET operational space. Two methods have been studied. The first method uses a multilayer perceptron network with softmax activation function for the output layer. This method can be used for classifying the input patterns in various classes. In this case the plasma input patterns have been divided between disrupting and safe patterns, giving the possibility of

  16. Percutaneous nephrolithotomy vs. extracorporeal shockwave lithotripsy for treating a 20-30 mm single renal pelvic stone.

    Science.gov (United States)

    Hassan, Mohammed; El-Nahas, Ahmed R; Sheir, Khaled Z; El-Tabey, Nasr A; El-Assmy, Ahmed M; Elshal, Ahmed M; Shokeir, Ahmed A

    2015-09-01

    To compare the efficacy, safety and cost of extracorporeal shockwave lithotripsy (ESWL) and percutaneous nephrolithotomy (PNL) for treating a 20-30 mm single renal pelvic stone. The computerised records of patients who underwent PNL or ESWL for a 20-30 mm single renal pelvic stone between January 2006 and December 2012 were reviewed retrospectively. Patients aged PNL. The re-treatment rate (75% vs. 5%), the need for secondary procedures (25% vs. 4.7%) and total number of procedures (three vs. one) were significantly higher in the ESWL group (P PNL group (95% vs. 75%, P PNL (US$ 1120 vs. 490; P PNL was more effective than ESWL for treating a single renal pelvic stone of 20-30 mm. However, ESWL was associated with fewer complications and a lower cost.

  17. DCH analyses using the CONTAIN code

    International Nuclear Information System (INIS)

    Hong, Sung Wan; Kim, Hee Dong

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of 'DCH issue resolution for ice condenser plants' which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author)

  18. DCH analyses using the CONTAIN code

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Wan; Kim, Hee Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of `DCH issue resolution for ice condenser plants` which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author).

  19. Soil analyses by ICP-MS (Review)

    International Nuclear Information System (INIS)

    Yamasaki, Shin-ichi

    2000-01-01

    Soil analyses by inductively coupled plasma mass spectrometry (ICP-MS) are reviewed. The first half of the paper is devoted to the development of techniques applicable to soil analyses, where diverse analytical parameters are carefully evaluated. However, the choice of soil samples is somewhat arbitrary, and only a limited number of samples (mostly reference materials) are examined. In the second half, efforts are mostly concentrated on the introduction of reports, where a large number of samples and/or very precious samples have been analyzed. Although the analytical techniques used in these reports are not necessarily novel, valuable information concerning such topics as background levels of elements in soils, chemical forms of elements in soils and behavior of elements in soil ecosystems and the environment can be obtained. The major topics discussed are total elemental analysis, analysis of radionuclides with long half-lives, speciation, leaching techniques, and isotope ratio measurements. (author)

  20. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  1. Standardized analyses of nuclear shipping containers

    International Nuclear Information System (INIS)

    Parks, C.V.; Hermann, O.W.; Petrie, L.M.; Hoffman, T.J.; Tang, J.S.; Landers, N.F.; Turner, W.D.

    1983-01-01

    This paper describes improved capabilities for analyses of nuclear fuel shipping containers within SCALE -- a modular code system for Standardized Computer Analyses for Licensing Evaluation. Criticality analysis improvements include the new KENO V, a code which contains an enhanced geometry package and a new control module which uses KENO V and allows a criticality search on optimum pitch (maximum k-effective) to be performed. The SAS2 sequence is a new shielding analysis module which couples fuel burnup, source term generation, and radial cask shielding. The SAS5 shielding sequence allows a multidimensional Monte Carlo analysis of a shipping cask with code generated biasing of the particle histories. The thermal analysis sequence (HTAS1) provides an easy-to-use tool for evaluating a shipping cask response to the accident capability of the SCALE system to provide the cask designer or evaluator with a computational system that provides the automated procedures and easy-to-understand input that leads to standarization

  2. Quantitative Analyse und Visualisierung der Herzfunktionen

    Science.gov (United States)

    Sauer, Anne; Schwarz, Tobias; Engel, Nicole; Seitel, Mathias; Kenngott, Hannes; Mohrhardt, Carsten; Loßnitzer, Dirk; Giannitsis, Evangelos; Katus, Hugo A.; Meinzer, Hans-Peter

    Die computergestützte bildbasierte Analyse der Herzfunktionen ist mittlerweile Standard in der Kardiologie. Die verfügbaren Produkte erfordern meist ein hohes Maß an Benutzerinteraktion und somit einen erhöhten Zeitaufwand. In dieser Arbeit wird ein Ansatz vorgestellt, der dem Kardiologen eine größtenteils automatische Analyse der Herzfunktionen mittels MRT-Bilddaten ermöglicht und damit Zeitersparnis schafft. Hierbei werden alle relevanten herzphysiologsichen Parameter berechnet und mithilfe von Diagrammen und Graphen visualisiert. Diese Berechnungen werden evaluiert, indem die ermittelten Werte mit manuell vermessenen verglichen werden. Der hierbei berechnete mittlere Fehler liegt mit 2,85 mm für die Wanddicke und 1,61 mm für die Wanddickenzunahme immer noch im Bereich einer Pixelgrösse der verwendeten Bilder.

  3. Exergetic and thermoeconomic analyses of power plants

    International Nuclear Information System (INIS)

    Kwak, H.-Y.; Kim, D.-J.; Jeon, J.-S.

    2003-01-01

    Exergetic and thermoeconomic analyses were performed for a 500-MW combined cycle plant. In these analyses, mass and energy conservation laws were applied to each component of the system. Quantitative balances of the exergy and exergetic cost for each component, and for the whole system was carefully considered. The exergoeconomic model, which represented the productive structure of the system considered, was used to visualize the cost formation process and the productive interaction between components. The computer program developed in this study can determine the production costs of power plants, such as gas- and steam-turbines plants and gas-turbine cogeneration plants. The program can be also be used to study plant characteristics, namely, thermodynamic performance and sensitivity to changes in process and/or component design variables

  4. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  5. Kinetic stability analyses in a bumpy cylinder

    International Nuclear Information System (INIS)

    Dominguez, R.R.; Berk, H.L.

    1981-01-01

    Recent interest in the ELMO Bumpy Torus (EBT) has prompted a number of stability analyses of both the hot electron rings and the toroidal plasma. Typically these works employ the local approximation, neglecting radial eigenmode structure and ballooning effects to perform the stability analysis. In the present work we develop a fully kinetic formalism for performing nonlocal stability analyses in a bumpy cylinder. We show that the Vlasov-Maxwell integral equations (with one ignorable coordinate) are self-adjoint and hence amenable to analysis using numerical techniques developed for self-adjoint systems of equations. The representation we obtain for the kernel of the Vlasov-Maxwell equations is a differential operator of arbitrarily high order. This form leads to a manifestly self-adjoint system of differential equations for long wavelength modes

  6. Sectorial Group for Incident Analyses (GSAI)

    International Nuclear Information System (INIS)

    Galles, Q.; Gamo, J. M.; Jorda, M.; Sanchez-Garrido, P.; Lopez, F.; Asensio, L.; Reig, J.

    2013-01-01

    In 2008, the UNESA Nuclear Energy Committee (CEN) proposed the creation of a working group formed by experts from all Spanish NPPs with the purpose of jointly analyze relevant incidents occurred in each one of the plants. This initiative was a response to a historical situation in which the exchange of information on incidents between the Spanish NPP's was below the desired level. In june 2009, UNESA's Guide CEN-29 established the performance criteria for the so called Sectorial Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group of Operating Experience, under the Operations Commission (COP). (Author)

  7. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  8. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    was systematically evaluated by (i) in silico PCRs using all standard sequences in the EMBL public database as templates, (ii) in vitro PCRs of DNA extracts from surface soil samples from a site in Varanger, northern Norway and (iii) in vitro PCRs of DNA extracts from permanently frozen sediment samples of late......Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized...... for taxonomic resolution, minimal bias in amplification of the target organism group and short sequence length. Using bioinformatic tools, we developed metabarcodes for several groups of organisms: fungi, bryophytes, enchytraeids, beetles and birds. The ability of these metabarcodes to amplify the target groups...

  9. Visuelle Analyse von E-mail-Verkehr

    OpenAIRE

    Mansmann, Florian

    2003-01-01

    Diese Arbeit beschreibt Methoden zur visuellen geographischen Analyse von E-mail Verkehr.Aus dem Header einer E-mail können Hostadressen und IP-Adressen herausgefiltert werden. Anhand einer Datenbank werden diesen Host- und IP-Adressen geographische Koordinaten zugeordnet.Durch eine Visualisierung werden in übersichtlicher Art und Weise mehrere tausend E-mail Routen dargestellt. Zusätzlich dazu wurden interktive Manipulationsmöglichkeiten vorgestellt, welche eine visuelle Exploration der Date...

  10. BWR core melt progression phenomena: Experimental analyses

    International Nuclear Information System (INIS)

    Ott, L.J.

    1992-01-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component

  11. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  12. Neuronal network analyses: premises, promises and uncertainties

    OpenAIRE

    Parker, David

    2010-01-01

    Neuronal networks assemble the cellular components needed for sensory, motor and cognitive functions. Any rational intervention in the nervous system will thus require an understanding of network function. Obtaining this understanding is widely considered to be one of the major tasks facing neuroscience today. Network analyses have been performed for some years in relatively simple systems. In addition to the direct insights these systems have provided, they also illustrate some of the diffic...

  13. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  14. Kinematic gait analyses in healthy Golden Retrievers

    OpenAIRE

    Silva, Gabriela C.A.; Cardoso, Mariana Trés; Gaiad, Thais P.; Brolio, Marina P.; Oliveira, Vanessa C.; Assis Neto, Antonio; Martins, Daniele S.; Ambrósio, Carlos E.

    2014-01-01

    Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female,...

  15. Evaluation of periodic safety status analyses

    International Nuclear Information System (INIS)

    Faber, C.; Staub, G.

    1997-01-01

    In order to carry out the evaluation of safety status analyses by the safety assessor within the periodical safety reviews of nuclear power plants safety goal oriented requirements have been formulated together with complementary evaluation criteria. Their application in an inter-disciplinary coopertion covering the subject areas involved facilitates a complete safety goal oriented assessment of the plant status. The procedure is outlined briefly by an example for the safety goal 'reactivity control' for BWRs. (orig.) [de

  16. Application of RUNTA code in flood analyses

    International Nuclear Information System (INIS)

    Perez Martin, F.; Benitez Fonzalez, F.

    1994-01-01

    Flood probability analyses carried out to date indicate the need to evaluate a large number of flood scenarios. This necessity is due to a variety of reasons, the most important of which include: - Large number of potential flood sources - Wide variety of characteristics of flood sources - Large possibility of flood-affected areas becoming inter linked, depending on the location of the potential flood sources - Diversity of flood flows from one flood source, depending on the size of the rupture and mode of operation - Isolation times applicable - Uncertainties in respect of the structural resistance of doors, penetration seals and floors - Applicable degrees of obstruction of floor drainage system Consequently, a tool which carries out the large number of calculations usually required in flood analyses, with speed and flexibility, is considered necessary. The RUNTA Code enables the range of possible scenarios to be calculated numerically, in accordance with all those parameters which, as a result of previous flood analyses, it is necessary to take into account in order to cover all the possible floods associated with each flood area

  17. An analyser for power plant operations

    International Nuclear Information System (INIS)

    Rogers, A.E.; Wulff, W.

    1990-01-01

    Safe and reliable operation of power plants is essential. Power plant operators need a forecast of what the plant will do when its current state is disturbed. The in-line plant analyser provides precisely this information at relatively low cost. The plant analyser scheme uses a mathematical model of the dynamic behaviour of the plant to establish a numerical simulation. Over a period of time, the simulation is calibrated with measurements from the particular plant in which it is used. The analyser then provides a reference against which to evaluate the plant's current behaviour. It can be used to alert the operator to any atypical excursions or combinations of readings that indicate malfunction or off-normal conditions that, as the Three Mile Island event suggests, are not easily recognised by operators. In a look-ahead mode, it can forecast the behaviour resulting from an intended change in settings or operating conditions. Then, when such changes are made, the plant's behaviour can be tracked against the forecast in order to assure that the plant is behaving as expected. It can be used to investigate malfunctions that have occurred and test possible adjustments in operating procedures. Finally, it can be used to consider how far from the limits of performance the elements of the plant are operating. Then by adjusting settings, the required power can be generated with as little stress as possible on the equipment. (6 figures) (Author)

  18. Steady State Thermal Analyses of SCEPTOR X-57 Wingtip Propulsion

    Science.gov (United States)

    Schnulo, Sydney L.; Chin, Jeffrey C.; Smith, Andrew D.; Dubois, Arthur

    2017-01-01

    Electric aircraft concepts enable advanced propulsion airframe integration approaches that promise increased efficiency as well as reduced emissions and noise. NASA's fully electric Maxwell X-57, developed under the SCEPTOR program, features distributed propulsion across a high aspect ratio wing. There are 14 propulsors in all: 12 high lift motor that are only active during take off and climb, and 2 larger motors positioned on the wingtips that operate over the entire mission. The power electronics involved in the wingtip propulsion are temperature sensitive and therefore require thermal management. This work focuses on the high and low fidelity heat transfer analysis methods performed to ensure that the wingtip motor inverters do not reach their temperature limits. It also explores different geometry configurations involved in the X-57 development and any thermal concerns. All analyses presented are performed at steady state under stressful operating conditions, therefore predicting temperatures which are considered the worst-case scenario to remain conservative.

  19. Probabilistic fuel rod analyses using the TRANSURANUS code

    Energy Technology Data Exchange (ETDEWEB)

    Lassmann, K; O` Carroll, C; Laar, J Van De [CEC Joint Research Centre, Karlsruhe (Germany)

    1997-08-01

    After more than 25 years of fuel rod modelling research, the basic concepts are well established and the limitations of the specific approaches are known. However, the widely used mechanistic approach leads in many cases to discrepancies between theoretical predictions and experimental evidence indicating that models are not exact and that some of the physical processes encountered are of stochastic nature. To better understand uncertainties and their consequences, the mechanistic approach must therefore be augmented by statistical analyses. In the present paper the basic probabilistic methods are briefly discussed. Two such probabilistic approaches are included in the fuel rod performance code TRANSURANUS: the Monte Carlo method and the Numerical Noise Analysis. These two techniques are compared and their capabilities are demonstrated. (author). 12 refs, 4 figs, 2 tabs.

  20. Spent fuel shipping costs for transportation logistics analyses

    International Nuclear Information System (INIS)

    Cole, B.M.; Cross, R.E.; Cashwell, J.W.

    1983-05-01

    Logistics analyses supplied to the nuclear waste management programs of the U.S. Department of Energy through the Transportation Technology Center (TTC) at Sandia National Laboratories are used to predict nuclear waste material logistics, transportation packaging demands, shipping and receiving rates and transportation-related costs for alternative strategies. This study is an in-depth analysis of the problems and contingencies associated with the costs of shipping irradiated reactor fuel. These costs are extremely variable however, and have changed frequently (sometimes monthly) during the past few years due to changes in capital, fuel, and labor costs. All costs and charges reported in this study are based on January 1982 data using existing transport cask systems and should be used as relative indices only. Actual shipping costs would be negotiable for each origin-destination combination

  1. Contract Dynamics : Lessons from Empirical Analyses

    OpenAIRE

    Magali Chaudey

    2010-01-01

    Working paper GATE 2010-35; The recognition that contracts have a time dimension has given rise to a very abundant literature since the end of the 1980s. In such a dynamic context, the contract may take place over several periods and develop repeated interactions. Then, the principal topics of the analysis are commitment, reputation, memory and the renegotiation of the contract. Few papers have tried to apply the predictions of dynamic contract theory to data. The examples of applications int...

  2. Depressive disorder in the last phase of life in patients with cardiovascular disease, cancer, and COPD: data from a 20-year follow-up period in general practice.

    Science.gov (United States)

    Warmenhoven, Franca; Bor, Hans; Lucassen, Peter; Vissers, Kris; van Weel, Chris; Prins, Judith; Schers, Henk

    2013-05-01

    Depression is assumed to be common in chronically ill patients during their last phase of life and is associated with poorer outcomes. The prevalence of depression is widely varying in previous studies due to the use of different terminology, classification, and assessment methods. To explore the reported incidence of depressive disorder, as registered in the last phase of life of patients who died from cardiovascular disease, cancer or COPD, in a sample of primary care patients. A historic cohort study, using a 20-year period registration database of medical records in four Dutch general practices (a dynamic population based on the Continuous Morbidity Registration database). Medical history of the sample cohort was analysed for the diagnosis of a new episode of depressive disorder and descriptive statistics were used. In total 982 patients were included, and 19 patients (1.9%) were diagnosed with a new depressive disorder in the last year of their life. The lifetime prevalence of depressive disorder in this sample was 8.2%. The incidence of depressive disorder in the last phase of life is remarkably low in this study. These data were derived from actual patient care in general practice. Psychiatric diagnoses were made by GPs in the context of both patient needs and delivered care. A broader concept of depression in general practice is recommended to improve the diagnosis and treatment of mood disorders in patients in the last phase of life.

  3. Determinants of Aortic Root Dilatation and Reference Values Among Young Adults Over a 20-Year Period: Coronary Artery Risk Development in Young Adults Study.

    Science.gov (United States)

    Teixido-Tura, Gisela; Almeida, Andre L C; Choi, Eui-Young; Gjesdal, Ola; Jacobs, David R; Dietz, Harry C; Liu, Kiang; Sidney, Stephen; Lewis, Cora E; Garcia-Dorado, David; Evangelista, Artur; Gidding, Samuel; Lima, João A C

    2015-07-01

    Aortic size increases with age, but factors related to such dilatation in healthy young adult population have not been studied. We aim to evaluate changes in aortic dimensions and its principal correlates among young adults over a 20-year time period. Reference values for aortic dimensions in young adults by echocardiography are also provided. Healthy Coronary Artery Risk Development in Young Adults (CARDIA) study participants aged 23 to 35 years in 1990-1991 (n=3051) were included after excluding 18 individuals with significant valvular dysfunction. Aortic root diameter (ARD) by M-mode echocardiography at year-5 (43.7% men; age, 30.2 ± 3.6 years) and year-25 CARDIA exams was obtained. Univariable and multivariable analyses were performed to assess associations of ARD with clinical data at years-5 and -25. ARD from year-5 was used to establish reference values of ARD in healthy young adults. ARD at year-25 was greater in men (33.3 ± 3.7 versus 28.7 ± 3.4 mm; Pyoung adulthood. Our study also provides reference values for ARD in young adults. © 2015 American Heart Association, Inc.

  4. Predictability of blocking

    International Nuclear Information System (INIS)

    Tosi, E.; Ruti, P.; Tibaldi, S.; D'Andrea, F.

    1994-01-01

    Tibaldi and Molteni (1990, hereafter referred to as TM) had previously investigated operational blocking predictability by the ECMWF model and the possible relationships between model systematic error and blocking in the winter season of the Northern Hemisphere, using seven years of ECMWF operational archives of analyses and day 1 to 10 forecasts. They showed that fewer blocking episodes than in the real atmosphere were generally simulated by the model, and that this deficiency increased with increasing forecast time. As a consequence of this, a major contribution to the systematic error in the winter season was shown to derive from the inability of the model to properly forecast blocking. In this study, the analysis performed in TM for the first seven winter seasons of the ECMWF operational model is extended to the subsequent five winters, during which model development, reflecting both resolution increases and parametrisation modifications, continued unabated. In addition the objective blocking index developed by TM has been applied to the observed data to study the natural low frequency variability of blocking. The ability to simulate blocking of some climate models has also been tested

  5. ALBEDO PATTERN RECOGNITION AND TIME-SERIES ANALYSES IN MALAYSIA

    Directory of Open Access Journals (Sweden)

    S. A. Salleh

    2012-07-01

    Full Text Available Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000–2009 MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools. There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI and aerosol optical depth (AOD. There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high

  6. Predicting responsiveness to intervention in dyslexia using dynamic assessment

    NARCIS (Netherlands)

    Aravena, S.; Tijms, J.; Snellings, P.; van der Molen, M.W.

    In the current study we examined the value of a dynamic test for predicting responsiveness to reading intervention for children diagnosedwith dyslexia. The test consisted of a 20-minute training aimed at learning eight basic letter–speech sound correspondences within an artificial orthography,

  7. Indian Point 2 steam generator tube rupture analyses

    International Nuclear Information System (INIS)

    Dayan, A.

    1985-01-01

    Analyses were conducted with RETRAN-02 to study consequences of steam generator tube rupture (SGTR) events. The Indian Point, Unit 2, power plant (IP2, PWR) was modeled as a two asymmetric loops, consisting of 27 volumes and 37 junctions. The break section was modeled once, conservatively, as a 150% flow area opening at the wall of the steam generator cold leg plenum, and once as a 200% double-ended tube break. Results revealed 60% overprediction of breakflow rates by the traditional conservative model. Two SGTR transients were studied, one with low-pressure reactor trip and one with an earlier reactor trip via over temperature ΔT. The former is more typical to a plant with low reactor average temperature such as IP2. Transient analyses for a single tube break event over 500 seconds indicated continued primary subcooling and no need for steam line pressure relief. In addition, SGTR transients with reactor trip while the pressurizer still contains water were found to favorably reduce depressurization rates. Comparison of the conservative results with independent LOFTRAN predictions showed good agreement

  8. Methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Mazumdar, M.; Marshall, J.A.; Chay, S.C.; Gay, R.

    1976-07-01

    In February 1975, Westinghouse Electric Corporation, under contract to Electric Power Research Institute, started a one-year program to develop methodology for statistical evaluation of nuclear-safety-related engineering analyses. The objectives of the program were to develop an understanding of the relative efficiencies of various computational methods which can be used to compute probability distributions of output variables due to input parameter uncertainties in analyses of design basis events for nuclear reactors and to develop methods for obtaining reasonably accurate estimates of these probability distributions at an economically feasible level. A series of tasks was set up to accomplish these objectives. Two of the tasks were to investigate the relative efficiencies and accuracies of various Monte Carlo and analytical techniques for obtaining such estimates for a simple thermal-hydraulic problem whose output variable of interest is given in a closed-form relationship of the input variables and to repeat the above study on a thermal-hydraulic problem in which the relationship between the predicted variable and the inputs is described by a short-running computer program. The purpose of the report presented is to document the results of the investigations completed under these tasks, giving the rationale for choices of techniques and problems, and to present interim conclusions

  9. Consumer brand choice: individual and group analyses of demand elasticity.

    Science.gov (United States)

    Oliveira-Castro, Jorge M; Foxall, Gordon R; Schrezenmaier, Teresa C

    2006-03-01

    Following the behavior-analytic tradition of analyzing individual behavior, the present research investigated demand elasticity of individual consumers purchasing supermarket products, and compared individual and group analyses of elasticity. Panel data from 80 UK consumers purchasing 9 product categories (i.e., baked beans, biscuits, breakfast cereals, butter, cheese, fruit juice, instant coffee, margarine and tea) during a 16-week period were used. Elasticity coefficients were calculated for individual consumers with data from all or only 1 product category (intra-consumer elasticities), and for each product category using all data points from all consumers (overall product elasticity) or 1 average data point per consumer (interconsumer elasticity). In addition to this, split-sample elasticity coefficients were obtained for each individual with data from all product categories purchased during weeks 1 to 8 and 9 to 16. The results suggest that: 1) demand elasticity coefficients calculated for individual consumers purchasing supermarket food products are compatible with predictions from economic theory and behavioral economics; 2) overall product elasticities, typically employed in marketing and econometric research, include effects of interconsumer and intraconsumer elasticities; 3) when comparing demand elasticities of different product categories, group and individual analyses yield similar trends; and 4) individual differences in demand elasticity are relatively consistent across time, but do not seem to be consistent across products. These results demonstrate the theoretical, methodological, and managerial relevance of investigating the behavior of individual consumers.

  10. MAAP - modular program for analyses of severe accidents

    International Nuclear Information System (INIS)

    Henry, R.E.; Lutz, R.J.

    1990-01-01

    The MAAP computer code was developed by Westinghouse as a fast, user-friendly, integrated analytical tool for evaluations of the sequences and consequences of severe accidents. The code allows a fully integrated treatment of thermohydraulic behavior and of the fission products in the primary system, the containment, and the ancillary buildings. This ensures interactive inclusion of all thermohydraulic events and of fission product behavior. All important phenomena which may occur in a major accident are contained in the modular code. In addition, many of the important parameters affecting the multitude of different phenomena can be defined by the user. In this way, it is possible to study the accuracy of the predicted course and of the consequences of a series of major accident phenomena. The MAAP code was subjected to extensive benchmarking with respect to the results of the experimental and theoretical programs, the findings obtained in other safety analyses using computers and data from accidents and transients in plants actually in operation. With the expected connection of the validation and test programs, the computer code attains a quality standard meeting the most stringent requirements in safety analyses. The code will be enlarged further in order to expand the number of benchmarks and the resolution of individual comparisons, and to ensure that future MAAP models will be in better agreement with the experiments and experiences of industry. (orig.) [de

  11. Progress Report on Computational Analyses of Water-Based NSTF

    Energy Technology Data Exchange (ETDEWEB)

    Lv, Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Kraus, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Lisowski, D. [Argonne National Lab. (ANL), Argonne, IL (United States); Nunez, D. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-01

    CFD analysis has been focused on important component-level phenomena using STARCCM+ to supplement the system analysis of integral system behavior. A notable area of interest was the cavity region. This area is of particular interest for CFD analysis due to the multi-dimensional flow and complex heat transfer (thermal radiation heat transfer and natural convection), which are not simulated directly by RELAP5. CFD simulations allow for the estimation of the boundary heat flux distribution along the riser tubes, which is needed in the RELAP5 simulations. The CFD results can also provide additional data to help establish what level of modeling detail is necessary in RELAP5. It was found that the flow profiles in the cavity region are simpler for the water-based concept than for the air-cooled concept. The local heat flux noticeably increases axially, and is higher in the fins than in the riser tubes. These results were utilized in RELAP5 simulations as boundary conditions, to provide better temperature predictions in the system level analyses. It was also determined that temperatures were higher in the fins than the riser tubes, but within design limits for thermal stresses. Higher temperature predictions were identified in the edge fins, in part due to additional thermal radiation from the side cavity walls.

  12. Model-Based Recursive Partitioning for Subgroup Analyses.

    Science.gov (United States)

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-05-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by predictive factors. The method starts with a model for the overall treatment effect as defined for the primary analysis in the study protocol and uses measures for detecting parameter instabilities in this treatment effect. The procedure produces a segmented model with differential treatment parameters corresponding to each patient subgroup. The subgroups are linked to predictive factors by means of a decision tree. The method is applied to the search for subgroups of patients suffering from amyotrophic lateral sclerosis that differ with respect to their Riluzole treatment effect, the only currently approved drug for this disease.

  13. Pegasys: software for executing and integrating analyses of biological sequences

    Directory of Open Access Journals (Sweden)

    Lett Drew

    2004-04-01

    Full Text Available Abstract Background We present Pegasys – a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. Results The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. Conclusions The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  14. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  15. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  16. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  17. Safety analyses for high-temperature reactors

    International Nuclear Information System (INIS)

    Mueller, A.

    1978-01-01

    The safety evaluation of HTRs may be based on the three methods presented here: The licensing procedure, the probabilistic risk analysis, and the damage extent analysis. Thereby all safety aspects - from normal operation to the extreme (hypothetical) accidents - of the HTR are covered. The analyses within the licensing procedure of the HTR-1160 have shown that for normal operation and for the design basis accidents the radiation exposures remain clearly below the maximum permissible levels as prescribed by the radiation protection ordinance, so that no real hazard for the population will avise from them. (orig./RW) [de

  18. Introduction: Analysing Emotion and Theorising Affect

    Directory of Open Access Journals (Sweden)

    Peta Tait

    2016-08-01

    Full Text Available This discussion introduces ideas of emotion and affect for a volume of articles demonstrating the scope of approaches used in their study within the humanities and creative arts. The volume offers multiple perspectives on emotion and affect within 20th-century and 21st-century texts, arts and organisations and their histories. The discussion explains how emotion encompasses the emotions, emotional feeling, sensation and mood and how these can be analysed particularly in relation to literature, art and performance. It briefly summarises concepts of affect theory within recent approaches before introducing the articles.

  19. Applications of neural network to numerical analyses

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki; Fukuhara, Makoto; Ma, Xiao-Feng; Liaqat, Ali

    1999-01-01

    Applications of a multi-layer neural network to numerical analyses are described. We are mainly concerned with the computed tomography and the solution of differential equations. In both cases as the objective functions for the training process of the neural network we employed residuals of the integral equation or the differential equations. This is different from the conventional neural network training where sum of the squared errors of the output values is adopted as the objective function. For model problems both the methods gave satisfactory results and the methods are considered promising for some kind of problems. (author)

  20. Komparativ analyse - Scandinavian Airlines & Norwegian Air Shuttle

    OpenAIRE

    Kallesen, Martin Nystrup; Singh, Ravi Pal; Boesen, Nana Wiaberg

    2017-01-01

    The project is based around a pondering of how that a company the size of Scandinavian Airlines or Norwegian Air Shuttle use their Finances and how they see their external environment. This has led to us researching the relationship between the companies and their finances as well as their external environment, and how they differ in both.To do this we have utilised a myriad of different methods to analyse the companies, including PESTEL, SWOT, TOWS; DCF, risk analysis, Sensitivity, Porter’s ...

  1. Implementing partnerships in nonreactor facility safety analyses

    International Nuclear Information System (INIS)

    Courtney, J.C.; Perry, W.H.; Phipps, R.D.

    1996-01-01

    Faculty and students from LSU have been participating in nuclear safety analyses and radiation protection projects at ANL-W at INEL since 1973. A mutually beneficial relationship has evolved that has resulted in generation of safety-related studies acceptable to Argonne and DOE, NRC, and state regulatory groups. Most of the safety projects have involved the Hot Fuel Examination Facility or the Fuel Conditioning Facility; both are hot cells that receive spent fuel from EBR-II. A table shows some of the major projects at ANL-W that involved LSU students and faculty

  2. Cost/benefit analyses of environmental impact

    International Nuclear Information System (INIS)

    Goldman, M.I.

    1974-01-01

    Various aspects of cost-benefit analyses are considered. Some topics discussed are: regulations of the National Environmental Policy Act (NEPA); statement of AEC policy and procedures for implementation of NEPA; Calvert Cliffs decision; AEC Regulatory Guide; application of risk-benefit analysis to nuclear power; application of the as low as practicable (ALAP) rule to radiation discharges; thermal discharge restrictions proposed by EPA under the 1972 Amendment to the Water Pollution Control Act; estimates of somatic and genetic insult per unit population exposure; occupational exposure; EPA Point Source Guidelines for Discharges from Steam Electric Power Plants; and costs of closed-cycle cooling using cooling towers. (U.S.)

  3. The phaco machine: analysing new technology.

    Science.gov (United States)

    Fishkind, William J

    2013-01-01

    The phaco machine is frequently overlooked as the crucial surgical instrument it is. Understanding how to set parameters is initiated by understanding fundamental concepts of machine function. This study analyses the critical concepts of partial occlusion phaco, occlusion phaco and pump technology. In addition, phaco energy categories as well as variations of phaco energy production are explored. Contemporary power modulations and pump controls allow for the enhancement of partial occlusion phacoemulsification. These significant changes in the anterior chamber dynamics produce a balanced environment for phaco; less complications; and improved patient outcomes.

  4. Nuclear analyses of the Pietroasa gold hoard

    International Nuclear Information System (INIS)

    Cojocaru, V.; Besliu, C.

    1999-01-01

    By means of nuclear analyses the concentrations of Au, Ag, Cu, Ir, Os, Pt, Co and Hg were measured in the 12 artifacts of the gold hoard discovered in 1837 at Pietroasa, Buzau country in Romania. The concentrations of the first four elements were used to compare different stylistic groups assumed by historians. Comparisons with gold nuggets from the old Dacian territory and gold Roman imperial coins were also made. A good agreement was found with the oldest hypothesis which considers that the hoard is represented by three styles appropriated mainly by the Goths. (author)

  5. An evaluation of the Olympus "Quickrate" analyser.

    Science.gov (United States)

    Williams, D G; Wood, R J; Worth, H G

    1979-02-01

    The Olympus "Quickrate", a photometer built for both kinetic and end point analysis was evaluated in this laboratory. Aspartate transaminase, lactate dehydrogenase, hydroxybutyrate dehydrogenase, creatine kinase, alkaline phosphatase and gamma glutamyl transpeptidase were measured in the kinetic mode and glucose, urea, total protein, albumin, bilirubin, calcium and iron in the end point mode. Overall, good correlation was observed with routine methodologies and the precision of the methods was acceptable. An electrical evaluation was also performed. In our hands, the instrument proved to be simple to use and gave no trouble. It should prove useful for paediatric and emergency work, and as a back up for other analysers.

  6. Nitrous Oxide Abuse and Vitamin B12 Action in a 20-Year-Old Woman: A Case Report.

    Science.gov (United States)

    Duque, Miriam Andrea; Kresak, Jesse L; Falchook, Adam; Harris, Neil S

    2015-01-01

    Herein, we report a case of a 20-year-old (ethnicity not reported) woman with a history of nitrous oxide abuse and clinical symptoms consistent with spinal cord subacute combined degeneration with associated low serum concentrations of vitamin B12, elevated methylmalonic acid levels, and radiologic evidence of demyelination of the dorsal region of the spinal column. The health of the patient improved dramatically with B12 supplementation. In this case, we discuss the interaction of nitrous oxide with the enzymatic pathways involved in the biochemistry of vitamin B12. Copyright© by the American Society for Clinical Pathology (ASCP).

  7. Esophageal Foreign Body: A Case Report of a Refractory Croup in a 20-Month-Old Boy

    Directory of Open Access Journals (Sweden)

    sevil Nasirmohtaram

    2016-11-01

    Full Text Available Introduction: Foreign body ingestion is common among children and more common in boys and in children under the age of 3. It can present with a wide variety of symptoms like dysphagia and drooling or symptoms related to the upper aerodigestive tract.   Case Report: A 20-month-old male presented with refractory croup and poor feeding since 2 weeks. Bronchoscopy and esophagoscopy was performed due to suspicious history of eating loquat. The core of the fruit was found in the esophagus.   Conclusion:  Physicians should be aware of the variability of esophageal foreign body presentations to prevent serious complications due to delay in diagnosis.

  8. A 20-year-old man with large gastric lipoma--imaging, clinical symptoms, pathological findings and surgical treatment.

    Science.gov (United States)

    Kapetanakis, Stylianos; Papathanasiou, Jiannis; Fiska, Aliki; Ververidis, Athanasios; Dimitriou, Thespis; Hristov, Zheliazko; Paskalev, George

    2010-01-01

    A broad search of the available literature yielded no other report of gastric lipoma of that size (13.5 x 6.5 x 4.5 cm) at this early age. The patient (a 20-year-old man with giant lipoma in the anterior gastric wall) presented with haematemesis and melena after excessive alcohol consumption. Gastric resection was performed. At 5-year follow up the patient is healthy and doing well. Epidemiology of gastric lipoma, the differential diagnosis, means of diagnosis and treatment are discussed.

  9. [Parvovirus B19 infection as the cause of hepatitis and neutrophil granulocytosis in a 20-year old woman].

    Science.gov (United States)

    Wiggers, H; Rasmussen, L H; Møller, A

    1995-10-23

    A case of Parvovirus B19 infection (erythema infectiosum) in a 20 year old woman is presented. The patient presented with fever, arthritis in one knee, neutrophil granulocytosis and biochemical evidence of hepatitis. Serological evidence of Parvovirus B19 infection was found as the only explanation of the clinical picture. Hepatitis was due to Parvovirus B19 infection as there was no serological evidence of EBV or CMV reactivation. Neutrophil granulocytosis and thrombocytosis were found and were probably due to an active bone marrow in the recovery phase of bone marrow aplasia.

  10. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  11. Passive safety injection experiments and analyses (PAHKO)

    International Nuclear Information System (INIS)

    Tuunanen, J.

    1998-01-01

    PAHKO project involved experiments on the PACTEL facility and computer simulations of selected experiments. The experiments focused on the performance of Passive Safety Injection Systems (PSIS) of Advanced Light Water Reactors (ALWRs) in Small Break Loss-Of-Coolant Accident (SBLOCA) conditions. The PSIS consisted of a Core Make-up Tank (CMT) and two pipelines (Pressure Balancing Line, PBL, and Injection Line, IL). The examined PSIS worked efficiently in SBLOCAs although the flow through the PSIS stopped temporarily if the break was very small and the hot water filled the CMT. The experiments demonstrated the importance of the flow distributor in the CMT to limit rapid condensation. The project included validation of three thermal-hydraulic computer codes (APROS, CATHARE and RELAP5). The analyses showed the codes are capable to simulate the overall behaviour of the transients. The detailed analyses of the results showed some models in the codes still need improvements. Especially, further development of models for thermal stratification, condensation and natural circulation flow with small driving forces would be necessary for accurate simulation of the PSIS phenomena. (orig.)

  12. Used Fuel Management System Interface Analyses - 13578

    Energy Technology Data Exchange (ETDEWEB)

    Howard, Robert; Busch, Ingrid [Oak Ridge National Laboratory, P.O. Box 2008, Bldg. 5700, MS-6170, Oak Ridge, TN 37831 (United States); Nutt, Mark; Morris, Edgar; Puig, Francesc [Argonne National Laboratory (United States); Carter, Joe; Delley, Alexcia; Rodwell, Phillip [Savannah River National Laboratory (United States); Hardin, Ernest; Kalinina, Elena [Sandia National Laboratories (United States); Clark, Robert [U.S. Department of Energy (United States); Cotton, Thomas [Complex Systems Group (United States)

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  13. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  14. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  15. Special analyses reveal coke-deposit structure

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations

  16. Overview of cooperative international piping benchmark analyses

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1982-01-01

    This paper presents an overview of an effort initiated in 1976 by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency (IAEA) to evaluate detailed and simplified inelastic analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IAEA countries descriptions of tests and test results for piping systems or bends (with emphasis on high temperature inelastic tests), to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analyses results. Of the problem descriptions submitted three were selected to be used: a 90 0 -elbow at 600 0 C with an in-plane transverse force; a 90 0 -elbow with an in-plane moment; and a 180 0 -elbow at room temperature with a reversed, cyclic, in-plane transverse force. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this paper. 15 figures

  17. Ethics of cost analyses in medical education.

    Science.gov (United States)

    Walsh, Kieran

    2013-11-01

    Cost analyses in medical education are rarely straightforward, and rarely lead to clear-cut conclusions. Occasionally they do lead to clear conclusions but even when that happens, some stakeholders will ask difficult but valid questions about what to do following cost analyses-specifically about distributive justice in the allocation of resources. At present there are few or no debates about these issues and rationing decisions that are taken in medical education are largely made subconsciously. Distributive justice 'concerns the nature of a socially just allocation of goods in a society'. Inevitably there is a large degree of subjectivity in the judgment as to whether an allocation is seen as socially just or ethical. There are different principles by which we can view distributive justice and which therefore affect the prism of subjectivity through which we see certain problems. For example, we might say that distributive justice at a certain institution or in a certain medical education system operates according to the principle that resources must be divided equally amongst learners. Another system may say that resources should be distributed according to the needs of learners or even of patients. No ethical system or model is inherently right or wrong, they depend on the context in which the educator is working.

  18. Analysing lawyers’ attitude towards knowledge sharing

    Directory of Open Access Journals (Sweden)

    Wole M. Olatokun

    2012-02-01

    Full Text Available Objectives: The study examined and identified the factors that affect lawyers’ attitudes to knowledge sharing, and their knowledge sharing behaviour. Specifically, it investigated the relationship between the salient beliefs affecting the knowledge sharing attitude of lawyers’, and applied a modified version of the Theory of Reasoned Action (TRA in the knowledge sharing context, to predict how these factors affect their knowledge sharing behaviour.Method: A field survey of 273 lawyers was carried out, using questionnaire for data collection. Collected data on all variables were structured into grouped frequency distributions. Principal Component Factor Analysis was applied to reduce the constructs and Simple Regression was applied to test the hypotheses. These were tested at 0.05% level of significance.Results: Results showed that expected associations and contributions were the major determinants of lawyers’ attitudes towards knowledge sharing. Expected reward was not significantly related to lawyers’ attitudes towards knowledge sharing. A positive attitude towards knowledge sharing was found to lead to a positive intention to share knowledge, although a positive intention to share knowledge did not significantly predict a positive knowledge sharing behaviour. The level of Information Technology (IT usage was also found to significantly affect the knowledge sharing behaviour of lawyers’.Conclusion: It was recommended that law firms in the study area should deploy more IT infrastructure and services that encourage effective knowledge sharing amongst lawyers. 

  19. Climate Prediction Center - Expert Assessments Index

    Science.gov (United States)

    Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Home Site Map News Web resources and services. HOME > Monitoring and Data > Global Climate Data & Maps > ; Global Regional Climate Maps Regional Climate Maps Banner The Monthly regional analyses products are

  20. Monitoring and prediction of natural disasters

    International Nuclear Information System (INIS)

    Kondratyev, K. Ya; Krapivin, V. F.

    2004-01-01

    The problems of natural disaster predicting and accomplishing a synthesis of environmental monitoring systems to collect, store, and process relevant information for their solution are analysed. A three-level methodology is proposed for making decisions concerning the natural disaster dynamics. The methodology is based on the assessment of environmental indicators and the use of numerical models of the environment

  1. [ABIN1 is not involved in imatinib upregulating A20 to inhibit the activation of NF-κB pathway in Jurkat T cells].

    Science.gov (United States)

    Chen, Qian; Wang, Senlin; Lin, Chen; Chen, Shaohua; Zhao, Xiaoling; Li, Yangqiu

    2017-05-01

    Objective To investigate the effect of imatinib (IM) on the expressions of A20-binding inhibitor of NF-κB1 (ABIN1) and A20 in Jurkat T cells. Methods Jurkat T cells were treated with 25, 50 and 100 nmol/L IM for 24 hours. The mRNA and protein levels of ABIN1, A20 and NF-κB were detected by real-time quantitative PCR and Western blotting. Results IM significantly inhibited both mRNA and protein levels of ABIN1 and NF-κB, but raised the mRNA and protein levels of A20; while phorbol 12-myristate 13-acetate/ionomycin increased the expression levels of ABIN1 and A20 mRNA and protein. Conclusion IM could upregulate A20 protein to inhibit the activation of NF-κB pathway in Jurkat T cells, which was independent of the ABIN1 protein.

  2. Pathway analyses implicate glial cells in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Laramie E Duncan

    Full Text Available The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge.Ten publically available gene sets (pathways related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls, and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls.The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance and also achieved nominal levels of significance with INRICH (p = 0.0057 and ALIGATOR (p = 0.022. For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002.Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or lifestyle. While not the primary purpose of our study

  3. Activation and afterheat analyses for the HCPB test blanket

    International Nuclear Information System (INIS)

    Pereslavtsev, P.; Fischer, U.

    2007-01-01

    The Helium-Cooled Pebble Bed (HCPB) blanket is one of two breeder blanket concepts developed in the framework of the European Fusion Technology Programme for performance tests in ITER. The recent development programme focussed on the detailed engineering design of the Test Blanket Module (TBM) and associated systems including the assessment of safety and licensing related issues with the objective to prepare for a preliminary Safety Report. To provide a sound data basis for the safety analyses of the HCPB TBM system in ITER, the afterheat and activity inventories were assessed making use of a code system that allows performing 3D activation calculations by linking the Monte Carlo transport code MCNP and the fusion inventory code FISPACT through an appropriate interface. A suitable MCNP model of a 20 degree ITER torus sector with an integrated TBM of the HCPB PI (Plant Integration) type in the horizontal test blanket port was developed and adapted to the requirements for coupled 3D neutron transport and activation calculations. Two different irradiation scenarios were considered in the coupled 3D neutron transport and activation calculations. The first one is representative for the TBM irradiation in ITER with a total of 9000 neutron pulses over a three (calendar) years period. It was simulated by a continuous irradiation for 3 years minus the last month and a discontinuous irradiation with 250 pulses (420 s pulse length, 1200 s power-off in between) over the last month. The second (conservative) irradiation scenario assumes an extended irradiation time over the full anticipated lifetime of ITER according to the M-DRG-1 irradiation scenario with a total first wall fluence of 0.3 MWa/m 2 . For both irradiation scenarios the radioactivity inventories, the afterheat and the contact gamma dose were calculated as function of the decay time. Data were processed for the total activity and afterheat of the TBM, its constituting components and materials including their

  4. Predictive modeling of complications.

    Science.gov (United States)

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  5. DEPUTY: analysing architectural structures and checking style

    International Nuclear Information System (INIS)

    Gorshkov, D.; Kochelev, S.; Kotegov, S.; Pavlov, I.; Pravilnikov, V.; Wellisch, J.P.

    2001-01-01

    The DepUty (dependencies utility) can be classified as a project and process management tool. The main goal of DepUty is to assist by means of source code analysis and graphical representation using UML, in understanding dependencies of sub-systems and packages in CMS Object Oriented software, to understand architectural structure, and to schedule code release in modularised integration. It also allows a new-comer to more easily understand the global structure of CMS software, and to void circular dependencies up-front or re-factor the code, in case it was already too close to the edge of non-maintainability. The authors will discuss the various views DepUty provides to analyse package dependencies, and illustrate both the metrics and style checking facilities it provides

  6. Spatial Analyses of Harappan Urban Settlements

    Directory of Open Access Journals (Sweden)

    Hirofumi Teramura

    2006-12-01

    Full Text Available The Harappan Civilization occupies a unique place among the early civilizations of the world with its well planned urban settlements, advanced handicraft and technology, religious and trade activities. Using a Geographical Information Systems (GIS, this study presents spatial analyses that locate urban settlements on a digital elevation model (DEM according to the three phases of early, mature and late. Understanding the relationship between the spatial distribution of Harappan sites and the change in some factors, such as topographic features, river passages or sea level changes, will lead to an understanding of the dynamism of this civilization. It will also afford a glimpse of the factors behind the formation, development, and decline of the Harappan Civilization.

  7. The plant design analyser and its applications

    International Nuclear Information System (INIS)

    Whitmarsh-Everiss, M.J.

    1992-01-01

    Consideration is given to the history of computational methods for the non-linear dynamic analysis of plant behaviour. This is traced from analogue to hybrid computers. When these were phased out simulation languages were used in the batch mode and the interactive computational capabilities were lost. These have subsequently been recovered using mainframe computing architecture in the context of small models using the Prototype Plant Design Analyser. Given the development of parallel processing architectures, the restriction on model size can be lifted. This capability and the use of advanced Work Stations and graphics software has enabled an advanced interactive design environment to be developed. This system is generic and can be used, with suitable graphics development, to study the dynamics and control behaviour of any plant or system for minimum cost. Examples of past and possible future uses are identified. (author)

  8. Abundance analyses of thirty cool carbon stars

    International Nuclear Information System (INIS)

    Utsumi, Kazuhiko

    1985-01-01

    The results were previously obtained by use of the absolute gf-values and the cosmic abundance as a standard. These gf-values were found to contain large systematic errors, and as a result, the solar photospheric abundances were revised. Our previous results, therefore, must be revised by using new gf-values, and abundance analyses are extended for as many carbon stars as possible. In conclusion, in normal cool carbon stars heavy metals are overabundant by factors of 10 - 100 and rare-earth elements are overabundant by a factor of about 10, and in J-type cool carbon stars, C 12 /C 13 ratio is smaller, C 2 and CN bands and Li 6708 are stronger than in normal cool carbon stars, and the abundances of s-process elements with respect to Fe are nearly normal. (Mori, K.)

  9. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  10. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Onisawa, T.; Kacprzyk, J.

    1995-01-01

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  11. Precise Chemical Analyses of Planetary Surfaces

    Science.gov (United States)

    Kring, David; Schweitzer, Jeffrey; Meyer, Charles; Trombka, Jacob; Freund, Friedemann; Economou, Thanasis; Yen, Albert; Kim, Soon Sam; Treiman, Allan H.; Blake, David; hide

    1996-01-01

    We identify the chemical elements and element ratios that should be analyzed to address many of the issues identified by the Committee on Planetary and Lunar Exploration (COMPLEX). We determined that most of these issues require two sensitive instruments to analyze the necessary complement of elements. In addition, it is useful in many cases to use one instrument to analyze the outermost planetary surface (e.g. to determine weathering effects), while a second is used to analyze a subsurface volume of material (e.g., to determine the composition of unaltered planetary surface material). This dual approach to chemical analyses will also facilitate the calibration of orbital and/or Earth-based spectral observations of the planetary body. We determined that in many cases the scientific issues defined by COMPLEX can only be fully addressed with combined packages of instruments that would supplement the chemical data with mineralogic or visual information.

  12. Seismic analyses of structures. 1st draft

    International Nuclear Information System (INIS)

    David, M.

    1995-01-01

    The dynamic analysis presented in this paper refers to the seismic analysis of the main building of Paks NPP. The aim of the analysis was to determine the floor response spectra as response to seismic input. This analysis was performed by the 3-dimensional calculation model and the floor response spectra were determined for a number levels from the floor response time histories and no other adjustments were applied. The following results of seismic analysis are presented: 3-dimensional finite element model; basic assumptions of dynamic analyses; table of frequencies and included factors; modal masses for all modes; floor response spectra in all the selected nodes with figures of indicated nodes and important nodes of free vibration

  13. Analysing Terrorism from a Systems Thinking Perspective

    Directory of Open Access Journals (Sweden)

    Lukas Schoenenberger

    2014-02-01

    Full Text Available Given the complexity of terrorism, solutions based on single factors are destined to fail. Systems thinking offers various tools for helping researchers and policy makers comprehend terrorism in its entirety. We have developed a semi-quantitative systems thinking approach for characterising relationships between variables critical to terrorism and their impact on the system as a whole. For a better understanding of the mechanisms underlying terrorism, we present a 16-variable model characterising the critical components of terrorism and perform a series of highly focused analyses. We show how to determine which variables are best suited for government intervention, describing in detail their effects on the key variable—the political influence of a terrorist network. We also offer insights into how to elicit variables that destabilise and ultimately break down these networks. Because we clarify our novel approach with fictional data, the primary importance of this paper lies in the new framework for reasoning that it provides.

  14. Seismic analyses of structures. 1st draft

    Energy Technology Data Exchange (ETDEWEB)

    David, M [David Consulting, Engineering and Design Office (Czech Republic)

    1995-07-01

    The dynamic analysis presented in this paper refers to the seismic analysis of the main building of Paks NPP. The aim of the analysis was to determine the floor response spectra as responseto seismic input. This analysis was performed by the 3-dimensional calculation model and the floor response spectra were determined for a number levels from the floor response time histories and no other adjustments were applied. The following results of seismic analysis are presented: 3-dimensional finite element model; basic assumptions of dynamic analyses; table of frequencies and included factors; modal masses for all modes; floor response spectra in all the selected nodes with figures of indicated nodes and important nodes of free vibration.

  15. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  16. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  17. Externalizing Behaviour for Analysing System Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof

    2013-01-01

    System models have recently been introduced to model organisations and evaluate their vulnerability to threats and especially insider threats. Especially for the latter these models are very suitable, since insiders can be assumed to have more knowledge about the attacked organisation than outside...... attackers. Therefore, many attacks are considerably easier to be performed for insiders than for outsiders. However, current models do not support explicit specification of different behaviours. Instead, behaviour is deeply embedded in the analyses supported by the models, meaning that it is a complex......, if not impossible task to change behaviours. Especially when considering social engineering or the human factor in general, the ability to use different kinds of behaviours is essential. In this work we present an approach to make the behaviour a separate component in system models, and explore how to integrate...

  18. ATLAS helicity analyses in beauty hadron decays

    CERN Document Server

    Smizanska, M

    2000-01-01

    The ATLAS detector will allow a precise spatial reconstruction of the kinematics of B hadron decays. In combination with the efficient lepton identification applied already at trigger level, ATLAS is expected to provide large samples of exclusive decay channels cleanly separable from background. These data sets will allow spin-dependent analyses leading to the determination of production and decay parameters, which are not accessible if the helicity amplitudes are not separated. Measurement feasibility studies for decays B/sub s //sup 0/ to J/ psi phi and Lambda /sub b//sup 0/ to Lambda J/ psi , presented in this document, show the experimental precisions that can be achieved in determination of B/sub s//sup 0/ and Lambda /sub b //sup 0/ characteristics. (19 refs).

  19. Thermal hydraulic reactor safety analyses and experiments

    International Nuclear Information System (INIS)

    Holmstroem, H.; Eerikaeinen, L.; Kervinen, T.; Kilpi, K.; Mattila, L.; Miettinen, J.; Yrjoelae, V.

    1989-04-01

    The report introduces the results of the thermal hydraulic reactor safety research performed in the Nuclear Engineering Laboratory of the Technical Research Centre of Finland (VTT) during the years 1972-1987. Also practical applications i.e. analyses for the safety authorities and power companies are presented. The emphasis is on description of the state-of-the-art know how. The report describes VTT's most important computer codes, both those of foreign origin and those developed at VTT, and their assessment work, VTT's own experimental research, as well as international experimental projects and other forms of cooperation VTT has participated in. Appendix 8 contains a comprehensive list of the most important publications and technical reports produced. They present the content and results of the research in detail.(orig.)

  20. Digital analyses of cartometric Fruska Gora guidelines

    Directory of Open Access Journals (Sweden)

    Živković Dragica

    2013-01-01

    Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006

  1. Attitude stability analyses for small artificial satellites

    International Nuclear Information System (INIS)

    Silva, W R; Zanardi, M C; Formiga, J K S; Cabette, R E S; Stuchi, T J

    2013-01-01

    The objective of this paper is to analyze the stability of the rotational motion of a symmetrical spacecraft, in a circular orbit. The equilibrium points and regions of stability are established when components of the gravity gradient torque acting on the spacecraft are included in the equations of rotational motion, which are described by the Andoyer's variables. The nonlinear stability of the equilibrium points of the rotational motion is analysed here by the Kovalev-Savchenko theorem. With the application of the Kovalev-Savchenko theorem, it is possible to verify if they remain stable under the influence of the terms of higher order of the normal Hamiltonian. In this paper, numerical simulations are made for a small hypothetical artificial satellite. Several stable equilibrium points were determined and regions around these points have been established by variations in the orbital inclination and in the spacecraft principal moment of inertia. The present analysis can directly contribute in the maintenance of the spacecraft's attitude

  2. Cointegration Approach to Analysing Inflation in Croatia

    Directory of Open Access Journals (Sweden)

    Lena Malešević-Perović

    2009-06-01

    Full Text Available The aim of this paper is to analyse the determinants of inflation in Croatia in the period 1994:6-2006:6. We use a cointegration approach and find that increases in wages positively influence inflation in the long-run. Furthermore, in the period from June 1994 onward, the depreciation of the currency also contributed to inflation. Money does not explain Croatian inflation. This irrelevance of the money supply is consistent with its endogeneity to exchange rate targeting, whereby the money supply is determined by developments in the foreign exchange market. The value of inflation in the previous period is also found to be significant, thus indicating some inflation inertia.

  3. Deterministic analyses of severe accident issues

    International Nuclear Information System (INIS)

    Dua, S.S.; Moody, F.J.; Muralidharan, R.; Claassen, L.B.

    2004-01-01

    Severe accidents in light water reactors involve complex physical phenomena. In the past there has been a heavy reliance on simple assumptions regarding physical phenomena alongside of probability methods to evaluate risks associated with severe accidents. Recently GE has developed realistic methodologies that permit deterministic evaluations of severe accident progression and of some of the associated phenomena in the case of Boiling Water Reactors (BWRs). These deterministic analyses indicate that with appropriate system modifications, and operator actions, core damage can be prevented in most cases. Furthermore, in cases where core-melt is postulated, containment failure can either be prevented or significantly delayed to allow sufficient time for recovery actions to mitigate severe accidents

  4. Risques naturels en montagne et analyse spatiale

    Directory of Open Access Journals (Sweden)

    Yannick Manche

    1999-06-01

    Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité

  5. Isotropy analyses of the Planck convergence map

    Science.gov (United States)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  6. The radiation analyses of ITER lower ports

    International Nuclear Information System (INIS)

    Petrizzi, L.; Brolatti, G.; Martin, A.; Loughlin, M.; Moro, F.; Villari, R.

    2010-01-01

    The ITER Vacuum Vessel has upper, equatorial, and lower ports used for equipment installation, diagnostics, heating and current drive systems, cryo-vacuum pumping, and access inside the vessel for maintenance. At the level of the divertor, the nine lower ports for remote handling, cryo-vacuum pumping and diagnostic are inclined downwards and toroidally located each every 40 o . The cryopump port has additionally a branch to allocate a second cryopump. The ports, as openings in the Vacuum Vessel, permit radiation streaming out of the vessel which affects the heating in the components in the outer regions of the machine inside and outside the ports. Safety concerns are also raised with respect to the dose after shutdown at the cryostat behind the ports: in such zones the radiation dose level must be kept below the regulatory limit to allow personnel access for maintenance purposes. Neutronic analyses have been required to qualify the ITER project related to the lower ports. A 3-D model was used to take into account full details of the ports and the lower machine surroundings. MCNP version 5 1.40 has been used with the FENDL 2.1 nuclear data library. The ITER 40 o model distributed by the ITER Organization was developed in the lower part to include the relevant details. The results of a first analysis, focused on cryopump system only, were recently published. In this paper more complete data on the cryopump port and analysis for the remote handling port and the diagnostic rack are presented; the results of both analyses give a complete map of the radiation loads in the outer divertor ports. Nuclear heating, dpa, tritium production, and dose rates after shutdown are provided and the implications for the design are discussed.

  7. Analysing lawyers’ attitude towards knowledge sharing

    Directory of Open Access Journals (Sweden)

    Wole M. Olatokun

    2012-09-01

    Method: A field survey of 273 lawyers was carried out, using questionnaire for data collection. Collected data on all variables were structured into grouped frequency distributions. Principal Component Factor Analysis was applied to reduce the constructs and Simple Regression was applied to test the hypotheses. These were tested at 0.05% level of significance. Results: Results showed that expected associations and contributions were the major determinants of lawyers’ attitudes towards knowledge sharing. Expected reward was not significantly related to lawyers’ attitudes towards knowledge sharing. A positive attitude towards knowledge sharing was found to lead to a positive intention to share knowledge, although a positive intention to share knowledge did not significantly predict a positive knowledge sharing behaviour. The level of Information Technology (IT usage was also found to significantly affect the knowledge sharing behaviour of lawyers’. Conclusion: It was recommended that law firms in the study area should deploy more IT infrastructure and services that encourage effective knowledge sharing amongst lawyers.

  8. Antarctic observations available for IMS correlative analyses

    International Nuclear Information System (INIS)

    Rycroft, M.J.

    1982-01-01

    A review is provided of the wide-ranging observational programs of 25 stations operating on and around the continent of Antarctica during the International Magnetospheric Study (IMS). Attention is given to observations of geomagnetism, short period fluctuations of the earth's electromagnetic field, observations of the ionosphere and of whistler mode signals, observational programs in ionospheric and magnetospheric physics, upper atmosphere physics observations, details of magnetospheric programs conducted at Kerguelen, H-component magnetograms, magnetic field line oscillations, dynamic spectra of whistlers, and the variation of plasmapause position derived from recorded whistlers. The considered studies suggest that, in principle, if the level of magnetic activity is known, predictions can be made concerning the time at which the trough occurs, and the shape and the movement of the main trough

  9. TCA UO2/MOX core analyses

    International Nuclear Information System (INIS)

    Tahara, Yoshihisa; Noda, Hideyuki

    2000-01-01

    In order to examine the adequacy of nuclear data, the TCA UO 2 and MOX core experiments were analyzed with MVP using the libraries based on ENDF/B-VI Mod.3 and JENDL-3.2. The ENDF/B-VI data underpredict k eff values. The replacement of 238 U data with the JENDL-3.2 data and the adjustment of 235 ν-value raise the k eff values by 0.3% for UO 2 cores, but still underpredict k eff values. On the other hand, the nuclear data of JENDL-3.2 for H, O, Al, 238 U and 235 U of ENDF/B-VI whose 235 ν-value in thermal energy region is adjusted to the average value of JENDL-3.2 give a good prediction of k eff . (author)

  10. Analysing CMS transfers using Machine Learning techniques

    CERN Document Server

    Diotalevi, Tommaso

    2016-01-01

    LHC experiments transfer more than 10 PB/week between all grid sites using the FTS transfer service. In particular, CMS manages almost 5 PB/week of FTS transfers with PhEDEx (Physics Experiment Data Export). FTS sends metrics about each transfer (e.g. transfer rate, duration, size) to a central HDFS storage at CERN. The work done during these three months, here as a Summer Student, involved the usage of ML techniques, using a CMS framework called DCAFPilot, to process this new data and generate predictions of transfer latencies on all links between Grid sites. This analysis will provide, as a future service, the necessary information in order to proactively identify and maybe fix latency issued transfer over the WLCG.

  11. Micromechanical Failure Analyses for Finite Element Polymer Modeling

    Energy Technology Data Exchange (ETDEWEB)

    CHAMBERS,ROBERT S.; REEDY JR.,EARL DAVID; LO,CHI S.; ADOLF,DOUGLAS B.; GUESS,TOMMY R.

    2000-11-01

    Polymer stresses around sharp corners and in constrained geometries of encapsulated components can generate cracks leading to system failures. Often, analysts use maximum stresses as a qualitative indicator for evaluating the strength of encapsulated component designs. Although this approach has been useful for making relative comparisons screening prospective design changes, it has not been tied quantitatively to failure. Accurate failure models are needed for analyses to predict whether encapsulated components meet life cycle requirements. With Sandia's recently developed nonlinear viscoelastic polymer models, it has been possible to examine more accurately the local stress-strain distributions in zones of likely failure initiation looking for physically based failure mechanisms and continuum metrics that correlate with the cohesive failure event. This study has identified significant differences between rubbery and glassy failure mechanisms that suggest reasonable alternatives for cohesive failure criteria and metrics. Rubbery failure seems best characterized by the mechanisms of finite extensibility and appears to correlate with maximum strain predictions. Glassy failure, however, seems driven by cavitation and correlates with the maximum hydrostatic tension. Using these metrics, two three-point bending geometries were tested and analyzed under variable loading rates, different temperatures and comparable mesh resolution (i.e., accuracy) to make quantitative failure predictions. The resulting predictions and observations agreed well suggesting the need for additional research. In a separate, additional study, the asymptotically singular stress state found at the tip of a rigid, square inclusion embedded within a thin, linear elastic disk was determined for uniform cooling. The singular stress field is characterized by a single stress intensity factor K{sub a} and the applicable K{sub a} calibration relationship has been determined for both fully bonded and

  12. MDOT Pavement Management System : Prediction Models and Feedback System

    Science.gov (United States)

    2000-10-01

    As a primary component of a Pavement Management System (PMS), prediction models are crucial for one or more of the following analyses: : maintenance planning, budgeting, life-cycle analysis, multi-year optimization of maintenance works program, and a...

  13. Predicting outdoor sound

    CERN Document Server

    Attenborough, Keith; Horoshenkov, Kirill

    2014-01-01

    1. Introduction  2. The Propagation of Sound Near Ground Surfaces in a Homogeneous Medium  3. Predicting the Acoustical Properties of Outdoor Ground Surfaces  4. Measurements of the Acoustical Properties of Ground Surfaces and Comparisons with Models  5. Predicting Effects of Source Characteristics on Outdoor Sound  6. Predictions, Approximations and Empirical Results for Ground Effect Excluding Meteorological Effects  7. Influence of Source Motion on Ground Effect and Diffraction  8. Predicting Effects of Mixed Impedance Ground  9. Predicting the Performance of Outdoor Noise Barriers  10. Predicting Effects of Vegetation, Trees and Turbulence  11. Analytical Approximations including Ground Effect, Refraction and Turbulence  12. Prediction Schemes  13. Predicting Sound in an Urban Environment.

  14. Database-Driven Analyses of Astronomical Spectra

    Science.gov (United States)

    Cami, Jan

    2012-03-01

    Spectroscopy is one of the most powerful tools to study the physical properties and chemical composition of very diverse astrophysical environments. In principle, each nuclide has a unique set of spectral features; thus, establishing the presence of a specific material at astronomical distances requires no more than finding a laboratory spectrum of the right material that perfectly matches the astronomical observations. Once the presence of a substance is established, a careful analysis of the observational characteristics (wavelengths or frequencies, intensities, and line profiles) allows one to determine many physical parameters of the environment in which the substance resides, such as temperature, density, velocity, and so on. Because of this great diagnostic potential, ground-based and space-borne astronomical observatories often include instruments to carry out spectroscopic analyses of various celestial objects and events. Of particular interest is molecular spectroscopy at infrared wavelengths. From the spectroscopic point of view, molecules differ from atoms in their ability to vibrate and rotate, and quantum physics inevitably causes those motions to be quantized. The energies required to excite vibrations or rotations are such that vibrational transitions generally occur at infrared wavelengths, whereas pure rotational transitions typically occur at sub-mm wavelengths. Molecular vibration and rotation are coupled though, and thus at infrared wavelengths, one commonly observes a multitude of ro-vibrational transitions (see Figure 13.1). At lower spectral resolution, all transitions blend into one broad ro-vibrational molecular band. The isotope. Molecular spectroscopy thus allows us to see a difference of one neutron in an atomic nucleus that is located at astronomical distances! Since the detection of the first interstellar molecules (the CH [21] and CN [14] radicals), more than 150 species have been detected in space, ranging in size from diatomic

  15. High performance liquid chromatography in pharmaceutical analyses

    Directory of Open Access Journals (Sweden)

    Branko Nikolin

    2004-05-01

    Full Text Available In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatographyreplaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography(HPLC analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1 Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or

  16. High perfomance liquid chromatography in pharmaceutical analyses.

    Science.gov (United States)

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  17. Uncertainty Analyses for Back Projection Methods

    Science.gov (United States)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  18. Scanning electron microscopy and micro-analyses

    International Nuclear Information System (INIS)

    Brisset, F.; Repoux, L.; Ruste, J.; Grillon, F.; Robaut, F.

    2008-01-01

    Scanning electron microscopy (SEM) and the related micro-analyses are involved in extremely various domains, from the academic environments to the industrial ones. The overall theoretical bases, the main technical characteristics, and some complements of information about practical usage and maintenance are developed in this book. high-vacuum and controlled-vacuum electron microscopes are thoroughly presented, as well as the last generation of EDS (energy dispersive spectrometer) and WDS (wavelength dispersive spectrometer) micro-analysers. Beside these main topics, other analysis or observation techniques are approached, such as EBSD (electron backscattering diffraction), 3-D imaging, FIB (focussed ion beams), Monte-Carlo simulations, in-situ tests etc.. This book, in French language, is the only one which treats of this subject in such an exhaustive way. It represents the actualized and totally updated version of a previous edition of 1979. It gathers the lectures given in 2006 at the summer school of Saint Martin d'Heres (France). Content: 1 - electron-matter interactions; 2 - characteristic X-radiation, Bremsstrahlung; 3 - electron guns in SEM; 4 - elements of electronic optics; 5 - vacuum techniques; 6 - detectors used in SEM; 7 - image formation and optimization in SEM; 7a - SEM practical instructions for use; 8 - controlled pressure microscopy; 8a - applications; 9 - energy selection X-spectrometers (energy dispersive spectrometers - EDS); 9a - EDS analysis; 9b - X-EDS mapping; 10 - technological aspects of WDS; 11 - processing of EDS and WDS spectra; 12 - X-microanalysis quantifying methods; 12a - quantitative WDS microanalysis of very light elements; 13 - statistics: precision and detection limits in microanalysis; 14 - analysis of stratified samples; 15 - crystallography applied to EBSD; 16 - EBSD: history, principle and applications; 16a - EBSD analysis; 17 - Monte Carlo simulation; 18 - insulating samples in SEM and X-ray microanalysis; 18a - insulating

  19. Structural and mutational analyses of cis-acting sequences in the 5'-untranslated region of satellite RNA of bamboo mosaic potexvirus

    International Nuclear Information System (INIS)

    Annamalai, Padmanaban; Hsu, Y.-H.; Liu, Y.-P.; Tsai, C.-H.; Lin, N.-S.

    2003-01-01

    The satellite RNA of Bamboo mosaic virus (satBaMV) contains on open reading frame for a 20-kDa protein that is flanked by a 5'-untranslated region (UTR) of 159 nucleotides (nt) and a 3'-UTR of 129 nt. A secondary structure was predicted for the 5'-UTR of satBaMV RNA, which folds into a large stem-loop (LSL) and a small stem-loop. Enzymatic probing confirmed the existence of LSL (nt 8-138) in the 5'-UTR. The essential cis-acting sequences in the 5'-UTR required for satBaMV RNA replication were determined by deletion and substitution mutagenesis. Their replication efficiencies were analyzed in Nicotiana benthamiana protoplasts and Chenopodium quinoa plants coinoculated with helper BaMV RNA. All deletion mutants abolished the replication of satBaMV RNA, whereas mutations introduced in most of the loop regions and stems showed either no replication or a decreased replication efficiency. Mutations that affected the positive-strand satBaMV RNA accumulation also affected the accumulation of negative-strand RNA; however, the accumulation of genomic and subgenomic RNAs of BaMV were not affected. Moreover, covariation analyses of natural satBaMV variants provide substantial evidence that the secondary structure in the 5'-UTR of satBaMV is necessary for efficient replication

  20. Non-Hodgkin's Lymphoma of the Stomach: Treatment Outcomes for 57 Patients Over a 20-Year Period

    Directory of Open Access Journals (Sweden)

    Ching-Liang Ho

    2005-01-01

    Conclusion: Clinical stage is the most important factor predicting the long-term survival of patients with gastric NHL. Surgery may still be necessary in cases of failed gastroscopic diagnosis. In early-stage gastric NHL, non-surgical treatment seems able to achieve the aims of improved long-term survival and, in some instances, cure.

  1. Multichannel amplitude analyser for nuclear spectrometry

    International Nuclear Information System (INIS)

    Jankovic, S.; Milovanovic, B.

    2003-01-01

    A multichannel amplitude analyser with 4096 channels was designed. It is based on a fast 12-bit analog-to-digital converter. The intended purpose of the instrument is recording nuclear spectra by means of scintillation detectors. The computer link is established through an opto-isolated serial connection cable, thus reducing instrument sensitivity to disturbances originating from digital circuitry. Refreshing of the data displayed on the screen occurs on every 2.5 seconds. The impulse peak detection is implemented through the differentiation of the amplified input signal, while the synchronization with the data coming from the converter output is established by taking advantage of the internal 'pipeline' structure of the converter itself. The mode of operation of the built-in microcontroller provides that there are no missed impulses, and the simple logic network prevents the initiation of the amplitude reading sequence for the next impulse in case it appears shortly after its precedent. The solution proposed here demonstrated a good performance at a comparatively low manufacturing cost, and is thus suitable for educational purposes (author)

  2. Scleral topography analysed by optical coherence tomography.

    Science.gov (United States)

    Bandlitz, Stefan; Bäumer, Joachim; Conrad, Uwe; Wolffsohn, James

    2017-08-01

    A detailed evaluation of the corneo-scleral-profile (CSP) is of particular relevance in soft and scleral lenses fitting. The aim of this study was to use optical coherence tomography (OCT) to analyse the profile of the limbal sclera and to evaluate the relationship between central corneal radii, corneal eccentricity and scleral radii. Using OCT (Optos OCT/SLO; Dunfermline, Scotland, UK) the limbal scleral radii (SR) of 30 subjects (11M, 19F; mean age 23.8±2.0SD years) were measured in eight meridians 45° apart. Central corneal radii (CR) and corneal eccentricity (CE) were evaluated using the Oculus Keratograph 4 (Oculus, Wetzlar, Germany). Differences between SR in the meridians and the associations between SR and corneal topography were assessed. Median SR measured along 45° (58.0; interquartile range, 46.8-84.8mm) was significantly (ptopography and may provide additional data useful in fitting soft and scleral contact lenses. Copyright © 2017 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  3. Bayesian analyses of seasonal runoff forecasts

    Science.gov (United States)

    Krzysztofowicz, R.; Reese, S.

    1991-12-01

    Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.

  4. Analyses of demand response in Denmark

    International Nuclear Information System (INIS)

    Moeller Andersen, F.; Grenaa Jensen, S.; Larsen, Helge V.; Meibom, P.; Ravn, H.; Skytte, K.; Togeby, M.

    2006-10-01

    Due to characteristics of the power system, costs of producing electricity vary considerably over short time intervals. Yet, many consumers do not experience corresponding variations in the price they pay for consuming electricity. The topic of this report is: are consumers willing and able to respond to short-term variations in electricity prices, and if so, what is the social benefit of consumers doing so? Taking Denmark and the Nord Pool market as a case, the report focuses on what is known as short-term consumer flexibility or demand response in the electricity market. With focus on market efficiency, efficient allocation of resources and security of supply, the report describes demand response from a micro-economic perspective and provides empirical observations and case studies. The report aims at evaluating benefits from demand response. However, only elements contributing to an overall value are presented. In addition, the analyses are limited to benefits for society, and costs of obtaining demand response are not considered. (au)

  5. WIND SPEED AND ENERGY POTENTIAL ANALYSES

    Directory of Open Access Journals (Sweden)

    A. TOKGÖZLÜ

    2013-01-01

    Full Text Available This paper provides a case study on application of wavelet techniques to analyze wind speed and energy (renewable and environmental friendly energy. Solar and wind are main sources of energy that allows farmers to have the potential for transferring kinetic energy captured by the wind mill for pumping water, drying crops, heating systems of green houses, rural electrification's or cooking. Larger wind turbines (over 1 MW can pump enough water for small-scale irrigation. This study tried to initiate data gathering process for wavelet analyses, different scale effects and their role on wind speed and direction variations. The wind data gathering system is mounted at latitudes: 37° 50" N; longitude 30° 33" E and height: 1200 m above mean sea level at a hill near Süleyman Demirel University campus. 10 minutes average values of two levels wind speed and direction (10m and 30m above ground level have been recorded by a data logger between July 2001 and February 2002. Wind speed values changed between the range of 0 m/s and 54 m/s. Annual mean speed value is 4.5 m/s at 10 m ground level. Prevalent wind

  6. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  7. Soil deflation analyses from wind erosion events

    Directory of Open Access Journals (Sweden)

    Lenka Lackóová

    2015-09-01

    Full Text Available There are various methods to assess soil erodibility for wind erosion. This paper focuses on aggregate analysis by a laser particle sizer ANALYSETTE 22 (FRITSCH GmbH, made to determine the size distribution of soil particles detached by wind (deflated particles. Ten soil samples, trapped along the same length of the erosion surface (150–155 m but at different wind speeds, were analysed. The soil was sampled from a flat, smooth area without vegetation cover or soil crust, not affected by the impact of windbreaks or other barriers, from a depth of maximum 2.5 cm. Prior to analysis the samples were prepared according to the relevant specifications. An experiment was also conducted using a device that enables characterisation of the vertical movement of the deflated material. The trapped samples showed no differences in particle size and the proportions of size fractions at different hourly average wind speeds. It was observed that most of particles travelling in saltation mode (size 50–500 μm – 58–70% – moved vertically up to 26 cm above the soil surface. At greater heights, particles moving in suspension mode (floating in the air; size < 100 μm accounted for up to 90% of the samples. This result suggests that the boundary between the two modes of the vertical movement of deflated soil particles lies at about 25 cm above the soil surface.

  8. Genomic analyses of modern dog breeds.

    Science.gov (United States)

    Parker, Heidi G

    2012-02-01

    A rose may be a rose by any other name, but when you call a dog a poodle it becomes a very different animal than if you call it a bulldog. Both the poodle and the bulldog are examples of dog breeds of which there are >400 recognized worldwide. Breed creation has played a significant role in shaping the modern dog from the length of his leg to the cadence of his bark. The selection and line-breeding required to maintain a breed has also reshaped the genome of the dog, resulting in a unique genetic pattern for each breed. The breed-based population structure combined with extensive morphologic variation and shared human environments have made the dog a popular model for mapping both simple and complex traits and diseases. In order to obtain the most benefit from the dog as a genetic system, it is necessary to understand the effect structured breeding has had on the genome of the species. That is best achieved by looking at genomic analyses of the breeds, their histories, and their relationships to each other.

  9. Interim Basis for PCB Sampling and Analyses

    International Nuclear Information System (INIS)

    BANNING, D.L.

    2001-01-01

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842/Rev.1 A, Vol. IV, Section 4.16 (Banning 1999)

  10. Achieving reasonable conservatism in nuclear safety analyses

    International Nuclear Information System (INIS)

    Jamali, Kamiar

    2015-01-01

    In the absence of methods that explicitly account for uncertainties, seeking reasonable conservatism in nuclear safety analyses can quickly lead to extreme conservatism. The rate of divergence to extreme conservatism is often beyond the expert analysts’ intuitive feeling, but can be demonstrated mathematically. Too much conservatism in addressing the safety of nuclear facilities is not beneficial to society. Using certain properties of lognormal distributions for representation of input parameter uncertainties, example calculations for the risk and consequence of a fictitious facility accident scenario are presented. Results show that there are large differences between the calculated 95th percentiles and the extreme bounding values derived from using all input variables at their upper-bound estimates. Showing the relationship of the mean values to the key parameters of the output distributions, the paper concludes that the mean is the ideal candidate for representation of the value of an uncertain parameter. The mean value is proposed as the metric that is consistent with the concept of reasonable conservatism in nuclear safety analysis, because its value increases towards higher percentiles of the underlying positively skewed distribution with increasing levels of uncertainty. Insensitivity of the results to the actual underlying distributions is briefly demonstrated. - Highlights: • Multiple conservative assumptions can quickly diverge into extreme conservatism. • Mathematics and attractive properties provide basis for wide use of lognormal distribution. • Mean values are ideal candidates for representation of parameter uncertainties. • Mean values are proposed as reasonably conservative estimates of parameter uncertainties

  11. CFD analyses of coolant channel flowfields

    Science.gov (United States)

    Yagley, Jennifer A.; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    The flowfield characteristics in rocket engine coolant channels are analyzed by means of a numerical model. The channels are characterized by large length to diameter ratios, high Reynolds numbers, and asymmetrical heating. At representative flow conditions, the channel length is approximately twice the hydraulic entrance length so that fully developed conditions would be reached for a constant property fluid. For the supercritical hydrogen that is used as the coolant, the strong property variations create significant secondary flows in the cross-plane which have a major influence on the flow and the resulting heat transfer. Comparison of constant and variable property solutions show substantial differences. In addition, the property variations prevent fully developed flow. The density variation accelerates the fluid in the channels increasing the pressure drop without an accompanying increase in heat flux. Analyses of the inlet configuration suggest that side entry from a manifold can affect the development of the velocity profile because of vortices generated as the flow enters the channel. Current work is focused on studying the effects of channel bifurcation on the flow field and the heat transfer characteristics.

  12. Fast and accurate methods for phylogenomic analyses

    Directory of Open Access Journals (Sweden)

    Warnow Tandy

    2011-10-01

    Full Text Available Abstract Background Species phylogenies are not estimated directly, but rather through phylogenetic analyses of different gene datasets. However, true gene trees can differ from the true species tree (and hence from one another due to biological processes such as horizontal gene transfer, incomplete lineage sorting, and gene duplication and loss, so that no single gene tree is a reliable estimate of the species tree. Several methods have been developed to estimate species trees from estimated gene trees, differing according to the specific algorithmic technique used and the biological model used to explain differences between species and gene trees. Relatively little is known about the relative performance of these methods. Results We report on a study evaluating several different methods for estimating species trees from sequence datasets, simulating sequence evolution under a complex model including indels (insertions and deletions, substitutions, and incomplete lineage sorting. The most important finding of our study is that some fast and simple methods are nearly as accurate as the most accurate methods, which employ sophisticated statistical methods and are computationally quite intensive. We also observe that methods that explicitly consider errors in the estimated gene trees produce more accurate trees than methods that assume the estimated gene trees are correct. Conclusions Our study shows that highly accurate estimations of species trees are achievable, even when gene trees differ from each other and from the species tree, and that these estimations can be obtained using fairly simple and computationally tractable methods.

  13. Mediation Analyses in the Real World

    DEFF Research Database (Denmark)

    Lange, Theis; Starkopf, Liis

    2016-01-01

    The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion of restr......The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion...... it simultaneously ensures that the comparison is based on properties, which matter in actual applications, and makes the comparison accessible for a broader audience. In a wider context, the choice to stay close to real-life problems mirrors a general trend within the literature on mediation analysis namely to put...... applications using the inverse odds ration approach, as it simply has not had enough time to move from theoretical concept to published applied paper, we do expect to be able to judge the willingness of authors and journals to employ the causal inference-based approach to mediation analyses. Our hope...

  14. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  15. Activation analyses for different fusion structural alloys

    International Nuclear Information System (INIS)

    Attaya, H.; Smith, D.

    1991-01-01

    The leading candidate structural materials, viz., the vanadium alloys, the nickel or the manganese stabilized austenitic steels, and the ferritic steels, are analysed in terms of their induced activation in the TPSS fusion power reactor. The TPSS reactor has 1950 MW fusion power and inboard and outboard average neutron wall loading of 3.75 and 5.35 MW/m 2 respectively. The results shows that, after one year of continuous operation, the vanadium alloys have the least radioactivity at reactor shutdown. The maximum difference between the induced radioactivity in the vanadium alloys and in the other iron-based alloys occurs at about 10 years after reactor shutdown. At this time, the total reactor radioactivity, using the vanadium alloys, is about two orders of magnitude less than the total reactor radioactivity utilizing any other alloy. The difference is even larger in the first wall, the FW-vanadium activation is 3 orders of magnitude less than other alloys' FW activation. 2 refs., 7 figs

  16. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  17. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  18. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    International Nuclear Information System (INIS)

    S. Tsai

    2005-01-01

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2

  19. Genomic analyses of the CAM plant pineapple.

    Science.gov (United States)

    Zhang, Jisen; Liu, Juan; Ming, Ray

    2014-07-01

    The innovation of crassulacean acid metabolism (CAM) photosynthesis in arid and/or low CO2 conditions is a remarkable case of adaptation in flowering plants. As the most important crop that utilizes CAM photosynthesis, the genetic and genomic resources of pineapple have been developed over many years. Genetic diversity studies using various types of DNA markers led to the reclassification of the two genera Ananas and Pseudananas and nine species into one genus Ananas and two species, A. comosus and A. macrodontes with five botanical varieties in A. comosus. Five genetic maps have been constructed using F1 or F2 populations, and high-density genetic maps generated by genotype sequencing are essential resources for sequencing and assembling the pineapple genome and for marker-assisted selection. There are abundant expression sequence tag resources but limited genomic sequences in pineapple. Genes involved in the CAM pathway has been analysed in several CAM plants but only a few of them are from pineapple. A reference genome of pineapple is being generated and will accelerate genetic and genomic research in this major CAM crop. This reference genome of pineapple provides the foundation for studying the origin and regulatory mechanism of CAM photosynthesis, and the opportunity to evaluate the classification of Ananas species and botanical cultivars. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  20. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.

  1. System for analysing sickness absenteeism in Poland.

    Science.gov (United States)

    Indulski, J A; Szubert, Z

    1997-01-01

    The National System of Sickness Absenteeism Statistics has been functioning in Poland since 1977, as the part of the national health statistics. The system is based on a 15-percent random sample of copies of certificates of temporary incapacity for work issued by all health care units and authorised private medical practitioners. A certificate of temporary incapacity for work is received by every insured employee who is compelled to stop working due to sickness, accident, or due to the necessity to care for a sick member of his/her family. The certificate is required on the first day of sickness. Analyses of disease- and accident-related sickness absenteeism carried out each year in Poland within the statistical system lead to the main conclusions: 1. Diseases of the musculoskeletal and peripheral nervous systems accounting, when combined, for 1/3 of the total sickness absenteeism, are a major health problem of the working population in Poland. During the past five years, incapacity for work caused by these diseases in males increased 2.5 times. 2. Circulatory diseases, and arterial hypertension and ischaemic heart disease in particular (41% and 27% of sickness days, respectively), create an essential health problem among males at productive age, especially, in the 40 and older age group. Absenteeism due to these diseases has increased in males more than two times.

  2. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  3. Thermomagnetic Analyses to Test Concrete Stability

    Science.gov (United States)

    Geiss, C. E.; Gourley, J. R.

    2017-12-01

    Over the past decades pyrrhotite-containing aggregate has been used in concrete to build basements and foundations in central Connecticut. The sulphur in the pyrrhotite reacts to several secondary minerals, and associated changes in volume lead to a loss of structural integrity. As a result hundreds of homes have been rendered worthless as remediation costs often exceed the value of the homes and the value of many other homes constructed during the same time period is in question as concrete provenance and potential future structural issues are unknown. While minor abundances of pyrrhotite are difficult to detect or quantify by traditional means, the mineral is easily identified through its magnetic properties. All concrete samples from affected homes show a clear increase in magnetic susceptibility above 220°C due to the γ - transition of Fe9S10 [1] and a clearly defined Curie-temperature near 320°C for Fe7S8. X-ray analyses confirm the presence of pyrrhotite and ettringite in these samples. Synthetic mixtures of commercially available concrete and pyrrhotite show that the method is semiquantitative but needs to be calibrated for specific pyrrhotite mineralogies. 1. Schwarz, E.J., Magnetic properties of pyrrhotite and their use in applied geology and geophysics. 1975, Geological Survey of Canada : Ottawa, ON, Canada: Canada.

  4. Validating experimental and theoretical Langmuir probe analyses

    Science.gov (United States)

    Pilling, L. S.; Carnegie, D. A.

    2007-08-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a dc discharge plasma over a wide variety of conditions. This discharge contains a dual-temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities, an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital-motion limited (OML) is approximately the same as the radial-motion gradients. An analysis of the 'gradients' from the radial-motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature, or separation of the ion and electron contributions. Only the value of the space potential is necessary to determine the applicable theory.

  5. Theoretical and computational analyses of LNG evaporator

    Science.gov (United States)

    Chidambaram, Palani Kumar; Jo, Yang Myung; Kim, Heuy Dong

    2017-04-01

    Theoretical and numerical analysis on the fluid flow and heat transfer inside a LNG evaporator is conducted in this work. Methane is used instead of LNG as the operating fluid. This is because; methane constitutes over 80% of natural gas. The analytical calculations are performed using simple mass and energy balance equations. The analytical calculations are made to assess the pressure and temperature variations in the steam tube. Multiphase numerical simulations are performed by solving the governing equations (basic flow equations of continuity, momentum and energy equations) in a portion of the evaporator domain consisting of a single steam pipe. The flow equations are solved along with equations of species transport. Multiphase modeling is incorporated using VOF method. Liquid methane is the primary phase. It vaporizes into the secondary phase gaseous methane. Steam is another secondary phase which flows through the heating coils. Turbulence is modeled by a two equation turbulence model. Both the theoretical and numerical predictions are seen to match well with each other. Further parametric studies are planned based on the current research.

  6. Does a 20-week aerobic exercise training programme increase our capabilities to buffer real-life stressors? A randomized, controlled trial using ambulatory assessment.

    Science.gov (United States)

    von Haaren, Birte; Ottenbacher, Joerg; Muenz, Julia; Neumann, Rainer; Boes, Klaus; Ebner-Priemer, Ulrich

    2016-02-01

    The cross-stressor adaptation hypothesis suggests that regular exercise leads to adaptations in the stress response systems that induce decreased physiological responses to psychological stressors. Even though an exercise intervention to buffer the detrimental effects of psychological stressors on health might be of utmost importance, empirical evidence is mixed. This may be explained by the use of cross-sectional designs and non-personally relevant stressors. Using a randomized controlled trial, we hypothesized that a 20-week aerobic exercise training does reduce physiological stress responses to psychological real-life stressors in sedentary students. Sixty-one students were randomized to either a control group or an exercise training group. The academic examination period (end of the semester) served as a real-life stressor. We used ambulatory assessment methods to assess physiological stress reactivity of the autonomic nervous system (heart rate variability: LF/HF, RMSSD), physical activity and perceived stress during 2 days of everyday life and multilevel models for data analyses. Aerobic capacity (VO2max) was assessed pre- and post-intervention via cardiopulmonary exercise testing to analyze the effectiveness of the intervention. During real-life stressors, the exercise training group showed significantly reduced LF/HF (β = -0.15, t = -2.59, p = .01) and increased RMSSD (β = 0.15, t = 2.34, p = .02) compared to the control group. Using a randomized controlled trial and a real-life stressor, we could show that exercise appears to be a useful preventive strategy to buffer the effects of stress on the autonomic nervous system, which might result into detrimental health outcomes.

  7. SRM Internal Flow Tests and Computational Fluid Dynamic Analysis. Volume 2; CFD RSRM Full-Scale Analyses

    Science.gov (United States)

    2001-01-01

    This document presents the full-scale analyses of the CFD RSRM. The RSRM model was developed with a 20 second burn time. The following are presented as part of the full-scale analyses: (1) RSRM embedded inclusion analysis; (2) RSRM igniter nozzle design analysis; (3) Nozzle Joint 4 erosion anomaly; (4) RSRM full motor port slag accumulation analysis; (5) RSRM motor analysis of two-phase flow in the aft segment/submerged nozzle region; (6) Completion of 3-D Analysis of the hot air nozzle manifold; (7) Bates Motor distributed combustion test case; and (8) Three Dimensional Polysulfide Bump Analysis.

  8. Insights from Severe Accident Analyses for Verification of VVER SAMG

    Energy Technology Data Exchange (ETDEWEB)

    Gaikwad, A. J.; Rao, R. S.; Gupta, A.; Obaidurrahaman, K., E-mail: avinashg@aerb.gov.in [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai (India)

    2014-10-15

    The severe accident analyses of simultaneous rupture of all four steam lines (case-a), simultaneous occurrence of LOCA with SBO (case-b) and Station blackout (case-c) were performed with the computer code ASTEC V2r2 for a typical VVER-1000. The results obtained will be used for verification of sever accident provisions and Severe Accident Management Guidelines (SAMG). Auxiliary feed water and emergency core cooling systems are modelled as boundary conditions. The ICARE module is used to simulate the reactor core, which is divided into five radial regions by grouping similarly powered fuel assemblies together. Initially, CESAR module computes thermal hydraulics in primary and secondary circuits. As soon as core uncovery begins, the ICARE module is actuated based on certain parameters, and after this, ICARE module computes the thermal hydraulics in the core, bypass, downcomer and the lower plenum. CESAR handles the remaining components in the primary and secondary loops. CPA module is used to simulate the containment and to predict the thermal-hydraulic and hydrogen behaviour in the containment. The accident sequences were selected in such a way that they cover low/high pressure and slow/fast core damage progression events. Events simulated included slow progression events with high pressure and fast accident progression with low primary pressure. Analysis was also carried out for the case of SBO with the opening of the PORVs when core exit temperature exceeds certain value as part of SAMG. Time step sensitivity study was carried out for LOCA with SBO. In general the trends and magnitude of the parameters are as expected. The key results of the above analyses are presented in this paper. (author)

  9. Castor-1C spent fuel storage cask decay heat, heat transfer, and shielding analyses

    International Nuclear Information System (INIS)

    Rector, D.R.; McCann, R.A.; Jenquin, U.P.; Heeb, C.M.; Creer, J.M.; Wheeler, C.L.

    1986-12-01

    This report documents the decay heat, heat transfer, and shielding analyses of the Gesellschaft fuer Nuklear Services (GNS) CASTOR-1C cask used in a spent fuel storage demonstration performed at Preussen Elektra's Wurgassen nuclear power plant. The demonstration was performed between March 1982 and January 1984, and resulted in cask and fuel temperature data and cask exterior surface gamma-ray and neutron radiation dose rate measurements. The purpose of the analyses reported here was to evaluate decay heat, heat transfer, and shielding computer codes. The analyses consisted of (1) performing pre-look predictions (predictions performed before the analysts were provided the test data), (2) comparing ORIGEN2 (decay heat), COBRA-SFS and HYDRA (heat transfer), and QAD and DOT (shielding) results to data, and (3) performing post-test analyses if appropriate. Even though two heat transfer codes were used to predict CASTOR-1C cask test data, no attempt was made to compare the two codes. The codes are being evaluated with other test data (single-assembly data and other cask data), and to compare the codes based on one set of data may be premature and lead to erroneous conclusions

  10. Applied predictive control

    CERN Document Server

    Sunan, Huang; Heng, Lee Tong

    2002-01-01

    The presence of considerable time delays in the dynamics of many industrial processes, leading to difficult problems in the associated closed-loop control systems, is a well-recognized phenomenon. The performance achievable in conventional feedback control systems can be significantly degraded if an industrial process has a relatively large time delay compared with the dominant time constant. Under these circumstances, advanced predictive control is necessary to improve the performance of the control system significantly. The book is a focused treatment of the subject matter, including the fundamentals and some state-of-the-art developments in the field of predictive control. Three main schemes for advanced predictive control are addressed in this book: • Smith Predictive Control; • Generalised Predictive Control; • a form of predictive control based on Finite Spectrum Assignment. A substantial part of the book addresses application issues in predictive control, providing several interesting case studie...

  11. Bench top and portable mineral analysers, borehole core analysers and in situ borehole logging

    International Nuclear Information System (INIS)

    Howarth, W.J.; Watt, J.S.

    1982-01-01

    Bench top and portable mineral analysers are usually based on balanced filter techniques using scintillation detectors or on low resolution proportional detectors. The application of radioisotope x-ray techniques to in situ borehole logging is increasing, and is particularly suited for logging for tin and higher atomic number elements

  12. Integrated Field Analyses of Thermal Springs

    Science.gov (United States)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  13. Transient Seepage for Levee Engineering Analyses

    Science.gov (United States)

    Tracy, F. T.

    2017-12-01

    Historically, steady-state seepage analyses have been a key tool for designing levees by practicing engineers. However, with the advances in computer modeling, transient seepage analysis has become a potentially viable tool. A complication is that the levees usually have partially saturated flow, and this is significantly more complicated in transient flow. This poster illustrates four elements of our research in partially saturated flow relating to the use of transient seepage for levee design: (1) a comparison of results from SEEP2D, SEEP/W, and SLIDE for a generic levee cross section common to the southeastern United States; (2) the results of a sensitivity study of varying saturated hydraulic conductivity, the volumetric water content function (as represented by van Genuchten), and volumetric compressibility; (3) a comparison of when soils do and do not exhibit hysteresis, and (4) a description of proper and improper use of transient seepage in levee design. The variables considered for the sensitivity and hysteresis studies are pore pressure beneath the confining layer at the toe, the flow rate through the levee system, and a levee saturation coefficient varying between 0 and 1. Getting results for SEEP2D, SEEP/W, and SLIDE to match proved more difficult than expected. After some effort, the results matched reasonably well. Differences in results were caused by various factors, including bugs, different finite element meshes, different numerical formulations of the system of nonlinear equations to be solved, and differences in convergence criteria. Varying volumetric compressibility affected the above test variables the most. The levee saturation coefficient was most affected by the use of hysteresis. The improper use of pore pressures from a transient finite element seepage solution imported into a slope stability computation was found to be the most grievous mistake in using transient seepage in the design of levees.

  14. Summary of the analyses for recovery factors

    Science.gov (United States)

    Verma, Mahendra K.

    2017-07-17

    IntroductionIn order to determine the hydrocarbon potential of oil reservoirs within the U.S. sedimentary basins for which the carbon dioxide enhanced oil recovery (CO2-EOR) process has been considered suitable, the CO2 Prophet model was chosen by the U.S. Geological Survey (USGS) to be the primary source for estimating recovery-factor values for individual reservoirs. The choice was made because of the model’s reliability and the ease with which it can be used to assess a large number of reservoirs. The other two approaches—the empirical decline curve analysis (DCA) method and a review of published literature on CO2-EOR projects—were deployed to verify the results of the CO2 Prophet model. This chapter discusses the results from CO2 Prophet (chapter B, by Emil D. Attanasi, this report) and compares them with results from decline curve analysis (chapter C, by Hossein Jahediesfanjani) and those reported in the literature for selected reservoirs with adequate data for analyses (chapter D, by Ricardo A. Olea).To estimate the technically recoverable hydrocarbon potential for oil reservoirs where CO2-EOR has been applied, two of the three approaches—CO2 Prophet modeling and DCA—do not include analysis of economic factors, while the third approach—review of published literature—implicitly includes economics. For selected reservoirs, DCA has provided estimates of the technically recoverable hydrocarbon volumes, which, in combination with calculated amounts of original oil in place (OOIP), helped establish incremental CO2-EOR recovery factors for individual reservoirs.The review of published technical papers and reports has provided substantial information on recovery factors for 70 CO2-EOR projects that are either commercially profitable or classified as pilot tests. When comparing the results, it is important to bear in mind the differences and limitations of these three approaches.

  15. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  16. Analyses of Hypomethylated Oil Palm Gene Space

    Science.gov (United States)

    Jayanthi, Nagappan; Mohd-Amin, Ab Halim; Azizi, Norazah; Chan, Kuang-Lim; Maqbool, Nauman J.; Maclean, Paul; Brauning, Rudi; McCulloch, Alan; Moraga, Roger; Ong-Abdullah, Meilina; Singh, Rajinder

    2014-01-01

    Demand for palm oil has been increasing by an average of ∼8% the past decade and currently accounts for about 59% of the world's vegetable oil market. This drives the need to increase palm oil production. Nevertheless, due to the increasing need for sustainable production, it is imperative to increase productivity rather than the area cultivated. Studies on the oil palm genome are essential to help identify genes or markers that are associated with important processes or traits, such as flowering, yield and disease resistance. To achieve this, 294,115 and 150,744 sequences from the hypomethylated or gene-rich regions of Elaeis guineensis and E. oleifera genome were sequenced and assembled into contigs. An additional 16,427 shot-gun sequences and 176 bacterial artificial chromosomes (BAC) were also generated to check the quality of libraries constructed. Comparison of these sequences revealed that although the methylation-filtered libraries were sequenced at low coverage, they still tagged at least 66% of the RefSeq supported genes in the BAC and had a filtration power of at least 2.0. A total 33,752 microsatellites and 40,820 high-quality single nucleotide polymorphism (SNP) markers were identified. These represent the most comprehensive collection of microsatellites and SNPs to date and would be an important resource for genetic mapping and association studies. The gene models predicted from the assembled contigs were mined for genes of interest, and 242, 65 and 14 oil palm transcription factors, resistance genes and miRNAs were identified respectively. Examples of the transcriptional factors tagged include those associated with floral development and tissue culture, such as homeodomain proteins, MADS, Squamosa and Apetala2. The E. guineensis and E. oleifera hypomethylated sequences provide an important resource to understand the molecular mechanisms associated with important agronomic traits in oil palm. PMID:24497974

  17. Predictive modelling of noise level generated during sawing of rocks

    Indian Academy of Sciences (India)

    This paper presents an experimental and statistical study on noise level generated during of rock sawing by circular diamond sawblades. Influence of the operating variables and rock properties on the noise level are investigated and analysed. Statistical analyses are then employed and models are built for the prediction of ...

  18. Connecting clinical and actuarial prediction with rule-based methods

    NARCIS (Netherlands)

    Fokkema, M.; Smits, N.; Kelderman, H.; Penninx, B.W.J.H.

    2015-01-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction

  19. Spatial Ensemble Postprocessing of Precipitation Forecasts Using High Resolution Analyses

    Science.gov (United States)

    Lang, Moritz N.; Schicker, Irene; Kann, Alexander; Wang, Yong

    2017-04-01

    Ensemble prediction systems are designed to account for errors or uncertainties in the initial and boundary conditions, imperfect parameterizations, etc. However, due to sampling errors and underestimation of the model errors, these ensemble forecasts tend to be underdispersive, and to lack both reliability and sharpness. To overcome such limitations, statistical postprocessing methods are commonly applied to these forecasts. In this study, a full-distributional spatial post-processing method is applied to short-range precipitation forecasts over Austria using Standardized Anomaly Model Output Statistics (SAMOS). Following Stauffer et al. (2016), observation and forecast fields are transformed into standardized anomalies by subtracting a site-specific climatological mean and dividing by the climatological standard deviation. Due to the need of fitting only a single regression model for the whole domain, the SAMOS framework provides a computationally inexpensive method to create operationally calibrated probabilistic forecasts for any arbitrary location or for all grid points in the domain simultaneously. Taking advantage of the INCA system (Integrated Nowcasting through Comprehensive Analysis), high resolution analyses are used for the computation of the observed climatology and for model training. The INCA system operationally combines station measurements and remote sensing data into real-time objective analysis fields at 1 km-horizontal resolution and 1 h-temporal resolution. The precipitation forecast used in this study is obtained from a limited area model ensemble prediction system also operated by ZAMG. The so called ALADIN-LAEF provides, by applying a multi-physics approach, a 17-member forecast at a horizontal resolution of 10.9 km and a temporal resolution of 1 hour. The performed SAMOS approach statistically combines the in-house developed high resolution analysis and ensemble prediction system. The station-based validation of 6 hour precipitation sums

  20. Predictable or not predictable? The MOV question

    International Nuclear Information System (INIS)

    Thibault, C.L.; Matzkiw, J.N.; Anderson, J.W.; Kessler, D.W.

    1994-01-01

    Over the past 8 years, the nuclear industry has struggled to understand the dynamic phenomena experienced during motor-operated valve (MOV) operation under differing flow conditions. For some valves and designs, their operational functionality has been found to be predictable; for others, unpredictable. Although much has been accomplished over this period of time, especially on modeling valve dynamics, the unpredictability of many valves and designs still exists. A few valve manufacturers are focusing on improving design and fabrication techniques to enhance product reliability and predictability. However, this approach does not address these issues for installed and inpredictable valves. This paper presents some of the more promising techniques that Wyle Laboratories has explored with potential for transforming unpredictable valves to predictable valves and for retrofitting installed MOVs. These techniques include optimized valve tolerancing, surrogated material evaluation, and enhanced surface treatments

  1. A systematic review of the quality and impact of anxiety disorder meta-analyses.

    Science.gov (United States)

    Ipser, Jonathan C; Stein, Dan J

    2009-08-01

    Meta-analyses are seen as representing the pinnacle of a hierarchy of evidence used to inform clinical practice. Therefore, the potential importance of differences in the rigor with which they are conducted and reported warrants consideration. In this review, we use standardized instruments to describe the scientific and reporting quality of meta-analyses of randomized controlled trials of the treatment of anxiety disorders. We also use traditional and novel metrics of article impact to assess the influence of meta-analyses across a range of research fields in the anxiety disorders. Overall, although the meta-analyses that we examined had some flaws, their quality of reporting was generally acceptable. Neither the scientific nor reporting quality of the meta-analyses was predicted by any of the impact metrics. The finding that treatment meta-analyses were cited less frequently than quantitative reviews of studies in current "hot spots" of research (ie, genetics, imaging) points to the multifactorial nature of citation patterns. A list of the meta-analyses included in this review is available on an evidence-based website of anxiety and trauma-related disorders.

  2. DMINDA: an integrated web server for DNA motif identification and analyses.

    Science.gov (United States)

    Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying

    2014-07-01

    DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. Microstructure Evolution and Flow Stress Model of a 20Mn5 Hollow Steel Ingot during Hot Compression.

    Science.gov (United States)

    Liu, Min; Ma, Qing-Xian; Luo, Jian-Bin

    2018-03-21

    20Mn5 steel is widely used in the manufacture of heavy hydro-generator shaft due to its good performance of strength, toughness and wear resistance. However, the hot deformation and recrystallization behaviors of 20Mn5 steel compressed under high temperature were not studied. In this study, the hot compression experiments under temperatures of 850-1200 °C and strain rates of 0.01/s-1/s are conducted using Gleeble thermal and mechanical simulation machine. And the flow stress curves and microstructure after hot compression are obtained. Effects of temperature and strain rate on microstructure are analyzed. Based on the classical stress-dislocation relation and the kinetics of dynamic recrystallization, a two-stage constitutive model is developed to predict the flow stress of 20Mn5 steel. Comparisons between experimental flow stress and predicted flow stress show that the predicted flow stress values are in good agreement with the experimental flow stress values, which indicates that the proposed constitutive model is reliable and can be used for numerical simulation of hot forging of 20Mn5 hollow steel ingot.

  4. Vibro-spring particle size distribution analyser

    International Nuclear Information System (INIS)

    Patel, Ketan Shantilal

    2002-01-01

    This thesis describes the design and development of an automated pre-production particle size distribution analyser for particles in the 20 - 2000 μm size range. This work is follow up to the vibro-spring particle sizer reported by Shaeri. In its most basic form, the instrument comprises a horizontally held closed coil helical spring that is partly filled with the test powder and sinusoidally vibrated in the transverse direction. Particle size distribution data are obtained by stretching the spring to known lengths and measuring the mass of the powder discharged from the spring's coils. The size of the particles on the other hand is determined from the spring 'intercoil' distance. The instrument developed by Shaeri had limited use due to its inability to measure sample mass directly. For the device reported here, modifications are made to the original configurations to establish means of direct sample mass measurement. The feasibility of techniques for measuring the mass of powder retained within the spring are investigated in detail. Initially, the measurement of mass is executed in-situ from the vibration characteristics based on the spring's first harmonic resonant frequency. This method is often erratic and unreliable due to the particle-particle-spring wall interactions and the spring bending. An much more successful alternative is found from a more complicated arrangement in which the spring forms part of a stiff cantilever system pivoted along its main axis. Here, the sample mass is determined in the 'static mode' by monitoring the cantilever beam's deflection following the wanton termination of vibration. The system performance has been optimised through the variations of the mechanical design of the key components and the operating procedure as well as taking into account the effect of changes in the ambient temperature on the system's response. The thesis also describes the design and development of the ancillary mechanisms. These include the pneumatic

  5. Kuosheng Mark III containment analyses using GOTHIC

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Ansheng, E-mail: samuellin1999@iner.gov.tw; Chen, Yen-Shu; Yuann, Yng-Ruey

    2013-10-15

    Highlights: • The Kuosheng Mark III containment model is established using GOTHIC. • Containment pressure and temperature responses due to LOCA are presented. • The calculated results are all below the design values and compared with the FSAR results. • The calculated results can be served as an analysis reference for an SPU project in the future. -- Abstract: Kuosheng nuclear power plant in Taiwan is a twin-unit BWR/6 plant, and both units utilize the Mark III containment. Currently, the plant is performing a stretch power uprate (SPU) project to increase the core thermal power to 103.7% OLTP (original licensed thermal power). However, the containment response in the Kuosheng Final Safety Analysis Report (FSAR) was completed more than twenty-five years ago. The purpose of this study is to establish a Kuosheng Mark III containment model using the containment program GOTHIC. The containment pressure and temperature responses under the design-basis accidents, which are the main steam line break (MSLB) and the recirculation line break (RCLB) accidents, are investigated. Short-term and long-term analyses are presented in this study. The short-term analysis is to calculate the drywell peak pressure and temperature which happen in the early stage of the LOCAs. The long-term analysis is to calculate the peak pressure and temperature of the reactor building space. In the short-term analysis, the calculated peak drywell to wetwell differential pressure is 140.6 kPa for the MSLB, which is below than the design value of 189.6 kPa. The calculated peak drywell temperature is 158 °C, which is still below the design value of 165.6 °C. In addition, in the long-term analysis, the calculated peak containment pressure is 47 kPa G, which is below the design value of 103.4 kPa G. The calculated peak values of containment temperatures are 74.7 °C, which is lower than the design value of 93.3 °C. Therefore, the Kuosheng Mark III containment can maintain the integrity after

  6. YALINA Booster subcritical assembly modeling and analyses

    International Nuclear Information System (INIS)

    Talamo, A.; Gohar, Y.; Aliberti, G.; Cao, Y.; Zhong, Z.; Kiyavitskaya, H.; Bournos, V.; Fokov, Y.; Routkovskaya, C.; Sadovich, S.

    2010-01-01

    Full text: Accurate simulation models of the YALINA Booster assembly of the Joint Institute for Power and Nuclear Research (JIPNR)-Sosny, Belarus have been developed by Argonne National Laboratory (ANL) of the USA. YALINA-Booster has coupled zones operating with fast and thermal neutron spectra, which requires a special attention in the modelling process. Three different uranium enrichments of 90%, 36% or 21% were used in the fast zone and 10% uranium enrichment was used in the thermal zone. Two of the most advanced Monte Carlo computer programs have been utilized for the ANL analyses: MCNP of the Los Alamos National Laboratory and MONK of the British Nuclear Fuel Limited and SERCO Assurance. The developed geometrical models for both computer programs modelled all the details of the YALINA Booster facility as described in the technical specifications defined in the International Atomic Energy Agency (IAEA) report without any geometrical approximation or material homogenization. Materials impurities and the measured material densities have been used in the models. The obtained results for the neutron multiplication factors calculated in criticality mode (keff) and in source mode (ksrc) with an external neutron source from the two Monte Carlo programs are very similar. Different external neutron sources have been investigated including californium, deuterium-deuterium (D-D), and deuterium-tritium (D-T) neutron sources. The spatial neutron flux profiles and the neutron spectra in the experimental channels were calculated. In addition, the kinetic parameters were defined including the effective delayed neutron fraction, the prompt neutron lifetime, and the neutron generation time. A new calculation methodology has been developed at ANL to simulate the pulsed neutron source experiments. In this methodology, the MCNP code is used to simulate the detector response from a single pulse of the external neutron source and a C code is used to superimpose the pulse until the

  7. Hydrogen mixing analyses for a VVER containment.

    Energy Technology Data Exchange (ETDEWEB)

    Sienicki, J.J.; Kostka, P.; Techy, Z.

    2002-02-25

    Hydrogen combustion may represent a threat to containment integrity in a VVER-440/213 plant owing to the combination of high pressure and high temperature. A study has been carried out using the GASFLOW 2.1 three-dimensional CFD code to evaluate the hydrogen distribution in the containment during a beyond design basis accident. The VVER-440/213 containment input model consists of two 3D blocks connected via one-dimensional (1D) ducts. One 3D block contains the reactor building and the accident localization tower with the suppression pools. Another 3D block models the air traps. 1D ducts represent the check valves connecting the accident localization tower with the air traps. The VVER pressure suppression system, called ''bubbler condenser,'' was modeled as a distributed heat sink with water thermodynamic properties. This model accounts for the energy balance. However, it is not currently possible to model dynamic phenomena associated with the water pools (e.g., vent clearing, level change). The GASFLOW 2.1 calculation gave detailed results for the spatial distribution of thermal-hydraulic parameters and gas concentrations. The range and trend of the parameters are reasonable and valuable. There are particularly interesting circulation patterns around the steam generators, in the bubbler tower and other primary system compartments. In case of the bubbler tower, concentration and temperature contour plots show an inhomogeneous distribution along the height and width, changing during the accident. Hydrogen concentrations also vary within primary system compartments displaying lower as well as higher (up to 13-20% and higher) values in some nodes. Prediction of such concentration distributions was not previously possible with lumped parameter codes. GASFLOW 2.1 calculations were compared with CONTAIN 1.2 (lumped parameter code) results. Apart from the qualitatively similar trends, there are, for the time being, quantitative differences between the

  8. Advanced core-analyses for subsurface characterization

    Science.gov (United States)

    Pini, R.

    2017-12-01

    The heterogeneity of geological formations varies over a wide range of length scales and represents a major challenge for predicting the movement of fluids in the subsurface. Although they are inherently limited in the accessible length-scale, laboratory measurements on reservoir core samples still represent the only way to make direct observations on key transport properties. Yet, properties derived on these samples are of limited use and should be regarded as sample-specific (or `pseudos'), if the presence of sub-core scale heterogeneities is not accounted for in data processing and interpretation. The advent of imaging technology has significantly reshaped the landscape of so-called Special Core Analysis (SCAL) by providing unprecedented insight on rock structure and processes down to the scale of a single pore throat (i.e. the scale at which all reservoir processes operate). Accordingly, improved laboratory workflows are needed that make use of such wealth of information by e.g., referring to the internal structure of the sample and in-situ observations, to obtain accurate parameterisation of both rock- and flow-properties that can be used to populate numerical models. We report here on the development of such workflow for the study of solute mixing and dispersion during single- and multi-phase flows in heterogeneous porous systems through a unique combination of two complementary imaging techniques, namely X-ray Computed Tomography (CT) and Positron Emission Tomography (PET). The experimental protocol is applied to both synthetic and natural porous media, and it integrates (i) macroscopic observations (tracer effluent curves), (ii) sub-core scale parameterisation of rock heterogeneities (e.g., porosity, permeability and capillary pressure), and direct 3D observation of (iii) fluid saturation distribution and (iv) the dynamic spreading of the solute plumes. Suitable mathematical models are applied to reproduce experimental observations, including both 1D and 3D

  9. Sensitivity analyses of the peach bottom turbine trip 2 experiment

    International Nuclear Information System (INIS)

    Bousbia Salah, A.; D'Auria, F.

    2003-01-01

    In the light of the sustained development in computer technology, the possibilities for code calculations in predicting more realistic transient scenarios in nuclear power plants have been enlarged substantially. Therefore, it becomes feasible to perform 'Best-estimate' simulations through the incorporation of three-dimensional modeling of reactor core into system codes. This method is particularly suited for complex transients that involve strong feedback effects between thermal-hydraulics and kinetics as well as to transient involving local asymmetric effects. The Peach bottom turbine trip test is characterized by a prompt core power excursion followed by a self limiting power behavior. To emphasize and understand the feedback mechanisms involved during this transient, a series of sensitivity analyses were carried out. This should allow the characterization of discrepancies between measured and calculated trends and assess the impact of the thermal-hydraulic and kinetic response of the used models. On the whole, the data comparison revealed a close dependency of the power excursion with the core feedback mechanisms. Thus for a better best estimate simulation of the transient, both of the thermal-hydraulic and the kinetic models should be made more accurate. (author)

  10. Nonlinear finite element analyses: advances and challenges in dental applications.

    Science.gov (United States)

    Wakabayashi, N; Ona, M; Suzuki, T; Igarashi, Y

    2008-07-01

    To discuss the development and current status of application of nonlinear finite element method (FEM) in dentistry. The literature was searched for original research articles with keywords such as nonlinear, finite element analysis, and tooth/dental/implant. References were selected manually or searched from the PUBMED and MEDLINE databases through November 2007. The nonlinear problems analyzed in FEM studies were reviewed and categorized into: (A) nonlinear simulations of the periodontal ligament (PDL), (B) plastic and viscoelastic behaviors of dental materials, (C) contact phenomena in tooth-to-tooth contact, (D) contact phenomena within prosthodontic structures, and (E) interfacial mechanics between the tooth and the restoration. The FEM in dentistry recently focused on simulation of realistic intra-oral conditions such as the nonlinear stress-strain relationship in the periodontal tissues and the contact phenomena in teeth, which could hardly be solved by the linear static model. The definition of contact area critically affects the reliability of the contact analyses, especially for implant-abutment complexes. To predict the failure risk of a bonded tooth-restoration interface, it is essential to assess the normal and shear stresses relative to the interface. The inclusion of viscoelasticity and plastic deformation to the program to account for the time-dependent, thermal sensitive, and largely deformable nature of dental materials would enhance its application. Further improvement of the nonlinear FEM solutions should be encouraged to widen the range of applications in dental and oral health science.

  11. Non-Statistical Methods of Analysing of Bankruptcy Risk

    Directory of Open Access Journals (Sweden)

    Pisula Tomasz

    2015-06-01

    Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

  12. Experimental and numerical analyses of magnesium alloy hot workability

    Directory of Open Access Journals (Sweden)

    F. Abbassi

    2016-12-01

    Full Text Available Due to their hexagonal crystal structure, magnesium alloys have relatively low workability at room temperature. In this study, the hot workability behavior of cast-extruded AZ31B magnesium alloy is studied through hot compression testing, numerical modeling and microstructural analyses. Hot deformation tests are performed at temperatures of 250 °C to 400 °C under strain rates of 0.01 to 1.0 s−1. Transmission electron microscopy is used to reveal the presence of dynamic recrystallization (DRX, dynamic recovery (DRY, cracks and shear bands. To predict plastic instabilities during hot compression tests of AZ31B magnesium alloy, the authors use Johnson–Cook damage model in a 3D finite element simulation. The optimal hot workability of magnesium alloy is found at a temperature (T of 400 °C and strain rate (ε˙ of 0.01 s−1. Stability is found at a lower strain rate, and instability is found at a higher strain rate.

  13. Quantifying Fire Cycle from Dendroecological Records Using Survival Analyses

    Directory of Open Access Journals (Sweden)

    Dominic Cyr

    2016-06-01

    Full Text Available Quantifying fire regimes in the boreal forest ecosystem is crucial for understanding the past and present dynamics, as well as for predicting its future dynamics. Survival analyses have often been used to estimate the fire cycle in eastern Canada because they make it possible to take into account the censored information that is made prevalent by the typically long fire return intervals and the limited scope of the dendroecological methods that are used to quantify them. Here, we assess how the true length of the fire cycle, the short-term temporal variations in fire activity, and the sampling effort affect the accuracy and precision of estimates obtained from two types of parametric survival models, the Weibull and the exponential models, and one non-parametric model obtained with the Cox regression. Then, we apply those results in a case area located in eastern Canada. Our simulation experiment confirms some documented concerns regarding the detrimental effects of temporal variations in fire activity on parametric estimation of the fire cycle. Cox regressions appear to provide the most accurate and robust estimator, being by far the least affected by temporal variations in fire activity. The Cox-based estimate of the fire cycle for the last 300 years in the case study area is 229 years (CI95: 162–407, compared with the likely overestimated 319 years obtained with the commonly used exponential model.

  14. Visualizing Confidence in Cluster-Based Ensemble Weather Forecast Analyses.

    Science.gov (United States)

    Kumpf, Alexander; Tost, Bianca; Baumgart, Marlene; Riemer, Michael; Westermann, Rudiger; Rautenhaus, Marc

    2018-01-01

    In meteorology, cluster analysis is frequently used to determine representative trends in ensemble weather predictions in a selected spatio-temporal region, e.g., to reduce a set of ensemble members to simplify and improve their analysis. Identified clusters (i.e., groups of similar members), however, can be very sensitive to small changes of the selected region, so that clustering results can be misleading and bias subsequent analyses. In this article, we - a team of visualization scientists and meteorologists-deliver visual analytics solutions to analyze the sensitivity of clustering results with respect to changes of a selected region. We propose an interactive visual interface that enables simultaneous visualization of a) the variation in composition of identified clusters (i.e., their robustness), b) the variability in cluster membership for individual ensemble members, and c) the uncertainty in the spatial locations of identified trends. We demonstrate that our solution shows meteorologists how representative a clustering result is, and with respect to which changes in the selected region it becomes unstable. Furthermore, our solution helps to identify those ensemble members which stably belong to a given cluster and can thus be considered similar. In a real-world application case we show how our approach is used to analyze the clustering behavior of different regions in a forecast of "Tropical Cyclone Karl", guiding the user towards the cluster robustness information required for subsequent ensemble analysis.

  15. A comparison of cephalometric analyses for assessing sagittal jaw relationship

    International Nuclear Information System (INIS)

    Erum, G.; Fida, M.

    2008-01-01

    To compare the seven methods of cephalometric analysis for assessing sagittal jaw relationship and to determine the level of agreement between them. Seven methods, describing anteroposterior jaw relationships (A-B plane, ANB, Wits, AXB, AF-BF, FABA and Beta angle) were measured on the lateral cephalographs of 85 patients. Correlation analysis, using Cramer's V-test, was performed to determine the possible agreement between the pair of analyses. The mean age of the sample, comprising 35 males and 50 females was 15 years and 3 months. Statistically significant relationships were found among seven sagittal parameters with p-value <0.001. Very strong correlation was found between AXB and AF-BF distance (r=0.924); and weak correlation between ANB and Beta angle (r=0.377). Wits appraisal showed the greatest coefficient of variability. Despite varying strengths of association, statistically significant correlations were found among seven methods for assessing sagittal jaw relationship. FABA and A-B plane may be used to predict the skeletal class in addition to the established ANB angle. (author)

  16. Proposed Testing to Assess the Accuracy of Glass-To-Metal Seal Stress Analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, Robert S.; Emery, John M; Tandon, Rajan; Antoun, Bonnie R.; Stavig, Mark E.; Newton, Clay S.; Gibson, Cory S; Bencoe, Denise N.

    2014-09-01

    The material characterization tests conducted on 304L VAR stainless steel and Schott 8061 glass have provided higher fidelity data for calibration of material models used in Glass - T o - Metal (GTM) seal analyses. Specifically, a Thermo - Multi - Linear Elastic Plastic ( thermo - MLEP) material model has be en defined for S S304L and the Simplified Potential Energy Clock nonlinear visc oelastic model has been calibrated for the S8061 glass. To assess the accuracy of finite element stress analyses of GTM seals, a suite of tests are proposed to provide data for comparison to mo del predictions.

  17. MCNP benchmark analyses of critical experiments for space nuclear thermal propulsion

    International Nuclear Information System (INIS)

    Selcow, E.C.; Cerbone, R.J.; Ludewig, H.

    1993-01-01

    The particle-bed reactor (PBR) system is being developed for use in the Space Nuclear Thermal Propulsion (SNTP) Program. This reactor system is characterized by a highly heterogeneous, compact configuration with many streaming pathways. The neutronics analyses performed for this system must be able to accurately predict reactor criticality, kinetics parameters, material worths at various temperatures, feedback coefficients, and detailed fission power and heating distributions. The latter includes coupled axial, radial, and azimuthal profiles. These responses constitute critical inputs and interfaces with the thermal-hydraulics design and safety analyses of the system

  18. TRAC analyses for CCTF and SCTF tests and UPTF design/operation

    International Nuclear Information System (INIS)

    Williams, K.A.

    1983-01-01

    The 2D/3D Program is a multinational (Germany, Japan, and the United States) experimental and analytical nuclear reactor safety research program. The Los Alamos analysis effort is functioning as a vital part of the 2D/3D program. The CCTF and SCTF analyses have demonstrated that TRAC-PF1 can correctly predict multidimensional, nonequilibrium behavior in large-scale facilities prototypical of actual PWR's. Through these and future TRAC analyses the experimental findings can be related from facility to facility, and the results of this research program can be directly related to licensing concerns affecting actual PWR's

  19. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  20. Reliability and validity of a 20-s alternative to the wingate anaerobic test in team sport male athletes.

    Directory of Open Access Journals (Sweden)

    Ahmed Attia

    Full Text Available The intent of this study was to evaluate relative and absolute reliability of the 20-s anaerobic test (WAnT20 versus the WAnT30 and to verify how far the various indices of the 30-s Wingate anaerobic test (WAnT30 could be predicted from the WAnT20 data in male athletes. The participants were Exercise Science majors (age: 21.5±1.6 yrs, stature: 0.183±0.08 m, body mass: 81.2±10.9 kg who participated regularly in team sports. In Phase I, 41 participants performed duplicate WAnT20 and WAnT30 tests to assess reliability. In Phase II, 31 participants performed one trial each of the WAnT20 and WAnT30 to determine the ability of the WAnT20 to predict components of the WAnT30. In Phase III, 31 participants were used to cross-validate the prediction equations developed in Phase II. Respective intra-class correlation coefficients (ICC for peak power output (PPO (ICC = 0.98 and 0.95 and mean power output (MPO (ICC 0.98 and 0.90 did not differ significantly between WAnT20 and WAnT30. ICCs for minimal power output (POmin and fatigue index (FI were poor for both tests (range 0.53 to 0.76. Standard errors of the means (SEM for PPO and MPO were less than their smallest worthwhile changes (SWC in both tests; however, POmin and FI values were "marginal," with SEM values greater than their respective SWCs for both tests values. Stepwise regression analysis showed that MPO had the highest coefficient of predictability (R = 0.97, with POmin and FI considerable lower (R = 0.71 and 0.41 respectively. Cross-validation showed insignificant bias with limits of agreement of 0.99±1.04, 6.5±92.7 W, and 1.6±9.8% between measured and predicted MPO, POmin, and FI, respectively. WAnT20 offers a reliable and valid test of leg anaerobic power in male athletes and could replace the classic WAnT30.