WorldWideScience

Sample records for re-evaluate fisher-matrix predictions

  1. Fisher Matrix Predictions for Detecting the Cosmological 21-cm ...

    Indian Academy of Sciences (India)

    . ... tive designs are being planned for the future low frequency telescope SKA5. ... In section 4, we use the Fisher matrix analysis to make predictions for the SNR as ..... to sample Fourier modes k of a fixed magnitude k which are oriented at ...

  2. Fisher Matrix Preloaded — FISHER4CAST

    Science.gov (United States)

    Bassett, Bruce A.; Fantaye, Yabebal; Hlozek, Renée; Kotze, Jacques

    The Fisher Matrix is the backbone of modern cosmological forecasting. We describe the Fisher4Cast software: A general-purpose, easy-to-use, Fisher Matrix framework. It is open source, rigorously designed and tested and includes a Graphical User Interface (GUI) with automated LATEX file creation capability and point-and-click Fisher ellipse generation. Fisher4Cast was designed for ease of extension and, although written in Matlab, is easily portable to open-source alternatives such as Octave and Scilab. Here we use Fisher4Cast to present new 3D and 4D visualizations of the forecasting landscape and to investigate the effects of growth and curvature on future cosmological surveys. Early releases have been available at since mid-2008. The current release of the code is Version 2.2 which is described here. For ease of reference a Quick Start guide and the code used to produce the figures in this paper are included, in the hope that it will be useful to the cosmology and wider scientific communities.

  3. On the validity of cosmological Fisher matrix forecasts

    International Nuclear Information System (INIS)

    Wolz, Laura; Kilbinger, Martin; Weller, Jochen; Giannantonio, Tommaso

    2012-01-01

    We present a comparison of Fisher matrix forecasts for cosmological probes with Monte Carlo Markov Chain (MCMC) posterior likelihood estimation methods. We analyse the performance of future Dark Energy Task Force (DETF) stage-III and stage-IV dark-energy surveys using supernovae, baryon acoustic oscillations and weak lensing as probes. We concentrate in particular on the dark-energy equation of state parameters w 0 and w a . For purely geometrical probes, and especially when marginalising over w a , we find considerable disagreement between the two methods, since in this case the Fisher matrix can not reproduce the highly non-elliptical shape of the likelihood function. More quantitatively, the Fisher method underestimates the marginalized errors for purely geometrical probes between 30%-70%. For cases including structure formation such as weak lensing, we find that the posterior probability contours from the Fisher matrix estimation are in good agreement with the MCMC contours and the forecasted errors only changing on the 5% level. We then explore non-linear transformations resulting in physically-motivated parameters and investigate whether these parameterisations exhibit a Gaussian behaviour. We conclude that for the purely geometrical probes and, more generally, in cases where it is not known whether the likelihood is close to Gaussian, the Fisher matrix is not the appropriate tool to produce reliable forecasts

  4. Fisher.py: Fisher Matrix Manipulation and Confidence Contour Plotting

    Science.gov (United States)

    Coe, Dan

    2010-10-01

    Fisher.py allows you to combine constraints from multiple experiments (e.g., weak lensing + supernovae) and add priors (e.g., a flat universe) simply and easily. Calculate parameter uncertainties and plot confidence ellipses. Fisher matrix expectations for several experiments are included as calculated by myself (time delays) and the Dark Energy Task Force (WL/SN/BAO/CL/CMB), or provide your own.

  5. A re-evaluation of PETROTOX for predicting acute and chronic toxicity of petroleum substances.

    Science.gov (United States)

    Redman, Aaron D; Parkerton, Thomas F; Leon Paumen, Miriam; Butler, Josh D; Letinski, Daniel J; den Haan, Klass

    2017-08-01

    The PETROTOX model was developed to perform aquatic hazard assessment of petroleum substances based on substance composition. The model relies on the hydrocarbon block method, which is widely used for conducting petroleum substance risk assessments providing further justification for evaluating model performance. Previous work described this model and provided a preliminary calibration and validation using acute toxicity data for limited petroleum substance. The objective of the present study was to re-evaluate PETROTOX using expanded data covering both acute and chronic toxicity endpoints on invertebrates, algae, and fish for a wider range of petroleum substances. The results indicated that recalibration of 2 model parameters was required, namely, the algal critical target lipid body burden and the log octanol-water partition coefficient (K OW ) limit, used to account for reduced bioavailability of hydrophobic constituents. Acute predictions from the updated model were compared with observed toxicity data and found to generally be within a factor of 3 for algae and invertebrates but overestimated fish toxicity. Chronic predictions were generally within a factor of 5 of empirical data. Furthermore, PETROTOX predicted acute and chronic hazard classifications that were consistent or conservative in 93 and 84% of comparisons, respectively. The PETROTOX model is considered suitable for the purpose of characterizing petroleum substance hazard in substance classification and risk assessments. Environ Toxicol Chem 2017;36:2245-2252. © 2017 SETAC. © 2017 SETAC.

  6. Fisher Matrix Predictions for Detecting the Cosmological 21-cm ...

    Indian Academy of Sciences (India)

    1 ... (Morales 2005) instead of the frequency channels νc. ... r′ν ). + δa,b 2 νc B σ2. (NA − a) . (7). The factor (NA − a)−1 in the noise contribution accounts for the redundancy in the baseline distribution. The functions ˜A(Ua − U′) and ˜A∗(Ub ...

  7. Application of the effective Fisher matrix to the frequency domain inspiral waveforms

    International Nuclear Information System (INIS)

    Cho, Hee-Suk; Lee, Chang-Hwan

    2014-01-01

    The Fisher matrix (FM) has been generally used to predict the accuracy of the gravitational wave parameter estimation. Although the limitation of the FM has been well known, it is still mainly used due to its very low computational cost compared to the Monte Carlo simulations. Recently, Rodriguez et al (2013 Phys. Rev. D 88 084013) performed Markov chain Monte Carlo (MCMC) simulations using a frequency domain inspiral waveform model (TaylorF2) for nonspinning binary systems with total masses M⩽20M ⊙ , and they found systematic differences between the predictions from FM and MCMC for M>10M ⊙ . On the other hand, an effective Fisher matrix (eFM) was recently introduced by Cho et al (2013 Phys. Rev. D 87 24004). The eFM is a semi-analytic approach to the standard FM, in which the derivative is taken of a quadratic function fitted to the local overlap surface. In this work, we apply the eFM method to the TaylorF2 waveform for nonspinning binary systems with a moderately high signal-to-noise ratio (SNR∼15) and find that the eFM can reproduce the MCMC error bounds in Rodriguez et al well, even for high masses. By comparing the eFM standard deviation directly with the 1-σ confidence interval of the marginalized overlap that approximates the MCMC posterior distribution, we show that the eFM can be acceptable in all mass regions for the estimation of the MCMC error bounds. We also investigate the dependence on the signal strength. (paper)

  8. SURVEY DESIGN FOR SPECTRAL ENERGY DISTRIBUTION FITTING: A FISHER MATRIX APPROACH

    International Nuclear Information System (INIS)

    Acquaviva, Viviana; Gawiser, Eric; Bickerton, Steven J.; Grogin, Norman A.; Guo Yicheng; Lee, Seong-Kook

    2012-01-01

    The spectral energy distribution (SED) of a galaxy contains information on the galaxy's physical properties, and multi-wavelength observations are needed in order to measure these properties via SED fitting. In planning these surveys, optimization of the resources is essential. The Fisher Matrix (FM) formalism can be used to quickly determine the best possible experimental setup to achieve the desired constraints on the SED-fitting parameters. However, because it relies on the assumption of a Gaussian likelihood function, it is in general less accurate than other slower techniques that reconstruct the probability distribution function (PDF) from the direct comparison between models and data. We compare the uncertainties on SED-fitting parameters predicted by the FM to the ones obtained using the more thorough PDF-fitting techniques. We use both simulated spectra and real data, and consider a large variety of target galaxies differing in redshift, mass, age, star formation history, dust content, and wavelength coverage. We find that the uncertainties reported by the two methods agree within a factor of two in the vast majority (∼90%) of cases. If the age determination is uncertain, the top-hat prior in age used in PDF fitting to prevent each galaxy from being older than the universe needs to be incorporated in the FM, at least approximately, before the two methods can be properly compared. We conclude that the FM is a useful tool for astronomical survey design.

  9. Re-Evaluation of Acid-Base Prediction Rules in Patients with Chronic Respiratory Acidosis

    Directory of Open Access Journals (Sweden)

    Tereza Martinu

    2003-01-01

    Full Text Available RATIONALE: The prediction rules for the evaluation of the acid-base status in patients with chronic respiratory acidosis, derived primarily from an experimental canine model, suggest that complete compensation should not occur. This appears to contradict frequent observations of normal or near-normal pH levels in patients with chronic hypercapnia.

  10. Fisher matrix forecasts for astrophysical tests of the stability of the fine-structure constant

    Directory of Open Access Journals (Sweden)

    C.S. Alves

    2017-07-01

    Full Text Available We use Fisher Matrix analysis techniques to forecast the cosmological impact of astrophysical tests of the stability of the fine-structure constant to be carried out by the forthcoming ESPRESSO spectrograph at the VLT (due for commissioning in late 2017, as well by the planned high-resolution spectrograph (currently in Phase A for the European Extremely Large Telescope. Assuming a fiducial model without α variations, we show that ESPRESSO can improve current bounds on the Eötvös parameter—which quantifies Weak Equivalence Principle violations—by up to two orders of magnitude, leading to stronger bounds than those expected from the ongoing tests with the MICROSCOPE satellite, while constraints from the E-ELT should be competitive with those of the proposed STEP satellite. Should an α variation be detected, these measurements will further constrain cosmological parameters, being particularly sensitive to the dynamics of dark energy.

  11. Status report on seismic re-evaluation

    International Nuclear Information System (INIS)

    1998-01-01

    In May 1997, a meeting of the PWG 3 Sub Group on the Seismic Behaviour of Structures agreed several priority objectives, of which one was the production of a status report on seismic re-evaluation. Seismic re-evaluation is identified as the process of carrying out a re-assessment of the safety of existing nuclear power plants for a specified seismic hazard. This may be necessary when no seismic hazard was considered in the original design of the plant, the relevant codes and regulations have been revised, the seismic hazard for the site has been re-assessed or there is a need to assess the capacity of the plant for severe accident conditions and behaviour beyond the design basis. Re-evaluation may also be necessary to resolve an issue, or to assess the impact of new findings or knowledge. A questionnaire on the subject was issued to all members of the Seismic Sub Group in the summer of 1997, and responses to the questionnaire had been received from most members by the end of 1997. This report is based on the responses to the questionnaire, together with comment and discussion within the group. The questionnaire covered the following main topics of interest in relation to seismic re-evaluation: General and Legislative Framework, Overall Approach, Input Definition and Analysis Methods, Scope of Plant and Assessment of As-built Situation, Assessment criteria, Outcome of Re-evaluations, Research. The responses to the questionnaire have been collated and reviewed with the objective of comparing current practice in the field of seismic re-evaluation in member countries, and a number of important points have been identified in relation to the position of seismic re-evaluation in the nuclear power industry throughout the world. It is evident that seismic re-evaluation is a relatively mature process that has been developing for some time, with most countries adopting similar practices, often based on principles which have been developed in the US nuclear industry. Seismic

  12. Re-evaluation of atomic bomb radiation

    International Nuclear Information System (INIS)

    Okajima, Shunzo

    1984-01-01

    The background and current status of the re-evaluation of atomic bomb (A-bomb) radiation doses are presented. Problems in re-evaluating radiation doses are discussed: spectra of gamma-rays and neutrons emitted in the air, A-bomb structures, and meterological elements should be taken into account. In Japan, in an attempt to estimate A-bomb radiation doses, radioactive residues contained in roof tiles, bricks, rocks, and teeth and shell button of clothes are being actually measured. (Namekawa, K.)

  13. Pancreaticoduodenal injuries: Re-evaluating current management ...

    African Journals Online (AJOL)

    Background. Pancreaticoduodenal injuries are uncommon owing to the protected position of the pancreas and duodenum in the retroperitoneum. Management depends on the extent of injury. This study was undertaken to document outcome of pancreaticoduodenal injuries and to re-evaluate our approach. Patients and ...

  14. Seismic re-evaluation process in Medzamor-2 NPP

    International Nuclear Information System (INIS)

    Zadoyan, P.

    2000-01-01

    Seismic re-evaluation process for Medzamor-2 NPP describes the following topics: program implementation status; re-evaluation program structure; regulatory procedure and review plan; current tasks and practice; and regulatory assessment and research programs

  15. Re-evaluation of lung to thorax transverse area ratio immediately before birth in predicting postnatal short-term outcomes of fetuses with isolated left-sided congenital diaphragmatic hernia: A single center analysis.

    Science.gov (United States)

    Kido, Saki; Hidaka, Nobuhiro; Sato, Yuka; Fujita, Yasuyuki; Miyoshi, Kina; Nagata, Kouji; Taguchi, Tomoaki; Kato, Kiyoko

    2018-05-01

    We aimed to investigate whether the lung-to-thorax transverse area ratio (LTR) immediately before birth is of diagnostic value for the prediction of postnatal short-term outcomes in cases of isolated left-sided congenital diaphragmatic hernia (CDH). We retrospectively reviewed the cases of fetal isolated left-sided CDH managed at our institution between April 2008 and July 2016. We divided the patients into two groups based on LTR immediately before birth, using a cut-off value of 0.08. We compared the proportions of subjects within the two groups who survived until discharge using Fisher's exact test. Further, using Spearman's rank correlation, we assessed whether LTR was correlated with length of stay, duration of mechanical ventilation, and supplemental oxygen. Twenty-nine subjects were included (five with LTR < 0.08, and 24 with LTR ≥ 0.08). The proportion of subjects surviving until discharge was 40% (2/5) for patients with LTR < 0.08, as compared with 96% (23/24) for those with LTR ≥ 0.08. LTR measured immediately before birth was negatively correlated with the postnatal length of stay (Spearman's rank correlation coefficient, rs = -0.486), and the duration of supplemental oxygen (rs = -0.537). Further, the duration of mechanical ventilation was longer in patients with a lower LTR value. LTR immediately before birth is useful for the prediction of postnatal short-term outcomes in fetuses with isolated left-sided CDH. In particular, patients with prenatal LTR value less than 0.08 are at increased risk of postnatal death. © 2017 Japanese Teratology Society.

  16. Landscape, Process and Power: Re-evaluating Traditional Environmental Knowledge

    Directory of Open Access Journals (Sweden)

    Colleen Marie O'Brien

    2010-09-01

    Full Text Available Review of Landscape, Process and Power: Re-evaluating Traditional Environmental Knowledge. Serena Heckler, ed. 2009. Berghahn Books, New York. Pp. 304, 21 illustrations, bibliography, index. $95.00 (hardback. ISBN 978-1-84545-549-1

  17. Men in nursing: re-evaluating masculinities, re-evaluating gender.

    Science.gov (United States)

    Brown, Brian

    2009-10-01

    This paper critically interrogates and re-evaluates the notion that it is somehow difficult being a man in nursing and suggests some ways forward which will allow us to gain a more politically astute purchase on gender, nursing and the socio-political context in which the profession operates. Men appear to be well served by a career in nursing. Despite their lesser numbers they are likely to earn more and be promoted into leadership roles more readily. Yet there is a pervasive sense in the literature on men in nursing that they feel unhappy as a minority in a predominantly female occupation and feel a disjuncture between masculine identity and the nursing role. The genealogy of this idea can be traced to a more extensive literature in the 'men's movement', in sex role theory and masculinity studies which has tended to focus on the putative hurts that men suffer as they are socialized into the male role. This is itself informed by experiences and discourses from therapy, and privileges these kinds of experiences over and above more sober consideration of the respective powers of men and women and the sociopolitical context of the profession. This 'poor me' discourse deflects attention away from the business of tackling material inequalities and enables men to encroach further into the agenda of nursing discussions. Instead, a view of men and women in nursing is proposed which is attentive to the historical and political operations of power and which sees subjective experiences as the effects of power rather than as a starting point for analysis. We must place individual experience coherently and exhaustively in the material environment of social space and time. It is in this way that we can genuinely advance the interest of men and women and build an effective profile for the profession as a whole.

  18. Seismic re-evaluation of Heavy Water Plant, Kota

    International Nuclear Information System (INIS)

    Parulekar, Y.M.; Reddy, G.R.; Vaze, K.K.; Kushwaha, H.S.

    2003-10-01

    This report deals with seismic re-evaluation of Heavy Water Plant, Kota. Heavy Water Plant, Kota handles considerable amount of H 2 S gas, which is very toxic. During the original design stage as per IS 1893-1966 seismic coefficient for zone-I was zero. Therefore earthquake and its effects were not considered while designing the heavy water plant structures. However as per IS 1893 (1984) the seismic coefficient for zone-I is 0.01 g. Hence seismic re-evaluation of various structures of the heavy water plant is carried out. Analysis of the heavy water plant structures was carried out for self weight, equipment load and earthquake load. Pressure loading was also considered in case of H 2 S storage tanks. Soil structure interaction effect was considered in the analysis. The combined stresses in the structures due to earthquake and dead load were checked with the allowable stresses. (author)

  19. Seismic re-evaluation of French nuclear power plants

    International Nuclear Information System (INIS)

    Andrieu, R.

    1995-01-01

    After a presentation of the seismic inputs which have been taken into account in the design of the French Nuclear Power Plants, the re-assessed values of these inputs are shown. Some considerations about the specificity of the French PWR program with regard to the standardisation of plants are given together with the present objectives of seismic re-evaluations. Finally the main results of the seismic re-analysis being performed for the Phenix Fast Reactor are considered. (author)

  20. The Re-evaluation of {sup 84}Rb decay data

    Energy Technology Data Exchange (ETDEWEB)

    Xiaolong, Huang; Chunmei, Zhou [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    The {sup 84}Rb is an important radionuclide and its decay data are fundamental data in nuclear applications. The decay data for {sup 84}Rb were re-evaluated. The energies and intensities of {gamma} rays and their internal conversion coefficients, energies and intensities of Auger electrons, conversion electrons and x-rays, were recommended. The decay scheme was also given. The balance of radiation rays intensities and energies was checked. (9 tabs., 2 figs.).

  1. The Re-evaluation of 84Rb decay data

    International Nuclear Information System (INIS)

    Huang Xiaolong; Zhou Chunmei

    1996-01-01

    The 84 Rb is an important radionuclide and its decay data are fundamental data in nuclear applications. The decay data for 84 Rb were re-evaluated. The energies and intensities of γ rays and their internal conversion coefficients, energies and intensities of Auger electrons, conversion electrons and x-rays, were recommended. The decay scheme was also given. The balance of radiation rays intensities and energies was checked. (9 tabs., 2 figs.)

  2. Re-evaluation of Station Blackout in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Eunchan; Shin, Taeyoung [Korea Hydro and Nuclear Power Co. Ltd., Daejeon (Korea, Republic of)

    2014-05-15

    This paper proposes a reduction of the uncertainty due to the small number of LOOP events and an estimation of the non-recovery probability after a LOOP event where the operators fail to energize a safety bus using the offsite power recovery during an SBO with recent operating experience. In addition, in this analysis, the CDF is re-evaluated through reflecting the enhancement of the Class-1E battery capacity. For newly constructed KHNP plants, the LOOP frequency and non-recovery probability after a LOOP during an SBO were re-evaluated through integrating the KHNP events into generic data containing broader experiences for PSA. For an initiating event frequency, a new LOOP frequency was calculated through a Bayesian update of the KHNP LOOP frequency using NUREG/CR-6890, which reflects the recent trends and has a large data size. For the non-recovery probability estimation, domestic data were added to the American experiences in the NUREG/CR-6890; these data were fitted to a lognormal distribution in order to reduce the uncertainty due to the small size of the KHNP data. Regarding the battery capacity enhancement, the success criteria during an SBO were re-evaluated considering the longer battery duty time. The CDF was recalculated using the resultant available time for operator action. The changed CDF was reduced by approximately 50% compared with the value before battery improvement. In conclusion, it was quantitatively proven that enlarging the battery capacity to manage SBOs positively affected plant safety. In addition, methods to improve data uncertainty due to the small number of experiences were selected in order to evaluate the LOOP frequency and non-recovery probability after a LOOP for future plants. These efforts contribute to obtaining a realistic risk profile and to prioritizing countermeasures and improvements of vulnerabilities for safety.

  3. Re-evaluation of azo dyes as food additives

    DEFF Research Database (Denmark)

    Pratt, Iona; Larsen, John Christian; Mortensen, Alicja

    2013-01-01

    additives to be assessed by the Scientific Committee on Food, many years ago, (ii) because of concern regarding possible health effects of artificial colours arising since the original evaluations.Concerns includedbehavioural effects in children, allergic reactions, genotoxicity and possible carcinogenicity......Aryl azo compounds are widely used as colorants (azo dyes) in a wide range of products including textiles, leather, paper, cosmetics, pharmaceuticals and food.As part of its systematic re-evaluation of food additives, the European Food Safety Authority (EFSA) has carried out new risk assessments...

  4. 42 CFR 405.213 - Re-evaluation of a device categorization.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Re-evaluation of a device categorization. 405.213... Decisions That Relate to Health Care Technology § 405.213 Re-evaluation of a device categorization. (a... experimental/investigational (Category A) may request re-evaluation of the categorization decision. (2) A...

  5. Re-evaluating the Contribution and Legacy of Hedley Bull

    Directory of Open Access Journals (Sweden)

    Emerson Maione Souza

    2008-06-01

    Full Text Available The article aims, in the first instance, to make a detailed analysis of the work of Hedley Bull, approaching the main themes and concepts developed by him. Secondly, it aims to re-evaluate the potential of the author’s contribution, given the new conditions of the post-Cold War period. With this in mind, the article critically analyses the most recent interpretations of this work, which seek to highlight its critical and normative potential, as well as to dissociate it from the realist tradition in international relations. These two facts differentiate the new commentators from older ones and reaffirm the continuing relevance of Hedley Bull’s work, the latter being the article’s chief conclusion.

  6. Re-evaluation of Baby EBM Shielding Thickness

    International Nuclear Information System (INIS)

    Mohd Rizal Mohd Chulan; Siti Aisah Hashim; Wah, L.K.; Mukhlis Moktar

    2013-01-01

    The minimum energy required for an electron beam (EB) to be used as an irradiation device is 200 keV. Nuclear Malaysia's home grown EB machine, the Baby EB can generate up to 140 keV. Therefore, to enable it to be used for application, an internal funding was acquired to increase the energy to up to 300 keV. In doing so, the existing shielding with thickness of 0.35 cm for the top frame and 0.7 cm for the middle and bottom frame needs to be reevaluated. This is to ensure that the shield can still provide significant protection from harmful radiation. This re-evaluation is also needed because of the recent change of clean area dose limit from 2.5 μSv/ hr to 1.0 μSv/ hr. The location of Baby EBM also needs to be re-evaluated if the weight reached 4500 kg/ m 2 (concentrated load for laboratories area). From the calculation it was found that the existing shielding is unable to provide the required protection from the harmful radiation. The recommended thicknesses for the shielding are 3.26 cm for the top frame, 3.5 cm for the middle frame and 3.78 for the bottom frame. Therefore, the total weight of the Baby EBM becomes more than 3000 kg/ m 2 (3337.38 kg/ m 2 ) and this justify the need for the Baby EBM to be transferred from first floor (room no.43008), block 43 (ALUTRON building) to a more suitable location. It is preferable that the new location is in a ground floor that can bear the increased weight. (author)

  7. Re-evaluation of monitored retrievable storage concepts

    International Nuclear Information System (INIS)

    Fletcher, J.F.; Smith, R.I.

    1989-04-01

    In 1983, as a prelude to the monitored retrievable storage (MRS) facility conceptual design, the Pacific Northwest Laboratory (PNL) conducted an evaluation for the US Department of Energy (DOE) that examined alternative concepts for storing spent LWR fuel and high- level wastes from fuel reprocessing. The evaluation was made considering nine concepts for dry away-from-reactor storage. The nine concepts evaluated were: concrete storage cask, tunnel drywell, concrete cask-in-trench, open-cycle vault, metal casks (transportable and stationary), closed-cycle vault, field drywell, and tunnel-rack vault. The purpose and scope of the re-evaluation did not require a repetition of the expert-based examinations used earlier. Instead, it was based on more detailed technical review by a small group, focusing on changes that had occurred since the initial evaluation was made. Two additional storage concepts--the water pool and the horizontal modular storage vault (NUHOMS system)--were ranked along with the original nine. The original nine concepts and the added two conceptual designs were modified as appropriate for a scenario with storage capacity for 15,000 MTU of spent fuel. Costs, area requirements, and technical and historical data pertaining to MRS storage were updated for each concept

  8. Re-evaluation of monitored retrievable storage concepts

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, J.F.; Smith, R.I.

    1989-04-01

    In 1983, as a prelude to the monitored retrievable storage (MRS) facility conceptual design, the Pacific Northwest Laboratory (PNL) conducted an evaluation for the US Department of Energy (DOE) that examined alternative concepts for storing spent LWR fuel and high- level wastes from fuel reprocessing. The evaluation was made considering nine concepts for dry away-from-reactor storage. The nine concepts evaluated were: concrete storage cask, tunnel drywell, concrete cask-in-trench, open-cycle vault, metal casks (transportable and stationary), closed-cycle vault, field drywell, and tunnel-rack vault. The purpose and scope of the re-evaluation did not require a repetition of the expert-based examinations used earlier. Instead, it was based on more detailed technical review by a small group, focusing on changes that had occurred since the initial evaluation was made. Two additional storage concepts--the water pool and the horizontal modular storage vault (NUHOMS system)--were ranked along with the original nine. The original nine concepts and the added two conceptual designs were modified as appropriate for a scenario with storage capacity for 15,000 MTU of spent fuel. Costs, area requirements, and technical and historical data pertaining to MRS storage were updated for each concept.

  9. Overview of UNSCEAR re-evaluation of public exposure

    International Nuclear Information System (INIS)

    Rochedo, Elaine R.R.

    2009-01-01

    The United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) has re-evaluated the levels of public radiation exposure for four broad categories of sources: natural sources of radiation, enhanced exposure to naturally occurring radioactive material (NORM), man-made sources used for peaceful purposes and man-made sources used for military purposes. Regarding natural radiation sources, recent data confirmed former results from 2000 Report, but with a more wide range. Very few information is available for public exposure from NORM. Most works describes concentration levels but dose assessments are usually restricted to occupational exposures. The use of source and by-product materials may however lead to doses up to a few milisieverts to members of the public. The nuclear fuel cycle and electric energy generation have very small contributions to public exposure. Uranium mining contributes with the largest individual doses, mainly due to radon from tailings. Most relevant military use of nuclear energy were the atmospheric nuclear tests, interrupted in the 60's. Residual radioactivity deposited worldwide is now responsible for a very small contribution to worldwide exposures. However, they left a legacy of several contaminated sites. The use of depleted uranium in munitions in Kuwait, Kosovo, Serbia, Montenegro and Bosnia-Herzegovina, has led to great public concern, although not usually associated to any major consequence regarding public exposure. Some accidents resulted in environmental contamination and exposures of members of the public. Except for the Chernobyl accident, the areas affected were usually small and the exposure restricted to small number of persons, up to a few hundred, without any significant contribution to worldwide exposures. The exposure to natural sources of radiation is still the major component of worldwide exposure to ionizing radiation although for some highly developed countries, medical exposure has surpassed the

  10. Pancreaticoduodenal injuries: re-evaluating current management approaches.

    Science.gov (United States)

    Chinnery, G E; Madiba, T E

    2010-02-01

    Pancreaticoduodenal injuries are uncommon owing to the protected position of the pancreas and duodenum in the retroperitoneum. Management depends on the extent of injury. This study was undertaken to document outcome of pancreaticoduodenal injuries and to re-evaluate our approach. A prospective study of all patients treated for pancreaticoduodenal trauma in one surgical ward at King Edward VIII hospital over a 7-year period (1998 - 2004). Demographic data, clinical presentation, findings at laparotomy and outcome were documented. Prophylactic antibiotics were given at induction of anaesthesia. A total of 488 patients underwent laparotomy over this period, 43 (9%) of whom (all males) had pancreatic and duodenal injuries. Injury mechanisms were gunshot (30), stabbing (10) and blunt trauma (3). Their mean age was 30.1+9.6 years. Delay before laparotomy was 12.8+29.1 hours. Seven were admitted in shock. Mean Injury Severity Score (ISS) was 14+8.6. Management of 20 duodenal injuries was primary repair (14), repair and pyloric exclusion (3) and conservative (3). Management of 15 pancreatic injuries was drainage alone (13), conservative management of pseudocyst (1) and distal pancreatectomy (1). Management of 8 combined pancreaticoduodenal injuries was primary duodenal repair and pancreatic drainage (5) and repair with pyloric exclusion of duodenal injury and pancreatic drainage (3). Twenty-one patients (49%) developed complications, and 28 required ICU admission with a median ICU stay of 4 days. Ten patients died (23%). Mean hospital stay was 18.3+24.4 days. The overall mortality was comparable with that in the world literature. We still recommend adequate exploration of the pancreas and duodenum and conservative operative management where possible.

  11. Re-evaluating the treatment of acute optic neuritis.

    Science.gov (United States)

    Bennett, Jeffrey L; Nickerson, Molly; Costello, Fiona; Sergott, Robert C; Calkwood, Jonathan C; Galetta, Steven L; Balcer, Laura J; Markowitz, Clyde E; Vartanian, Timothy; Morrow, Mark; Moster, Mark L; Taylor, Andrew W; Pace, Thaddeus W W; Frohman, Teresa; Frohman, Elliot M

    2015-07-01

    Clinical case reports and prospective trials have demonstrated a reproducible benefit of hypothalamic-pituitary-adrenal (HPA) axis modulation on the rate of recovery from acute inflammatory central nervous system (CNS) demyelination. As a result, corticosteroid preparations and adrenocorticotrophic hormones are the current mainstays of therapy for the treatment of acute optic neuritis (AON) and acute demyelination in multiple sclerosis.Despite facilitating the pace of recovery, HPA axis modulation and corticosteroids have failed to demonstrate long-term benefit on functional recovery. After AON, patients frequently report visual problems, motion perception difficulties and abnormal depth perception despite 'normal' (20/20) vision. In light of this disparity, the efficacy of these and other therapies for acute demyelination require re-evaluation using modern, high-precision paraclinical tools capable of monitoring tissue injury.In no arena is this more amenable than AON, where a new array of tools in retinal imaging and electrophysiology has advanced our ability to measure the anatomic and functional consequences of optic nerve injury. As a result, AON provides a unique clinical model for evaluating the treatment response of the derivative elements of acute inflammatory CNS injury: demyelination, axonal injury and neuronal degeneration.In this article, we examine current thinking on the mechanisms of immune injury in AON, discuss novel technologies for the assessment of optic nerve structure and function, and assess current and future treatment modalities. The primary aim is to develop a framework for rigorously evaluating interventions in AON and to assess their ability to preserve tissue architecture, re-establish normal physiology and restore optimal neurological function. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  12. Overview of UNSCEAR re-evaluation of occupational exposure

    International Nuclear Information System (INIS)

    Melo, Dunstana

    2008-01-01

    The United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) has re-evaluated the levels of occupational radiation exposure for two broad categories of sources: natural sources of radiation and man-made source of radiation. The latter one includes the practices from the nuclear fuel cycle, medical uses of radiation, industrial uses, military activities, and miscellaneous sources. The evaluation has been performed based on the data provided in response to the UNSCEAR Survey of Occupational Radiation Exposures and also data from the literature. In general, the reporting of exposures arising in the commercial nuclear fuel cycle is more complete than that of exposures arising from other uses of radiation. The figure for occupational exposure, for the periods 1995-1999 and 2000-2002, has changed compared to the estimates in the UNSCEAR 2000 Report. The collective effective dose resulting from exposures to natural sources (in excess of average levels of natural background) is estimated to be about 37 260 man Sv, about 3 times higher than the value estimated in the UNSCEAR 2000 Report. The worldwide average annual collective effective dose for the workers involved in the use of man-made sources of radiation is around 4 730 man Sv, about 2 times higher than the value estimated in the UNSCEAR 2000 Report. The medical uses of radiation contributes with about 75% of the collective effective dose; nuclear fuel cycles contributes with about 17% and industrial uses, military activities and all other categories of worker contribute with about 8% of the collective dose for man-made sources of radiation. In general the levels of occupational exposure have decreased: the average effective doses are decreasing over time for all practices, the collective effective doses have fallen for most of the practices; except for medical uses which is now estimated based on more realistic data of number of monitored workers. (author)

  13. Safety re-evaluation of the HOR reactor

    International Nuclear Information System (INIS)

    Verkooijen, A.H.M.; Vries, J.W. de

    2001-01-01

    State. Requirement C16 in the new licence asks for a periodical integral safety re-evaluation of the HOR reactor every 10 years and starting after 2 years

  14. 75 FR 49930 - Stakeholder Meeting Regarding Re-Evaluation of Currently Approved Total Coliform Analytical Methods

    Science.gov (United States)

    2010-08-16

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9190-2] Stakeholder Meeting Regarding Re-Evaluation of... conferences during which the Agency will have a technical dialogue with stakeholders regarding re-evaluation of currently approved Total Coliform Rule (TCR) analytical methods. At these meetings, stakeholders...

  15. Prostate needle biopsies: interobserver variation and clinical consequences of histopathological re-evaluation

    DEFF Research Database (Denmark)

    Berg, Kasper Drimer; Toft, Birgitte Grønkaer; Røder, Martin Andreas

    2011-01-01

    Histopathological grading of prostate cancer (PCa) is associated with significant interobserver variability. This, as well as clinical consequences of histopathological re-evaluation, was investigated. In 350 patients, histopathological re-evaluations of prostate biopsies were compared with primary.......9%. The cancers were assessed with higher GS at re-evaluation in 25.0% of patients in cases with primary GS ≤ 6, while scores were devaluated in 3.0% and 10.3% of the patients with primary GS = 7 and ≥ 8, respectively. Strategies for clinical evaluation and treatment were changed as a result of the biopsy re......-evaluations in 19.7% and 13.1% of patients, respectively. Gleason scoring based on the radical prostatectomy specimen was higher than in both primary reports and re-evaluation of biopsies. Although a relatively high degree of concordance was found between biopsy assessments, the significant trend towards higher...

  16. Prostate needle biopsies: interobserver variation and clinical consequences of histopathological re-evaluation

    DEFF Research Database (Denmark)

    Berg, Kasper Drimer; Toft, Birgitte Grønkaer; Røder, Martin Andreas

    2011-01-01

    Histopathological grading of prostate cancer (PCa) is associated with significant interobserver variability. This, as well as clinical consequences of histopathological re-evaluation, was investigated. In 350 patients, histopathological re-evaluations of prostate biopsies were compared with primary...... pathology reports and with histopathology of the radical prostatectomy specimen. The consequences of re-evaluation for clinical workup and treatment of patients according to local algorithms were determined. For Gleason score (GS), complete agreement between primary report and re-evaluation was found in 76.......9%. The cancers were assessed with higher GS at re-evaluation in 25.0% of patients in cases with primary GS = 6, while scores were devaluated in 3.0% and 10.3% of the patients with primary GS = 7 and = 8, respectively. Strategies for clinical evaluation and treatment were changed as a result of the biopsy re...

  17. Acoustic scaling: A re-evaluation of the acoustic model of Manchester Studio 7

    Science.gov (United States)

    Walker, R.

    1984-12-01

    The reasons for the reconstruction and re-evaluation of the acoustic scale mode of a large music studio are discussed. The design and construction of the model using mechanical and structural considerations rather than purely acoustic absorption criteria is described and the results obtained are given. The results confirm that structural elements within the studio gave rise to unexpected and unwanted low-frequency acoustic absorption. The results also show that at least for the relatively well understood mechanisms of sound energy absorption physical modelling of the structural and internal components gives an acoustically accurate scale model, within the usual tolerances of acoustic design. The poor reliability of measurements of acoustic absorption coefficients, is well illustrated. The conclusion is reached that such acoustic scale modelling is a valid and, for large scale projects, financially justifiable technique for predicting fundamental acoustic effects. It is not appropriate for the prediction of fine details because such small details are unlikely to be reproduced exactly at a different size without extensive measurements of the material's performance at both scales.

  18. Five-Year NRHP Re-Evaluation of Historic Buildings Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Ullrich, R A; Heidecker, K R

    2011-09-12

    The Lawrence Livermore National Laboratory (LLNL) 'Draft Programmatic Agreement among the Department of Energy and the California State Historic Preservation Officer Regarding Operation of Lawrence Livermore National Laboratory' requires a review and re-evaluation of the eligibility of laboratory properties for the National Register of Historic Places (NRHP) every five years. The original evaluation was published in 2005; this report serves as the first five-year re-evaluation. This re-evaluation includes consideration of changes within LLNL to management, to mission, and to the built environment. it also determines the status of those buildings, objects, and districts that were recommended as NRHP-eligible in the 2005 report. Buildings that were omitted from the earlier building list, those that have reached 50 years of age since the original assessment, and new buildings are also addressed in the re-evaluation.

  19. Periodic safety re-evaluations in NPPs in EC member states, Finland and Sweden

    International Nuclear Information System (INIS)

    1990-01-01

    The work on periodic safety re-evaluations summarized in this report was performed by a Task Force of the CEC Working Group on the Safety of Thermal Reactors. The periodic safety re-evaluations under review in this study were those that are carried out in addition to other reviews which represent the primary means of safety assurance. The periodic safety re-evaluation is broader and more comprehensive in nature. The cumulative effects of experience (national and international), advances in knowledge and analysis techniques, improvements in safety standards and operating practices, overall effects of plant ageing, and the totality of all modifications over the period in question need to be taken into account. All countries have recognized the value of such periodic reviews, and licensees, either as a regulatory requirement or as a voluntary action, are carrying them out. The scope and contents of each country's review showed many similarities of approach, any differences being explained by the age and type of reactor in operation. Many similarities emerged in the topics selected for re-evaluation and in the approach to re-evaluation itself. The overall conclusion was that while approaches may differ in some respects, for practical purposes comparable levels of safety are achieved in the periodic safety re-evaluation of nuclear power plants

  20. Re-evaluation of the immunological Big Bang.

    Science.gov (United States)

    Flajnik, Martin F

    2014-11-03

    Classically the immunological 'Big Bang' of adaptive immunity was believed to have resulted from the insertion of a transposon into an immunoglobulin superfamily gene member, initiating antigen receptor gene rearrangement via the RAG recombinase in an ancestor of jawed vertebrates. However, the discovery of a second, convergent adaptive immune system in jawless fish, focused on the so-called variable lymphocyte receptors (VLRs), was arguably the most exciting finding of the past decade in immunology and has drastically changed the view of immune origins. The recent report of a new lymphocyte lineage in lampreys, defined by the antigen receptor VLRC, suggests that there were three lymphocyte lineages in the common ancestor of jawless and jawed vertebrates that co-opted different antigen receptor supertypes. The transcriptional control of these lineages during development is predicted to be remarkably similar in both the jawless (agnathan) and jawed (gnathostome) vertebrates, suggesting that an early 'division of labor' among lymphocytes was a driving force in the emergence of adaptive immunity. The recent cartilaginous fish genome project suggests that most effector cytokines and chemokines were also present in these fish, and further studies of the lamprey and hagfish genomes will determine just how explosive the Big Bang actually was. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Re-evaluation of internal exposure from the Chernobyl accident to the Czech population

    International Nuclear Information System (INIS)

    Malatova, I.; Skrkal, J.

    2006-01-01

    Doses from internal and external exposure due to the Chernobyl accident to the Czech population were estimated early in 1986. Later on, with more experimental results, doses from internal exposure were calculated more precisely. The initial predictions were rather conservative leading thus to higher doses than it appeared one year later. Monitoring of the environment, food chain and monitoring of internal contamination has been performed on the whole territory of the country since 1986 up to present time and has thus enabled reevaluation of the original estimates and also prediction of doses in future. This paper is focused mainly on evaluation of in vivo measurements of people. Use of the sophisticate software I.M.B.A. Professional Plus led to new estimation of committed effective doses and calculated inhalation intakes of radionuclides lead to estimation of content of radionuclides in the air. Ingestion intakes were also evaluated and compared with estimates from the results of measurements of food chain. Generally, the doses from the Chernobyl accident to the Czech population were low; however, as a few radionuclides have been measurable in environment, food chain and human body (137 Cs up to present), it is a unique chance for studying behaviour of radionuclides in the biosphere. Experience and conclusions which follow from the monitoring of the Chernobyl accident are unique for running and development of monitoring networks. Re evaluation of internal doses to the Czech population from the Chernobyl accident, using alternative approach, gave generally smaller doses than original estimation; still, the difference was not significant. It was shown that the doses from inhalation of 131 I and 137 Cs were greater than originally estimated, whereas doses from ingestion intake were lower than the originally estimated ones. (authors)

  2. General re-evaluation of the safety on the nuclear ship 'Mutsu' and its repair work

    International Nuclear Information System (INIS)

    1980-01-01

    According to the proposition by the Committee for Investigation Radiation Leak on Mutsu, the works of the general re-evaluation of safety were started after the approval by the Committee for Investigating General Re-evaluation and Repair Techniques for Mutsu. The contents of the general re-evaluation of safety are the inspection of the machines and equipments in the nuclear reactor plant, the review of the design of the nuclear reactor plant, the analysis of the nuclear reactor plant behavior in accidents, and the related experimental researches. These works have been carried out for five years, and problem did not arise at all regarding the nuclear reactor so far, but from the viewpoint of improving the safety and reliability further, it was decided to carry out the repair work based on the general re-evaluation of safety. The contents of the repair work are the improvement of the emergency core-cooling system, the improvement of the safety protection system, the improvement of the radiation monitoring equipments, the improvement of the containment vessel boundary, the improvement of the actuators for technological safety facilities, the improvement of the method controlling secondary water quality, and other repair works. The progress of the general re-evaluation of safety is reported. (Kako, I.)

  3. [Discussion on development of four diagnostic information scale for clinical re-evaluation of postmarketing herbs].

    Science.gov (United States)

    He, Wei; Xie, Yanming; Wang, Yongyan

    2011-12-01

    Post-marketing re-evaluation of Chinese herbs can well reflect Chinese medicine characteristics, which is the most easily overlooked the clinical re-evaluation content. Since little attention has been paid to this, study on the clinical trial design method was lost. It is difficult to improving the effectiveness and safety of traditional Chinese medicine. Therefore, more attention should be paid on re-evaluation of the clinical trial design method point about tcm syndrome such as the type of research program design, the study of Chinese medical information collection scale and statistical analysis methods, so as to improve the clinical trial design method study about tcm syndrome of Chinese herbs postmarketing re-evalutation status.

  4. Re-evaluation of natural food colours—State of the art

    DEFF Research Database (Denmark)

    Dusemund, B.; Parent-Massin, D.; Mortensen, Alicja

    2011-01-01

    regarding the re-evaluation of natural food colours, including: (1) the extracts are often made from different natural sources, (2) the extracts can be made using a range of extraction solvents/methods, (3) chemical characterisation of different extracts is usually missing, (4) detailed specifications......Having started the re-evaluation of food additives in accordance with the Commission Regulation (EU) No 257/2010 of 25 March 2010 the Scientific Panel on Food Additives and Nutrient Sources added to Food (ANS Panel) of the European Food Safety Authority (EFSA) identified several complicating issues...... levels as food additive to exposure resulting from the regular diet can be applied....

  5. U.S. experience in seismic re-evaluation and verification programs

    International Nuclear Information System (INIS)

    Stevenson, J.D.

    1995-01-01

    The purpose of this paper is to present a summary of the development of a seismic re-evaluation program for older nuclear power plants in the U.S. The principal focus of this reevaluation is the use of actual strong motion earthquake response data for structures and mechanical and electrical systems and components. These data are supplemented by generic shake table test results. Use of this type of seismic re-evaluation has led to major cost reductions as compared to more conventional analytical and component specific testing procedures. (author)

  6. Expanding the net: The re-evaluation of the multidimensional nomogram calculating the upper limit of normal PTH (maxPTH) in the setting of secondary hyperparathyroidism and the development of the MultIdimensional Predictive hyperparaTHyroid model (Mi-PTH).

    Science.gov (United States)

    Rajhbeharrysingh, Uma; El Youssef, Joseph; Leon, Enrique; Lasarev, Michael R; Klein, Robert; Vanek, Chaim; Mattar, Samer; Berber, Eren; Siperstein, Allan; Shindo, Maisie; Milas, Mira

    2016-01-01

    The multidimensional nomogram calculating the upper limit of normal PTH (maxPTH) model identifies a personalized upper limit of normal parathyroid hormone (PTH) and successfully predicts classical primary hyperparathyroidism (PHP). We aimed to assess whether maxPTH can distinguish normocalcemic PHP (NCPHP) from secondary hyperparathyroidism (SHP), including subjects who underwent bariatric surgery (BrS). A total of 172 subjects with 359 complete datasets of serum calcium (Ca), 25-OH vitamin D, and intact PTH from Oregon were analyzed: 123 subjects (212 datasets) with PHP and 47 (143) with SHP, including 28 (100) with previous BrS. An improved prediction model, MultIdimensional evaluation for Primary hyperparaTHyroidism (Mi-PTH), was created with the same variables as maxPTH by the use of a combined cohort (995 subjects) including participants from previous studies. In the Oregon cohort, maxPTH's sensitivity was 100% for classical PHP and 89% for NCPHP, but only 50% for normohormonal PHP (NHPHP) and 40% specific for SHP. In comparison, although sensitivity for NCPHP was similar (89%), Mi-PTH vastly improved SHP specificity (85%). In the combined cohort, Mi-PTH had better sensitivity of 98.5% (vs 95%) and specificity 97% (vs 85%). MaxPTH was sensitive in detecting PHP; however, there was low specificity for SHP, especially in patients who underwent BrS. The creation of Mi-PTH provided improved performance measures but requires further prospective evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Re-evaluation of the haptoglobin reference values with the radial immunodiffusion technique

    NARCIS (Netherlands)

    Rijn, H.J.M. van; Schreurs, W.H.P.; Schrijver, J.

    1984-01-01

    The reference values of the three main types of serum haptoglobin Hp 1-1, Hp 2-1, and Hp 2-2, as determined by radial immunodiffusion and with phenotype determination on polyacrylamide gel electrophoresis have been re-evaluated for both sexes. For that purpose about 500 serum samples were collected

  8. 12 CFR 560.172 - Re-evaluation of real estate owned.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Re-evaluation of real estate owned. 560.172... of real estate owned. A savings association shall appraise each parcel of real estate owned at the... under the particular circumstances. The foregoing requirement shall not apply to any parcel of real...

  9. Emotional Dissonance and Burnout: The Moderating Role of Team Reflexivity and Re-Evaluation.

    Science.gov (United States)

    Andela, Marie; Truchot, Didier

    2017-08-01

    The aim of the present study was to better understand the relationship between emotional dissonance and burnout by exploring the buffering effects of re-evaluation and team reflexivity. The study was conducted with a sample of 445 nurses and healthcare assistants from a general hospital. Team reflexivity was evaluated with the validation of the French version of the team reflexivity scale (Facchin, Tschan, Gurtner, Cohen, & Dupuis, 2006). Burnout was measured with the MBI General Survey (Schaufeli, Leiter, Maslach, & Jackson, 1996). Emotional dissonance and re-evaluation were measured with the scale developed by Andela, Truchot, & Borteyrou (2015). With reference to Rimé's theoretical model (2009), we suggested that both dimensions of team reflexivity (task and social reflexivity) respond to both psychological necessities induced by dissonance (cognitive clarification and socio-affective necessities). Firstly, results indicated that emotional dissonance was related to burnout. Secondly, regression analysis confirmed the buffering role of re-evaluation and social reflexivity on the emotional exhaustion of emotional dissonance. Overall, results contribute to the literature by highlighting the moderating effect of re-evaluation and team reflexivity in analysing the relationship between emotional dissonance and burnout. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Economics of the specification 6M safety re-evaluation and regulatory requirements

    International Nuclear Information System (INIS)

    Hopper, C.M.

    1985-01-01

    The objective of this work was to examine the potential economic impact of the DOT Specification 6M criticality safety re-evaluation and regulatory requirements. The examination was based upon comparative analyses of current authorized fissile material load limits for the 6M, current Federal regulations (and interpretations) limiting the contents of Type B fissile material packages, limiting aggregates of fissile material packages, and recent proposed fissile material mass limits derived from specialized criticality safety analyses of the 6M package. The work examines influences on cost in transportation, handling, and storage of fissile materials. Depending upon facility throughput requirements (and assumed incremental costs of fissile material packaging, storage, and transport), operating, facility storage capacity, and transportation costs can be reduced significantly. As an example of the pricing algorithm application based upon reasonable cost influences, the magnitude of the first year cost reductions could extend beyond four times the cost of the packaging nuclear criticality safety re-evaluation. 1 tab

  11. Technical guidelines for the seismic safety re-evaluation at Eastern European NPPs

    International Nuclear Information System (INIS)

    Godoy, A.R.; Guerpinar, A.

    2001-01-01

    The paper describes one of the outcomes of the Engineering Safety Review Services (ESRS) that the IAEA provides as an element of the Agency's national, regional and interregional technical assistance and co-operation programmes and other extrabudgetary programmes to assess the safety of nuclear facilities. This refers to the establishment of detailed guidelines for conducting the seismic safety re-evaluation of existing nuclear power plants in Eastern European countries in line with updated criteria and current international practice. (author)

  12. Orphan caribou, Rangifer tarandus, calves: A re-evaluation of overwinter survival data

    Science.gov (United States)

    Joly, Kyle

    2000-01-01

    Low sample size and high variation within populations reduce power of statistical tests. These aspects of statistical power appear to have affected an analysis comparing overwinter survival rates of non-orphan and orphan Caribou (Rangifer tarandus) calves by an earlier study for the Porcupine Caribou Herd. A re-evaluation of the data revealed that conclusions about a lack of significant difference in the overwinter survival rates between orphan and non-orphan calves were premature.

  13. ACCOUNTING AND TAX TREATMENT OF THE RE-EVALUATION OF THE TANGIBLE ASSETS

    Directory of Open Access Journals (Sweden)

    Daniela CRETU

    2013-01-01

    Full Text Available The methods of patrimonial evaluation are recognised on a large scale by the specialists in the Continental Europe, while the specialists in the North America almost ignore them, they consider as a realistic economic value the one that results from the update of the forecast cash-flows. The Romanian financial school does not mention at present a basic orientation related to the continental or American opinion. In general, it can be found out that the attitude of the Romanian authors, specialised in the accounting domain, is for the patrimonial methods, and those in financial professional domain, is for the financial and stock methods. According to the International Standards for business evaluation, the “asset based approach is the way to estimate the value of a business and /or the participations to it, using methods based on the market value of the individual assets of the business, decreasing its debts”. The entities can proceed to the re-evaluation of the tangible assets that exist at the end of the financial exercise, so that they are presented to their true value in accounting, reflecting the results of this re-evaluation in the financial reports made for that exercise. In this context, the present paper proposes the analysis of the accounting and tax treatmentforeseen by the accounting regulations, according to the European directives, and to the procedures of evaluation and re-evaluation of the tangible assets.

  14. Seismic re-evaluation of the Tarapur atomic power plants 1 and 2

    International Nuclear Information System (INIS)

    Ingole, S.M.; Kumar, B.S.; Gupta, S.; Singh, U.P.; Giridhar, K.; Bhawsar, S.D.; Samota, A.; Chhatre, A.G.; Dixit, K.B.; Bhardwaj, S.A.

    2004-01-01

    Two Boiling Water Reactors (BWR) of 210 MWe each at Tarapur Atomic Power Station, Units-1 and 2 (TAPS-1-2) were commissioned in the year 1969. The safety related civil structures at TAPS had been designed for a seismic coefficient of 0.2 g and other structures for 0.1 g. The work of seismic re-evaluation of the TAPS-1-2 has been taken up in the year 2002. As two new Pressurized Heavy Water Reactor (PHWR) plants of 540 MWe each, Tarapur Atomic Power Project Units-3 and 4 (TAPP-3-4), are coming up in the vicinity of TAPS-1-2, detailed geological and seismological studies of the area around TAPS-1-2 are available. The same free-field ground motion as generated for TAPP-3-4 has been used for TAPS-1-2. The seismic re-evaluation of the plant has been performed as per the procedure given in IAEA, Safety Reports Series entitled 'Seismic Evaluation of Existing Nuclear Power Plants', and meeting the various codes and standards, viz., ASME, ASCE, IEEE standards etc. The Safety Systems (SS) and Safety Support Systems (SSS) are qualified by adopting detailed analysis and testing methods. The equipment in the SS and SSS have been qualified by conducting a walk-down as per the procedure given in Generic Implementation Procedure, Dept. of Energy (GIP--DOE), USA. The safety systems include the systems required for safe shutdown of the plant, one chain of decay heat removal and containment of activity. The safety support systems viz., Electrical, Instrumentation and Control and systems other than SS and SSS have been qualified by limited analysis, testing and mostly by following the procedure of walk-down. The paper brings out the details of the work accomplished during seismic re-evaluation of the two units of BWR at Tarapur. (authors)

  15. Re-evaluation of emergency planning zone for 3 NPPS in Taiwan

    International Nuclear Information System (INIS)

    Chiou, S.-T.; Yin, H.-L.; Chen, C.-S.; Shih, C.-L.

    2004-01-01

    The emergency planning zone for the 3 nuclear power plants in Taiwan are re-evaluated. The analysis is performed by the CRAC2 code and the basic approach follows the NUREG-0396 evaluation procedure. Meteorological data are provided by Taiwan Power Company and reviewed by Taiwan University and Central Weather Bureau. Accident source terms are also provided by Institute of Nuclear Energy Research (INER) by probabilistic risk assessment method with consideration of actual plant system improvement and/or modification. The dose rate distribution, acute and latent cancer fatality are evaluated and compared with proposed EPZ decision criteria including protective action guide dose levels, individual and societal risk safety goal. (author)

  16. Scientific Opinion on the re-evaluation of carnauba wax (E 903) as a food additive

    OpenAIRE

    EFSA Panel on Food Additives and Nutrient Sources added to Food (ANS)

    2012-01-01

    The Panel on Food Additives and Nutrient Sources added to Food (ANS) delivers a scientific opinion re-evaluating the safety of carnauba wax (E 903). Carnauba wax (E 903) is authorised in the EU as food additive as glazing agent. It has been evaluated by the Scientific Committee on Food (SCF) and by the Joint FAO/WHO Expert Committee on Food Additives (JECFA) who allocated an Acceptable Daily Intake (ADI) of 7 mg/kg bw/day. The SCF did not establish an ADI but considered the use of ca...

  17. A re-evaluation of Scinaia (Nemaliales, Rhodophyta) in the Azores

    Science.gov (United States)

    León-Cisneros, K.; Riosmena-Rodríguez, R.; Neto, A. I.

    2011-06-01

    The genus Scinaia in the Azores is re-evaluated based on historical and recent collections. A combination of morphological and anatomical diagnostic characters was used for species segregation, and a key for Azorean species determination is presented. Anatomical information associated to the hair development is described for the first time for the genus. The occurrence of S. furcellata and S. interrupta is confirmed for the archipelago. The presence of S. acuta is reported for the first time in the Azores, representing a spread from Australia to the N-Atlantic and specifically into the Macaronesian region. Its occurrence in the archipelago and the Canaries is discussed as a possible introduction.

  18. [Research about re-evaluation of screening of traditonal Chinese medicine symptoms item of post-marketing medicine Xuezhikang].

    Science.gov (United States)

    He, Wei; Xie, Yanming; Wang, Yongyan

    2011-10-01

    The purpose of post-marketing Chinese medicine re-evaluation is to identify Chinese medicine clinical indications, while designing scientific and rational of Chinese medicine symptoms items are important to the result of symptoms re-evaluation. This study give screening of traditional Chinese medicine(TCM) symptoms item of post-marketing medicine Xuezhikang re-evaluation as example that reference to principle dyslipidemia clinical research, academic dissertations, Xuezhikang directions, clinical expert practice experience etc. while standardization those symptom names and screening 41 dyslipidemia common symptoms. Furthermore, this paper discuss about the accoerdance and announcements when screening symptoms item, so as to providing a research thread to manufacture PRO chart for post-marketing medicine re-evaluation.

  19. Potential influence of new doses of A-bomb after re-evaluation of epidemiological research

    International Nuclear Information System (INIS)

    Maruyama, T.

    1983-01-01

    Since the peaceful use of atomic energy appears essential for future human existence, we must provide risk estimates from low-dose exposures to human beings. The largest body of human data has been derived from the studies of atomic bomb survivors in Hiroshima and Nagasaki. Recently, it was proposed by an Oak Ridge National Laboratory group that the current free-in-air doses of atomic bombs are significantly different from the doses recalculated on the basis of the new output spectra of neutrons and gamma rays from the atomic bombs which were declassified by the US Department of Energy in 1976. A joint commission on dose re-evaluation of the United States of America and Japan was established in 1981 to pursue the dose reassessment programme between US and Japanese research groups and to decide an agreed best estimate of organ or tissue doses in survivors as soon as possible. The paper reviews the physical concepts of the re-evaluation of atomic bomb doses and discusses the potential influence of new dosimetric parameters on the epidemiological studies of the atomic bomb survivors in future, although the re-assessment programme is still in progress. (author)

  20. Re-evaluating Traditional Predictors of Incoming Knowledge in Astronomy 101 and Implications for Course Revitalization

    Science.gov (United States)

    Berryhill, K. J.; Slater, T. F.; Slater, S. J.; Harbour, C.; Forrester, J. H.

    2016-12-01

    A wide range of incoming knowledge is seen in students taking introductory astronomy courses. Using the Test Of Astronomy STandards (TOAST) as a pre-course measure of incoming knowledge, an evaluation was completed to discover any explanation for this variation. It would be reasonable to suggest that this could result from the variety we see in student's motivation, self-efficacy, general scholastic achievement, their high school science experience, or even whether one or more of their parents is in a STEM field. In this re-evaluation, there was no correlation seen between the above and the student's pre-test scores. Instead, the only predictor of pretest scores was student's exposure to astronomy through informal learning opportunities. This leads to important implications for faculty revitalizing their courses to improve student learning.

  1. IARC 1987 re-evaluation of carcinogenicity of mineral oils and bitumens - CONCAWE comments and interpretation

    Energy Technology Data Exchange (ETDEWEB)

    1988-10-01

    Following the IARC response to CONCAWE comments on their 1987 re-evaluation of carcinogenicity of mineral oils and bitumens, CONCAWE decided that a revised version of Report No. 87/63 should be published in this report. The objective is to ensure that all who may be concerned with carcinogenicity classifications of these petroleum products, including national and international regulatory authorities, are aware of the reasons for the differences between the 1984 and 1987 IARC classifications and of CONCAWE's reservations concerning the revised classifications. This document reviews important differences between the new and previous evaluations, which are summarized in tabular form, and also indicates how the differences have occurred. In addition, it provides CONCAWE's interpretation of the available evidence, taking into account points discussed with IARC. 1 tab.

  2. Generic results and conclusions of re-evaluating the flooding protection in French Nuclear Power Plants

    International Nuclear Information System (INIS)

    Vial, E.; Rebour, V.; Mattei, J.; Gprbatchev, A.

    2002-01-01

    The partial flooding of the Blayais site, occurred on December 1999 has led to a large scale re-examination of the measures to prevent and limit the consequences associated with all contingencies or combinations of them, which could lead to external flooding of any of the 19 French sites, equipped with pressurized water reactors. An Action Program has been launched by Electricite de France and a methodology has been approved, consisting of: defining of principles for re-evaluating external flooding risks together with the relevant arrangements; applying the principles to each site and showing that the margins adopted are sufficient for achieving an acceptable safety level. The implementation of the program throughout all sites with PWR in France will extend to 2005

  3. A re-evaluation of subspecific variation and canine dimorphism in woolly spider monkeys (Brachyteles arachnoides).

    Science.gov (United States)

    Leigh, S R; Jungers, W L

    1994-12-01

    A recent study suggests that differing populations of woolly spider monkeys exhibit a substantial degree of morphological, cytogenetic, and behavioral variation. We re-evaluate the differences between populations in the degree of canine tooth height sexual dimorphism and in the frequency of thumbs. Statistical analysis of variation in the degree of canine sexual dimorphism between these populations fails to provide strong evidence for subspecific variation: differences in the degree of canine dimorphism cannot be considered statistically significant. Differences between populations in the frequency of thumbs are, however, statistically significant. The lack of clear distinctions between populations in the degree of canine dimorphism complicates assessments of behavioral variation between these populations. We suggest that the level of geographic variation in woolly spider monkey canine dimorphism is not consistent with subspecific status.

  4. General misincorporation frequency: Re-evaluation of the fidelity of DNA polymerases.

    Science.gov (United States)

    Yang, Jie; Li, Bianbian; Liu, Xiaoying; Tang, Hong; Zhuang, Xiyao; Yang, Mingqi; Xu, Ying; Zhang, Huidong; Yang, Chun

    2018-02-19

    DNA replication in cells is performed in the presence of four dNTPs and four rNTPs. In this study, we re-evaluated the fidelity of DNA polymerases using the general misincorporation frequency consisting of three incorrect dNTPs and four rNTPs but not using the traditional special misincorporation frequency with only the three incorrect dNTPs. We analyzed both the general and special misincorporation frequencies of nucleotide incorporation opposite dG, rG, or 8-oxoG by Pseudomonas aeruginosa phage 1 (PaP1) DNA polymerase Gp90 or Sulfolobus solfataricus DNA polymerase Dpo4. Both misincorporation frequencies of other DNA polymerases published were also summarized and analyzed. The general misincorporation frequency is obviously higher than the special misincorporation frequency for many DNA polymerases, indicating the real fidelity of a DNA polymerase should be evaluated using the general misincorporation frequency. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Activities concerning a re-evaluation of gamma-ray buildup factors in Japan

    International Nuclear Information System (INIS)

    Hirayama, Hideo

    2000-01-01

    Research related to gamma-ray buildup factors in Japan are continuing to improve in accuracy and usefulness after the publication of new standard buildup factors as NUREG/CR-5740. Buildup factors for homogeneous materials were studied by three different calculation methods. Several improvements were made to calculate buildup factors up to 40 mfp for various materials for a wide energy range at each code. Systematic data production of buildup factors for multilayer materials were performed by using the EGS4 Monte Carlo code, and were used to improve the fitting formula. These research activities related to gamma-ray buildup factors performed in Japan are presented together with discussions concerning re-evaluation of buildup factors. (author)

  6. Diagnostic performance and useful findings of ultrasound re-evaluation for patients with equivocal CT features of acute appendicitis.

    Science.gov (United States)

    Kim, Mi Sung; Kwon, Heon-Ju; Kang, Kyung A; Do, In-Gu; Park, Hee-Jin; Kim, Eun Young; Hong, Hyun Pyo; Choi, Yoon Jung; Kim, Young Hwan

    2018-02-01

    To evaluate the diagnostic performance of ultrasound and to determine which ultrasound findings are useful to differentiate appendicitis from non-appendicitis in patients who underwent ultrasound re-evaluation owing to equivocal CT features of acute appendicitis. 62 patients who underwent CT examinations for suspected appendicitis followed by ultrasound re-evaluation owing to equivocal CT findings were included. Equivocal CT findings were considered based on the presence of only one or two findings among the CT criteria, and ultrasound re-evaluation was done based on a predefined structured report form. The diagnostic performance of ultrasound and independent variables to discriminate appendicitis from non-appendicitis were assessed. There were 27 patients in the appendicitis group. The overall diagnostic performance of ultrasound re-evaluation was sensitivity of 96.3%, specificity of 91.2% and accuracy of 91.9%. In terms of the performance of individual ultrasound findings, probe-induced tenderness showed the highest accuracy (86.7%) with sensitivity of 74% and specificity of 97%, followed by non-compressibility (accuracy 71.7%, sensitivity 85.2% and specificity 60.6%). The independent ultrasound findings for discriminating appendicitis were non-compressibility (p = 0.002) and increased flow on the appendiceal wall (p = 0.001). Ultrasound re-evaluation can be used to improve diagnostic accuracy in cases with equivocal CT features for diagnosing appendicitis. The presence of non-compressibility and increased vascular flow on the appendix wall are useful ultrasound findings to discriminate appendicitis from non-appendicitis. Advances in knowledge: Ultrasound re-evaluation is useful to discriminate appendicitis from non-appendicitis when CT features are inconclusive.

  7. Threshold of toxicological concern values for non-genotoxic effects in industrial chemicals: re-evaluation of the Cramer classification.

    Science.gov (United States)

    Kalkhof, H; Herzler, M; Stahlmann, R; Gundert-Remy, U

    2012-01-01

    The TTC concept employs available data from animal testing to derive a distribution of NOAELs. Taking a probabilistic view, the 5th percentile of the distribution is taken as a threshold value for toxicity. In this paper, we use 824 NOAELs from repeated dose toxicity studies of industrial chemicals to re-evaluate the currently employed TTC values, which have been derived for substances grouped according to the Cramer scheme (Cramer et al. in Food Cosm Toxicol 16:255-276, 1978) by Munro et al. (Food Chem Toxicol 34:829-867, 1996) and refined by Kroes and Kozianowski (Toxicol Lett 127:43-46, 2002), Kroes et al. 2000. In our data set, consisting of 756 NOAELs from 28-day repeated dose testing and 57 NOAELs from 90-days repeated dose testing, the experimental NOAEL had to be extrapolated to chronic TTC using regulatory accepted extrapolation factors. The TTC values derived from our data set were higher than the currently used TTC values confirming the safety of the latter. We analysed the prediction of the Cramer classification by comparing the classification by this tool with the guidance values for classification according to the Globally Harmonised System of classification and labelling of the United Nations (GHS). Nearly 90% of the chemicals were in Cramer class 3 and assumed as highly toxic compared to 22% according to the GHS. The Cramer classification does underestimate the toxicity of chemicals only in 4.6% of the cases. Hence, from a regulatory perspective, the Cramer classification scheme might be applied as it overestimates hazard of a chemical.

  8. Seismic re-evaluation of piping systems of heavy water plant, Kota

    International Nuclear Information System (INIS)

    Mishra, Rajesh; Soni, R.S.; Kushwaha, H.S.; Venkat Raj, V.

    2002-05-01

    Heavy Water Plant, Kota is the first indigenous heavy water plant built in India. The plant started operation in the year 1985 and it is approaching the completion of its originally stipulated design life. In view of the excellent record of plant operation for the past so many years, it has been planned to carry out various exercises for the life extension of the plant. In the first stage, evaluation of operation stresses was carried out for the process critical piping layouts and equipment, which are connected with 25 process critical nozzle locations, identified based on past history of the plant performance. Fatigue life evaluation has been carried out to fmd out the Cumulative Usage Factor, which helps in arriving at a decision regarding the life extension of the plant. The results of these exercises have been already reported separately vide BARC/200I /E/O04. In the second stage, seismic reevaluation of the plant has been carried out to assess its ability to maintain its integ:rity in case of a seismic event. The aim of this exercise is to assess the effects of the maximum probable earthquake at the plant site on the various systems and components of the plant. This exercise is further aimed at ensuring the adequacy of seismic supports to maintain the integrity of the system in case of a seismic event and to suggest some retrofitting measures, if required. Seismic re-evaluation of the piping of Heavy Water Plant, Kota has been performed taking into account the interaction effects from the connected equipment. Each layout has been qualified using the latest provisions of ASME Code Section III, Subsection ND wherein the earthquake loading has been considered as a reversing dynamic load. The maximum combined stresses for all the layouts due to pressure, weight and seismic loadings have been found to be well within the code allowable limit. Therefore, it has been concluded that during a maximum probable seismic event, the possibility of pipe rupture can be safely

  9. Dust Destruction in the ISM: A Re-Evaluation of Dust Lifetimes

    Science.gov (United States)

    Jones, A. P.; Nuth, J. A., III

    2011-01-01

    There is a long-standing conundrum in interstellar dust studies relating to the discrepancy between the time-scales for dust formation from evolved stars and the apparently more rapid destruction in supernova-generated shock waves. Aims. We re-examine some of the key issues relating to dust evolution and processing in the interstellar medium. Methods. We use recent and new constraints from observations, experiments, modelling and theory to re-evaluate dust formation in the interstellar medium (ISM). Results. We find that the discrepancy between the dust formation and destruction time-scales may not be as significant as has previously been assumed because of the very large uncertainties involved. Conclusions. The derived silicate dust lifetime could be compatible with its injection time-scale, given the inherent uncertainties in the dust lifetime calculation. The apparent need to re-form significant quantities of silicate dust in the tenuous interstellar medium may therefore not be a strong requirement. Carbonaceous matter, on the other hand, appears to be rapidly recycled in the ISM and, in contrast to silicates, there are viable mechanisms for its re-formation in the ISM.

  10. Re-evaluation and updating of the seismic hazard of Lebanon

    Science.gov (United States)

    Huijer, Carla; Harajli, Mohamed; Sadek, Salah

    2016-01-01

    This paper presents the results of a study undertaken to evaluate the implications of the newly mapped offshore Mount Lebanon Thrust (MLT) fault system on the seismic hazard of Lebanon and the current seismic zoning and design parameters used by the local engineering community. This re-evaluation is critical, given that the MLT is located at close proximity to the major cities and economic centers of the country. The updated seismic hazard was assessed using probabilistic methods of analysis. The potential sources of seismic activities that affect Lebanon were integrated along with any/all newly established characteristics within an updated database which includes the newly mapped fault system. The earthquake recurrence relationships of these sources were developed from instrumental seismology data, historical records, and earlier studies undertaken to evaluate the seismic hazard of neighboring countries. Maps of peak ground acceleration contours, based on 10 % probability of exceedance in 50 years (as per Uniform Building Code (UBC) 1997), as well as 0.2 and 1 s peak spectral acceleration contours, based on 2 % probability of exceedance in 50 years (as per International Building Code (IBC) 2012), were also developed. Finally, spectral charts for the main coastal cities of Beirut, Tripoli, Jounieh, Byblos, Saida, and Tyre are provided for use by designers.

  11. Recruitment processes in Baltic sprat - A re-evaluation of GLOBEC Germany hypotheses

    DEFF Research Database (Denmark)

    Voss, Rudiger; Peck, M.A.; Hinrichsen, H.-H.

    2012-01-01

    The GLOBEC Germany program (2002–2007) had the ambitious goal to resolve the processes impacting the recruitment dynamics ofBalticsprat (Sprattus sprattus L.) by examining various factors affecting early life history stages. At the start of the research program, a number of general recruitmenthyp......The GLOBEC Germany program (2002–2007) had the ambitious goal to resolve the processes impacting the recruitment dynamics ofBalticsprat (Sprattus sprattus L.) by examining various factors affecting early life history stages. At the start of the research program, a number of general......, and modeling studies to re-evaluate these hypotheses for the Balticsprat stock. Recruitment success was quite different in the 2 years investigated. Despite a lower spawning stock biomass in 2003, the total number of recruits was almost 2-fold higher that year compared to 2002. The higher recruitment success...... in 2003 could be attributed to enhanced survival success during the post-larval/juvenile stage, a life phase that appears to be critical for recruitment dynamics. In the state of the Baltic ecosystem during the period of investigation, we consider bottom-up control (e.g. temperature, prey abundance...

  12. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.

    1999-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original database was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound master curve corresponds to the K IR -reference curve. (orig.)

  13. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.; Rintamaa, R.

    1998-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'Master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound Master curve corresponds to the K IR -reference curve. (orig.)

  14. Re-evaluation of broiler carcass scalding protocols for impact on the recovery of Campylobacter from breast skin after defeathering

    Science.gov (United States)

    This research re-evaluated the impact of scalding protocols on the recovery of Campylobacter from breast skin following defeathering after preliminary processing trials detected Campylobacter from breast skin for 4/8 carcasses that had vents plugged and sutured prior to scalding. Published research...

  15. Re-evaluating the relevance of José Martí

    Directory of Open Access Journals (Sweden)

    Lillian Guerra

    2001-01-01

    Full Text Available [First paragraph] José Marti's "Our America": From National to Hemispheric Cultural Studies. JEFFREY BELNAP & RAÜL FERNANDEZ (eds.. Durham NC: Duke University Press, 1998. viii + 344 pp. (Cloth US$ 49.95, Paper US$ 17.95 Re-Reading José Marti (1853-1895: One Hundred Years Later. JULIO RODRÏGUEZ-LUIS (ed.. Albany: State University of New York Press, 1999. xxiii + 158 pp. (Paper US$ 16.95 José Marti Reader: Writings on the Americas. DEBORAH SHNOOKAL & MIRTA MUNIZ (eds.. Melbourne: Ocean Press, 1999. xiii + 276 pp. (Paper US$ 19.95 Generated by the tide of commemorations in 1995 that marked the one-hundredth anniversary of the death of José Marti, a number of scholarly volumes have washed into the mainstream of debates on the origins and contradictions of identities now taking place across disciplines and geographic regions. A new generation of Latin Americanists, specialists of U.S. history, politics, and literature, as well as scholars in the ever-widening domain of cultural studies, have published works on José Marti's writings and life experiences. Taken together, they represent an unprecedented re-evaluation and reinterpretation of José Marti as both a man and a myth. In general, the methods, approaches, and conceptual points of reference on which these new works rely make for an exciting and unique set of arguments about who Marti was, why he mattered and what his writings teil us about the time in which he lived and the multiple societies of which he formed a part.

  16. Water as a contrast medium: a re-evaluation using the multidetector-row computed tomography.

    Science.gov (United States)

    Makarawo, Tafadzwa P; Negussie, Edsa; Malde, Sachit; Tilak, Jacqueline; Gayagoy, Jennifer; Watson, Jenna; Francis, Faiz; Lincoln, Denis; Jacobs, Michael J

    2013-07-01

    Water as an intraluminal negative contrast medium produces improved image quality with reduced artefact. However, rapid absorption of oral water in the bowel relative to speed and timing of image capturing has limited its clinical application. These findings predate advances in multidetector-row computed tomography (CT). To re-evaluate differences in image quality, we studied image clarity and luminal distention between the same group of patients who received both a pancreas protocol CT (PPCT) that uses oral water and a conventional positive oral contrast scan. We reviewed 66 patients who had previously undergone both a PPCT and an oral contrast abdominal CT. CT images were independently reviewed by two board-certified radiologists who scored degree of hollow viscus distention and visualization of mural detail using a Likert 5-point scale. Results were evaluated by using the Wilcoxon-signed rank test. Student's t test was applied to evaluate the differences in radiation dosage and Spearman's correlational test was used to evaluate interrater correlation between the radiologists. In comparing the mean radiation dosage, there was no statistical difference between the two protocols, and there was good interrater association with ratios of 0.595 and 0.51 achieved for the PPCT and conventional oral scan, respectively. The Wilcoxon signed-rank test showed statistical differences in the stomach (P contrast medium causing better or equal distention in the bowel and better or equal clarity than routine barium contrast. This calls for a need to reconsider the use of water as a contrast medium in clinical practice.

  17. The carcinogenic effects of aspartame: The urgent need for regulatory re-evaluation.

    Science.gov (United States)

    Soffritti, Morando; Padovani, Michela; Tibaldi, Eva; Falcioni, Laura; Manservisi, Fabiana; Belpoggi, Fiorella

    2014-04-01

    Aspartame (APM) is an artificial sweetener used since the 1980s, now present in >6,000 products, including over 500 pharmaceuticals. Since its discovery in 1965, and its first approval by the US Food and Drugs Administration (FDA) in 1981, the safety of APM, and in particular its carcinogenicity potential, has been controversial. The present commentary reviews the adequacy of the design and conduct of carcinogenicity bioassays on rodents submitted by G.D. Searle, in the 1970s, to the FDA for market approval. We also review how experimental and epidemiological data on the carcinogenic risks of APM, that became available in 2005 motivated the European Commission (EC) to call the European Food and Safety Authority (EFSA) for urgent re-examination of the available scientific documentation (including the Searle studies). The EC has further requested that, if the results of the evaluation should suggest carcinogenicity, major changes must be made to the current APM specific regulations. Taken together, the studies performed by G.D. Searle in the 1970s and other chronic bioassays do not provide adequate scientific support for APM safety. In contrast, recent results of life-span carcinogenicity bioassays on rats and mice published in peer-reviewed journals, and a prospective epidemiological study, provide consistent evidence of APM's carcinogenic potential. On the basis of the evidence of the potential carcinogenic effects of APM herein reported, a re-evaluation of the current position of international regulatory agencies must be considered an urgent matter of public health. © 2014 Wiley Periodicals, Inc.

  18. Accidents and undetermined deaths: re-evaluation of nationwide samples from the Scandinavian countries.

    Science.gov (United States)

    Tøllefsen, Ingvild Maria; Thiblin, Ingemar; Helweg-Larsen, Karin; Hem, Erlend; Kastrup, Marianne; Nyberg, Ullakarin; Rogde, Sidsel; Zahl, Per-Henrik; Østevold, Gunvor; Ekeberg, Øivind

    2016-05-27

    National mortality statistics should be comparable between countries that use the World Health Organization's International Classification of Diseases. Distinguishing between manners of death, especially suicides and accidents, is a challenge. Knowledge about accidents is important in prevention of both accidents and suicides. The aim of the present study was to assess the reliability of classifying deaths as accidents and undetermined manner of deaths in the three Scandinavian countries and to compare cross-national differences. The cause of death registers in Norway, Sweden and Denmark provided data from 2008 for samples of 600 deaths from each country, of which 200 were registered as suicides, 200 as accidents or undetermined manner of deaths and 200 as natural deaths. The information given to the eight experts was identical to the information used by the Cause of Death Register. This included death certificates, and if available external post-mortem examinations, forensic autopsy reports and police reports. In total, 69 % (Sweden and Norway) and 78 % (Denmark) of deaths registered in the official mortality statistics as accidents were confirmed by the experts. In the majority of the cases where disagreement was seen, the experts reclassified accidents to undetermined manner of death, in 26, 25 and 19 % of cases, respectively. Few cases were reclassified as suicides or natural deaths. Among the extracted accidents, the experts agreed least with the official mortality statistics concerning drowning and poisoning accidents. They also reported most uncertainty in these categories of accidents. In a second re-evaluation, where more information was made available, the Norwegian psychiatrist and forensic pathologist increased their agreement with the official mortality statistics from 76 to 87 %, and from 85 to 88 %, respectively, regarding the Norwegian and Swedish datasets. Among the extracted undetermined deaths in the Swedish dataset, the two experts

  19. Recruitment processes in Baltic sprat - A re-evaluation of GLOBEC Germany hypotheses

    Science.gov (United States)

    Voss, Rüdiger; Peck, Myron A.; Hinrichsen, Hans-Harald; Clemmesen, Catriona; Baumann, Hannes; Stepputtis, Daniel; Bernreuther, Matthias; Schmidt, Jörn O.; Temming, Axel; Köster, Fritz W.

    2012-12-01

    The GLOBEC Germany program (2002-2007) had the ambitious goal to resolve the processes impacting the recruitment dynamics of Baltic sprat (Sprattus sprattus L.) by examining various factors affecting early life history stages. At the start of the research program, a number of general recruitment hypotheses were formulated, i.e. focusing on (1) predation, (2) food availability, (3) physical parameters, (4) the impact of current systems, and finally (5) the importance of top-down vs bottom-up effects. The present study synthesizes the results of field sampling (2002 and 2003), laboratory experiments, and modeling studies to re-evaluate these hypotheses for the Baltic sprat stock. Recruitment success was quite different in the 2 years investigated. Despite a lower spawning stock biomass in 2003, the total number of recruits was almost 2-fold higher that year compared to 2002. The higher recruitment success in 2003 could be attributed to enhanced survival success during the post-larval/juvenile stage, a life phase that appears to be critical for recruitment dynamics. In the state of the Baltic ecosystem during the period of investigation, we consider bottom-up control (e.g. temperature, prey abundance) to be more important than top-down control (predation mortality). This ranking in importance does not vary seasonally. Prevailing water circulation patterns and the transport dynamics of larval cohorts have a strong influence on sprat recruitment success. Pronounced transport to coastal areas is detrimental for year-class strength particularly at high sprat stock sizes. A suggested mechanism is density-dependant regulation of survival via intra- and inter-specific competition for prey in coastal areas. A documented change in larval vertical migration behavior between the early 1990s and early 2000s increased the transport potential to the coast, strengthening the coupling between inter-annual differences in the magnitude and direction of wind-driven surface currents and

  20. THE 2005 WORLD HEALTH ORGANIZATION RE-EVALUATION OF HUMAN AND MAMMALIAN TOXIC EQUIVALENCY FACTORS FOR DIOXINS AND DIOXIN-LIKE COMPOUNDS

    Science.gov (United States)

    In June 2005 a WHO-IPCS expert meeting was held in Geneva during which the toxic equivalency factors (TEFs) for dioxin like compounds, including some polychlorinated biphenyls (PCBs), were re-evaluated. For this re-evaluation process the refined TEF database recently published by...

  1. [Post-marketing re-evaluation about usage and dosage of Chinese medicine based on human population pharmacokinetics].

    Science.gov (United States)

    Jiang, Junjie; Xie, Yanming

    2011-10-01

    The usage and dosage of Chinese patent medicine are determined by rigorous evaluation which include four clinical trail stages: I, II, III. But the usage and dosage of Chinese patent medicine are lacked re-evaluation after marketing. And this lead to unchanging or fixed of the usage and dosage of Chinese patent medicine instead of different quantity based on different situations in individual patients. The situation of Chinese patent medicine used in clinical application is far away from the idea of the "Treatment based on syndrome differentiation" in traditional Chinese medicine and personalized therapy. Human population pharmacokinetics provides data support to the personalized therapy in clinical application, and achieved the postmarking reevaluating of the usage and dosage of Chinese patent medicine. This paper briefly introduced the present situation, significance and the application of human population pharmacokinetics about re-evaluation of the usage and dosage of Chinese patent medicine after marketing.

  2. Re-evaluation of the macroseismic effects produced by the March 4, 1977, strong Vrancea earthquake in Romanian territory

    Directory of Open Access Journals (Sweden)

    Aurelian Pantea

    2013-04-01

    Full Text Available In this paper, the macroseismic effects of the subcrustal earthquake in Vrancea (Romania that occurred on March 4, 1977, have been re-evaluated. This was the second strongest seismic event that occurred in this area during the twentieth century, following the event that happened on November 10, 1940. It is thus of importance for our understanding of the seismicity of the Vrancea zone. The earthquake was felt over a large area, which included the territories of the neighboring states, and it produced major damage. Due to its effects, macroseismic studies were developed by Romanian researchers soon after its occurrence, with foreign scientists also involved, such as Medvedev, the founder of the Medvedev-Sponheuer-Karnik (MSK seismic intensity scale. The original macroseismic questionnaires were re-examined, to take into account the recommendations for intensity assessments according to the MSK-64 macroseismic scale used in Romania. After the re-evaluation of the macroseismic field of this earthquake, the intensity dataset was obtained for 1,620 sites in Romanian territory. The re-evaluation was necessary as it has confirmed that the previous macroseismic map was underestimated. On this new map, only the intensity data points are plotted, without tracing the isoseismals.

  3. Fear but not fright: re-evaluating traumatic experience attenuates anxiety-like behaviors after fear conditioning

    Directory of Open Access Journals (Sweden)

    Marco eCostanzi

    2014-08-01

    Full Text Available Fear allows organisms to cope with dangerous situations and remembering these situations has an adaptive role preserving individuals from injury and death. However, recalling traumatic memories can induce re-experiencing the trauma, thus resulting in a maladaptive fear. A failure to properly regulate fear responses has been associated with anxiety disorders, like Posttraumatic Stress Disorder (PTSD. Thus, re-establishing the capability to regulate fear has an important role for its adaptive and clinical relevance. Strategies aimed at erasing fear memories have been proposed, although there are limits about their efficiency in treating anxiety disorders. To re-establish fear regulation, here we propose a new approach, based on the re-evaluation of the aversive value of traumatic experience. Mice were submitted to a contextual-fear-conditioning paradigm in which a neutral context was paired with an intense electric footshock. Three weeks after acquisition, conditioned mice were treated with a less intense footshock (pain threshold. The effectiveness of this procedure in reducing fear expression was assessed in terms of behavioral outcomes related to PTSD (e.g. hyper-reactivity to a neutral tone, anxiety levels in a plus maze task, social avoidance, and learning deficits in a spatial water maze and of amygdala activity by evaluating c-fos expression. Furthermore, a possible role of lateral orbitofrontal cortex (lOFC in mediating the behavioral effects induced by the re-evaluation procedure was investigated. We observed that this treatment (i significantly mitigates the abnormal behavioral outcomes induced by trauma, (ii persistently attenuates fear expression without erasing contextual memory, (iii prevents fear reinstatement, (iv reduces amygdala activity and (v requires an intact lOFC to be effective.The results suggest that an effective strategy to treat pathological anxiety should address cognitive re-evaluation of traumatic experiences

  4. A safety re-evaluation of the AVR pebble bed reactor operation and its consequences for future HTR concepts

    Energy Technology Data Exchange (ETDEWEB)

    Moormann, R.

    2008-06-15

    The AVR pebble bed reactor (46 MW{sub th}) was operated 1967-88 at coolant outlet temperatures up to 990 C. A principle difference of pebble bed HTRs as AVR to conventional reactors is the continuous movement of fuel element pebbles through the core which complicates thermohydraulic, nuclear and safety estimations. Also because of a lack of other experience AVR operation is still a relevant basis for future pebble bed HTRs and thus requires careful examination. This paper deals mainly with some insufficiently published unresolved safety problems of AVR operation and of pebble bed HTRs but skips the widely known advantageous features of pebble bed HTRs. The AVR primary circuit is heavily contaminated with metallic fission products (Sr-90, Cs-137) which create problems in current dismantling. The amount of this contamination is not exactly known, but the evaluation of fission product deposition experiments indicates that the end of life contamination reached several percent of a single core inventory, which is some orders of magnitude more than precalculated and far more than in large LWRs. A major fraction of this contamination is bound on graphitic dust and thus partly mobile in depressurization accidents, which has to be considered in safety analyses of future reactors. A re-evaluation of the AVR contamination is performed here in order to quantify consequences for future HTRs (400 MW{sub th}). It leads to the conclusion that the AVR contamination was mainly caused by inadmissible high core temperatures, increasing fission product release rates, and not - as presumed in the past - by inadequate fuel quality only. The high AVR core temperatures were detected not earlier than one year before final AVR shut-down, because a pebble bed core cannot yet be equipped with instruments. The maximum core temperatures are still unknown but were more than 200 K higher than calculated. Further, azimuthal temperature differences at the active core margin of up to 200 K were

  5. A safety re-evaluation of the AVR pebble bed reactor operation and its consequences for future HTR concepts

    International Nuclear Information System (INIS)

    Moormann, R.

    2008-06-01

    The AVR pebble bed reactor (46 MW th ) was operated 1967-88 at coolant outlet temperatures up to 990 C. A principle difference of pebble bed HTRs as AVR to conventional reactors is the continuous movement of fuel element pebbles through the core which complicates thermohydraulic, nuclear and safety estimations. Also because of a lack of other experience AVR operation is still a relevant basis for future pebble bed HTRs and thus requires careful examination. This paper deals mainly with some insufficiently published unresolved safety problems of AVR operation and of pebble bed HTRs but skips the widely known advantageous features of pebble bed HTRs. The AVR primary circuit is heavily contaminated with metallic fission products (Sr-90, Cs-137) which create problems in current dismantling. The amount of this contamination is not exactly known, but the evaluation of fission product deposition experiments indicates that the end of life contamination reached several percent of a single core inventory, which is some orders of magnitude more than precalculated and far more than in large LWRs. A major fraction of this contamination is bound on graphitic dust and thus partly mobile in depressurization accidents, which has to be considered in safety analyses of future reactors. A re-evaluation of the AVR contamination is performed here in order to quantify consequences for future HTRs (400 MW th ). It leads to the conclusion that the AVR contamination was mainly caused by inadmissible high core temperatures, increasing fission product release rates, and not - as presumed in the past - by inadequate fuel quality only. The high AVR core temperatures were detected not earlier than one year before final AVR shut-down, because a pebble bed core cannot yet be equipped with instruments. The maximum core temperatures are still unknown but were more than 200 K higher than calculated. Further, azimuthal temperature differences at the active core margin of up to 200 K were observed

  6. A re-evaluation of the initial yield of the hydrated electron in the picosecond time range

    International Nuclear Information System (INIS)

    Muroya, Yusa; Lin Mingzhang; Wu, Guozhong; Iijima, Hokuto; Yoshii, Koji; Ueda, Toru; Kudo, Hisaaki; Katsumura, Yosuke

    2005-01-01

    The yield of the hydrated electron in the picosecond time range has been re-evaluated with an ultrafast pulse radiolysis system using a laser photocathode RF-gun in combination with a conventional one, and a value of 4.1±0.2 per 100 eV of absorbed energy at 20 ps was derived. This is consistent with recent experimental results using a time correlation method [Bartels et al., J. Phys. Chem. A 104, 1686-1691 (2000)] and with Monte-Carlo calculations [Muroya et al., Can. J. Chem. 80 1367-1374 (2002)

  7. Reference materials characterized for impurities in uranium matrices. An overview and re-evaluation of the NBL CRM 124 series

    International Nuclear Information System (INIS)

    Buerger, S.; Mathew, K.J.; Mason, P.; Narayanan, U.

    2009-01-01

    The characterized concentrations of 24 impurity elements in New Brunswick Laboratory (NBL) Certified Reference Material (CRM) 124 were reevaluated. A provisional certificate of analysis was issued in September 1983 based upon the 'as prepared' values (gravimetric mixing). The provisional certificate does not state uncertainties for the characterized values, or estimate the degree of homogeneity. Since release of the provisional certificate of analysis various laboratories have reported analytical results for CRM 124. Based upon the reported data a re-evaluation of the characterized values with an estimate of their uncertainties was performed in this work. An assessment of the degree of homogeneity was included. The overall difference between the re-evaluated values for the 24 impurity elements and the 'as prepared' values from the provisional certificate of analysis is negligible compared to the uncertainties. Therefore, NBL will establish the 'as prepared' values as the certified values and use the derived uncertainties from this work for the uncertainties of the certified values. The traceability of the 'as prepared' values was established by the gravimetric mixing procedure employed during the preparation of the CRM. NBL further recommends a minimum sample size of 1 g of the CRM material to ensure homogeneity. Samples should be dried by heating up to 110 deg C for one hour before use. (author)

  8. A re-evaluation of 32S(n,p) cross sections from threshold to 5 MeV

    International Nuclear Information System (INIS)

    Fu, C.Y.

    1989-01-01

    Two evaluations of the 32 S(n,p) reaction cross sections, currently being used for the Nagasaki and Hiroshima dosimetry studies, yielded results that differ significantly. These two evaluations were reviewed and both were found to be quite old and without benefit of modern theoretical guidance and recent experimental data, hence inadequate in view of its relative importance for the present application. The necessity for a re-evaluation is further enhanced by the fact that: the present data search has uncovered a relatively high-quality data set that was not known previously, a generalized Bayes-theorem code is now available for averaging the various data sets with uncertainties and generating uncertainties for the results, effects on data combination of differing energy resolution in the various measurements can now be accounted for, and the ENDF/B-VI standards for 238 U(n,f) cross sections have become available for renormalizing two of the available data sets. The re-evaluation is performed to 5 MeV, the upper energy limit for the present purpose. 8 refs., 2 figs

  9. Grass carp in the Great Lakes region: establishment potential, expert perceptions, and re-evaluation of experimental evidence of ecological impact

    Science.gov (United States)

    Wittmann, Marion E.; Jerde, Christopher L.; Howeth, Jennifer G.; Maher, Sean P.; Deines, Andrew M.; Jenkins, Jill A.; Whitledge, Gregory W.; Burbank, Sarah B.; Chadderton, William L.; Mahon, Andrew R.; Tyson, Jeffrey T.; Gantz, Crysta A.; Keller, Reuben P.; Drake, John M.; Lodge, David M.

    2014-01-01

    Intentional introductions of nonindigenous fishes are increasing globally. While benefits of these introductions are easily quantified, assessments to understand the negative impacts to ecosystems are often difficult, incomplete, or absent. Grass carp (Ctenopharyngodon idella) was originally introduced to the United States as a biocontrol agent, and recent observations of wild, diploid individuals in the Great Lakes basin have spurred interest in re-evaluating its ecological risk. Here, we evaluate the ecological impact of grass carp using expert opinion and a suite of the most up-to-date analytical tools and data (ploidy assessment, eDNA surveillance, species distribution models (SDMs), and meta-analysis). The perceived ecological impact of grass carp by fisheries experts was variable, ranging from unknown to very high. Wild-caught triploid and diploid individuals occurred in multiple Great Lakes waterways, and eDNA surveillance suggests that grass carp are abundant in a major tributary of Lake Michigan. SDMs predicted suitable grass carp climate occurs in all Great Lakes. Meta-analysis showed that grass carp introductions impact both water quality and biota. Novel findings based on updated ecological impact assessment tools indicate that iterative risk assessment of introduced fishes may be warranted.

  10. Pressurized thermal shock re-evaluation studies for Korean PWR plant

    International Nuclear Information System (INIS)

    Jung, Sung Gyu; Kim, Hyun Su; Jin, Tae Eun; Jang, Chang Hee

    2001-01-01

    The PTS reference temperature of reactor pressure vessel for one of the Korean NPPs has been predicted to exceed the screening criteria before it reaches it's design life. To cope with this issue, a plant-specific PTS analysis had been performed in accordance with the Regulatory Guide 1.154 in 1999. As a result of that analysis, it was found that current methodology of RG. 1.154 was very conservative. The objective of this study is to examine the effects of changing various input parameters and to determine the amount of conservatism of the current PTS analysis method. To do this, based on the past PTS analysis experience, parametric study were performed for various models using modified VISA-II code. This paper discusses the analysis results and recommendations to reduce the conservatism of current analysis method

  11. Re-evaluating neonatal-age models for ungulates: does model choice affect survival estimates?

    Directory of Open Access Journals (Sweden)

    Troy W Grovenburg

    Full Text Available New-hoof growth is regarded as the most reliable metric for predicting age of newborn ungulates, but variation in estimated age among hoof-growth equations that have been developed may affect estimates of survival in staggered-entry models. We used known-age newborns to evaluate variation in age estimates among existing hoof-growth equations and to determine the consequences of that variation on survival estimates. During 2001-2009, we captured and radiocollared 174 newborn (≤24-hrs old ungulates: 76 white-tailed deer (Odocoileus virginianus in Minnesota and South Dakota, 61 mule deer (O. hemionus in California, and 37 pronghorn (Antilocapra americana in South Dakota. Estimated age of known-age newborns differed among hoof-growth models and varied by >15 days for white-tailed deer, >20 days for mule deer, and >10 days for pronghorn. Accuracy (i.e., the proportion of neonates assigned to the correct age in aging newborns using published equations ranged from 0.0% to 39.4% in white-tailed deer, 0.0% to 3.3% in mule deer, and was 0.0% for pronghorns. Results of survival modeling indicated that variability in estimates of age-at-capture affected short-term estimates of survival (i.e., 30 days for white-tailed deer and mule deer, and survival estimates over a longer time frame (i.e., 120 days for mule deer. Conversely, survival estimates for pronghorn were not affected by estimates of age. Our analyses indicate that modeling survival in daily intervals is too fine a temporal scale when age-at-capture is unknown given the potential inaccuracies among equations used to estimate age of neonates. Instead, weekly survival intervals are more appropriate because most models accurately predicted ages within 1 week of the known age. Variation among results of neonatal-age models on short- and long-term estimates of survival for known-age young emphasizes the importance of selecting an appropriate hoof-growth equation and appropriately defining intervals (i

  12. Re-evaluation of the AASHTO-flexible pavement design equation with neural network modeling.

    Science.gov (United States)

    Tiğdemir, Mesut

    2014-01-01

    Here we establish that equivalent single-axle loads values can be estimated using artificial neural networks without the complex design equality of American Association of State Highway and Transportation Officials (AASHTO). More importantly, we find that the neural network model gives the coefficients to be able to obtain the actual load values using the AASHTO design values. Thus, those design traffic values that might result in deterioration can be better calculated using the neural networks model than with the AASHTO design equation. The artificial neural network method is used for this purpose. The existing AASHTO flexible pavement design equation does not currently predict the pavement performance of the strategic highway research program (Long Term Pavement Performance studies) test sections very accurately, and typically over-estimates the number of equivalent single axle loads needed to cause a measured loss of the present serviceability index. Here we aimed to demonstrate that the proposed neural network model can more accurately represent the loads values data, compared against the performance of the AASHTO formula. It is concluded that the neural network may be an appropriate tool for the development of databased-nonparametric models of pavement performance.

  13. Re-evaluation of the age of the Brandon Lignite (Vermont, USA) based on plant megafossils. [USA - Vermont

    Energy Technology Data Exchange (ETDEWEB)

    Tiffney, B.H. (University of California at Santa Barbara, Santa Barbara, CA (United States). Dept. of Geological Sciences)

    1994-07-01

    The Brandon Lignite of west-central Vermont contains the northernmost megafossil flora of Cenozoic angiosperms, and one of the most diverse Cenozoic pollen floras in northeastern North America. While the floristic composition clearly indicates deposition of the Brandon sediments in the warmer parts of the Cenozoic, previous attempts at a more precise stratigraphic placement have been inconclusive, ranging from Cretaceous to Miocene. Re-evaluation of existing and new fruit, seed and wood data from the Brandon flora in the context of other floras in the Northern Hemisphere leads to the conservative conclusion that the deposit could range from earliest Oligocene to Early Miocene. Several lines of potentially weak evidence favor an Early Miocene age, in agreement with recent biostratigraphic data from the associated pollen flora. It is concluded that the Brandon Lignite is Early Miocene.

  14. Re-evaluation of the 1995 Hanford Large Scale Drum Fire Test Results

    International Nuclear Information System (INIS)

    Yang, J M

    2007-01-01

    A large-scale drum performance test was conducted at the Hanford Site in June 1995, in which over one hundred (100) 55-gal drums in each of two storage configurations were subjected to severe fuel pool fires. The two storage configurations in the test were pallet storage and rack storage. The description and results of the large-scale drum test at the Hanford Site were reported in WHC-SD-WM-TRP-246, ''Solid Waste Drum Array Fire Performance,'' Rev. 0, 1995. This was one of the main references used to develop the analytical methodology to predict drum failures in WHC-SD-SQA-ANAL-501, 'Fire Protection Guide for Waste Drum Storage Array,'' September 1996. Three drum failure modes were observed from the test reported in WHC-SD-WM-TRP-246. They consisted of seal failure, lid warping, and catastrophic lid ejection. There was no discernible failure criterion that distinguished one failure mode from another. Hence, all three failure modes were treated equally for the purpose of determining the number of failed drums. General observations from the results of the test are as follows: (lg b ullet) Trash expulsion was negligible. (lg b ullet) Flame impingement was identified as the main cause for failure. (lg b ullet) The range of drum temperatures at failure was 600 C to 800 C. This is above the yield strength temperature for steel, approximately 540 C (1,000 F). (lg b ullet) The critical heat flux required for failure is above 45 kW/m 2 . (lg b ullet) Fire propagation from one drum to the next was not observed. The statistical evaluation of the test results using, for example, the student's t-distribution, will demonstrate that the failure criteria for TRU waste drums currently employed at nuclear facilities are very conservative relative to the large-scale test results. Hence, the safety analysis utilizing the general criteria described in the five bullets above will lead to a technically robust and defensible product that bounds the potential consequences from postulated

  15. [Construction and realization of real world integrated data warehouse from HIS on re-evaluation of post-maketing traditional Chinese medicine].

    Science.gov (United States)

    Zhuang, Yan; Xie, Bangtie; Weng, Shengxin; Xie, Yanming

    2011-10-01

    To construct real world integrated data warehouse on re-evaluation of post-marketing traditional Chinese medicine for the research on key techniques of clinic re-evaluation which mainly includes indication of traditional Chinese medicine, dosage usage, course of treatment, unit medication, combined disease and adverse reaction, which provides data for reviewed research on its safety,availability and economy,and provides foundation for perspective research. The integrated data warehouse extracts and integrate data from HIS by information collection system and data warehouse technique and forms standard structure and data. The further research is on process based on the data. A data warehouse and several sub data warehouses were built, which focused on patients' main records, doctor orders, diseases diagnoses, laboratory results and economic indications in hospital. These data warehouses can provide research data for re-evaluation of post-marketing traditional Chinese medicine, and it has clinical value. Besides, it points out the direction for further research.

  16. Revise and Re-evaluate Cross Cultural Understanding Curriculum at Akademi Bahasa Asing Balikpapan (Foriegn Language Academy of Balikpapan

    Directory of Open Access Journals (Sweden)

    Rachmi Sari Baso

    2014-02-01

    Full Text Available The study is about the project to revise and re-evaluate the unit of Cross Cultural Understanding curriculum which is taught in the Akademi Bahasa Asing Ballikpapan. The unit is for fifth semester students. The project aimed to provide students' perspectives of cross cultural differences in the workplace with the materials and knowledge that suitable for workplace demands. The information was gained by distributing questionnaires to 2 teachers and 2 employers of multinational companies in Balikpapan. The investigations for teachers were focused on the content, learning activities and materials of the current curriculum. The investigations for the employers were focused on their perspectives on the cross cultural understanding taught in the higher education. The project used Nicholls' cycle model that will be a useful tool to regularly evaluate curriculum based on the situational analysis. As the result, there were some of materials of American business cultural encounter should be revised to meet the companies demands and additional table manners in cultural perspectives should be included in the curriculum. Therefore, the new curriculum will be applied by these materials as the demands of the workplace.

  17. Re-evaluating concepts of biological function in clinical medicine: towards a new naturalistic theory of disease.

    Science.gov (United States)

    Chin-Yee, Benjamin; Upshur, Ross E G

    2017-08-01

    Naturalistic theories of disease appeal to concepts of biological function, and use the notion of dysfunction as the basis of their definitions. Debates in the philosophy of biology demonstrate how attributing functions in organisms and establishing the function-dysfunction distinction is by no means straightforward. This problematization of functional ascription has undermined naturalistic theories and led some authors to abandon the concept of dysfunction, favoring instead definitions based in normative criteria or phenomenological approaches. Although this work has enhanced our understanding of disease and illness, we need not necessarily abandon naturalistic concepts of function and dysfunction in the disease debate. This article attempts to move towards a new naturalistic theory of disease that overcomes the limitations of previous definitions and offers advantages in the clinical setting. Our approach involves a re-evaluation of concepts of biological function employed by naturalistic theories. Drawing on recent insights from the philosophy of biology, we develop a contextual and evaluative account of function that is better suited to clinical medicine and remains consistent with contemporary naturalism. We also show how an updated naturalistic view shares important affinities with normativist and phenomenological positions, suggesting a possibility for consilience in the disease debate.

  18. Re-evaluation of the criticality experiments of the ''Otto Hahn Nuclear Ship'' reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lengar, I.; Snoj, L.; Rogan, P.; Ravnik, M. [Jozef Stefan Institute, Ljubljana (Slovenia)

    2008-11-15

    Several series of experiments with a FDR reactor (advanced pressurized light water reactor) were performed in 1972 in the Geesthacht critical facility ANEX. The experiments were performed to test the core prior to its usage for the propulsion of the first German nuclear merchant ship ''Otto-Hahn''. In the present paper a calculational re-evaluation of the experiments is described with the use of the up-to date computer codes (Monte-Carlo code MCNP5) and nuclear data (ENDF/B-VI release 6). It is focused on the determination of uncertainties in the benchmark model of the experimental set-up, originating mainly from the limited set of information still available about the experiments. Effects of the identified uncertainties on the multiplication factor were studied. The sensitivity studies include parametric variation of material composition and geometry. The combined total uncertainty being found 0.0050 in k{sub eff}, the experiments are qualified as criticality safety benchmark experiments. (orig.)

  19. Fever of unknown origin; Re-evaluation of sup 67 Ga scintigraphy in detecting causes of fever

    Energy Technology Data Exchange (ETDEWEB)

    Misaki, Takashi; Matsui, Akira; Tanaka, Fumiko; Okuno, Yoshishige; Mitsumori, Michihide; Torizuka, Tatsurou; Dokoh, Shigeharu; Hayakawa, Katsumi; Shimbo, Shin-ichirou (Kyoto City Hospital (Japan))

    1990-06-01

    Gallium-67 scintigraphy is a commonly performed imaging modality in deteting pyrogenic lesions in cases of long-standing inexplainable fever. To re-evaluate the significance of gallium imaging in such cases, a retrospective review was made of 56 scans performed in febrile patients in whom sufficient clinical and laboratory findings were obtained. Gallium scans were true positive in 30 patients, false positive in 3, true negative in 19, and false negative in 4. In the group of true positive, local inflammatory lesions were detected in 23 patients with a final diagnosis of lung tuberculosis, urinary tract infection, and inflammatory joint disease. Abnormal gallium accumulation, as shown in the other 7 patients, provided clues to the diagnosis of generalized disorders, such as hematological malignancies (n=3), systemic autoimmune diseases (n=3), and severe infectious mononucleosis (n=one). In the group of false positive, gallium imaging revealed intestinal excretion of gallium in 2 patients and physiological pulmonary hilar accumulation in one. In the true negative group of 19 patients, fever of unknown origin was resolved spontaneously in 12 patients, and with antibiotics and corticosteroids in 2 and 5 patients, respectively. Four patients having false negative scans were finally diagnosed as having urinary tract infection (n=2), bacterial meningitis (n=one), and polyarteritis (n=one). Gallium imaging would remain the technique of choice in searching for origin of unknown fever. It may also be useful for early diagnosis of systemic disease, as well as focal inflammation. (N.K.).

  20. [Clinical re-evaluation of effects of two different "cocktail therapy" to prevent from phlebitis induced by Chansu injection].

    Science.gov (United States)

    Zhao, Yu-Bin; Hao, Zhe; Zhang, Hong-Dan; Xie, Yan-Ming

    2012-09-01

    To re-evaluate the effects of different "cocktail therapy" to prevent from phlebitis induced by Chansu injection. Patients treated with Chansu injection were divided randomLy into 4 groups with 90 per group, control group, phentolaminum group, the magnesium sulfate group-phentolaminum group, and anisodamine-phentolaminum group. Patients in the control group only received the routine nursing treatment, and patients in the various experiment group received different interventions. The comparison was made in the morbidity and the starting time of occurrence of phlebitis, the severity of pain, duration of pain. The morbidity of phlebitis was 8%, 8%, 6%, respectively. The starting time of phlebitis occurrence was (22 +/- 4), (27 +/- 5), (28 +/- 7) h, respectively. The NRS of pain was (4.75 +/- 1.51), (3.27 +/- 1.02), (2.71 +/- 1.63), respectively. The duration time of pain was (4.25 +/- 1.36), (2.51 +/- 1.05), (2.19 +/- 1.13) d respectively. In control group, the morbidity of phlebitis, the starting time of occurrence of phlebitis, the severity of pain, duration of pain was 30%, (16 +/- 4) h, (6.34 +/- 1.21), (5.47 +/- 1.07) d, respectively. As compared with the control group, a significance difference was found between every group in three test groups and control group respectively (Pphlebitis, the severity of pain, duration of pain was significantly reduced respectively by two different "cocktail therapy".

  1. Statistical re-evaluation of the ASME K{sub IC} and K{sub IR} fracture toughness reference curves

    Energy Technology Data Exchange (ETDEWEB)

    Wallin, K.; Rintamaa, R. [Valtion Teknillinen Tutkimuskeskus, Espoo (Finland)

    1998-11-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the `Master curve`, has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K{sub IC}-reference curve. Similarly, the 1% lower bound Master curve corresponds to the K{sub IR}-reference curve. (orig.)

  2. Unit Roots in Economic and Financial Time Series: A Re-Evaluation at the Decision-Based Significance Levels

    Directory of Open Access Journals (Sweden)

    Jae H. Kim

    2017-09-01

    Full Text Available This paper re-evaluates key past results of unit root tests, emphasizing that the use of a conventional level of significance is not in general optimal due to the test having low power. The decision-based significance levels for popular unit root tests, chosen using the line of enlightened judgement under a symmetric loss function, are found to be much higher than conventional ones. We also propose simple calibration rules for the decision-based significance levels for a range of unit root tests. At the decision-based significance levels, many time series in Nelson and Plosser’s (1982 (extended data set are judged to be trend-stationary, including real income variables, employment variables and money stock. We also find that nearly all real exchange rates covered in Elliott and Pesavento’s (2006 study are stationary; and that most of the real interest rates covered in Rapach and Weber’s (2004 study are stationary. In addition, using a specific loss function, the U.S. nominal interest rate is found to be stationary under economically sensible values of relative loss and prior belief for the null hypothesis.

  3. The application of carbon monoxide in meat packaging needs to be re-evaluated within the EU: An overview.

    Science.gov (United States)

    Van Rooyen, Lauren Anne; Allen, Paul; O'Connor, David I

    2017-10-01

    Carbon monoxide (CO) has many value-added benefits in meat packaging due to its colour stabilising effects and enhancement of meat quality attributes. The regulation of CO within meat packaging varies worldwide and remains a topical and controversial issue. CO is prohibited in the EU for use in meat packaging mainly due to fears it may mask spoilage therefore misleading consumers. The issue of consumer acceptance of CO was not considered. This article reviews the most pertinent literature to assess if the problems associated with the prohibition have been addressed. Applying CO pretreatments prior to vacuum packaging enhances colour while allowing discolouration to occur by the use-by-date, thereby addressing concerns about safety. Recent work showing European consumer acceptance of CO in meat packaging demonstrates its future potential within the EU. The information provided may support framing future policies intended to assure consumer protection, safety, choice and interest. Re-evaluation of permitting CO as a packaging gas within the EU may be warranted. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Male bladder outlet obstruction: Time to re-evaluate the definition and reconsider our diagnostic pathway? ICI-RS 2015.

    Science.gov (United States)

    Rademakers, Kevin; Drake, Marcus J; Gammie, Andrew; Djurhuus, Jens C; Rosier, Peter F W M; Abrams, Paul; Harding, Christopher

    2017-04-01

    The diagnosis of bladder outlet obstruction (BOO) in the male is dependent on measurements of pressure and flow made during urodynamic studies. The procedure of urodynamics and the indices used to delineate BOO are well standardized largely as a result of the work of the International Continence Society. The clinical utility of the diagnosis of BOO is however, less well defined and there are several shortcomings and gaps in the currently available medical literature. Consequently the International Consultation on Incontinence Research Society (ICI-RS) held a think tank session in 2015 entitled "Male bladder outlet obstruction: Time to re-evaluate the definition and reconsider our diagnostic pathway?" This manuscript details the discussions that took place within that think tank setting out the pros and cons of the current definition of BOO and exploring alternative clinical tests (alone or in combination) which may be useful in the future investigation of male patients with lower urinary tract symptoms. The think tank panel concluded that pressure-flow studies remain the diagnostic gold-standard for BOO although there is still a lack of high quality evidence. Newer, less invasive, investigations have shown promise in terms of diagnostic accuracy for BOO but similar criticisms can be levelled against these tests. Therefore, the think tank suggests further research with regard to these alternative indicators to determine their clinical utility. © 2017 Wiley Periodicals, Inc.

  5. Re-evaluation of DNA Index as a Prognostic Factor in Children with Precursor B Cell Acute Lymphoblastic Leukemia.

    Science.gov (United States)

    Noh, O Kyu; Park, Se Jin; Park, Hyeon Jin; Ju, HeeYoung; Han, Seung Hyon; Jung, Hyun Joo; Park, Jun Eun

    2017-09-01

    We aimed to investigate the prognostic value of DNA index (DI) in children with precursor B cell acute lymphoblastic lymphoma (pre-B ALL). From January 2003 to December 2014, 72 children diagnosed with pre-B ALL were analyzed. We analyzed the prognostic value of DI and its relations with other prognostic factors. The DI cut-point of 1.16 did not discriminate significantly the groups between high and low survivals (DI≥1.16 versus 1.90), and the survival of children with a DI between 1.00-1.90 were significantly higher than that of children with DI of 1.90 (5-year OS, 90.6% vs. 50.0%, p children with pre-B ALL. However, the DI divided by specific ranges of values remained an independent prognostic factor. Further studies are warranted to re-evaluate the prognostic value and cut-point of DI in children treated with recent treatment protocols. © 2017 by the Association of Clinical Scientists, Inc.

  6. A regulatory view of the seismic re-evaluation of existing nuclear power plants in the United Kingdom

    International Nuclear Information System (INIS)

    Inkester, J.E.; Bradford, P.M.

    1995-01-01

    The paper describes the background to the seismic re-evaluation of existing nuclear power plants in the United Kingdom. Nuclear installations in this country were not designed specifically to resist earthquakes until the nineteen-seventies, although older plants were robustly constructed. The seismic capability of these older installations is now being evaluated as part of the periodic safety reviews which nuclear licensees are required to carry out. The regulatory requirements which set the framework for these studies are explained and the approaches being adopted by the licensees for their assessment of the seismic capability of existing plants are outlined. The process of hazard appraisal is reported together with a general overview of UK seismicity. The paper then discusses the methodologies used to evaluate the response of plant to the hazard. Various other types of nuclear installation besides power plants are subject to licensing in the UK and the application of seismic evaluation to some of these is briefly described. Finally the paper provides some comments on future initiatives and possible areas of development. (author)

  7. A combined fluorescence spectroscopy, confocal and 2-photon microscopy approach to re-evaluate the properties of sphingolipid domains.

    Science.gov (United States)

    Pinto, Sandra N; Fernandes, Fábio; Fedorov, Alexander; Futerman, Anthony H; Silva, Liana C; Prieto, Manuel

    2013-09-01

    The aim of this study is to provide further insight about the interplay between important signaling lipids and to characterize the properties of the lipid domains formed by those lipids in membranes containing distinct composition. To this end, we have used a combination of fluorescence spectroscopy, confocal and two-photon microscopy and a stepwise approach to re-evaluate the biophysical properties of sphingolipid domains, particularly lipid rafts and ceramide (Cer)-platforms. By using this strategy we were able to show that, in binary mixtures, sphingolipids (Cer and sphingomyelin, SM) form more tightly packed gel domains than those formed by phospholipids with similar acyl chain length. In more complex lipid mixtures, the interaction between the different lipids is intricate and is strongly dictated by the Cer-to-Chol ratio. The results show that in quaternary phospholipid/SM/Chol/Cer mixtures, Cer forms gel domains that become less packed as Chol is increased. Moreover, the extent of gel phase formation is strongly reduced in these mixtures, even though Cer molar fraction is increased. These results suggest that in biological membranes, lipid domains such as rafts and ceramide platforms, might display distinctive biophysical properties depending on the local lipid composition at the site of the membrane where they are formed, further highlighting the potential role of membrane biophysical properties as an underlying mechanism for mediating specific biological processes. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Measuring α in the early universe: CMB temperature, large-scale structure, and Fisher matrix analysis

    International Nuclear Information System (INIS)

    Martins, C. J. A. P.; Melchiorri, A.; Trotta, R.; Bean, R.; Rocha, G.; Avelino, P. P.; Viana, P. T. P.

    2002-01-01

    We extend our recent work on the effects of a time-varying fine-structure constant α in the cosmic microwave background by providing a thorough analysis of the degeneracies between α and the other cosmological parameters, and discussing ways to break these with both existing and/or forthcoming data. In particular, we present the state-of-the-art cosmic microwave background constraints on α through a combined analysis of the BOOMERanG, MAXIMA and DASI data sets. We also present a novel discussion of the constraints on α coming from large-scale structure observations, focusing in particular on the power spectrum from the 2dF survey. Our results are consistent with no variation in α from the epoch of recombination to the present day, and restrict any such variation to be less than about 4%. We show that the forthcoming Microwave Anisotropy Probe and Planck experiments will be able to break most of the currently existing degeneracies between α and other parameters, and measure α to better than percent accuracy

  9. Fisher matrix forecast on cosmological parameters from the dark energy survey 2-point angular correlation function

    Energy Technology Data Exchange (ETDEWEB)

    Sobreira, F.; Rosenfeld, R. [Universidade Estadual Paulista Julio de Mesquita Filho (IFT/UNESP), Sao Paulo, SP (Brazil). Inst. Fisica Teorica; Simoni, F. de; Costa, L.A.N. da; Gaia, M.A.G.; Ramos, B.; Ogando, R.; Makler, M. [Laboratorio Interinstitucional de e-Astronomia (LIneA), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    Full text: We study the cosmological constraints expected for the upcoming project Dark Energy Survey (DES) with the full functional form of the 2-point angular correlation function. The angular correlation function model applied in this work includes the effects of linear redshift-space distortion, photometric redshift errors (assumed to be Gaussian) and non-linearities prevenient from gravitational infall. The Fisher information matrix is constructed with the full covariance matrix, which takes the correlation between nearby redshift shells in a proper manner. The survey was sliced into 20 redshift shells in the range 0:4 {<=} z {<=} 1:40 with a variable angular scale in order to search only the scale around the signal from the baryon acoustic oscillation, therefore well within the validity of the non-linear model employed. We found that under those assumptions and with a flat {Lambda}CDM WMAP7 fiducial model, the DES will be able to constrain the dark energy equation of state parameter w with a precision of {approx} 20% and the cold dark matter with {approx} 11% when marginalizing over the other 25 parameters (bias is treated as a free parameter for each shell). When applying WMAP7 priors on {Omega}{sub baryon}, {Omega} c{sub dm}, n{sub s}, and HST priors on the Hubble parameter, w is constrained with {approx} 9% precision. This shows that the full shape of the angular correlation function with DES data will be a powerful probe to constrain cosmological parameters. (author)

  10. Re-evaluation of the sorption behaviour of Bromide and Sulfamethazine under field conditions using leaching data and modelling methods

    Science.gov (United States)

    Gassmann, Matthias; Olsson, Oliver; Höper, Heinrich; Hamscher, Gerd; Kümmerer, Klaus

    2016-04-01

    The simulation of reactive transport in the aquatic environment is hampered by the ambiguity of environmental fate process conceptualizations for a specific substance in the literature. Concepts are usually identified by experimental studies and inverse modelling under controlled lab conditions in order to reduce environmental uncertainties such as uncertain boundary conditions and input data. However, since environmental conditions affect substance behaviour, a re-evaluation might be necessary under environmental conditions which might, in turn, be affected by uncertainties. Using a combination of experimental data and simulations of the leaching behaviour of the veterinary antibiotic Sulfamethazine (SMZ; synonym: sulfadimidine) and the hydrological tracer Bromide (Br) in a field lysimeter, we re-evaluated the sorption concepts of both substances under uncertain field conditions. Sampling data of a field lysimeter experiment in which both substances were applied twice a year with manure and sampled at the bottom of two lysimeters during three subsequent years was used for model set-up and evaluation. The total amount of leached SMZ and Br were 22 μg and 129 mg, respectively. A reactive transport model was parameterized to the conditions of the two lysimeters filled with monoliths (depth 2 m, area 1 m²) of a sandy soil showing a low pH value under which Bromide is sorptive. We used different sorption concepts such as constant and organic-carbon dependent sorption coefficients and instantaneous and kinetic sorption equilibrium. Combining the sorption concepts resulted in four scenarios per substance with different equations for sorption equilibrium and sorption kinetics. The GLUE (Generalized Likelihood Uncertainty Estimation) method was applied to each scenario using parameter ranges found in experimental and modelling studies. The parameter spaces for each scenario were sampled using a Latin Hypercube method which was refined around local model efficiency maxima

  11. Re-evaluation of the shielding adequacy of the brachytherapy treatment room at Korle-Bu teaching hospital, Ghana

    International Nuclear Information System (INIS)

    Arwui, C. C.

    2009-06-01

    Staff and the general public's safety during the operation of the 137 Cs brachytherapy unit at the Korle Bu teaching hospital depends on the adequacy of the shielding of the facility. Shielding design of the brachytherapy unit at the hospital was based on postulated workload and postulated occupancy factors to critical locations at the facility where the public and staff may occupy. This facility has been in existence for the past twelve (12) years and has accumulated operational workload data which differs from the postulated one. A study was carried out to re-evaluate the integrity of the biological shielding of the 137 Cs brachytherapy unit. This study analyzed the accumulated workload data and used the information to perform shielding calculations to verify the adequacy of the biological shielding thicknesses to provide sufficient protection of staff and the public. Dose rate calculations were verified by measurements with calibrated dose rate meters. This provided the basis for determining the current state of protection and safety for staff and the general public. The results show that despite the variation in actual and postulated workloads, the dose rates were below the reference values of 0.5μSv/h for public areas and 7.5μSv/h for controlled areas. It was confirmed that the present shielding thickness of 535 mm can accommodate a high dose rate (HDR) 192 Ir source with activity in the range 370 - 570 GBq with an operational workload of 30 patients per week and an average treatment time of 10 minutes.

  12. [Clinical re-evaluation of effects of different treatments to prevent from phlebitis induced by Chansu injection].

    Science.gov (United States)

    Zhao, Yubin; Hao, Zhe; Zhang, Hongdan; Shi, Jian; Xie, Yanming

    2011-10-01

    To re-evaluate the effects of different treatments to prevent from phlebitis induced by Chansu injection. Patients treated with Chansu injection were divided randomly into 4 groups with 50 per group, control group, the magnesium sulfate group, phentolaminum group, and anisodamine group. Patients in the control group only received the routine nursing treatment, and patients in the various experiment group received different interventions. The comparison was made in the morbidity and the starting time of occurrence of phlebitis, the severity of pain, duration of pain. The morbidity of phlebitis was 8%, 8%, 6% respectively. The starting time of phlebitis occurrence was (21 +/- 9.31) , (22.34 +/- 10.15), (20.19 +/- 11.23) h, respectively. The NRS of pain was (4. 15 +/- 1.03), (3.26 +/- 1.17), (4.32 +/- 1.36), respectively. The duration time of pain was (4.05 +/- 1.21), (3.37 +/- 1.17), (3.19 +/- 1.67) d, respectively. In control group, the morbidity of phlebitis, the starting time of occurrence of phlebitis, the severity of pain, duration of pain was 24%, (17 +/- 6.32) h, (6.58 +/- 1.29), (5.32 +/- 1.12) d, respectively. As compared with the control group, a significance difference was found between every group in three test groups and control group respectively (Pphlebitis, the severity of pain, duration of pain was significantly reduced respectively by external appication of magnesium sulfate, anisodamine, and intravenous drip infusion of phentolaminum.

  13. A re-evaluation of the Italian historical geomagnetic catalogue: implications for paleomagnetic dating at active Italian volcanoes

    Directory of Open Access Journals (Sweden)

    F. D'Ajello Caracciolo

    2011-06-01

    Full Text Available Paleomagnetism is proving to represent one of the most powerful dating tools of volcanics emplaced in Italy during the last few centuries/millennia. This method requires that valuable proxies of the local geomagnetic field (paleosecular variation ((PSV are available. To this end, we re-evaluate the whole Italian geomagnetic directional dataset, consisting of 833 and 696 declination and inclination measurements, respectively, carried out since 1640 AD at several localities. All directions were relocated via the virtual geomagnetic pole method to Stromboli (38.8° N, 15.2° E, the rough centre of the active Italian volcanoes. For declination-only measurements, missing inclinations were derived (always by pole method by French data (for period 1670–1789, and by nearby Italian sites/years (for periods 1640–1657 and 1790–1962. Using post-1825 declination values, we obtain a 0.46 ± 0.19° yr−1 westward drift of the geomagnetic field for Italy. The original observation years were modified, considering such drift value, to derive at a drift-corrected relocated dataset. Both datasets were found to be in substantial agreement with directions derived from the field models by Jackson et al. (2000 and Pavon-Carrasco et al. (2009. However, the drift-corrected dataset minimizes the differences between the Italian data and both field models, and eliminates a persistent 1.6° shift of 1933–1962 declination values from Castellaccio with respect to other nearly coeval Italian data. The relocated datasets were used to calculate two post-1640 Italian SV curves, with mean directions calculated every 30 and 10 years before and after 1790, respectively. The curve comparison suggests that both available field models yield the best available SV curve to perform paleomagnetic dating of 1600–1800 AD Italian volcanics, while the Italian drift-corrected curve is probably preferable for the 19th century. For the 20th century, the global model by

  14. [Designs and thoughts of real world integrated data warehouse from HIS on re-evaluation of post-maketing traditional Chinese medicine].

    Science.gov (United States)

    Zhuang, Yan; Xie, Bangtie; Weng, Shengxin; Xie, Yanming

    2011-10-01

    To discuss the feasibility and necessity of using HIS data integration to build large data warehouse system which is extensively used on re-evaluation of post-marketing traditional Chinese medicine, and to provide the thought and method of the overall design for it. With domestic and overseas' analysis and comparison on clinical experiments' design based on real world using electronic information system, and with characteristics of HIS in China, a general framework was designed and discussed which refers to design thought, design characteristics, existing problems and solutions and so on. A design scheme of HIS data warehouse on re-evaluation of post-marketing traditional Chinese medicine was presented. The design scheme was proved to be high coherence and low coupling, safe, Universal, efficient and easy to maintain, which can effectively solve the problems many hospitals have faced during the process of HIS data integration.

  15. Re-evaluation of salt deposits. BGR investigates subhorizontally-bedded salt layers; Salzvorkommen neu bewertet. BGR untersucht flach lagernde salinare Schichten

    Energy Technology Data Exchange (ETDEWEB)

    Hammer, Joerg [Bundesanstalt fuer Geowissenschaften und Rohstoffe, Hannover (Germany). Fachbereich ' ' Geologisch-geotechnische Erkundung' ' ; Fahland, Sandra [Bundesanstalt fuer Geowissenschaften und Rohstoffe, Hannover (Germany). Fachberech ' ' Geotechnische Sicherheitsnachweise' '

    2016-05-15

    The search for a site for a repository for high-level radioactive waste was restarted in 2013. All of the potential host rocks existing in Germany must be re-evaluated and compared as a result. The list now also includes so-called ''subhorizontally-bedded evaporite formations''. BGR is analysing today's knowledge base on these salt deposits as part of the BASAL project.

  16. Recollections of parental care and quality of intimate relationships : The role of re-evaluating past attachment experiences

    NARCIS (Netherlands)

    Gerlsma, C.

    2000-01-01

    Attachment theory predicts that lack of parental care in childhood may affect the ability to relate in adulthood. While original attachment formulations have primarily focused on actual parenting experiences, more recently attachment writers increasingly emphasize the role of individual differences

  17. [A case of carbon monoxide poisoning by explosion of coal mine presenting as visual agnosia: re-evaluation after 40 years].

    Science.gov (United States)

    Takaiwa, Akiko; Yamashita, Kenichiro; Nomura, Takuo; Shida, Kenshiro; Taniwaki, Takayuki

    2005-11-01

    We re-evaluated a case of carbon monoxide poisoning presenting as visual agnosia who had been injured by explosion of Miike-Mikawa coal mine 40 years ago. In an early stage, his main neuropsychological symptoms were visual agnosia, severe anterograde amnesia, alexia, agraphia, constructional apraxia, left hemispatial neglect and psychic paralysis of gaze, in addition to pyramidal and extra pyramidal signs. At the time of re-evaluation after 40 years, he still showed visual agnosia associated with agraphia and constructional apraxia. Concerning visual agnosia, recognition of the real object was preserved, while recognition of object photographs and picture was impaired. Thus, this case was considered to have picture agnosia as he could not recognize the object by pictorial cues on the second dimensional space. MRI examination revealed low signal intensity lesions and cortical atrophy in the bilateral parieto-occipital lobes on T1-weighted images. Therefore, the bilateral parieto-occipital lesions are likely to be responsible for his picture agnosia.

  18. A case of carbon monoxide poisoning by explosion of coal mine presenting as visual agnosia: re-evaluation after 40 years

    Energy Technology Data Exchange (ETDEWEB)

    Takaiwa, A.; Yamashita, K.; Nomura, T.; Shida, K.; Taniwaki, T. [Kyushu University, Fukuoka (Japan). Department of Neurology, Graduate School of Medical Science

    2005-11-15

    We re-evaluated a case of carbon monoxide poisoning presenting as visual agnosia who had been injured by explosion of Miike-Mikawa coal mine 40 years ago. In an early stage, his main neuropsychological symptoms were visual agnosia, severe anterograde amnesia, alexia, agraphia, constructional apraxia, left hemispatial neglect and psychic paralysis of gaze, in addition to pyramidal and extra pyramidal signs. At the time of re-evaluation after 40 years, he still showed visual agnosia associated with agraphia and constructional apraxia. Concerning visual agnosia, recognition of the real object was preserved, while recognition of object photographs and picture was impaired. Thus, this case was considered to have picture agnosia as he could not recognize the object by pictorial cues on the second dimensional space. MRI examination revealed low signal intensity lesions and cortical atrophy in the bilateral parieto-occipital lobes on T1-weighted images. Therefore, the bilateral parieto-occipital lesions are likely to be responsible for his picture agnosia.

  19. Is group membership necessary for understanding generalized prejudice? A re-evaluation of why prejudices are interrelated.

    Science.gov (United States)

    Bergh, Robin; Akrami, Nazar; Sidanius, Jim; Sibley, Chris G

    2016-09-01

    Many scholars have proposed that people who reject one outgroup tend to reject other outgroups. Studies examining a latent factor behind different prejudices (e.g., toward ethnic and sexual minorities) have referred to this as generalized prejudice. Such research has also documented robust relations between latent prejudice factors and basic personality traits. However, targets of generalized prejudice tend to be lower in power and status and thus it remains an open question as to whether generalized prejudice, as traditionally studied, is about devaluing outgroups or devaluing marginalized groups. We present 7 studies, including experiments and national probability samples (N = 9,907 and 4,037) assessing the importance of outgroup devaluation, versus status- or power based devaluations, for understanding the nature of generalized prejudice, and its links to personality. Results show that (a) personality variables do not predict ingroup/outgroup biases in settings where power and status differences are absent, (b) women and overweight people who score high on generalized prejudice devalue their own groups, and (c) personality variables are far more predictive of prejudice toward low-compared with high-status targets. Together, these findings suggest that the personality explanation of prejudice including the generalized prejudice concept is not about ingroups versus outgroups per se, but rather about devaluing marginalized groups. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Technical note: The US Dobson station network data record prior to 2015, re-evaluation of NDACC and WOUDC archived records with WinDobson processing software

    Science.gov (United States)

    Evans, Robert D.; Petropavlovskikh, Irina; McClure-Begley, Audra; McConville, Glen; Quincy, Dorothy; Miyagawa, Koji

    2017-10-01

    The United States government has operated Dobson ozone spectrophotometers at various sites, starting during the International Geophysical Year (1 July 1957 to 31 December 1958). A network of stations for long-term monitoring of the total column content (thickness of the ozone layer) of the atmosphere was established in the early 1960s and eventually grew to 16 stations, 14 of which are still operational and submit data to the United States of America's National Oceanic and Atmospheric Administration (NOAA). Seven of these sites are also part of the Network for the Detection of Atmospheric Composition Change (NDACC), an organization that maintains its own data archive. Due to recent changes in data processing software the entire dataset was re-evaluated for possible changes. To evaluate and minimize potential changes caused by the new processing software, the reprocessed data record was compared to the original data record archived in the World Ozone and UV Data Center (WOUDC) in Toronto, Canada. The history of the observations at the individual stations, the instruments used for the NOAA network monitoring at the station, the method for reducing zenith-sky observations to total ozone, and calibration procedures were re-evaluated using data quality control tools built into the new software. At the completion of the evaluation, the new datasets are to be published as an update to the WOUDC and NDACC archives, and the entire dataset is to be made available to the scientific community. The procedure for reprocessing Dobson data and the results of the reanalysis on the archived record are presented in this paper. A summary of historical changes to 14 station records is also provided.

  1. Re-evaluation of structural shielding designs of X-ray and CO-60 gamma-ray scanners at the Port of Tema, Ghana

    International Nuclear Information System (INIS)

    Ofori, K.

    2011-07-01

    This research work was conducted to re-evaluate the shielding designs of the 6 MeV x-ray and the 1.253 MeV Co-60 gamma ray scanners used for cargo-containerized scanning at the Port of Tema. These scanners utilize ionizing radiation, therefore adequate shielding must be provided to reduce the radiation exposure of persons in and around the facilities to acceptable levels. The purpose of radiation shielding is to protect workers and the general public from the harmful effects of ionizing radiation. Investigations on the facilities indicated that after commissioning, no work had been carried out to re-evaluate the shielding designs. However, workloads have increased over time neccessitating review of the installed shielding. There has been introduction of scanner units with higher radiation energy (as in the case of the x-ray scanner) posibily increasing dose rates at various location requiring review of the shielding. New structures have been dotted around the facilities without particular attention to their distances and locations with respect to the radiation source. Measurements of distances from the source axes to the points of concern for primary and leakage barrier shielding; source to container and container to the points of concern for scattered radiation shielding were taken. The primary and secondary thicknesses required for both scanners were determined based on current operational parameters and compared with the thickness constituted during the construction of the facilities. Calculated and measured dose rate beyond the shielding barriers were used to established the adequacy or otherwise of the shielding employed by the shielding designers. Values obtained fell below the 20 µSv/hr specified by NCRP 151 (2005) which showed that the primary and secondary shields of both facilities were adequate requiring no additional shielding. (author)

  2. Technical note: The US Dobson station network data record prior to 2015, re-evaluation of NDACC and WOUDC archived records with WinDobson processing software

    Directory of Open Access Journals (Sweden)

    R. D. Evans

    2017-10-01

    Full Text Available The United States government has operated Dobson ozone spectrophotometers at various sites, starting during the International Geophysical Year (1 July 1957 to 31 December 1958. A network of stations for long-term monitoring of the total column content (thickness of the ozone layer of the atmosphere was established in the early 1960s and eventually grew to 16 stations, 14 of which are still operational and submit data to the United States of America's National Oceanic and Atmospheric Administration (NOAA. Seven of these sites are also part of the Network for the Detection of Atmospheric Composition Change (NDACC, an organization that maintains its own data archive. Due to recent changes in data processing software the entire dataset was re-evaluated for possible changes. To evaluate and minimize potential changes caused by the new processing software, the reprocessed data record was compared to the original data record archived in the World Ozone and UV Data Center (WOUDC in Toronto, Canada. The history of the observations at the individual stations, the instruments used for the NOAA network monitoring at the station, the method for reducing zenith-sky observations to total ozone, and calibration procedures were re-evaluated using data quality control tools built into the new software. At the completion of the evaluation, the new datasets are to be published as an update to the WOUDC and NDACC archives, and the entire dataset is to be made available to the scientific community. The procedure for reprocessing Dobson data and the results of the reanalysis on the archived record are presented in this paper. A summary of historical changes to 14 station records is also provided.

  3. Re-evaluation of a novel approach for quantitative myocardial oedema detection by analysing tissue inhomogeneity in acute myocarditis using T2-mapping.

    Science.gov (United States)

    Baeßler, Bettina; Schaarschmidt, Frank; Treutlein, Melanie; Stehning, Christian; Schnackenburg, Bernhard; Michels, Guido; Maintz, David; Bunck, Alexander C

    2017-12-01

    To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. • A novel quantitative approach to myocardial oedema imaging in myocarditis was re-evaluated. • The T2-mapping-derived parameters maxT2 and madSD were compared to traditional Lake-Louise criteria. • Using maxT2 and madSD with dedicated cut-offs performs similarly to Lake-Louise criteria. • Adding maxT2 and madSD to LGE results in further increased diagnostic performance. • This novel approach has the potential to overcome the limitations of T2-mapping.

  4. ["Re-evaluation upon suspected event" is an approach for post-marketing clinical study: lessons from adverse drug events related to Bupleuri Radix preparations].

    Science.gov (United States)

    Wu, Shu-Xin; Sun, Hong-Feng; Yang, Xiao-Hui; Long, Hong-Zhu; Ye, Zu-Guang; Ji, Shao-Liang; Zhang, Li

    2014-08-01

    We revisited the "Xiao Chaihu Decoction event (XCHDE)" occurred in late 1980s in Japan and the Bupleuri Radix related adverse drug reaction (ADR) reports in China After careful review, comparison, analysis and evaluation, we think the interstitial pneumonitis, drug induced Liver injury (DILI) and other severe adverse drug envents (ADEs) including death happened in Japan is probably results from multiple factors, including combinatory use of XCHDE with interferon, Kampo usage under modern medicine theory guidance, and use of XCHD on the basis of disease diagnosis instead of traditional Chinese syndrome complex differentiation. There are less ADE case reports related to XCHD preparation in China compared to Japan, mostly manifest with hypersensitivity responses of skin and perfuse perspiration. The symptoms of Radix Bupleuri injection related ADEs mainly manifest hypersensitivity-like response, 2 cases of intravenous infusion instead of intramuscular injection developed hypokalemia and renal failure. One case died from severe hypersensitivity shock. In Chinese literatures, there is no report of the interstitial pneumonitis and DILI associated with XCHDG in Japan. So far, there is no voluntary monitoring data and large sample clinical research data available. The author elaborated the classification of "reevaluation" and clarified "re-evaluation upon events" included the reaction to the suspected safety and efficacy events. Based on the current status of the clinical research on the Radix Bupleuri preparations, the author points out that post-marketing "re-evaluation upon suspected event" is not only a necessity of continuous evaluation of the safety, efficacy of drugs, it is also a necessity for providing objective clinical research data to share with the international and domestic drug administrations in the risk-benefit evaluation. It is also the unavoidable pathway to culture and push the excellent species and famous brands of TCM to the international market, in

  5. Raman spectroscopy of gold chloro-hydroxy speciation in fluids at ambient temperature and pressure: a re-evaluation of the effects of pH and chloride concentration

    Science.gov (United States)

    Murphy, P. J.; LaGrange, M. S.

    1998-11-01

    Previous work on gold chloride and hydroxide speciation in fluids has shown differences in opinion as to the relative importance of gold (I) and gold (III) species, as well as for the Raman peak assignments for the various species. In addition, previous experimental work has not been consistent with theoretical predictions either of the number or of the frequencies of the peaks in the Raman spectrum. In order to re-evaluate the effect of pH on Raman spectra and speciation, solutions containing gold (III) chloride were analysed by Raman spectroscopy at ambient temperature and pressure, over a range of pH from 1 to 11. Total gold concentrations were from 0.001 to 0.02 M, with total chloride concentrations of 0.004-0.5 M. The spectra obtained are consistent with the hydrolysis sequence of square-planar Au(III) complex ions [AuCl x(OH) 4-x] -, where x = 0-4. The Au-Cl stretching peaks obtained were 348/325 Rcm -1 for [AuCl 4] -, 348/335/325 Rcm -1 for [AuCl 3(OH)] -, 337/355 Rcm -1 for [AuCl 2(OH) 2] -, and 355 Rcm -1 for [AuCl(OH) 3] -. [Au(OH) 4] - probably occurred, alongside [AuCl(OH) 3] - at pH values above 11. A dark purplish-grey precipitate (Au(I)OH) formed at high pH values. No evidence for Au(I) species was found. The spectra are more consistent with theory than previous data and show the predicted number of peaks for Au-Cl and Au-OH stretches for each species. However, the peak frequencies do not fit precisely with the predictions of Tossell (1996), particularly for Au-OH stretches. Hydrolysis of the simple chloride species occurs at lower pH values than found previously, and both gold and chloride concentration were found to affect the pH ranges of stability for the various chloro-hydroxy species. Decreasing gold concentration resulted in hydrolysis occurring at lower pH values. This is especially important in the absence of excess chloride (ΣCl = 4ΣAu). Substantial hydrolysis occurred below pH = 4 for 0.02 M Au /0.08 M Cl -, and below pH = 2 for 0.001 M

  6. Re-evaluation of a novel approach for quantitative myocardial oedema detection by analysing tissue inhomogeneity in acute myocarditis using T2-mapping

    International Nuclear Information System (INIS)

    Baessler, Bettina; Treutlein, Melanie; Maintz, David; Bunck, Alexander C.; Schaarschmidt, Frank; Stehning, Christian; Schnackenburg, Bernhard; Michels, Guido

    2017-01-01

    To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. (orig.)

  7. Concrete reflected cylinders of highly enriched solutions of uranyl nitrate ICSBEP Benchmark: A re-evaluation by means of MCNPX using ENDF/B-VI cross section library

    International Nuclear Information System (INIS)

    Cruzate, J.A.; Carelli, J.L.

    2011-01-01

    This work presents a theoretical re-evaluation of a set of original experiments included in the 2009 issue of the International Handbook of Evaluated Criticality Safety Benchmark Experiments, as “Concrete Reflected Cylinders of Highly Enriched Solutions of Uranyl Nitrate” (identification number: HEU-SOL-THERM- 002) [4]. The present evaluation has been made according to benchmark specifications [4], and added data taken out of the original published report [3], but applying a different approach, resulting in a more realistic calculation model. In addition, calculations have been made using the latest version of MCNPX Monte Carlo code, combined with an updated set of cross section data, the continuous-energy ENDF/B-VI library. This has resulted in a comprehensive model for the given experimental situation. Uncertainties analysis has been made based on the evaluation of experimental data presented in the HEU-SOLTHERM-002 report. Resulting calculations with the present improved physical model have been able to reproduce the criticality of configurations within 0.5%, in good agreement with experimental data. Results obtained in the analysis of uncertainties are in general agreement with those at HEU-SOL-THERM-002 benchmark document. Qualitative results from analyses made in the present work can be extended to similar fissile systems: well moderated units of 235 U solutions, reflected with concrete from all directions. Results have confirmed that neutron absorbers, even as impurities, must be taken into account in calculations if at least approximate proportions were known. (authors)

  8. Re-evaluation of a novel approach for quantitative myocardial oedema detection by analysing tissue inhomogeneity in acute myocarditis using T2-mapping

    Energy Technology Data Exchange (ETDEWEB)

    Baessler, Bettina; Treutlein, Melanie; Maintz, David; Bunck, Alexander C. [University Hospital of Cologne, Department of Radiology, Cologne (Germany); Schaarschmidt, Frank [Leibniz Universitaet Hannover, Institute of Biostatistics, Faculty of Natural Sciences, Hannover (Germany); Stehning, Christian [Philips Research, Hamburg (Germany); Schnackenburg, Bernhard [Philips, Healthcare Germany, Hamburg (Germany); Michels, Guido [University Hospital of Cologne, Department III of Internal Medicine, Heart Centre, Cologne (Germany)

    2017-12-15

    To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. (orig.)

  9. Calcium lignosulphonate: re-evaluation of relevant endpoints to re-confirm validity and NOAEL of a 90-day feeding study in rats.

    Science.gov (United States)

    Thiel, Anette; Braun, William; Cary, Maurice G; Engelhardt, Jeffery A; Goodman, Dawn G; Hall, William C; Romeike, Annette; Ward, Jerrold M

    2013-08-01

    A 90-day feeding study in Han/Wistar rats with calcium lignosulphonate was evaluated by the EFSA. The study was considered to be inadequate due to potentially impaired health status of the animals based upon a high incidence of minimal lymphoid hyperplasia in mesenteric/mandibular lymph nodes and Peyer's patches, and minimal lymphoid cell infiltration in the liver in all animals. The EFSA Panel further disagreed with the conclusion that the treatment-related observation of foamy histiocytosis in mesenteric lymph nodes was non-adverse and asked whether this observation would progress to something more adverse over time. A PWG was convened to assess the sections of lymph nodes, Peyer's patches and liver. In addition, all lymphoid tissues were re-examined. The clinical pathology and animal colony health screening data were re-evaluated. The question whether the foamy histiocytosis could progress to an adverse finding with increasing exposure duration was addressed by read-across. In conclusion, the animals on the 90-day feeding study were in good health, the study was adequate for safety evaluation, and the foamy histiocytes in the mesenteric lymph nodes were not considered adverse, but rather an adaptive response that was considered unlikely to progress to an adverse condition with time. The NOAEL was re-affirmed to be 2000 mg/kgbw/d. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. p38 MAPK as an essential regulator of dorsal-ventral axis specification and skeletogenesis during sea urchin development: a re-evaluation.

    Science.gov (United States)

    Molina, Maria Dolores; Quirin, Magali; Haillot, Emmanuel; Jimenez, Felipe; Chessel, Aline; Lepage, Thierry

    2017-06-15

    Dorsal-ventral axis formation in the sea urchin embryo relies on the asymmetrical expression of the TGFβ Nodal. The p38-MAPK pathway has been proposed to be essential for dorsal-ventral axis formation by acting upstream of nodal expression. Here, we report that, in contrast to previous studies that used pharmacological inhibitors of p38, manipulating the activity of p38 by genetic means has no obvious impact on morphogenesis. Instead, we discovered that p38 inhibitors strongly disrupt specification of all germ layers by blocking signalling from the Nodal receptor and by interfering with the ERK pathway. Strikingly, while expression of a mutant p38 that is resistant to SB203580 did not rescue dorsal-ventral axis formation or skeletogenesis in embryos treated with this inhibitor, expression of mutant Nodal receptors that are resistant to SB203580 fully restored nodal expression in SB203580-treated embryos. Taken together, these results establish that p38 activity is not required for dorsal-ventral axis formation through nodal expression nor for skeletogenesis. Our results prompt a re-evaluation of the conclusions of several recent studies that linked p38 activity to dorsal-ventral axis formation and to patterning of the skeleton. © 2017. Published by The Company of Biologists Ltd.

  11. Decadal re-evaluation of contaminant exposure and productivity of ospreys (Pandion haliaetus) nesting in Chesapeake Bay Regions of Concern

    International Nuclear Information System (INIS)

    Lazarus, Rebecca S.; Rattner, Barnett A.; McGowan, Peter C.; Hale, Robert C.; Schultz, Sandra L.; Karouna-Renier, Natalie K.; Ottinger, Mary Ann

    2015-01-01

    The last large-scale ecotoxicological study of ospreys (Pandion haliaetus) in Chesapeake Bay was conducted in 2000–2001 and focused on U.S. EPA-designated Regions of Concern (ROCs; Baltimore Harbor/Patapsco, Anacostia/middle Potomac, and Elizabeth Rivers). In 2011–2012, ROCs were re-evaluated to determine spatial and temporal trends in productivity and contaminants. Concentrations of p,p′-DDE were low in eggs and below the threshold associated with eggshell thinning. Eggs from the Anacostia/middle Potomac Rivers had lower total PCB concentrations in 2011 than in 2000; however, concentrations remained unchanged in Baltimore Harbor. Polybrominated diphenyl ether flame retardants declined by 40%, and five alternative brominated flame retardants were detected at low levels. Osprey productivity was adequate to sustain local populations, and there was no relation between productivity and halogenated contaminants. Our findings document continued recovery of the osprey population, declining levels of many persistent halogenated compounds, and modest evidence of genetic damage in nestlings from industrialized regions. - Highlights: • This study documents the continued recovery of the Chesapeake Bay osprey population. • Osprey eggshells have nearly returned to pre-DDT-era thickness. • Organochlorine pesticides are low in eggs, but PCB levels seem unchanged in industrialized areas. • PBDE flame retardants have declined in eggs, but seem to peak near wastewater treatment plants. • There is some evidence of genetic damage in nestling blood samples in the most industrialized areas. - While the Chesapeake Bay osprey population has recovered, concentrations of some persistent contaminants in eggs remain unchanged, and there is some evidence of genetic damage in nestlings

  12. A re-evaluation of the genus Myceliophthora (Sordariales, Ascomycota): its segregation into four genera and description of Corynascus fumimontanus sp. nov.

    Science.gov (United States)

    Marin-Felix, Yasmina; Stchigel, Alberto M; Miller, Andrew N; Guarro, Josep; Cano-Lira, José F

    2015-01-01

    Based on a number of isolates of Myceliophthora (Chaetomiaceae, Sordariales, Ascomycota) recently isolated from soil samples collected in USA, the taxonomy of the genus was re-evaluated through phylogenetic analyses of sequences from the nuc rDNA internal transcribed spacer region and genes for the second largest subunit of RNA polymerase II and translation elongation factor 1α. Members of Myceliophthora were split into four monophyletic clades strongly supported by molecular and phenotypic data. Such clades correspond with Myceliophthora, now restricted only to the type species of the genus Corynascus, which is re-established with five species, the new monotypic genus Crassicarpon and also the new genus Thermothelomyces (comprising four species). Myceliophthora lutea is mesophilic and a permanently asexual morph compared to the members of the other three mentioned genera, which also are able to sexually reproduce morphs with experimentally proven links to their asexual morphs. The asexual morph of M. lutea is characterized by broadly ellipsoidal, smooth-walled conidia with a wide, truncate base. Crassicarpon thermophilum is thermophilic and heterothallic and produces spherical to cuneiform, smooth-walled conidia and cleistothecial ascomata of smooth-walled, angular cells and ascospores with a germ pore at each end. Corynascus spp. are homothallic and mesophilic and produce spherical, mostly ornamented conidia and cleistothecial ascomata with textura epidermoidea composed of ornamented wall cells, and ascospores with one germ pore at each end. Thermothelomyces spp. are thermophilic, heterothallic and characterized by similar ascomata and conidia as Corynascus spp., but its ascospores exhibit only a single germ pore. A dichotomous key to distinguish Myceliophthora from the other mentioned genera are provided, as well as dichotomous keys to identify the species of Corynascus and Thermothelomyces. A new species, namely Corynascus fumimontanus, characterized by

  13. Re-evaluation of Assay Data of Spent Nuclear Fuel obtained at Japan Atomic Energy Research Institute for validation of burnup calculation code systems

    Energy Technology Data Exchange (ETDEWEB)

    Suyama, Kenya, E-mail: suyama.kenya@jaea.go.jp [Office of International Relations, Nuclear Safety Division, Ministry of Education, Culture, Sports, Science and Technology - Japan, 3-2-2 Kasumigaseki, Chiyoda-ku, Tokyo 100-8959 (Japan); Murazaki, Minoru; Ohkubo, Kiyoshi [Fuel Cycle Safety Research Group, Nuclear Safety Research Center, Japan Atomic Energy Agency, 2-4 Shirakata Shirane, Tokai-mura, Ibaraki 319-1195 (Japan); Nakahara, Yoshinori [Research Group for Analytical Science, Nuclear Science and Engineering Directorate, Japan Atomic Energy Agency, 2-4 Shirakata Shirane, Tokai-mura, Ibaraki 319-1195 (Japan); Uchiyama, Gunzo [Fuel Cycle Safety Research Group, Nuclear Safety Research Center, Japan Atomic Energy Agency, 2-4 Shirakata Shirane, Tokai-mura, Ibaraki 319-1195 (Japan)

    2011-05-15

    Highlights: > The specifications required for the analyses of the destructive assay data taken from irradiated fuel in Ohi-1 and Ohi-2 PWRs were documented in this paper. > These data were analyzed using the SWAT2.1 code, and the calculation results showed good agreement with experimental results. > These destructive assay data are suitable for the benchmarking of the burnup calculation code systems. - Abstract: The isotopic composition of spent nuclear fuels is vital data for studies on the nuclear fuel cycle and reactor physics. The Japan Atomic Energy Agency (JAEA) has been active in obtaining such data for pressurized water reactor (PWR) and boiling water reactor (BWR) fuels, and some data has already been published. These data have been registered with the international Spent Fuel Isotopic Composition Database (SFCOMPO) and widely used as international benchmarks for burnup calculation codes and libraries. In this paper, Assay Data of Spent Nuclear Fuel from two fuel assemblies irradiated in the Ohi-1 and Ohi-2 PWRs in Japan are shown. The destructive assay data from Ohi-2 have already been published. However, these data were not suitable for the benchmarking of calculation codes and libraries because several important specifications and data were not included. This paper summarizes the details of destructive assay data and specifications required for analyses of isotopic composition from Ohi-1 and Ohi-2. For precise burnup analyses, the burnup values of destructive assay samples were re-evaluated in this study. These destructive assay data were analyzed using the SWAT2.1 code, and the calculation results showed good agreement with experimental results. This indicates that the quality of destructive assay data from Ohi-1 and Ohi-2 PWRs is high, and that these destructive assay data are suitable for the benchmarking of burnup calculation code systems.

  14. Covariance and decoupling of floral and vegetative traits in nine Neotropical plants: a re-evaluation of Berg's correlation-pleiades concept.

    Science.gov (United States)

    Armbruster, W S; Di Stilio, V S; Tuxill, J D; Flores, T C; Velásquez Runk, J L

    1999-01-01

    Nearly forty years ago R. L. Berg proposed that plants with specialized pollination ecology evolve genetic and developmental systems that decouple floral morphology from phenotypic variation in vegetative traits. These species evolve separate floral and vegetative trait clusters, or as she termed them, "correlation pleiades." The predictions of this hypothesis have been generally supported, but only a small sample of temperate-zone herb and grass species has been tested. To further evaluate this hypothesis, especially its applicability to plants of other growth forms, we examined the patterns of phenotypic variation and covariation of floral and vegetative traits in nine species of Neotropical plants. We recognized seven specific predictions of Berg's hypothesis. Our results supported some predictions but not others. Species with specialized pollination systems usually had floral traits decoupled (weak correlation; Canna and Eichornia) or buffered (relationship with shallow proportional slope; Calathea and Canna) from variation in vegetative traits. However, the same trend was also observed in three species with unspecialized pollination systems (Echinodorus, Muntingia, and Wedelia). One species with unspecialized pollination (Croton) and one wind-pollinated species (Cyperus) showed no decoupling or buffering, as predicted. While species with specialized pollination usually showed lower coefficients of variation for floral traits than vegetative traits (as predicted), the same was also true of species with unspecialized or wind pollination (unlike our prediction). Species with specialized pollination showed less variation in floral traits than did species with unspecialized or wind pollination, as predicted. However, the same was true of the corresponding vegetative traits, which was unexpected. Also in contrast to our prediction, plants with specialized pollination systems did not exhibit tighter phenotypic integration of floral characters than did species with

  15. Adaptation and Implementation of Predictive Maintenance Technique with Nondestructive Testing for Power Plants

    International Nuclear Information System (INIS)

    Jung, Gye Jo; Jung, Nam Gun

    2010-01-01

    Many forces are pressuring utilities to reduce operating and maintenance costs without cutting back on reliability or availability. Many utility managers are re-evaluating maintenance strategies to meet these demands. To utilities how to reduce maintenance costs and extent the effective operating life of equipment, predictive maintenance technique can be adapted. Predictive maintenance had three types program which are in-house program, engineering company program and mixed program. We can approach successful predictive maintenance program with 'smart trust' concept

  16. Re: Evaluation of the size and type of free particulates collected from unused asbestos-containing brake components as related to potential for respirability.

    Science.gov (United States)

    Paustenbach, Dennis J; Finley, Brent L; Sheehan, Patrick J; Brorby, Gregory P

    2006-01-01

    In Atkinson et al. 2004 rinsates of unused brake components were analyzed by transmission electron microscopy (TEM) for the presence of asbestos fibers. We do not believe that the findings of Atkinson et al. are informative and could have been predicted based on the study design and the fact that one would expect to find measurable TEM asbestos fibers on an unused brake component. We also find that the paper did not provide a full or even partial discussion of the published literature with respect to industrial hygiene or epidemiology data. The findings of Atkinson et al. do not, in our view, "further raise concerns" about historical asbestos exposures experienced by automotive mechanics because of the vast amount of published literature to the contrary. Am. J. Ind. Med. 49:60-61, 2006. (c) 2005 Wiley-Liss, Inc.

  17. SU-F-T-377: Monte Carlo Re-Evaluation of Volumetric-Modulated Arc Plans of Advanced Stage Nasopharygeal Cancers Optimized with Convolution-Superposition Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Lee, K; Leung, R; Law, G; Wong, M; Lee, V; Tung, S; Cheung, S; Chan, M [Tuen Mun Hospital, Hong Kong (Hong Kong)

    2016-06-15

    Background: Commercial treatment planning system Pinnacle3 (Philips, Fitchburg, WI, USA) employs a convolution-superposition algorithm for volumetric-modulated arc radiotherapy (VMAT) optimization and dose calculation. Study of Monte Carlo (MC) dose recalculation of VMAT plans for advanced-stage nasopharyngeal cancers (NPC) is currently limited. Methods: Twenty-nine VMAT prescribed 70Gy, 60Gy, and 54Gy to the planning target volumes (PTVs) were included. These clinical plans achieved with a CS dose engine on Pinnacle3 v9.0 were recalculated by the Monaco TPS v5.0 (Elekta, Maryland Heights, MO, USA) with a XVMC-based MC dose engine. The MC virtual source model was built using the same measurement beam dataset as for the Pinnacle beam model. All MC recalculation were based on absorbed dose to medium in medium (Dm,m). Differences in dose constraint parameters per our institution protocol (Supplementary Table 1) were analyzed. Results: Only differences in maximum dose to left brachial plexus, left temporal lobe and PTV54Gy were found to be statistically insignificant (p> 0.05). Dosimetric differences of other tumor targets and normal organs are found in supplementary Table 1. Generally, doses outside the PTV in the normal organs are lower with MC than with CS. This is also true in the PTV54-70Gy doses but higher dose in the nasal cavity near the bone interfaces is consistently predicted by MC, possibly due to the increased backscattering of short-range scattered photons and the secondary electrons that is not properly modeled by the CS. The straight shoulders of the PTV dose volume histograms (DVH) initially resulted from the CS optimization are merely preserved after MC recalculation. Conclusion: Significant dosimetric differences in VMAT NPC plans were observed between CS and MC calculations. Adjustments of the planning dose constraints to incorporate the physics differences from conventional CS algorithm should be made when VMAT optimization is carried out directly

  18. Re-evaluation of the technical basis for the regulation of pressurized thermal shock in U.S. pressurized water reactor vessels

    Energy Technology Data Exchange (ETDEWEB)

    Malik, S.N.; Kirk, M.T.; Jackson, D.A.; Hackett, E.M.; Chokshi, N.C.; Siu, N.O.; Woods, H.W.; Bessette, D.E. [Office of Nuclear Regulatory Research, U.S. nuclear Regulatory Commission, Washington, D.C. (United States); Dickson, T.L. [Oak Ridge National Lab., Computational Physics and Engineering Div., Oak Ridge, TN (United States)

    2001-07-01

    The current federal regulation to insure that pressurized-water nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to potential pressurized thermal shock (PTS) events during the life of the plant were derived from computational models and technologies that were developed in the early-to-mid 1980's. Since that time, there have been several advancements and refinements to the relevant fracture technology, materials characterization methods, probabilistic risk assessment (PRA) and thermal-hydraulics (TH) computational methods. Preliminary studies performed in 1998 (that applied this new technology) indicated the potential that technical bases can be established to support a relaxation of the current federal regulation (10 CFR 50.61) for PTS. A revision of PTS regulation could have significant implications for plants reaching their end-of-license periods and future plant license-extension considerations. Based on the above, in 1999, the United States Nuclear Regulatory Commission initiated a comprehensive project, with the nuclear industry as a participant, to revisit the technical bases for the current regulations on PTS. This paper provides an overview and status of the methodology that has evolved over the last two years through interactions between experts in relevant disciplines (TH, PRA, materials and fracture mechanics, and non-destructive and destructive examination to predict distribution of fabrication induced flaws in the belt-line region of the PWR vessels) from the NRC staff, their contractors, and representatives from the nuclear industry. This updated methodology is currently being implemented into the FAVOR (Fracture Analysis of Vessels: Oak Ridge) computer code for application to re-examine the adequacy of the current regulations and to determine if technical basis can be established for relaxing the current regulation. It is anticipated that the effort will be completed in 2002. (authors)

  19. Re-evaluating the resource potential of lomas fog oasis environments for Preceramic hunter-gatherers under past ENSO modes on the south coast of Peru

    Science.gov (United States)

    Beresford-Jones, David; Pullen, Alexander G.; Whaley, Oliver Q.; Moat, Justin; Chauca, George; Cadwallader, Lauren; Arce, Susana; Orellana, Alfonso; Alarcón, Carmela; Gorriti, Manuel; Maita, Patricia K.; Sturt, Fraser; Dupeyron, Agathe; Huaman, Oliver; Lane, Kevin J.; French, Charles

    2015-12-01

    Lomas - ephemeral seasonal oases sustained by ocean fogs - were critical to ancient human ecology on the desert Pacific coast of Peru: one of humanity's few independent hearths of agriculture and "pristine" civilisation. The role of climate change since the Late Pleistocene in determining productivity and extent of past lomas ecosystems has been much debated. Here we reassess the resource potential of the poorly studied lomas of the south coast of Peru during the long Middle Pre-ceramic period (c. 8000-4500 BP): a period critical in the transition to agriculture, the onset of modern El Niño Southern Oscillation ('ENSO') conditions, and eustatic sea-level rise and stabilisation and beach progradation. Our method combines vegetation survey and herbarium collection with archaeological survey and excavation to make inferences about both Preceramic hunter-gatherer ecology and the changed palaeoenvironments in which it took place. Our analysis of newly discovered archaeological sites - and their resource context - show how lomas formations defined human ecology until the end of the Middle Preceramic Period, thereby corroborating recent reconstructions of ENSO history based on other data. Together, these suggest that a five millennia period of significantly colder seas on the south coast induced conditions of abundance and seasonal predictability in lomas and maritime ecosystems, that enabled Middle Preceramic hunter-gatherers to reduce mobility by settling in strategic locations at the confluence of multiple eco-zones at the river estuaries. Here the foundations of agriculture lay in a Broad Spectrum Revolution that unfolded, not through population pressure in deteriorating environments, but rather as an outcome of resource abundance.

  20. Re-evaluation of the technical basis for the regulation of pressurized thermal shock in U.S. pressurized water reactor vessels

    International Nuclear Information System (INIS)

    Malik, S.N.; Kirk, M.T.; Jackson, D.A.; Hackett, E.M.; Chokshi, N.C.; Siu, N.O.; Woods, H.W.; Bessette, D.E.; Dickson, T.L.

    2001-01-01

    The current federal regulation to insure that pressurized-water nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to potential pressurized thermal shock (PTS) events during the life of the plant were derived from computational models and technologies that were developed in the early-to-mid 1980's. Since that time, there have been several advancements and refinements to the relevant fracture technology, materials characterization methods, probabilistic risk assessment (PRA) and thermal-hydraulics (TH) computational methods. Preliminary studies performed in 1998 (that applied this new technology) indicated the potential that technical bases can be established to support a relaxation of the current federal regulation (10 CFR 50.61) for PTS. A revision of PTS regulation could have significant implications for plants reaching their end-of-license periods and future plant license-extension considerations. Based on the above, in 1999, the United States Nuclear Regulatory Commission initiated a comprehensive project, with the nuclear industry as a participant, to revisit the technical bases for the current regulations on PTS. This paper provides an overview and status of the methodology that has evolved over the last two years through interactions between experts in relevant disciplines (TH, PRA, materials and fracture mechanics, and non-destructive and destructive examination to predict distribution of fabrication induced flaws in the belt-line region of the PWR vessels) from the NRC staff, their contractors, and representatives from the nuclear industry. This updated methodology is currently being implemented into the FAVOR (Fracture Analysis of Vessels: Oak Ridge) computer code for application to re-examine the adequacy of the current regulations and to determine if technical basis can be established for relaxing the current regulation. It is anticipated that the effort will be completed in 2002. (authors)

  1. Re-evaluation of temperature at the updip limit of locked portion of Nankai megasplay inferred from IODP Site C0002 temperature observatory

    Science.gov (United States)

    Sugihara, Takamitsu; Kinoshita, Masataka; Araki, Eichiro; Kimura, Toshinori; Kyo, Masanori; Namba, Yasuhiro; Kido, Yukari; Sanada, Yoshinori; Thu, Moe Kyaw

    2014-12-01

    In 2010, the first long-term borehole monitoring system was deployed at approximately 900 m below the sea floor (mbsf) and was assumed to be situated above the updip limit of the seismogenic zone in the Nankai Trough off Kumano (Site C0002). Four temperature records show that the effect of drilling diminished in less than 2 years. Based on in situ temperatures and thermal conductivities measured on core samples, the temperature measurements and heat flow at 900 mbsf are estimated to be 37.9°C and 56 ± 1 mW/m2, respectively. This heat flow value is in excellent agreement with that from the shallow borehole temperature corrected for rapid sedimentation in the Kumano Basin. We use these values in the present study to extrapolate the temperature below 900 mbsf for a megasplay fault at approximately 5,200 mbsf and a plate boundary fault at approximately 7,000 mbsf. To extrapolate the temperature downward, we use logging-while-drilling (LWD) bit resistivity data as a proxy for porosity and estimate thermal conductivity from this porosity using a geometrical mean model. The one-dimensional (1-D) thermal conduction model used for the extrapolation includes radioactive heat and frictional heat production at the plate boundary fault. The estimated temperature at the megasplay ranges from 132°C to 149°C, depending on the assumed thermal conductivity and radioactive heat production values. These values are significantly higher, by up to 40°C, than some of previous two-dimensional (2-D) numerical model predictions that can account for the high heat flow seaward of the deformation front, including a hydrothermal circulation within the subducted igneous oceanic crust. However, our results are in good agreement with those of the 2-D model, which does not include the advection cooling effect. The results imply that 2-D geometrical effects as well as the influence of the advective cooling may be critical and should be evaluated more quantitatively. Revision of 2-D simulation by

  2. Re-evaluating DSM-I.

    Science.gov (United States)

    Cooper, R; Blashfield, R K

    2016-02-01

    The DSM-I is currently viewed as a psychoanalytic classification, and therefore unimportant. There are four reasons to challenge the belief that DSM-I was a psychoanalytic system. First, psychoanalysts were a minority on the committee that created DSM-I. Second, psychoanalysts of the time did not use DSM-I. Third, DSM-I was as infused with Kraepelinian concepts as it was with psychoanalytic concepts. Fourth, contemporary writers who commented on DSM-I did not perceive it as psychoanalytic. The first edition of the DSM arose from a blending of concepts from the Statistical Manual for the Use of Hospitals of Mental Diseases, the military psychiatric classifications developed during World War II, and the International Classification of Diseases (6th edition). As a consensual, clinically oriented classification, DSM-I was popular, leading to 20 printings and international recognition. From the perspective inherent in this paper, the continuities between classifications from the first half of the 20th century and the systems developed in the second half (e.g. DSM-III to DSM-5) become more visible.

  3. Overview of seismic re-evaluation methodologies

    International Nuclear Information System (INIS)

    Campbell, R.D.; Johnson, J.J.

    1993-01-01

    Several seismic licensing and safety issues have emerged over the past fifteen years for commercial U.S. Nuclear Power Plants and U.S. Government research reactors, production reactors and process facilities. The methodologies for resolution of these issues have been developed in numerous government and utility sponsored research programs. The resolution criteria have included conservative deterministic design criteria, deterministic seismic margins assessments criteria (SMA) and seismic probabilistic safety assessment criteria (SPSA). The criteria for SMAs and SPSAs have been based on realistically considering the inelastic energy absorption capability of ductile structures, equipment and piping and have incorporated the use of earthquake and testing experience to evaluate the operability of complex mechanical and electrical equipment. Most of the applications to date have been confined to the U.S. but there have been several applications to Asian, Western and Eastern Europe reactors. This paper summarizes the major issues addressed, the development of reevaluation criteria and selected applications to non U.S. reactors including WWER reactors. (author)

  4. Re-evaluation of traditional Mediterranean foods. The local landraces of 'Cipolla di Giarratana' (Allium cepa L.) and long-storage tomato(Lycopersicon esculentum L.): quality traits and polyphenol content.

    Science.gov (United States)

    Siracusa, Laura; Avola, Giovanni; Patanè, Cristina; Riggi, Ezio; Ruberto, Giuseppe

    2013-11-01

    The heightened consumer awareness for food safety is reflected in the demand for products with well-defined individual characteristics due to specific production methods, composition and origin. In this context, of pivotal importance is the re-evaluation of folk/traditional foods by properly characterizing them in terms of peculiarity and nutritional value. The subjects of this study are two typical Mediterranean edible products. The main morphological, biometrical and productive traits and polyphenol contents of three onion genotypes ('Cipolla di Giarratana', 'Iblea' and 'Tonda Musona') and three long-storage tomato landraces ('Montallegro', 'Filicudi' and 'Principe Borghese') were investigated. Sicilian onion landraces were characterized by large bulbs, with 'Cipolla di Giarratana' showing the highest bulb weight (605 g), yield (151 t ha(-1)) and total polyphenol content (123.5 mg kg(-1)). Landraces of long-storage tomato were characterized by low productivity (up to 20 t ha(-1)), but more than 70% of the total production was obtained with the first harvest, allowing harvest costs to be reduced. High contents of polyphenols were found, probably related to the typical small fruit size and thick skin characterizing these landraces. The present study overviews some of the most important traits that could support traditional landrace characterization and their nutritional value assessment. © 2013 Society of Chemical Industry.

  5. A re-evaluation of k0 and related nuclear data for the 555.8 keV gamma-line emitted by the 104mRh-104Rh mother-daughter pair for use in NAA

    International Nuclear Information System (INIS)

    Corte, Frans de; Lierde, Stijn van; Simonits, Andras; Bossus, Danieel; Sluijs, Robbert van; Pomme, Stefaan

    1999-01-01

    A re-evaluation is made of the k 0 -factor and related nuclear data for the 555.8 keV gamma-ray of the 104m Rh- 104 Rh mother-daughter pair that are important in neutron activation analysis (NAA). This study considers that the relevant level is also fed by the 4.34 min 104m Rh mother (with an absolute gamma-ray emission probability γ 2 =0.13%) and not only, as assumed in former work, by the 42.3 s 104 Rh daughter isotope (with γ 3 =2.0%). In view of this, generalised equations were developed for both the experimental determination and the analytical use of the k 0 -factor and of the associated parameters k 0 (m)/k 0 (g), Q 0 (m) and Q 0 (g) [(m): 104m Rh; (g): 104 Rh], requiring the introduction of the γ 2 and γ 3 data and also of the 104m Rh→ 104 Rh fractional decay factor F 2 (=0.9987). The experimental determinations were based on irradiations performed in the BR1 reactor in Mol and the WWR-M reactor in Budapest. Furthermore, considering the special formation of the 555.8 keV gamma-ray, the procedure for true-coincidence correction was revised as well. All this led to the compilation and recommendation of a new set of 'k 0 -NAA' data

  6. The Rhetoric of Inferiority of African Slaves in John Fawcett’s Obi; or, Three-Fingered Jack (1800 Re-evaluated in Charlie Haffner’s Amistad Kata-Kata (1987

    Directory of Open Access Journals (Sweden)

    Ulrich Pallua

    2014-02-01

    Full Text Available John Fawcett’s Obi; or, Three-Finger’d Jack (1800 draws a distorted picture of the life of slaves in Jamaica. This paper investigates the ambivalence in this distortion as Fawcett creates two kinds of slaves by pitting them against each other: the loyal and obedient slaves (but still inferior vs. the superstitious-ridden and rebellious slaves deeply rooted in old traditions, thus considered inferior, uneducated, immoral and dangerous. The juxtaposition of what I call ‘anglicised’ slaves instrumentalised by the coloniser and the heathen ‘savages’ that are beyond the reach of the imperial ideology enables Fawcett to substantiate the claim that Christianity successfully promotes slaves to ‘anglicised’ mimic men/women who are then able to carry out its mission: to eradicate the pagan practice of obeah, three-finger’d Jack, and all those slaves that threaten the stability of the coloniser’s superiority. Charlie Haffner’s play Amistad Kata-Kata (1987 is about the heroism of Shengbe Pieh and his fellow slaves on board the La Amistad: on their way to the colonies they revolted, were sent to prison, tried, finally freed, and taken back home after 3 years. The paper shows how Haffner repositions the ‘Amistad trope’ in the 20th century by effacing the materiality of the body of the African slaves, thus re-evaluating the corporeality of the colonised slave in the 19th -century post-abolition debate by coming to terms with the cultural trauma postindependent African collective identity has been experiencing. The re-staging of the play by the ‘Freetong Players’ in 2007/8 commemorated the bicentenary of the abolition of the Atlantic Slave Trade, a unique opportunity to direct the attention to asserting the identity of ‘Post-European’ Africa.

  7. Re-evaluation of the definition of remission on the 17-item Hamilton Depression Rating Scale based on recovery in health-related quality of life in an observational post-marketing study.

    Science.gov (United States)

    Sawamura, Jitsuki; Ishigooka, Jun; Nishimura, Katsuji

    2018-01-16

    Although a score of less than 7 for the 17-item Hamilton Depression Rating Scale (HAM-D17) has been widely adopted to define remission of depression, a full recovery from depression is closely related to the patient's quality of life as well. Accordingly, we re-evaluated this definition of remission using HAM-D17 in comparison with the corresponding score for health-related quality of life (HRQOL) measured by the SF-36. Using the data for depressive patients reported by GlaxoSmithKline K.K. (Study No. BRL29060A/863) in a post-marketing observational study of paroxetine, with a sample size of n = 722, multivariate logistic regression was performed with the HAM-D17 score as a dependent variable and with each of the eight domain scores of HRQOL (from the SF-36) transformed into a binominal form according to the national standard value for Japan. Then, area under curve of receiver operating characteristic analyses were conducted. Based on the obtained results, a multivariate analysis was performed using the HAM-D17 score in a binomial form with HAM-D17 as a dependent variable and with each of the eight HRQOL domain scores (SF-36) as binominalized independent variables. A cutoff value for the HAM-D17 score of 5 provided the maximum ROC-AUC at "0.864." The significantly associated scores of the eight HRQOL domains (SF-36) were identified for the HAM-D17 cutoff values of ≥5 and ≤4. The scores for physical functioning (odds ratio, 0.473), bodily pain (0.557), vitality (0.379), social functioning (0.540), role-emotion (0.265), and mental health (0.467) had a significant negative association with the HAM-D17 score (p < 0.05), and HRQOL domain scores for HAM-D17 ≥ 5 were significantly lower compared with those for HAM-D17 ≤ 4. A cutoff value for HAM-D17 of less than or equal to 4 was the best candidate for indicating remission of depression when the recovery of HRQOL is considered. Restoration of social function and performance should be considered

  8. ‘Historical narratives and historical desires: re-evaluating American art criticism of the mid-nineteenth century’: Karen Georgi, Critical Shift: Rereading Jarves, Cook, Stillman, and the Narratives of Nineteenth-Century American Art, The Pennsylvania State University Press, 2013

    OpenAIRE

    Emily Gephart

    2014-01-01

    Striving to distinguish their authority as and demonstrate their professionalism, art critics James Jackson Jarves, Clarence Cook, and William James Stillman wrote exhibition reviews, essays, and increasingly self-conscious histories of American art and artists in the mid-nineteenth century. Whereas their writing has often been employed to establish a model of opposed pre- and post-war periodization in American art, Karen Georgi challenges this view, re-evaluating the rhetorical structures t...

  9. Re-evaluation of microscopic and integral cross-section data for important dosimetry reactions. Re-evaluation of the excitation functions for the 24Mg(n,p)24Na, 32S(n,p)32P, 60Ni(n,p)60m+gCo, 63Cu(n,2n)62Cu, 65Cu(n,2n)64Cu, 64Zn(n,p)64Cu, 115In(n,2n)114mIn, 127I(n,2n)126I, 197Au(n,2n)196Au and 199Hg(n,n')199mHg reactions

    International Nuclear Information System (INIS)

    Zolotarev, K.I.

    2008-08-01

    Re-evaluations of cross sections and their associated covariance matrices have been carried out for ten dosimetry reactions: - excitation functions for the 63 Cu(n,2n) 62 Cu, 65 Cu(n,2n) 64 Cu, 64 Zn(n,p) 64 Cu, 115 In(n,2n) 114m In and 199 Hg(n,n') 199m Hg reactions were re-evaluated over the neutron energy range from threshold to 20 MeV; - excitation functions for the 24 Mg(n,p) 24 Na, 32 S(n,p) 32 P and 60 Ni(n,p) 60m+g Co were reevaluated in the energy range from threshold to 21 MeV; - excitation functions for the 127 I(n,2n) 126 I and 197 Au(n,2n) 196 Au reactions were reevaluated in the energy range from threshold to 32 and 40 MeV, respectively. Benchmark calculations performed for 235 U thermal fission and 252 Cf spontaneous fission neutron spectra show that the integral cross sections derived from the newly evaluated excitation functions exhibit improved agreement with related experimental data when compared with the equivalent data from the IRDF-2002 library. (author)

  10. 18F-FDG PET for assessment of therapy response and preoperative re-evaluation after neoadjuvant radio-chemotherapy in stage III non-small cell lung cancer

    International Nuclear Information System (INIS)

    Eschmann, Susanne M.; Reimold, Matthias; Bares, Roland; Friedel, Godehard; Paulsen, Frank; Hehr, Thomas; Budach, Wilfried; Langen, Heinz-Jakob

    2007-01-01

    The aim of this study was to evaluate FDG-PET for assessment of therapy response and for prediction of patient outcome after neo-adjuvant radio-chemotherapy (NARCT) of advanced non-small cell lung cancer (NSCLC). Seventy patients with histologically proven stage III NSCLC underwent FDG-PET investigations before and after NARCT. Changes in FDG uptake and PET findings after completion of NARCT were compared with (1) the histology of tumour samples obtained at surgery or repeat mediastinoscopy, and (2) treatment results in terms of achieved operability and long-term survival. The mean average FDG uptake of the primary tumours in the patient group decreased significantly during NARCT (p = 0.004). Sensitivity, specificity and overall accuracy of FDG-PET were 94.5%, 80% and 91%, respectively, for the detection of residual viable primary tumour, and 77%, 68% and 73%, respectively, for the presence of lymph node metastases. A negative PET scan or a reduction in the standardised uptake value (SUV) of more than 80% was the best predictive factor for a favourable outcome of further treatment. Progressive disease according to PET (new tumour manifestations or increasing SUV) was significantly correlated with an unfavourable outcome (p = 0.005). In this subgroup, survival of patients who underwent surgery was not significantly different from survival among those who did not undergo surgery, whereas for the whole patient group, complete tumour resection had a significant influence on outcome. FDG-PET is suitable to assess response to NARCT in patients with stage III NSCLC accurately. It was highly predictive for treatment outcome and patient survival. PET may be helpful in improving restaging after NARCT by allowing reliable assessment of residual tumour viability. (orig.)

  11. WALS Prediction

    NARCIS (Netherlands)

    Magnus, J.R.; Wang, W.; Zhang, Xinyu

    2012-01-01

    Abstract: Prediction under model uncertainty is an important and difficult issue. Traditional prediction methods (such as pretesting) are based on model selection followed by prediction in the selected model, but the reported prediction and the reported prediction variance ignore the uncertainty

  12. Forecasts of non-Gaussian parameter spaces using Box-Cox transformations

    Science.gov (United States)

    Joachimi, B.; Taylor, A. N.

    2011-09-01

    Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.

  13. Climate prediction and predictability

    Science.gov (United States)

    Allen, Myles

    2010-05-01

    Climate prediction is generally accepted to be one of the grand challenges of the Geophysical Sciences. What is less widely acknowledged is that fundamental issues have yet to be resolved concerning the nature of the challenge, even after decades of research in this area. How do we verify or falsify a probabilistic forecast of a singular event such as anthropogenic warming over the 21st century? How do we determine the information content of a climate forecast? What does it mean for a modelling system to be "good enough" to forecast a particular variable? How will we know when models and forecasting systems are "good enough" to provide detailed forecasts of weather at specific locations or, for example, the risks associated with global geo-engineering schemes. This talk will provide an overview of these questions in the light of recent developments in multi-decade climate forecasting, drawing on concepts from information theory, machine learning and statistics. I will draw extensively but not exclusively from the experience of the climateprediction.net project, running multiple versions of climate models on personal computers.

  14. Cordaiteans in paleotropical wetlands: An ecological re-evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Raymond, Anne [Dept. of Geology and Geophysics, Texas A and M University, College Station, TX 77843-3115 (United States); Ecology and Evolutionary Biology, Texas A and M University, College Station, TX 77843-3115 (United States); Lambert, Lance [Dept. of Geological Sciences, University of Texas at San Antonio, San Antonio, TX 78249 (United States); Costanza, Suzanne [Paleobotanical Museum, Harvard University, Cambridge, MA 02138 (United States); Slone, E.J. [Dept. of Geology and Geophysics, Texas A and M University, College Station, TX 77843-3115 (United States); Cutlip, P.C. [Dept. of Natural Science, St. Petersburg College, St. Petersburg, FL 33733-3489 (United States)

    2010-08-01

    Cordaiteans in cordaite-dominated permineralized peat from Pennsylvanian coals in Iowa have been reconstructed as mangroves using root anatomy, peat taphonomy, and geochemical data. Macrofloral, palynofloral, and conodont biostratigraphy indicate that these peats come from the latest Atokan Blackoak coal and earliest Desmoinesian Cliffland coal (mid-Moscovian), both in the Kalo Formation. Thus, their depositional setting can be used to evaluate the mangrove hypothesis. In Recent mires, thick mangrove peats have accumulated in tropical to subtropical carbonate systems; in contrast, thick tropical freshwater peats have accumulated in siliclastic systems. Kalo Formation coals, which we interpret as freshwater deposits, formed in siliciclastic depositional settings, similar to those of modern tropical freshwater peat, and to other Pennsylvanian coals in North America interpreted as freshwater deposits. In the late Atokan and earliest Desmoinesian (mid-Moscovian), cordaiteans and tree ferns predominated in the Western Interior and Illinois Basins; lycopsids and cordaiteans predominated in the Appalachian and Donets Basins. The scarcity of lycopsid-only mires in North America during the late Atokan-earliest Desmoinesian (mid-Moscovian) suggests drier climates than during the mid-to-late Desmoinesian (late Moscovian). Rather than indicating mangrove swamps, cordaite-dominated peat may indicate climates with a 'low-rain' season. Although most plants in cordaite-dominated peat probably grew in freshwater, coastal mires in climate zones with seasons of 'low-rain' may harbor mangrove taxa. The Changuinola Swamp of Panama, a modern peat-accumulating wetland that has a 'low-rain' season, is a possible analog of ancient cordaite-dominated mires. In Changuinola, most plants require freshwater; however mangroves, sustained by salt-water influx into the swamp, grow along the seaward edge and along blackwater creeks. The 'low-rain' season hypothesis has implications for understanding rainfall amount and continuity during Pennsylvanian cyclothem deposition. The floral succession in diverse cordaite coals, from cordaiteans to tree ferns to lycopsids, suggests increasingly wet climate during coal accumulation. The position of these coals immediately above the sequence boundary suggests humid climate during early glacial melting for these cyclothems. (author)

  15. A re-evaluation of nuclear plant offsite power supplies

    International Nuclear Information System (INIS)

    William E Berger; Robert E Henry

    2005-01-01

    Full text of publication follows: De-regulation of the electric power industry has resulted in separate ownership of the transmission and power generation facilities as well as a revised format for operating the transmission facilities. Currently we see the transfer of large blocks of bulk power between markets which can impact the voltage regulation at the offsite power supply. Where Nuclear Plant operations once knew with a large degree of certainty the operating range of the system supplying the offsite power supply, this may no longer be the case and more challenges to the safety systems could result. These challenges may manifest themselves as either a loss of offsite power or voltage levels approaching the degraded level setpoints. In this paper we will first explore what challenges are caused by deregulation and how they impact offsite power supply operations. Next we will incorporate the knowledge grained regarding accidents and consequences from the Individual Plant Evaluations (IPE's) to see how the offsite power supply could be operated to mitigate the challenges and extend the capacity of the auxiliary power system. Various scenarios will be examined using the Modular Accident Analysis Program (MAAP) as an integral plant model. MAAP simulations that include both the plant thermal hydraulic responses and corresponding electric power demand are presented to demonstrate the impact of alternate approaches to offsite power system operation. The original design phase of the offsite and onsite power distribution system was based on a criterion relating to the starting of all safety loads if a safety injection signal was present independent of the accident or its progression. The IPE and risk informed insights that are readily available today will be applied in the re-analyses of the offsite distribution system response. (authors)

  16. Carboniferous Psammichnites: Systematic re-evaluation, taphonomy and autecology

    Science.gov (United States)

    Mángano, M. Gabriela; Rindsberg, Andrew K.

    2002-01-01

    The ichnogenus Psammichnites Torell 1870 includes a wide variety of predominantly horizontal, sinuous to looped, backfilled traces, characterized by a distinctive median dorsal structure. Though commonly preserved in full relief on upper bedding surfaces, some ichnospecies of Psammichnites may be preserved in negative hyporelief. Psammichnites records the feeding activities of a subsurface animal using a siphon-like device. Several ichnogenera reflect this general behavioral pattern, including Plagiogmus Roedel 1929 and the Carboniferous ichnogenera Olivellites Fenton and Fenton 1937a and Aulichnites Fenton and Fenton 1937b. Based on analysis of specimens from the United States, Spain, and the United Kingdom, three Carboniferous ichnospecies of Psammichnites are reviewed in this paper: P. plummeri (Fenton and Fenton, 1937a), P. grumula(Romano and Meléndez 1979), and P. implexus (Rindsberg 1994). Psammichnites plummeri is the most common Carboniferous ichnospecies and is characterized by a relatively straight, continuous dorsal ridge/groove, fine transverse ridges, larger size range, and non-looping geometric pattern. It represents a grazing trace of deposit feeders. Psammichnites grumula differs from the other ichnospecies of Psammichnitesby having median dorsal holes or protruding mounds. The presence of mounds or holes in P. grumulasuggests a siphon that was regularly connected to the sediment-water interface. This ichnospecies is interpreted as produced by a deposit feeder using the siphon for respiration or as a device for a chemosymbiotic strategy. Psammichnites implexus is characterized by its consistently smaller size range, subtle backfill structure, and tendency to scribble. Although displaying similarities with Dictyodora scotica, P. implexus is a very shallow-tier, grazing trace. Changes in behavioral pattern, preservational style, and bedform morphology suggest a complex interplay of ecological and taphonomic controls in Carboniferous tidal-flat Psammichnites. A first distributional pattern consists of guided meandering specimens preserved in ripple troughs, probably reflecting food-searching of buried organic matter concentrated in troughs. A second is recorded by concentration of Psammichnites on ripple crests and slopes. In some cases, the course is almost straight to slightly sinuous and closely follows topographic highs, suggesting a direct control of bedform morphology on trace pattern. Occurrences of Carboniferous Psammichnites most likely represent an opportunistic strategy in marginal-marine settings. Analysis of Carboniferous Psammichnites indicates the presence of a siphon-like device in the producer and reestablishes the possibility of a molluscan tracemaker.

  17. Seismic re-evaluation of Kozloduy NPP criteria, methodology, implementation

    International Nuclear Information System (INIS)

    Kostov, M.

    2003-01-01

    The paper describes some features of the methodology applied for seismic upgrading of civil structures at the site of the Kozloduy NPP. The essence of the methodology is the use of as-build data, realistic damping and inelastic reduction factors. As an example of seismic upgrading the analyses of units 3 and 4 are presented. The analyses are showing that for effective seismic upgrading detailed investigations are needed in order to understand the significant response modes of the structures. In the presented case this is the rotation of the attached flexible structures to the stiff reactor building. Based on this an upgrading approach is applied to increase the seismic resistance for the predominant motion. The second significant approach applied is the strengthening of the prefabricated element joints. Although it is very simple it allows use of the available element capacity. (author)

  18. Tell el Yahudiyeh Ware: a re-evaluation

    International Nuclear Information System (INIS)

    Kaplan, M.F.; Harbottle, G.; Sayre, E.V.

    1980-01-01

    The TY (Tell el Yahudiyeh ware) project has implications for understanding the cultural interactions. The Second Intermediate Period (1750-1550 B.C.), one during which centraized government in Egypt collapses and, it is generally assumed, so does her trade network. Foreigners - the Hyksos - are able to enter the country and rule at least part of it. Results of this study (which includes activation analysis), however, indicate the TY is primarily an Egyptian pottery which appeared before the Hyksos entered and may have continued in use after they left. It cannot, therefore, be tightly associated with the Hyksos nor can it be used to judge the extent of their influence. Its wide distribution shows that Egypt continued to trade goods outside her boundaries throughout this period. Finally, not only goods travelled between what were generally considered to have been hostile neighbors, but the trade appears to have included ideas and technology as well

  19. Re-evaluating Gondwana breakup: Magmatism, movement and microplates

    Science.gov (United States)

    Ferraccioli, F.; Jordan, T. A.

    2017-12-01

    Gondwana breakup is thought to have initiated in the Early- to Mid-Jurassic between South Africa and East Antarctica. The critical stages of continental extension and magmatism which preceded breakup remain controversial. It is agreed that extensive magmatism struck this region 180 Ma, and that significant extension occurred in the Weddell Sea Rift System (WSRS) and around the Falkland Plateau. However, the timing and volume of magmatism, extent and mechanism of continental extension, and the links with the wider plate circuit are poorly constrained. Jordan et al (Gondwana Research 2017) recently proposed a two-stage model for the formation of the WSRS: initial extension and movement of the Ellsworth Whitmore Mountains microplate along the margin of the East Antarctic continent on a sinistral strike slip fault zone, followed by transtensional extension closer to the continental margin. Here we identify some key questions raised by the two-stage model, and identify regions where these can be tested. Firstly, is the magmatism inferred to have facilitated extension in the WSRS directly linked to the onshore Dufek Intrusion? This question relates to both the uncertainty in the volume of magmatism and potentially the timing of extension, and requires improved resolution of aeromagnetic data in the eastern WSRS. Secondly, did extension in the WSRS terminate against a single strike slip fault zone or into a distributed fault system? By integrating new and existing aeromagnetic data along the margin of East Antarctica we evaluate the possibility of a distributed shear zone penetrating the East Antarctic continent, and identify critical remaining data gaps. Finally we question how extension within the WSRS could fit into the wider plate circuit. By integrating the two-stage model into Gplates reconstructions we identify regions of overlap and areas where tracers of past plate motion could be identified.

  20. Re-evaluation of characterisation and classification of Apa ( Afzelia ...

    African Journals Online (AJOL)

    As a result of effect of geographical location on timber properties, there is need for constant determination of properties of timber. This paper presents the results of experimental tests carried out on three Apa (Afzelia bipindensis) timber logs grown in Kwara State, south-western periphery of the North Central Zone of Nigeria ...

  1. Kinetic Re-Evaluation of Fuel Neutralization by AKGA

    Science.gov (United States)

    Oropeza Cristina; Kosiba, Mike; Davis, Chuck

    2010-01-01

    Baseline characterization testing previously identified alpha-ketoglutaric acid (AKGA) cis a potential alternative to the current standard hydrazine (HZ) family fuel neutralization techniques in use at Kennedy Space Center (KSC). Thus far, the reagent shows promise for use in hardware decontamination operations and as a drop-in replacement for the scrubber liquor currently used in KSC four tower vapor scrubbers. Implementation of AKGA could improve process safety and reduce or eliminate generation of hydrazine-Iaden waste streams. This paper focuses on evaluation of the kinetics of these decontamination reactions in solution. Pseudo first order reaction rate constants with respect to the pyridazine products (6-oxo-4,5-dihydro-1H-pyridazine-3-carboxylic acid, (PCA) and 1-methyl-6-oxo-4,5-dihydro-pyridazine-3-carboxylic acid (mPCA)) in the presence of excess AKGA were determined by monitoring product formation using a ultra-violet visible absorption spectroscopy method. The results are presented here in comparison to previous data obtained by monitoring reactant depletion by gas chromatography with nitrogen phosphorus detector (GC-NPD).

  2. Ben Okri's The Famished Road: A re-evaluation

    African Journals Online (AJOL)

    imaginative effect. In the same way, ... of that word, in the god. J. K. Rowling's imaginative world in Harry Potter and the Philosopher's Stone is a ... The Famished Road cannot affirm that spirit children exist in the real world. What, as a vehicle ...

  3. Re-evaluating the Disengagement Process: the Case of Fatah

    Directory of Open Access Journals (Sweden)

    Gordon Clubb

    2010-11-01

    Full Text Available Recently, a number of studies have looked at the disengagement/de-radicalisation of terrorist groups and individuals. This article critically assesses part of this literature in relation to the process of voluntary collective disengagement, using the case of the Palestinian Fatah organization as an example. It questions the specific focus of most de-radicalisation studies upon solely ending the use of the terrorist tactic, arguing that the disengagement process should be studied in conjunction with groups ceasing to use other forms of political violence as well. Although the article favours an objective definition of terrorism, it also recognises the salience of the term's normative power and argues that both perspectives can play a role in the disengagement process. This process can be divided into a number of stages: (i declarative disengagement, (ii behavioural disengagement, (iii organisational disengagement, and (iv de-radicalisation. Fatah's disengagement process demonstrates that the process can be conditional, reversible, and selective. Consequently, a number of problems arise in terms of defining when an organisation has actually ceased to use terrorism and other forms of political violence. The article argues that Fatah represents a case of mixed disengagement; it was selective, conditional and mostly only behavioural. However, despite the disengagement process only being partially successful during the Oslo period - and reversed considerably during the al-Aqsa Intifada - it has had some lasting effects on the organisation, making it less likely to re-engage in terrorism.

  4. Risk in Enterprise Cloud Computing: Re-Evaluated

    Science.gov (United States)

    Funmilayo, Bolonduro, R.

    2016-01-01

    A quantitative study was conducted to get the perspectives of IT experts about risks in enterprise cloud computing. In businesses, these IT experts are often not in positions to prioritize business needs. The business experts commonly known as business managers mostly determine an organization's business needs. Even if an IT expert classified a…

  5. Taxonomic re-evaluation of black koji molds

    NARCIS (Netherlands)

    Hong, S.B.; Yamada, O.; Samson, R.A.

    2013-01-01

    Black koji molds including its albino mutant, the white koji mold, have been widely used for making the distilled spirit shochu in Northeast Asia because they produce citric acid which prevents undesirable contamination from bacteria. Since Inui reported Aspergillus luchuensis from black koji in

  6. Seismic re-evaluation criteria for Bohunice V1 reconstruction

    International Nuclear Information System (INIS)

    Campbell, R.; Schlund, H.; Warnken, L.

    2001-01-01

    Bohunice V1 in Slovakia is a Russian designed two unit WWER 440, Model 230 Pressurized Water Reactor. The plant was not originally designed for earthquake. Subsequent and ongoing reassessments now confirm that the seismic hazard at the site is significant. EBO, the plant owner has contracted with a consortium lead by Siemens AG (REKON) to do major reconstruction of the plant to significantly enhance its safety systems by the addition of new systems and the upgrading of existing systems. As part of the reconstruction, a complete seismic assessment and upgrading is required for existing safety relevant structures, systems and components. It is not practical to conduct this reassessment and upgrading using criteria applied to new design of nuclear power plants. Alternate criteria may be used to achieve adequate safety goals. Utilities in the U.S. have faced several seismic issues with operating NPPs and to resolve these issues, alternate criteria have been developed which are much more cost effective than use of criteria for new design. These alternate criteria incorporate the knowledge obtained from investigation of the performance of equipment in major earthquakes and include provisions for structures and passive equipment to deform beyond the yield point, yet still provide their essential function. IAEA has incorporated features of these alternate criteria into draft Technical Guidelines for application to Bohunice V1 and V2. REKON has developed plant specific criteria and procedures for the Bohunice V1 reconstruction that incorporate major features of the U.S. developed alternate criteria, comply to local codes and which envelop the draft IAEA Technical Guidelines. Included in these criteria and procedures are comprehensive walkdown screening criteria for equipment, piping, HVAC and cable raceways, analytical criteria which include inelastic energy absorption factors defined on an element basis and testing criteria which include specific guidance on interpretation of existing single axis, single frequency testing and on amplification factors for electrical cabinets. (author)

  7. Re-telling, Re-evaluating and Re-constructing

    Directory of Open Access Journals (Sweden)

    Gorana Tolja

    2013-11-01

    Full Text Available 'Graphic History: Essays on Graphic Novels and/as History '(2012 is a collection of 14 unique essays, edited by scholar Richard Iadonisi, that explores a variety of complex issues within the graphic novel medium as a means of historical narration. The essays address the issues of accuracy of re-counting history, history as re-constructed, and the ethics surrounding historical narration.

  8. Re-evaluating the 1940s CO2 plateau

    Science.gov (United States)

    Bastos, Ana; Ciais, Philippe; Barichivich, Jonathan; Bopp, Laurent; Brovkin, Victor; Gasser, Thomas; Peng, Shushi; Pongratz, Julia; Viovy, Nicolas; Trudinger, Cathy M.

    2016-09-01

    The high-resolution CO2 record from Law Dome ice core reveals that atmospheric CO2 concentration stalled during the 1940s (so-called CO2 plateau). Since the fossil-fuel emissions did not decrease during the period, this stalling implies the persistence of a strong sink, perhaps sustained for as long as a decade or more. Double-deconvolution analyses have attributed this sink to the ocean, conceivably as a response to the very strong El Niño event in 1940-1942. However, this explanation is questionable, as recent ocean CO2 data indicate that the range of variability in the ocean sink has been rather modest in recent decades, and El Niño events have generally led to higher growth rates of atmospheric CO2 due to the offsetting terrestrial response. Here, we use the most up-to-date information on the different terms of the carbon budget: fossil-fuel emissions, four estimates of land-use change (LUC) emissions, ocean uptake from two different reconstructions, and the terrestrial sink modelled by the TRENDY project to identify the most likely causes of the 1940s plateau. We find that they greatly overestimate atmospheric CO2 growth rate during the plateau period, as well as in the 1960s, in spite of giving a plausible explanation for most of the 20th century carbon budget, especially from 1970 onwards. The mismatch between reconstructions and observations during the CO2 plateau epoch of 1940-1950 ranges between 0.9 and 2.0 Pg C yr-1, depending on the LUC dataset considered. This mismatch may be explained by (i) decadal variability in the ocean carbon sink not accounted for in the reconstructions we used, (ii) a further terrestrial sink currently missing in the estimates by land-surface models, or (iii) LUC processes not included in the current datasets. Ocean carbon models from CMIP5 indicate that natural variability in the ocean carbon sink could explain an additional 0.5 Pg C yr-1 uptake, but it is unlikely to be higher. The impact of the 1940-1942 El Niño on the observed stabilization of atmospheric CO2 cannot be confirmed nor discarded, as TRENDY models do not reproduce the expected concurrent strong decrease in terrestrial uptake. Nevertheless, this would further increase the mismatch between observed and modelled CO2 growth rate during the CO2 plateau epoch. Tests performed using the OSCAR (v2.2) model indicate that changes in land use not correctly accounted for during the period (coinciding with drastic socioeconomic changes during the Second World War) could contribute to the additional sink required. Thus, the previously proposed ocean hypothesis for the 1940s plateau cannot be confirmed by independent data. Further efforts are required to reduce uncertainty in the different terms of the carbon budget during the first half of the 20th century and to better understand the long-term variability of the ocean and terrestrial CO2 sinks.

  9. Peace Operations in the Former Yugoslavia: A Re-Evaluation

    Science.gov (United States)

    2010-12-01

    currency, and were susceptible to bribery and “ taxation ” by the Serbs, with up to a quarter of the 103...sponsored, and were even supported by the peace operations. Some UN peacekeepers, including the Turkish, Malaysian , Bangladeshi, and Maltese, are known

  10. A report on seismic re-evaluation of Cirus systems

    International Nuclear Information System (INIS)

    Varma, Veto; Reddy, G.R.; Vaze, K.K.; Kushwaha, H.S.

    2003-06-01

    Cirus was initiated way back in 1955 and its design was made with the methods prevailing at that time. The design codes and safety standards have changed since then, particularly with respect to seismic design criteria. As the structure is an important safety related structure it is mandatory to meet the present statutory requirement. This report contains the seismic qualification for some of the Cirus systems. The report has four parts. Part I gives the analytical studies performed in the containment building, Part II describes of experimental studies carried out to validate the analytical studies for containment builaing, Part III explains the seismic retrofitting of Battery bank, and Part IV summarizes the seismic qualification of inlet and exhaust damper of Cirus. (author)

  11. Re-evaluating occupational heat stress in a changing climate.

    Science.gov (United States)

    Spector, June T; Sheffield, Perry E

    2014-10-01

    The potential consequences of occupational heat stress in a changing climate on workers, workplaces, and global economies are substantial. Occupational heat stress risk is projected to become particularly high in middle- and low-income tropical and subtropical regions, where optimal controls may not be readily available. This commentary presents occupational heat stress in the context of climate change, reviews its impacts, and reflects on implications for heat stress assessment and control. Future efforts should address limitations of existing heat stress assessment methods and generate economical, practical, and universal approaches that can incorporate data of varying levels of detail, depending on resources. Validation of these methods should be performed in a wider variety of environments, and data should be collected and analyzed centrally for both local and large-scale hazard assessments and to guide heat stress adaptation planning. Heat stress standards should take into account variability in worker acclimatization, other vulnerabilities, and workplace resources. The effectiveness of controls that are feasible and acceptable should be evaluated. Exposure scientists are needed, in collaboration with experts in other areas, to effectively prevent and control occupational heat stress in a changing climate. © The Author 2014. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  12. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  13. A re-evaluation of k sub 0 and related nuclear data for the 555.8 keV gamma-line emitted by the sup 1 sup 0 sup 4 sup m Rh- sup 1 sup 0 sup 4 Rh mother-daughter pair for use in NAA

    CERN Document Server

    Corte, F D; Simonits, A; Bossus, D; Sluijs, R V; Pommé, S

    1999-01-01

    A re-evaluation is made of the k sub 0 -factor and related nuclear data for the 555.8 keV gamma-ray of the sup 1 sup 0 sup 4 sup m Rh- sup 1 sup 0 sup 4 Rh mother-daughter pair that are important in neutron activation analysis (NAA). This study considers that the relevant level is also fed by the 4.34 min sup 1 sup 0 sup 4 sup m Rh mother (with an absolute gamma-ray emission probability gamma sub 2 =0.13%) and not only, as assumed in former work, by the 42.3 s sup 1 sup 0 sup 4 Rh daughter isotope (with gamma sub 3 =2.0%). In view of this, generalised equations were developed for both the experimental determination and the analytical use of the k sub 0 -factor and of the associated parameters k sub 0 (m)/k sub 0 (g), Q sub 0 (m) and Q sub 0 (g) [(m): sup 1 sup 0 sup 4 sup m Rh; (g): sup 1 sup 0 sup 4 Rh], requiring the introduction of the gamma sub 2 and gamma sub 3 data and also of the sup 1 sup 0 sup 4 sup m Rh-> sup 1 sup 0 sup 4 Rh fractional decay factor F sub 2 (=0.9987). The experimental determinations...

  14. Predictive medicine

    NARCIS (Netherlands)

    Boenink, Marianne; ten Have, Henk

    2015-01-01

    In the last part of the twentieth century, predictive medicine has gained currency as an important ideal in biomedical research and health care. Research in the genetic and molecular basis of disease suggested that the insights gained might be used to develop tests that predict the future health

  15. Prediction Markets

    DEFF Research Database (Denmark)

    Horn, Christian Franz; Ivens, Bjørn Sven; Ohneberg, Michael

    2014-01-01

    In recent years, Prediction Markets gained growing interest as a forecasting tool among researchers as well as practitioners, which resulted in an increasing number of publications. In order to track the latest development of research, comprising the extent and focus of research, this article...... provides a comprehensive review and classification of the literature related to the topic of Prediction Markets. Overall, 316 relevant articles, published in the timeframe from 2007 through 2013, were identified and assigned to a herein presented classification scheme, differentiating between descriptive...... works, articles of theoretical nature, application-oriented studies and articles dealing with the topic of law and policy. The analysis of the research results reveals that more than half of the literature pool deals with the application and actual function tests of Prediction Markets. The results...

  16. Predicting unpredictability

    Science.gov (United States)

    Davis, Steven J.

    2018-04-01

    Analysts and markets have struggled to predict a number of phenomena, such as the rise of natural gas, in US energy markets over the past decade or so. Research shows the challenge may grow because the industry — and consequently the market — is becoming increasingly volatile.

  17. Unification predictions

    International Nuclear Information System (INIS)

    Ghilencea, D.; Ross, G.G.; Lanzagorta, M.

    1997-07-01

    The unification of gauge couplings suggests that there is an underlying (supersymmetric) unification of the strong, electromagnetic and weak interactions. The prediction of the unification scale may be the first quantitative indication that this unification may extend to unification with gravity. We make a precise determination of these predictions for a class of models which extend the multiplet structure of the Minimal Supersymmetric Standard Model to include the heavy states expected in many Grand Unified and/or superstring theories. We show that there is a strong cancellation between the 2-loop and threshold effects. As a result the net effect is smaller than previously thought, giving a small increase in both the unification scale and the value of the strong coupling at low energies. (author). 15 refs, 5 figs

  18. Life prediction methodology for ceramic components of advanced vehicular heat engines: Volume 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Khandelwal, P.K.; Provenzano, N.J.; Schneider, W.E. [Allison Engine Co., Indianapolis, IN (United States)

    1996-02-01

    One of the major challenges involved in the use of ceramic materials is ensuring adequate strength and durability. This activity has developed methodology which can be used during the design phase to predict the structural behavior of ceramic components. The effort involved the characterization of injection molded and hot isostatic pressed (HIPed) PY-6 silicon nitride, the development of nondestructive evaluation (NDE) technology, and the development of analytical life prediction methodology. Four failure modes are addressed: fast fracture, slow crack growth, creep, and oxidation. The techniques deal with failures initiating at the surface as well as internal to the component. The life prediction methodology for fast fracture and slow crack growth have been verified using a variety of confirmatory tests. The verification tests were conducted at room and elevated temperatures up to a maximum of 1371 {degrees}C. The tests involved (1) flat circular disks subjected to bending stresses and (2) high speed rotating spin disks. Reasonable correlation was achieved for a variety of test conditions and failure mechanisms. The predictions associated with surface failures proved to be optimistic, requiring re-evaluation of the components` initial fast fracture strengths. Correlation was achieved for the spin disks which failed in fast fracture from internal flaws. Time dependent elevated temperature slow crack growth spin disk failures were also successfully predicted.

  19. Performance of local information-based link prediction: a sampling perspective

    Science.gov (United States)

    Zhao, Jichang; Feng, Xu; Dong, Li; Liang, Xiao; Xu, Ke

    2012-08-01

    Link prediction is pervasively employed to uncover the missing links in the snapshots of real-world networks, which are usually obtained through different kinds of sampling methods. In the previous literature, in order to evaluate the performance of the prediction, known edges in the sampled snapshot are divided into the training set and the probe set randomly, without considering the underlying sampling approaches. However, different sampling methods might lead to different missing links, especially for the biased ways. For this reason, random partition-based evaluation of performance is no longer convincing if we take the sampling method into account. In this paper, we try to re-evaluate the performance of local information-based link predictions through sampling method governed division of the training set and the probe set. It is interesting that we find that for different sampling methods, each prediction approach performs unevenly. Moreover, most of these predictions perform weakly when the sampling method is biased, which indicates that the performance of these methods might have been overestimated in the prior works.

  20. Predictable Medea

    Directory of Open Access Journals (Sweden)

    Elisabetta Bertolino

    2010-01-01

    Full Text Available By focusing on the tragedy of the 'unpredictable' infanticide perpetrated by Medea, the paper speculates on the possibility of a non-violent ontological subjectivity for women victims of gendered violence and whether it is possible to respond to violent actions in non-violent ways; it argues that Medea did not act in an unpredictable way, rather through the very predictable subject of resentment and violence. 'Medea' represents the story of all of us who require justice as retribution against any wrong. The presupposition is that the empowered female subjectivity of women’s rights contains the same desire of mastering others of the masculine current legal and philosophical subject. The subject of women’s rights is grounded on the emotions of resentment and retribution and refuses the categories of the private by appropriating those of the righteous, masculine and public subject. The essay opposes the essentialised stereotypes of the feminine and the maternal with an ontological approach of people as singular, corporeal, vulnerable and dependent. There is therefore an emphasis on the excluded categories of the private. Forgiveness is taken into account as a category of the private and a possibility of responding to violence with newness. A violent act is seen in relations to the community of human beings rather than through an isolated setting as in the case of the individual of human rights. In this context, forgiveness allows to risk again and being with. The result is also a rethinking of feminist actions, feminine subjectivity and of the maternal. Overall the paper opens up the Arendtian category of action and forgiveness and the Cavarerian unique and corporeal ontology of the selfhood beyond gendered stereotypes.

  1. HIGH-PRECISION PREDICTIONS FOR THE ACOUSTIC SCALE IN THE NONLINEAR REGIME

    International Nuclear Information System (INIS)

    Seo, Hee-Jong; Eckel, Jonathan; Eisenstein, Daniel J.; Mehta, Kushal; Metchnik, Marc; Pinto, Phillip; Xu Xiaoying; Padmanabhan, Nikhil; Takahashi, Ryuichi; White, Martin

    2010-01-01

    We measure shifts of the acoustic scale due to nonlinear growth and redshift distortions to a high precision using a very large volume of high-force-resolution simulations. We compare results from various sets of simulations that differ in their force, volume, and mass resolution. We find a consistency within 1.5σ for shift values from different simulations and derive shift α(z) - 1 = (0.300 ± 0.015) %[D(z)/D(0)] 2 using our fiducial set. We find a strong correlation with a non-unity slope between shifts in real space and in redshift space and a weak correlation between the initial redshift and low redshift. Density-field reconstruction not only removes the mean shifts and reduces errors on the mean, but also tightens the correlations. After reconstruction, we recover a slope of near unity for the correlation between the real and redshift space and restore a strong correlation between the initial and the low redshifts. We derive propagators and mode-coupling terms from our N-body simulations and compare with the Zel'dovich approximation and the shifts measured from the χ 2 fitting, respectively. We interpret the propagator and the mode-coupling term of a nonlinear density field in the context of an average and a dispersion of its complex Fourier coefficients relative to those of the linear density field; from these two terms, we derive a signal-to-noise ratio of the acoustic peak measurement. We attempt to improve our reconstruction method by implementing 2LPT and iterative operations, but we obtain little improvement. The Fisher matrix estimates of uncertainty in the acoustic scale is tested using 5000 h -3 Gpc 3 of cosmological Particle-Mesh simulations from Takahashi et al. At an expected sample variance level of 1%, the agreement between the Fisher matrix estimates based on Seo and Eisenstein and the N-body results is better than 10%.

  2. Making detailed predictions makes (some) predictions worse

    Science.gov (United States)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  3. Predictive value of the transtheoretical model to smoking cessation in hospitalized patients with cardiovascular disease.

    Science.gov (United States)

    Chouinard, Maud-Christine; Robichaud-Ekstrand, Sylvie

    2007-02-01

    Several authors have questioned the transtheoretical model. Determining the predictive value of each cognitive-behavioural element within this model could explain the multiple successes reported in smoking cessation programmes. The purpose of this study was to predict point-prevalent smoking abstinence at 2 and 6 months, using the constructs of the transtheoretical model, when applied to a pooled sample of individuals who were hospitalized for a cardiovascular event. The study follows a predictive correlation design. Recently hospitalized patients (n=168) with cardiovascular disease were pooled from a randomized, controlled trial. Independent variables of the predictive transtheoretical model comprise stages and processes of change, pros and cons to quit smoking (decisional balance), self-efficacy, and social support. These were evaluated at baseline, 2 and 6 months. Compared to smokers, individuals who abstained from smoking at 2 and 6 months were more confident at baseline to remain non-smokers, perceived less pros and cons to continue smoking, utilized less consciousness raising and self-re-evaluation experiential processes of change, and received more positive reinforcement from their social network with regard to their smoke-free behaviour. Self-efficacy and stages of change at baseline were predictive of smoking abstinence after 6 months. Other variables found to be predictive of smoking abstinence at 6 months were an increase in self-efficacy; an increase in positive social support behaviour and a decrease of the pros within the decisional balance. The results partially support the predictive value of the transtheoretical model constructs in smoking cessation for cardiovascular disease patients.

  4. Predictive modeling of complications.

    Science.gov (United States)

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  5. Substantial proportion of MODY among multiplex families participating in a Type 1 diabetes prediction programme.

    Science.gov (United States)

    Petruzelkova, L; Dusatkova, P; Cinek, O; Sumnik, Z; Pruhova, S; Hradsky, O; Vcelakova, J; Lebl, J; Kolouskova, S

    2016-12-01

    Patients with maturity-onset diabetes of the young (MODY) might be over-represented in families with histories of Type 1 diabetes. Our aim was to re-evaluate families participating in the Czech T1D Prediction Programme (PREDIA.CZ) with at least two members affected with diabetes to assess the proportion of MODY among these families and determine its most significant clinical predictors. Of the 557 families followed up by the PREDIA.CZ, 53 (9.5%) had two or more family members with diabetes. One proband with diabetes from these families was chosen for direct sequencing of the GCK, HNF1A, HNF4A and INS genes. Non-parametric tests and a linear logistic regression model were used to evaluate differences between MODY and non-MODY families. MODY was genetically diagnosed in 24 of the 53 families with multiple occurrences of diabetes (45%). Mutations were detected most frequently in GCK (58%), followed by HNF1A (38%) and INS (4%). MODY families were more likely to have a parent with diabetes and had a higher proportion of females with diabetes than non-MODY families. Higher age (P MODY families already presenting with diabetes. A prediction programme for Type 1 diabetes would provide a useful new source of patients with MODY most likely to benefit from an accurate diagnosis. This identification has implications for patient treatment and disease prognosis. © 2015 Diabetes UK.

  6. Preoperative Aspartate Aminotransferase-to-Platelet Ratio Index Predicts Perioperative Liver-Related Complications Following Liver Resection for Colorectal Cancer Metastases

    DEFF Research Database (Denmark)

    Amptoulach, S.; Gross, G.; Sturesson, C.

    2017-01-01

    -related). In multivariate regression analysis, the aspartate aminotransferase-to-platelet ratio index was independently associated with liver-related complications (odds ratio: 1.149, p = 0.003) and perioperative liver failure (odds ratio: 1.155, p = 0.012). The latter was also true in the subcohort of patients......Background and Aims: There are limited data on the potential role of preoperative non-invasive markers, specifically the aspartate-to-alanine aminotransferase ratio and the aspartate aminotransferase-to-platelet ratio index, in predicting perioperative liver-related complications after hepatectomy...... collected from medical records. The nontumorous liver parenchyma in the surgical specimens of 31 patients was re-evaluated. Results: Overall, 215 patients were included. In total, 40% underwent neoadjuvant chemotherapy and 47% major resection, while 47% had perioperative complications (6% liver...

  7. Predicting outdoor sound

    CERN Document Server

    Attenborough, Keith; Horoshenkov, Kirill

    2014-01-01

    1. Introduction  2. The Propagation of Sound Near Ground Surfaces in a Homogeneous Medium  3. Predicting the Acoustical Properties of Outdoor Ground Surfaces  4. Measurements of the Acoustical Properties of Ground Surfaces and Comparisons with Models  5. Predicting Effects of Source Characteristics on Outdoor Sound  6. Predictions, Approximations and Empirical Results for Ground Effect Excluding Meteorological Effects  7. Influence of Source Motion on Ground Effect and Diffraction  8. Predicting Effects of Mixed Impedance Ground  9. Predicting the Performance of Outdoor Noise Barriers  10. Predicting Effects of Vegetation, Trees and Turbulence  11. Analytical Approximations including Ground Effect, Refraction and Turbulence  12. Prediction Schemes  13. Predicting Sound in an Urban Environment.

  8. Applied predictive control

    CERN Document Server

    Sunan, Huang; Heng, Lee Tong

    2002-01-01

    The presence of considerable time delays in the dynamics of many industrial processes, leading to difficult problems in the associated closed-loop control systems, is a well-recognized phenomenon. The performance achievable in conventional feedback control systems can be significantly degraded if an industrial process has a relatively large time delay compared with the dominant time constant. Under these circumstances, advanced predictive control is necessary to improve the performance of the control system significantly. The book is a focused treatment of the subject matter, including the fundamentals and some state-of-the-art developments in the field of predictive control. Three main schemes for advanced predictive control are addressed in this book: • Smith Predictive Control; • Generalised Predictive Control; • a form of predictive control based on Finite Spectrum Assignment. A substantial part of the book addresses application issues in predictive control, providing several interesting case studie...

  9. Predictable or not predictable? The MOV question

    International Nuclear Information System (INIS)

    Thibault, C.L.; Matzkiw, J.N.; Anderson, J.W.; Kessler, D.W.

    1994-01-01

    Over the past 8 years, the nuclear industry has struggled to understand the dynamic phenomena experienced during motor-operated valve (MOV) operation under differing flow conditions. For some valves and designs, their operational functionality has been found to be predictable; for others, unpredictable. Although much has been accomplished over this period of time, especially on modeling valve dynamics, the unpredictability of many valves and designs still exists. A few valve manufacturers are focusing on improving design and fabrication techniques to enhance product reliability and predictability. However, this approach does not address these issues for installed and inpredictable valves. This paper presents some of the more promising techniques that Wyle Laboratories has explored with potential for transforming unpredictable valves to predictable valves and for retrofitting installed MOVs. These techniques include optimized valve tolerancing, surrogated material evaluation, and enhanced surface treatments

  10. Predictive systems ecology.

    Science.gov (United States)

    Evans, Matthew R; Bithell, Mike; Cornell, Stephen J; Dall, Sasha R X; Díaz, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J; Lewis, Simon L; Mace, Georgina M; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim; Norris, K J; Petchey, Owen; Smith, Matthew; Travis, Justin M J; Benton, Tim G

    2013-11-22

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of predictive systems ecology, explicitly to understand and predict the properties and behaviour of ecological systems. We discuss the necessary and desirable features of predictive systems ecology models. There are places where predictive systems ecology is already being practised and we summarize a range of terrestrial and marine examples. Significant challenges remain but we suggest that ecology would benefit both as a scientific discipline and increase its impact in society if it were to embrace the need to become more predictive.

  11. Seismology for rockburst prediction.

    CSIR Research Space (South Africa)

    De Beer, W

    2000-02-01

    Full Text Available project GAP409 presents a method (SOOTHSAY) for predicting larger mining induced seismic events in gold mines, as well as a pattern recognition algorithm (INDICATOR) for characterising the seismic response of rock to mining and inferring future... State. Defining the time series of a specific function on a catalogue as a prediction strategy, the algorithm currently has a success rate of 53% and 65%, respectively, of large events claimed as being predicted in these two cases, with uncertainties...

  12. Predictability of Conversation Partners

    Science.gov (United States)

    Takaguchi, Taro; Nakamura, Mitsuhiro; Sato, Nobuo; Yano, Kazuo; Masuda, Naoki

    2011-08-01

    Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song , ScienceSCIEAS0036-8075 327, 1018 (2010)] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  13. Predictability of Conversation Partners

    Directory of Open Access Journals (Sweden)

    Taro Takaguchi

    2011-09-01

    Full Text Available Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song et al., Science 327, 1018 (2010SCIEAS0036-8075] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  14. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst......-case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only...... compare the worst-case execution time bounds of different architectures....

  15. Predicting scholars' scientific impact.

    Directory of Open Access Journals (Sweden)

    Amin Mazloumian

    Full Text Available We tested the underlying assumption that citation counts are reliable predictors of future success, analyzing complete citation data on the careers of ~150,000 scientists. Our results show that i among all citation indicators, the annual citations at the time of prediction is the best predictor of future citations, ii future citations of a scientist's published papers can be predicted accurately (r(2 = 0.80 for a 1-year prediction, P<0.001 but iii future citations of future work are hardly predictable.

  16. Identification and prediction of diabetic sensorimotor polyneuropathy using individual and simple combinations of nerve conduction study parameters.

    Directory of Open Access Journals (Sweden)

    Alanna Weisman

    Full Text Available OBJECTIVE: Evaluation of diabetic sensorimotor polyneuropathy (DSP is hindered by the need for complex nerve conduction study (NCS protocols and lack of predictive biomarkers. We aimed to determine the performance of single and simple combinations of NCS parameters for identification and future prediction of DSP. MATERIALS AND METHODS: 406 participants (61 with type 1 diabetes and 345 with type 2 diabetes with a broad spectrum of neuropathy, from none to severe, underwent NCS to determine presence or absence of DSP for cross-sectional (concurrent validity analysis. The 109 participants without baseline DSP were re-evaluated for its future onset (predictive validity. Performance of NCS parameters was compared by area under the receiver operating characteristic curve (AROC. RESULTS: At baseline there were 246 (60% Prevalent Cases. After 3.9 years mean follow-up, 25 (23% of the 109 Prevalent Controls that were followed became Incident DSP Cases. Threshold values for peroneal conduction velocity and sural amplitude potential best identified Prevalent Cases (AROC 0.90 and 0.83, sensitivity 80 and 83%, specificity 89 and 72%, respectively. Baseline tibial F-wave latency, peroneal conduction velocity and the sum of three lower limb nerve conduction velocities (sural, peroneal, and tibial best predicted 4-year incidence (AROC 0.79, 0.79, and 0.85; sensitivity 79, 70, and 81%; specificity 63, 74 and 77%, respectively. DISCUSSION: Individual NCS parameters or their simple combinations are valid measures for identification and future prediction of DSP. Further research into the predictive roles of tibial F-wave latencies, peroneal conduction velocity, and sum of conduction velocities as markers of incipient nerve injury is needed to risk-stratify individuals for clinical and research protocols.

  17. Flood design recipes vs. reality: can predictions for ungauged basins be trusted?

    Science.gov (United States)

    Efstratiadis, A.; Koussis, A. D.; Koutsoyiannis, D.; Mamassis, N.

    2014-06-01

    Despite the great scientific and technological advances in flood hydrology, everyday engineering practices still follow simplistic approaches that are easy to formally implement in ungauged areas. In general, these "recipes" have been developed many decades ago, based on field data from typically few experimental catchments. However, many of them have been neither updated nor validated across all hydroclimatic and geomorphological conditions. This has an obvious impact on the quality and reliability of hydrological studies, and, consequently, on the safety and cost of the related flood protection works. Preliminary results, based on historical flood data from Cyprus and Greece, indicate that a substantial revision of many aspects of flood engineering procedures is required, including the regionalization formulas as well as the modelling concepts themselves. In order to provide a consistent design framework and to ensure realistic predictions of the flood risk (a key issue of the 2007/60/EU Directive) in ungauged basins, it is necessary to rethink the current engineering practices. In this vein, the collection of reliable hydrological data would be essential for re-evaluating the existing "recipes", taking into account local peculiarities, and for updating the modelling methodologies as needed.

  18. Flood design recipes vs. reality: can predictions for ungauged basins be trusted?

    Science.gov (United States)

    Efstratiadis, A.; Koussis, A. D.; Koutsoyiannis, D.; Mamassis, N.

    2013-12-01

    Despite the great scientific and technological advances in flood hydrology, everyday engineering practices still follow simplistic approaches, such as the rational formula and the SCS-CN method combined with the unit hydrograph theory that are easy to formally implement in ungauged areas. In general, these "recipes" have been developed many decades ago, based on field data from few experimental catchments. However, many of them have been neither updated nor validated across all hydroclimatic and geomorphological conditions. This has an obvious impact on the quality and reliability of hydrological studies, and, consequently, on the safety and cost of the related flood protection works. Preliminary results, based on historical flood data from Cyprus and Greece, indicate that a substantial revision of many aspects of flood engineering procedures is required, including the regionalization formulas as well as the modelling concepts themselves. In order to provide a consistent design framework and to ensure realistic predictions of the flood risk (a key issue of the 2007/60/EU Directive) in ungauged basins, it is necessary to rethink the current engineering practices. In this vein, the collection of reliable hydrological data would be essential for re-evaluating the existing "recipes", taking into account local peculiarities, and for updating the modelling methodologies as needed.

  19. The Prediction Value

    NARCIS (Netherlands)

    Koster, M.; Kurz, S.; Lindner, I.; Napel, S.

    2013-01-01

    We introduce the prediction value (PV) as a measure of players’ informational importance in probabilistic TU games. The latter combine a standard TU game and a probability distribution over the set of coalitions. Player i’s prediction value equals the difference between the conditional expectations

  20. Predictability of Stock Returns

    Directory of Open Access Journals (Sweden)

    Ahmet Sekreter

    2017-06-01

    Full Text Available Predictability of stock returns has been shown by empirical studies over time. This article collects the most important theories on forecasting stock returns and investigates the factors that affecting behavior of the stocks’ prices and the market as a whole. Estimation of the factors and the way of estimation are the key issues of predictability of stock returns.

  1. Predicting AD conversion

    DEFF Research Database (Denmark)

    Liu, Yawu; Mattila, Jussi; Ruiz, Miguel �ngel Mu�oz

    2013-01-01

    To compare the accuracies of predicting AD conversion by using a decision support system (PredictAD tool) and current research criteria of prodromal AD as identified by combinations of episodic memory impairment of hippocampal type and visual assessment of medial temporal lobe atrophy (MTA) on MRI...

  2. Predicting Free Recalls

    Science.gov (United States)

    Laming, Donald

    2006-01-01

    This article reports some calculations on free-recall data from B. Murdock and J. Metcalfe (1978), with vocal rehearsal during the presentation of a list. Given the sequence of vocalizations, with the stimuli inserted in their proper places, it is possible to predict the subsequent sequence of recalls--the predictions taking the form of a…

  3. Archaeological predictive model set.

    Science.gov (United States)

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  4. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  5. Ground motion predictions

    Energy Technology Data Exchange (ETDEWEB)

    Loux, P C [Environmental Research Corporation, Alexandria, VA (United States)

    1969-07-01

    Nuclear generated ground motion is defined and then related to the physical parameters that cause it. Techniques employed for prediction of ground motion peak amplitude, frequency spectra and response spectra are explored, with initial emphasis on the analysis of data collected at the Nevada Test Site (NTS). NTS postshot measurements are compared with pre-shot predictions. Applicability of these techniques to new areas, for example, Plowshare sites, must be questioned. Fortunately, the Atomic Energy Commission is sponsoring complementary studies to improve prediction capabilities primarily in new locations outside the NTS region. Some of these are discussed in the light of anomalous seismic behavior, and comparisons are given showing theoretical versus experimental results. In conclusion, current ground motion prediction techniques are applied to events off the NTS. Predictions are compared with measurements for the event Faultless and for the Plowshare events, Gasbuggy, Cabriolet, and Buggy I. (author)

  6. Ground motion predictions

    International Nuclear Information System (INIS)

    Loux, P.C.

    1969-01-01

    Nuclear generated ground motion is defined and then related to the physical parameters that cause it. Techniques employed for prediction of ground motion peak amplitude, frequency spectra and response spectra are explored, with initial emphasis on the analysis of data collected at the Nevada Test Site (NTS). NTS postshot measurements are compared with pre-shot predictions. Applicability of these techniques to new areas, for example, Plowshare sites, must be questioned. Fortunately, the Atomic Energy Commission is sponsoring complementary studies to improve prediction capabilities primarily in new locations outside the NTS region. Some of these are discussed in the light of anomalous seismic behavior, and comparisons are given showing theoretical versus experimental results. In conclusion, current ground motion prediction techniques are applied to events off the NTS. Predictions are compared with measurements for the event Faultless and for the Plowshare events, Gasbuggy, Cabriolet, and Buggy I. (author)

  7. Structural prediction in aphasia

    Directory of Open Access Journals (Sweden)

    Tessa Warren

    2015-05-01

    Full Text Available There is considerable evidence that young healthy comprehenders predict the structure of upcoming material, and that their processing is facilitated when they encounter material matching those predictions (e.g., Staub & Clifton, 2006; Yoshida, Dickey & Sturt, 2013. However, less is known about structural prediction in aphasia. There is evidence that lexical prediction may be spared in aphasia (Dickey et al., 2014; Love & Webb, 1977; cf. Mack et al, 2013. However, predictive mechanisms supporting facilitated lexical access may not necessarily support structural facilitation. Given that many people with aphasia (PWA exhibit syntactic deficits (e.g. Goodglass, 1993, PWA with such impairments may not engage in structural prediction. However, recent evidence suggests that some PWA may indeed predict upcoming structure (Hanne, Burchert, De Bleser, & Vashishth, 2015. Hanne et al. tracked the eyes of PWA (n=8 with sentence-comprehension deficits while they listened to reversible subject-verb-object (SVO and object-verb-subject (OVS sentences in German, in a sentence-picture matching task. Hanne et al. manipulated case and number marking to disambiguate the sentences’ structure. Gazes to an OVS or SVO picture during the unfolding of a sentence were assumed to indicate prediction of the structure congruent with that picture. According to this measure, the PWA’s structural prediction was impaired compared to controls, but they did successfully predict upcoming structure when morphosyntactic cues were strong and unambiguous. Hanne et al.’s visual-world evidence is suggestive, but their forced-choice sentence-picture matching task places tight constraints on possible structural predictions. Clearer evidence of structural prediction would come from paradigms where the content of upcoming material is not as constrained. The current study used self-paced reading study to examine structural prediction among PWA in less constrained contexts. PWA (n=17 who

  8. Prediction of bull fertility.

    Science.gov (United States)

    Utt, Matthew D

    2016-06-01

    Prediction of male fertility is an often sought-after endeavor for many species of domestic animals. This review will primarily focus on providing some examples of dependent and independent variables to stimulate thought about the approach and methodology of identifying the most appropriate of those variables to predict bull (bovine) fertility. Although the list of variables will continue to grow with advancements in science, the principles behind making predictions will likely not change significantly. The basic principle of prediction requires identifying a dependent variable that is an estimate of fertility and an independent variable or variables that may be useful in predicting the fertility estimate. Fertility estimates vary in which parts of the process leading to conception that they infer about and the amount of variation that influences the estimate and the uncertainty thereof. The list of potential independent variables can be divided into competence of sperm based on their performance in bioassays or direct measurement of sperm attributes. A good prediction will use a sample population of bulls that is representative of the population to which an inference will be made. Both dependent and independent variables should have a dynamic range in their values. Careful selection of independent variables includes reasonable measurement repeatability and minimal correlation among variables. Proper estimation and having an appreciation of the degree of uncertainty of dependent and independent variables are crucial for using predictions to make decisions regarding bull fertility. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Bootstrap prediction and Bayesian prediction under misspecified models

    OpenAIRE

    Fushiki, Tadayoshi

    2005-01-01

    We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...

  10. Prediction ranges. Annual review

    Energy Technology Data Exchange (ETDEWEB)

    Parker, J.C.; Tharp, W.H.; Spiro, P.S.; Keng, K.; Angastiniotis, M.; Hachey, L.T.

    1988-01-01

    Prediction ranges equip the planner with one more tool for improved assessment of the outcome of a course of action. One of their major uses is in financial evaluations, where corporate policy requires the performance of uncertainty analysis for large projects. This report gives an overview of the uses of prediction ranges, with examples; and risks and uncertainties in growth, inflation, and interest and exchange rates. Prediction ranges and standard deviations of 80% and 50% probability are given for various economic indicators in Ontario, Canada, and the USA, as well as for foreign exchange rates and Ontario Hydro interest rates. An explanatory note on probability is also included. 23 tabs.

  11. Wind power prediction models

    Science.gov (United States)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  12. Protein Sorting Prediction

    DEFF Research Database (Denmark)

    Nielsen, Henrik

    2017-01-01

    and drawbacks of each of these approaches is described through many examples of methods that predict secretion, integration into membranes, or subcellular locations in general. The aim of this chapter is to provide a user-level introduction to the field with a minimum of computational theory.......Many computational methods are available for predicting protein sorting in bacteria. When comparing them, it is important to know that they can be grouped into three fundamentally different approaches: signal-based, global-property-based and homology-based prediction. In this chapter, the strengths...

  13. 'Red Flag' Predictions

    DEFF Research Database (Denmark)

    Hallin, Carina Antonia; Andersen, Torben Juul; Tveterås, Sigbjørn

    -generation prediction markets and outline its unique features as a third-generation prediction market. It is argued that frontline employees gain deep insights when they execute operational activities on an ongoing basis in the organization. The experiential learning from close interaction with internal and external......This conceptual article introduces a new way to predict firm performance based on aggregation of sensing among frontline employees about changes in operational capabilities to update strategic action plans and generate innovations. We frame the approach in the context of first- and second...

  14. Towards Predictive Association Theories

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios; Tsivintzelis, Ioannis; Michelsen, Michael Locht

    2011-01-01

    Association equations of state like SAFT, CPA and NRHB have been previously applied to many complex mixtures. In this work we focus on two of these models, the CPA and the NRHB equations of state and the emphasis is on the analysis of their predictive capabilities for a wide range of applications....... We use the term predictive in two situations: (i) with no use of binary interaction parameters, and (ii) multicomponent calculations using binary interaction parameters based solely on binary data. It is shown that the CPA equation of state can satisfactorily predict CO2–water–glycols–alkanes VLE...

  15. Prediction of intermetallic compounds

    International Nuclear Information System (INIS)

    Burkhanov, Gennady S; Kiselyova, N N

    2009-01-01

    The problems of predicting not yet synthesized intermetallic compounds are discussed. It is noted that the use of classical physicochemical analysis in the study of multicomponent metallic systems is faced with the complexity of presenting multidimensional phase diagrams. One way of predicting new intermetallics with specified properties is the use of modern processing technology with application of teaching of image recognition by the computer. The algorithms used most often in these methods are briefly considered and the efficiency of their use for predicting new compounds is demonstrated.

  16. Re-Evaluation of Morphological Characters Questions Current Views of Pinniped Origins

    Directory of Open Access Journals (Sweden)

    Koretsky I. A.

    2016-08-01

    Full Text Available The origin of pinnipeds has been a contentious issue, with opposite sides debating monophyly or diphyly. This review uses evidence from the fossil record, combined with comparative morphology, molecular and cytogenetic investigations to evaluate the evolutionary history and phylogenetic relationships of living and fossil otarioid and phocoid pinnipeds. Molecular investigations support a monophyletic origin of pinnipeds, but disregard vital morphological data. Likewise, morphological studies support diphyly, but overlook molecular analyses. This review will demonstrate that a monophyletic origin of pinnipeds should not be completely accepted, as is the current ideology, and a diphyletic origin remains viable due to morphological and paleobiological analyses. Critical examination of certain characters, used by supporters of pinniped monophyly, reveals different polarities, variability, or simply convergence. The paleontological record and our morphological analysis of important characters supports a diphyletic origin of pinnipeds, with otarioids likely arising in the North Pacific from large, bear-like animals and phocids arising in the North Atlantic from smaller, otter-like ancestors. Although members of both groups are known by Late Oligocene time, each developed and invaded the aquatic environment separately from their much earlier, common arctoid ancestor. Therefore, we treat the superfamily Otarioidea as being monophyletic, including the families Enaliarctidae, Otariidae (fur seals/sea lions, Desmatophocidae, and Odobenidae (walruses and extinct relatives, and the superfamily Phocoidea as monophyletic, including only the family Phocidae, with four subfamilies (Devinophocinae, Phocinae, Monachinae, and Cystophorinae.

  17. Re-evaluation of peroxide value as an indicator of the quality of edible oils.

    Science.gov (United States)

    Shiozawa, Satoshi; Tanaka, Masaharu; Ohno, Katsutoshi; Nagao, Yasuhiro; Yamada, Toshihiro

    2007-06-01

    The oxidation of oils has important effects on the quality of oily foods, such as instant noodles. In particular, the generation of aldehydes, which accompanies the oxidation of oils, is one of the first factors to reduce food quality. We examined various indicators of oil quality during temperature-accelerated storage and found that peroxide value (POV) was highly correlated with the total concentration of major odorants. Moreover, the correlation of POV with the total concentration of five unsaturated aldehydes (t-2-heptenal, t-2-octenal, t-2-decenal, t-2-undecenal and t,t-2,4-decadienal) that show strong cytotoxicity was greater than the correlation of POV with the total concentration of major odorants. The maximum allowable concentration of the five aldehydes was calculated based on the 'no observed adverse-effect level' of the aldehyde that showed the highest cytotoxicity, t,t-2,4-decadienal, along with the human daily oil intake. We showed that it is useful to utilize POV as an indicator to control food quality and safety.

  18. Oxygen fugacity control in piston-cylinder experiments: a re-evaluation

    Science.gov (United States)

    Jakobsson, Sigurdur; Blundy, Jon; Moore, Gordon

    2014-06-01

    Jakobsson (Contrib Miner Petrol 164(3):397-407, 2012) investigated a double capsule assembly for use in piston-cylinder experiments that would allow hydrous, high-temperature, and high-pressure experiments to be conducted under controlled oxygen fugacity conditions. Using a platinum outer capsule containing a metal oxide oxygen buffer (Ni-NiO or Co-CoO) and H2O, with an inner gold-palladium capsule containing hydrous melt, this study was able to compare the oxygen fugacity imposed by the outer capsule oxygen buffer with an oxygen fugacity estimated by the AuPdFe ternary system calibrated by Barr and Grove (Contrib Miner Petrol 160(5):631-643, 2010). H2O loss or gain, as well as iron loss to the capsule walls and carbon contamination, is often observed in piston-cylinder experiments and often go unexplained. Only a few have attempted to actually quantify various aspects of these changes (Brooker et al. in Am Miner 83(9-10):985-994, 1998; Truckenbrodt and Johannes in Am Miner 84:1333-1335, 1999). It was one of the goals of Jakobsson (Contrib Miner Petrol 164(3):397-407, 2012) to address these issues by using and testing the AuPdFe solution model of Barr and Grove (Contrib Miner Petrol 160(5):631-643, 2010), as well as to constrain the oxygen fugacity of the inner capsule. The oxygen fugacities of the analyzed melts were assumed to be equal to those of the solid Ni-NiO and Co-CoO buffers, which is incorrect since the melts are all undersaturated in H2O and the oxygen fugacities should therefore be lower than that of the buffer by 2 log.

  19. A re-evaluation of the phylogeny of Old World Treefrogs | Channing ...

    African Journals Online (AJOL)

    Seven subfamilies are recognized; six are monophyletic (Hyperoliidae: Hyperoliinae, Kassininae, Leptopelinae, Tachycneminae; Rhacophoridae: Buergeriinae, Mantellinae), while the Rhacophorinae are polyphyletic. The taxonomic changes from the standard Amphibian Species of the World (Frost 1985) proposed are: ...

  20. Latent dimensions of social anxiety disorder: A re-evaluation of the Social Phobia Inventory (SPIN).

    Science.gov (United States)

    Campbell-Sills, Laura; Espejo, Emmanuel; Ayers, Catherine R; Roy-Byrne, Peter; Stein, Murray B

    2015-12-01

    The Social Phobia Inventory (SPIN; Connor et al., 2000) is a well-validated instrument for assessing severity of social anxiety disorder (SAD). However, evaluations of its factor structure have produced inconsistent results and this aspect of the scale requires further study. Primary care patients with SAD (N=397) completed the SPIN as part of baseline assessment for the Coordinated Anxiety Learning and Management study (Roy-Byrne et al., 2010). These data were used for exploratory and confirmatory factor analysis of the SPIN. A 3-factor model provided the best fit for the data and factors were interpreted as Fear of Negative Evaluation, Fear of Physical Symptoms, and Fear of Uncertainty in Social Situations. Tests of a second-order model showed that the three factors loaded strongly on a single higher-order factor that was labeled Social Anxiety. Findings are consistent with theories identifying Fear of Negative Evaluation as the core feature of SAD, and with evidence that anxiety sensitivity and intolerance of uncertainty further contribute to SAD severity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Re-evaluation of superheat conditions postulated in NRC Information Notice 84-90

    International Nuclear Information System (INIS)

    Alsammarae, A.; Kruger, D.; Beutel, D.; Spisak, M.

    1994-01-01

    Information Notice 84-90, ''Main Steam Line Break Effect on Environmental Qualification of Equipment,'' describes a potential problem regarding existing plant analyses and Equipment Qualification (EQ) related to a postulated Main Steam Line Break (MSLB) with releases of superheated stream. This notice states that certain methodologies for computing mass and energy releases for a postulated MSLB did not account for heat transfer from the steam generator tube bundles if they were uncovered. Due to this potential change in the original environmental analysis, the EQ of various components may not consider the thermal environment which could result from superheated steam. Subsequent technical assessments may determine that the existing qualification basis for equipment/components does not envelop the postulated superheat condition. Corrective actions need to be taken to demonstrate that the affected equipment is qualified

  2. Re-Evaluating the Treatment of Nongonococcal Urethritis: Emphasizing Emerging Pathogens–A Randomized Clinical Trial

    Science.gov (United States)

    Rompalo, A.; Taylor, S.; Seña, A. C.; Martin, D. H.; Lopez, L. M.; Lensing, S.; Lee, J. Y.

    2011-01-01

    Background. Nongonococcal urethritis (NGU) is a common chlamydia-associated syndrome in men; however, Trichomonas vaginalis and Mycoplasma genitalium are associated with its etiology and should be considered in approaches to therapy. We sought to determine whether the addition of tinidazole, an anti-trichomonal agent, to the treatment regimen would result in higher cure rates than those achieved with treatment with doxycycline or azithromycin alone. A secondary aim was to compare the efficacy of doxycycline therapy and with that of azithromycin therapy. Methods. Randomized, controlled, double-blinded phase IIB trial of men with NGU. Participants were randomized to receive doxycycline plus or minus tinidazole or azithromycin plus or minus tinidazole and were observed for up to 45 days. Results. The prevalences of Chlamydia trachomatis, M. genitalium, and T. vaginalis were 43%, 31%, and 13%, respectively. No pathogens were identified in 29% of participants. Clinical cure rates at the first follow-up visit were 74.5% (111 of 149 patients) for doxycycline-containing regimens and 68.6% (107 of 156 patients) for azithromycin-containing regimens. By the final visit, cure rates were 49% (73 of 149 patients) for doxycycline-containing regimens and 43.6% (68 of 156 patients) for azithromycin-containing regimens. There were no significant differences in clinical response rates among the treatment arms. However, the chlamydia clearance rate was 94.8% (55 of 58 patients) for the doxycycline arm and 77.4% (41 of 53 patients) for the azithromycin arm (P = .011), and the M. genitalium clearance rate was 30.8% (12 of 39 patients) for the doxycycline arm and 66.7% (30 of 45 patients) for the azithromycin arm (P = .002). Conclusions. Addition of tinidazole to the treatment regimen did not result in higher cure rates but effectively eradicated trichomonas. Clinical cure rates were not significantly different between patients treated with doxycycline and those treated with azithromycin; however, doxycycline had significantly better efficacy against Chlamydia, whereas azithromycin was superior to doxycycline for the treatment of M. genitalium. PMID:21288838

  3. Pandemic of Pregnant Obese Women: Is It Time to Re-Evaluate Antenatal Weight Loss?

    Directory of Open Access Journals (Sweden)

    Anne M. Davis

    2015-08-01

    Full Text Available The Obesity pandemic will afflict future generations without successful prevention, intervention and management. Attention to reducing obesity before, during and after pregnancy is essential for mothers and their offspring. Preconception weight loss is difficult given that many pregnancies are unplanned. Interventions aimed at limiting gestational weight gain have produced minimal maternal and infant outcomes. Therefore, increased research to develop evidence-based clinical practice is needed to adequately care for obese pregnant women especially during antenatal care. This review evaluates the current evidence of obesity interventions during pregnancy various including weight loss for safety and efficacy. Recommendations are provided with the end goal being a healthy pregnancy, optimal condition for breastfeeding and prevent the progression of obesity in future generations.

  4. Race, Slavery, and the Re-evaluation of the T'ang Canon

    OpenAIRE

    Rutledge, Gregory E.

    2014-01-01

    In his article "Race, Slavery, and the Revaluation of the T'ang Canon" Gregory E. Rutledge re-evaluates—from the purview of African Diaspora literary studies—historiography that considers the place of East African slave lore in T'ang Dynasty fiction. Julie Wilensky's "The Magical Kunlun and 'Devil Slaves': Chinese Perceptions of Dark-skinned People and Africa before 1500" (2002), a revision of Chang Hsing-lang's "The Importation of Negro Slaves to China Under the T'ang Dynasty (A.D. 618-907)"...

  5. Relapse surveillance in AFP-positive hepatoblastoma: re-evaluating the role of imaging

    Energy Technology Data Exchange (ETDEWEB)

    Rojas, Yesenia; Vasudevan, Sanjeev A.; Nuchtern, Jed G. [Baylor College of Medicine, Pediatric Surgery Division, Michael E. DeBakey Department of Surgery, Texas Children' s Hospital, Houston, TX (United States); Guillerman, R.P. [Baylor College of Medicine, Department of Pediatric Radiology, Texas Children' s Hospital, Houston, TX (United States); Zhang, Wei [Texas Children' s Hospital, Surgical Outcomes Center, Houston, TX (United States); Thompson, Patrick A. [Baylor College of Medicine, Hematology-Oncology Division, Department of Pediatrics, Texas Children' s Cancer Center, Texas Children' s Hospital, Houston, TX (United States); University of North Carolina, Hematology-Oncology Division, Department of Pediatrics, North Carolina Children' s Hospital, Chapel Hill, NC (United States)

    2014-10-15

    Children with hepatoblastoma routinely undergo repetitive surveillance imaging, with CT scans for several years after therapy, increasing the risk of radiation-induced cancer. The purpose of this study was to determine the utility of surveillance CT scans compared to serum alpha-fetoprotein (AFP) levels for the detection of hepatoblastoma relapse. This was a retrospective study of all children diagnosed with AFP-positive hepatoblastoma from 2001 to 2011 at a single institution. Twenty-six children with hepatoblastoma were identified, with a mean age at diagnosis of 2 years 4 months (range 3 months to 11 years). Mean AFP level at diagnosis was 132,732 ng/ml (range 172.8-572,613 ng/ml). Five of the 26 children had hepatoblastoma relapse. A total of 105 imaging exams were performed following completion of therapy; 88 (84%) CT, 8 (8%) MRI, 5 (5%) US and 4 (4%) FDG PET/CT exams. A total of 288 alpha-fetoprotein levels were drawn, with a mean of 11 per child. The AFP level was elevated in all recurrences and no relapses were detected by imaging before AFP elevation. Two false-positive AFP levels and 15 false-positive imaging exams were detected. AFP elevation was found to be significantly more specific than PET/CT and CT imaging at detecting relapse. We recommend using serial serum AFP levels as the preferred method of surveillance in children with AFP-positive hepatoblastoma, reserving imaging for the early postoperative period, for children at high risk of relapse, and for determination of the anatomical site of clinically suspected recurrence. Given the small size of this preliminary study, validation in a larger patient population is warranted. (orig.)

  6. Re-evaluation of SiC permeation coefficients at high temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Yasushi, E-mail: yama3707@kansai-u.ac.jp [Faculty of Engineering Science, Kansai Univ., Yamate-cho, Suita, Osaka 564-8680 (Japan); Murakami, Yuichiro; Yamaguchi, Hirosato; Yamamoto, Takehiro; Yonetsu, Daigo [Faculty of Engineering Science, Kansai Univ., Yamate-cho, Suita, Osaka 564-8680 (Japan); Noborio, Kazuyuki [Hydrogen Isotope Research Center, Univ. of Toyama, Toyama, Toyama 930-8555 (Japan); Konishi, Satoshi [Institute of Advanced Energy, Kyoto Univ., Gokasho, Uji, Kyoto 611-0011 (Japan)

    2016-11-01

    Highlights: • The deuterium permeation coefficients of CVD-SiC at 600–950 °C were evaluated. • The wraparound flow was reduced to less than 1/100th of the permeation flow. • CVD-SiC materials are very effective as hydrogen isotope permeation barriers. - Abstract: Since 2007, our group has studied the deuterium permeation and diffusion coefficients for SiC materials at temperatures above 600 °C as a means of evaluating the tritium inventory and permeation in fusion blankets. During such measurements, control and evaluation of the wraparound flow through the sample holder are important, and so the heated sample holder is enclosed by a glass tube and kept under vacuum during experimental trials. However, detailed studies regarding the required degree of vacuum based on model calculations have shown that the wraparound flow is much larger than expected, and so can affect measurements at high temperatures. We therefore modified the measurement apparatus based on calculations involving reduced pressure in the glass tube, and are now confident that the measurement error is only several percent, even at 950 °C. In this paper, recent experimental results obtained with a chemical vapor deposition (CVD)-SiC sample over the temperature range of 600–950 °C are presented, showing that the permeation coefficient for CVD-SiC is more than three orders of magnitude smaller than that for stainless steel (SS316) at 600 °C, and that at 950 °C, the coefficient for CVD-SiC is almost equal to that for SUS316 at 550 °C.

  7. A re-evaluation of the final step of vanillin biosynthesis in the orchid Vanilla planifolia.

    Science.gov (United States)

    Yang, Hailian; Barros-Rios, Jaime; Kourteva, Galina; Rao, Xiaolan; Chen, Fang; Shen, Hui; Liu, Chenggang; Podstolski, Andrzej; Belanger, Faith; Havkin-Frenkel, Daphna; Dixon, Richard A

    2017-07-01

    A recent publication describes an enzyme from the vanilla orchid Vanilla planifolia with the ability to convert ferulic acid directly to vanillin. The authors propose that this represents the final step in the biosynthesis of vanillin, which is then converted to its storage form, glucovanillin, by glycosylation. The existence of such a "vanillin synthase" could enable biotechnological production of vanillin from ferulic acid using a "natural" vanilla enzyme. The proposed vanillin synthase exhibits high identity to cysteine proteases, and is identical at the protein sequence level to a protein identified in 2003 as being associated with the conversion of 4-coumaric acid to 4-hydroxybenzaldehyde. We here demonstrate that the recombinant cysteine protease-like protein, whether expressed in an in vitro transcription-translation system, E. coli, yeast, or plants, is unable to convert ferulic acid to vanillin. Rather, the protein is a component of an enzyme complex that preferentially converts 4-coumaric acid to 4-hydroxybenzaldehyde, as demonstrated by the purification of this complex and peptide sequencing. Furthermore, RNA sequencing provides evidence that this protein is expressed in many tissues of V. planifolia irrespective of whether or not they produce vanillin. On the basis of our results, V. planifolia does not appear to contain a cysteine protease-like "vanillin synthase" that can, by itself, directly convert ferulic acid to vanillin. The pathway to vanillin in V. planifolia is yet to be conclusively determined. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Re-Evaluation of Geomagnetic Field Observation Data at Syowa Station, Antarctica

    Directory of Open Access Journals (Sweden)

    K Takahashi

    2013-05-01

    Full Text Available The Japanese Antarctic Research Expedition has conducted geomagnetic observations at Syowa Station, Antarctica, since 1966. Geomagnetic variation data measured with a fluxgate magnetometer are not absolute but are relative to a baseline and show drift. To enhance the importance of the geomagnetic data at Syowa Station, therefore, it is necessary to correct the continuous variation data by using absolute baseline values acquired by a magnetic theodolite and proton magnetometer. However, the database of baseline values contains outliers. We detected outliers in the database and then converted the geomagnetic variation data to absolute values by using the reliable baseline values.

  9. Re-evaluation of 60Co treatment facility of Korle-Bu Teaching Hospital

    International Nuclear Information System (INIS)

    Adu, S.

    2008-06-01

    The radiological protection assessment based on the shielding of the Co-60 Radiotherapy facility at the Korle Bu Teaching Hospital after the source replacement has been carried out. The results indicate that the concrete biological shield is adequate to attenuate the gamma photons from the new 222 TBq Co-60 source in use. The dose rates at critical locations of the public access area are within the recommended dose rate limit of O.5J..1Sv/h and 7.5J..1Sv/h for public and staff respectively. Thus the shielding has not deteriorated and still provides adequate protection for members of the public and the operating staff (au).

  10. A re-evaluation of physical protection standards for irradiated HEU fuel

    International Nuclear Information System (INIS)

    Lyman, Edwin; Kuperman, Alan

    2002-01-01

    In the post-September 11 era, it is essential to reconsider all the assumptions upon which the physical protection systems of the past were based and determine whether these assumptions are still appropriate in light of the current terrorist threat. For instance, the U.S. Nuclear Regulatory Commission definition of a 'formula quantity' of special nuclear material is derived from the belief that a terrorist plot to carry out multiple coordinated attacks on different facilities with the goal of acquiring enough SNM for a nuclear weapon is incredible. This assumption has clearly been proven wrong by the September 11 attacks. Another standard that needs to be revisited is the 'self-protection' threshold that determines whether or not an item containing SNM is considered to be 'irradiated' for physical protection purposes. The current value of this threshold, 1 Sv/hr unshielded at 1 meter, is of questionable value as a deterrent to determined terrorists who would be willing to sustain long-term injury as long as they could accomplish their near-term goals. A more credible threshold would be set at a level that would have a high likelihood of disabling the perpetrators before they could complete their mission. Most irradiated nonpower reactor fuels would be unable to meet such a standard. This raises serious questions about the adequacy of the level of physical protection applied today to the large inventories of irradiated HEU fuels now scattered in storage sites around the world. The absence of a coherent global policy for dealing with these materials has created a situation rife with vulnerabilities that terrorists could exploit. The international community, now seized with concern about unused stockpiles of unirradiated HEU fuels around the world, also needs to appreciate the dangers posed by lightly irradiated spent fuels as well. A U.S. proposal to import Russian HEU for supplying U.S. nonpower reactors will only prolong this situation This paper will review policy options to mitigate this threat. (author)

  11. Gudden's Ventral Tegmental Nucleus Is Vital for Memory: Re-Evaluating Diencephalic Inputs for Amnesia

    Science.gov (United States)

    Vann, Seralynne D.

    2009-01-01

    Mammillary body atrophy is present in a number of neurological conditions and recent clinical findings highlight the importance of these nuclei for memory. While most accounts of diencephalic amnesia emphasize the functional importance of the hippocampal projections to the mammillary bodies, the present study tested the importance of the other…

  12. Re-evaluating Open Source for Sustaining Competitive Advantage for Hosted Applications

    Directory of Open Access Journals (Sweden)

    Daniel Crenna

    2010-03-01

    Full Text Available The use of open source in hosted solutions is undoubtedly widespread. However, it is seldom considered important in its own right, nor do the majority of hosted solutions providers contribute to or create open source as natural artifacts of doing good business. In this exploration of the nature of hosted solutions and their developers, it is suggested that not only consuming open source, but creating and disseminating it to collaborators and competitors alike, is essential to success. By establishing an open source ecosystem where hosted solutions compete on differentiation rather than lose time and money to concerns that are expected by users, do not add value, and benefit from public scrutiny, hosted solution providers can reduce the cost of their solution, the time it takes to deliver new ones, and improve their quality without additional resources.

  13. The importance of laboratory re-evaluation in cases of suspected child abuse - A case report.

    Science.gov (United States)

    Woydt, L; König, C; Bernhard, M K; Nickel, P; Dreßler, J; Ondruschka, B

    2017-09-01

    In order to accurately diagnose child abuse or neglect, a physician needs to be familiar with diseases and medical conditions that can simulate maltreatment. Unrecognized cases of abuse may lead to insufficient child protection, whereas, on the other hand, over-diagnosis could be the cause of various problems for the family and their potentially accused members. Regarding child abuse, numerous cases of false diagnoses with undetected causes of bleeding are described in the scientific literature, but, specifically concerning leukemia in childhood, only very few case reports exist. Here, for the first time, we report a case of a 2-year-old boy who got hospitalized twice because of suspicious injuries and psychosocial conspicuities, in a family situation known for repeated endangerment of the child's well-being. After his first hospitalization with injuries typical for child abuse, but without paraclinical abnormalities, medical inspections were arranged periodically. The child was hospitalized with signs of repeated child abuse again five months later. During second admission, an acute lymphoblastic leukemia was revealed by intermittent laboratory examination, ordered due to new bruises with changes in morphology, identifiable as petechial hemorrhages. This case elucidates the discussion of known cases of leukemia in childhood associated with suspected child abuse in order to provide an overview of possible diseases mimicking maltreatment. To arrange necessary supportive examinations, a skillful interaction between pediatrician and forensic pathologist is crucial in the differentiation between accidental and non-accidental injury. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Re-evaluation of the reactivity of hydroxylamine with O2-/HO2

    International Nuclear Information System (INIS)

    Bielski, B.H.J.; Arudi, R.L.; Cabelli, D.E.; Bors, W.

    1984-01-01

    The reactivity of hydroxylamine with HO 2 /O 2 - radicals was studied by pulse radiolysis and stopped-flow photolysis over a pH range of 1.1-10.5. Upper limits for the rate of reaction indicate that hydroxylamine, if it reacts at all, reacts at a very slow rate. Its use as an indicator for O 2 - and an assay for superoxide dismutase is, therefore, inappropriate. 20 references, 1 table

  15. Biogeography of Mediterranean Hotspot Biodiversity: Re-Evaluating the 'Tertiary Relict' Hypothesis of Macaronesian Laurel Forests.

    Science.gov (United States)

    Kondraskov, Paulina; Schütz, Nicole; Schüßler, Christina; de Sequeira, Miguel Menezes; Guerra, Arnoldo Santos; Caujapé-Castells, Juli; Jaén-Molina, Ruth; Marrero-Rodríguez, Águedo; Koch, Marcus A; Linder, Peter; Kovar-Eder, Johanna; Thiv, Mike

    2015-01-01

    The Macaronesian laurel forests (MLF) are dominated by trees with a laurophyll habit comparable to evergreen humid forests which were scattered across Europe and the Mediterranean in the Paleogene and Neogene. Therefore, MLF are traditionally regarded as an old, 'Tertiary relict' vegetation type. Here we address the question if key taxa of the MLF are relictual. We evaluated the relict hypothesis consulting fossil data and analyses based on molecular phylogenies of 18 representative species. For molecular dating we used the program BEAST, for ancestral trait reconstructions BayesTraits and Lagrange to infer ancestral areas. Our molecular dating showed that the origins of four species date back to the Upper Miocene while 14 originated in the Plio-Pleistocene. This coincides with the decline of fossil laurophyllous elements in Europe since the middle Miocene. Ancestral trait and area reconstructions indicate that MLF evolved partly from pre-adapted taxa from the Mediterranean, Macaronesia and the tropics. According to the fossil record laurophyllous taxa existed in Macaronesia since the Plio- and Pleistocene. MLF are composed of species with a heterogeneous origin. The taxa dated to the Pleistocene are likely not 'Tertiary relicts'. Some species may be interpreted as relictual. In this case, the establishment of most species in the Plio-Pleistocene suggests that there was a massive species turnover before this time. Alternatively, MLF were largely newly assembled through global recruitment rather than surviving as relicts of a once more widespread vegetation. This process may have possibly been triggered by the intensification of the trade winds at the end of the Pliocene as indicated by proxy data.

  16. Stories and story telling in first-levellanguage learning: a re-evaluation

    Directory of Open Access Journals (Sweden)

    Robert W. Blair

    2013-02-01

    Full Text Available This paper proposes that in the midst of all our theories on language teaching and language learning, we might have overlooked an age-old tool that has always been at the disposal of mankind; the telling of stories. Attention is drawn to how some have found in stories and story telling a driving force of natural language acquisition, a key that can unlock the intuitive faculties ofthe mind. A case is being made out for the re-instalment of stories and associated activities as a means of real, heart-felt functional communication in a foreign language, rather than through a direct assault on the structure of the language itself. Met hierdie artikel word daar voorgestel dat daar opnuut gekyk moet word na 'n hulpmiddel wat so oud is as die mensheid self en wat nog altyd tot ons beskikking was, naamlik stories en die vertel daarvan. Die aandag word daarop gevestig dat daar persone is wat in stories en die verbale oordrag daarvan 'n stukrag ontdek het tot natuurlike taalvaardigheid, 'n sleutel tot die intultiewe breinfunksies. Daar word 'n saak uitgemaak vir die terugkeer na stories en gepaardgaande aktiwiteite as middel tot 'n egte, diep deurleefde en funksionele wyse van kommunikasie in 'n vreemde taal, eerder as 'n direkte aanslag op die taalstruktuur self.

  17. Vike-Freiberga calls on Russia to re-evaluate its history / Aaron Eglitis

    Index Scriptorium Estoniae

    Eglitis, Aaron

    2003-01-01

    Rahvusvahelisel holokausti uurimise konverentsil avaldas Läti president Vaira Vike-Freiberga kahetsust, et kuna Venemaa ei soovi tunnistada Läti okupeerimist Nõukogude Liidu poolt 1940. aasta juunis, tekitab see probleeme Läti-Vene suhetes

  18. Re-evaluating the functional landscape of the cardiovascular system during development.

    Science.gov (United States)

    Takada, Norio; Omae, Madoka; Sagawa, Fumihiko; Chi, Neil C; Endo, Satsuki; Kozawa, Satoshi; Sato, Thomas N

    2017-11-15

    The cardiovascular system facilitates body-wide distribution of oxygen, a vital process for the development and survival of virtually all vertebrates. However, the zebrafish, a vertebrate model organism, appears to form organs and survive mid-larval periods without a functional cardiovascular system. Despite such dispensability, it is the first organ to develop. Such enigma prompted us to hypothesize other cardiovascular functions that are important for developmental and/or physiological processes. Hence, systematic cellular ablations and functional perturbations were performed on the zebrafish cardiovascular system to gain comprehensive and body-wide understanding of such functions and to elucidate the underlying mechanisms. This approach identifies a set of organ-specific genes, each implicated for important functions. The study also unveils distinct cardiovascular mechanisms, each differentially regulating their expressions in organ-specific and oxygen-independent manners. Such mechanisms are mediated by organ-vessel interactions, circulation-dependent signals, and circulation-independent beating-heart-derived signals. A comprehensive and body-wide functional landscape of the cardiovascular system reported herein may provide clues as to why it is the first organ to develop. Furthermore, these data could serve as a resource for the study of organ development and function. © 2017. Published by The Company of Biologists Ltd.

  19. Re-evaluation of the life cycle of Eimeria maxima Tyzzer, 1929 in chickens (Gallus domesticus).

    Science.gov (United States)

    Dubey, J P; Jenkins, M C

    2017-12-14

    A time-course study was conducted to resolve discrepancies in the literature and better define aspects of the Eimeria maxima life cycle such, as sites of development and both morphology and number of asexual stages. Broiler chickens were inoculated orally with five million E. maxima oocysts (APU1), and were necropsied at regular intervals from 12 to 120 h p.i. Small intestine tissue sections and smears were examined for developmental stages. The jejunum contained the highest numbers of developmental stages. At 12 h p.i., sporozoites were observed inside a parasitophorous vacuole (PV) in the epithelial villi and the lamina propria. By 24 h, sporozoites enclosed by a PV were observed in enterocytes of the glands of Lieberkühn. At 48 h p.i., sporozoites, elongated immature and mature schizonts, were all seen in the glands with merozoites budding off from a residual body. By 60 h, second-generation, sausage-shaped schizonts containing up to 12 merozoites were observed around a residual body in the villar tip of invaded enterocytes. At 72 and 96 h, profuse schizogony associated with third- and fourth-generation schizonts was observed throughout the villus. At 120 h, another generation (fifth) of schizonts were seen in villar tips as well as in subepithelium where gamonts and oocysts were also present; a few gamonts were in epithelium. Our finding of maximum parasitization of E. maxima in jejunum is important because this region is critical for nutrient absorption and weight gain.

  20. Re-evaluating Sustainability Assessment: Aligning the vision and the practice

    International Nuclear Information System (INIS)

    Bond, Alan J.; Morrison-Saunders, Angus

    2011-01-01

    Sustainable Development is the core goal of the expanding field of Sustainability Assessment (SA). However, we find that three key areas of debate in relation to SA practice in England and Western Australia can be classified as policy controversies. Through literature review and analysis of documentary evidence we consider the problem of reductionism (breaking down complex processes to simple terms or component parts) as opposed to holism (considering systems as wholes); the issue of contested understandings of the meaning of sustainability (and of the purpose of SA); and the definition of 'inter-generational' in the context of sustainable development and how this is reflected in the timescales considered in SA. We argue that SA practice is based on particular framings of the policy controversies and that the critical role of SA in facilitating deliberation over these controversies needs to be recognised if there is to be a move towards a new deliberative sustainability discourse which can accommodate these different framings.

  1. Re-evaluation of the effectiveness of the central A/M Area recovery well network

    International Nuclear Information System (INIS)

    Haselow, J.S.

    1991-06-01

    A groundwater recovery well network has been operating in the central portion of the A/M Area of the Savannah River Site (SRS) since 1985 to retrieve chlorinated volatile organic solvents. In 1986, a groundwater modeling study was performed to evaluate the effectiveness of the recovery well network that included planned recovery wells (RWM 1 through 11) and process water wells (S. S. Papadopulous, 1986). Since the original modeling study, use of some of the process wells has discontinued and some pumping rates at other wells have changed. Also, the understanding of the hydrologic system in the A/M Area has improved because additional monitoring wells have been installed in the area. As a result, an updated groundwater flow model (Beaudoin et al., 1991) for the area was used to evaluate the effectiveness of the existing recovery network. The results of this study indicate that the estimated effectiveness of the recovery well has not changed dramatically since the original groundwater modeling study. However, slight differences do exist between the original study and this study because the recent model more accurately reflects the A/M Area subsurface hydrologic system

  2. Clinical vampirism A presentation of 3 cases and a re-evaluation of ...

    African Journals Online (AJOL)

    From childhood they cut themselves, drank their own, exogenous human or animal blood to relieve a craving, dreamed of bloodshed, asspciated with the dead, and had a changing identity. They were intelligent, with no family mental or social pathology. Some self-cutters are auto-vampirists; females are not likely to assault ...

  3. [Significance of re-evaluation and development of Chinese herbal drugs].

    Science.gov (United States)

    Gao, Yue; Ma, Zengchun; Zhang, Boli

    2012-01-01

    The research of new herbal drugs involves in new herbal drugs development and renew the old drugs. It is necessary to research new herbal drugs based on the theory of traditional Chinese medicine (TCM). The current development of famous TCM focuses on the manufacture process, quality control standards, material basis and clinical research. But system management of security evaluation is deficient, the relevant system for the safety assessment TCM has not been established. The causes of security problems, security risks, target organ of toxicity, weak link of safety evaluation, and ideas of safety evaluation are discussed in this paper. The toxicology research of chinese herbal drugs is necessary based on standard of good laboratory practices (GLP), the characteristic of Chinese herbal drugs is necessary to be fully integrated into safety evaluation. The safety of new drug research is necessary to be integrated throughout the entire process. Famous Chinese medicine safety research must be paid more attention in the future.

  4. Re-evaluation of in vitro radiosensitivity of human fibroblasts of different genetic origins

    Energy Technology Data Exchange (ETDEWEB)

    Deschavanne, P.J.; Debieu, D.; Malaise, E.P.; Fertil, B.

    1986-08-01

    Statistical analysis of the radiosensitivity of 204 survival curves of non-transformed human fibroblast cell strains of different genetic origins was made using the multi-target one-hit model (characterized by parameters eta and D/sub 0/), the surviving fraction for a 2 Gy dose (S/sub 2/) and the mean inactivation dose (D-bar). D-bar is found to be the parameter for characterization of anomalous radiosensitivity linked to a genetic disorder and discrimination between groups of cell strains of differing radiosensitivity. It allows the description of a range of 'normal' radiosensitivity for control fibroblasts and classification of genetic disorders as a function of their mean radiosensitivity expressed in terms of D-bar. Nine groups of cell strains appear to exhibit radiosensitivity differing significantly from the controls: seven groups are hypersensitive (ataxia-telengiectasia homozygotes and heterozygotes, Cockayne's syndrome, Gardner's syndrome, 5-oxoprolinuria homozygotes and heterozygotes, Fanconi's anaemia) and two groups are more radioresistant (fibroblasts from retinoblastoma patients and individuals with chromosome 13 anomalies). Since the coupled parameter eta and D/sub 0/ failed to discriminate between the radiosensitivity of the different genetic groups, the use of D-bar to make an intercomparison of intrinsic radiosensitivity of non-transformed human fibroblasts is recommended. (U.K.).

  5. Re-evaluation of model-based light-scattering spectroscopy for tissue spectroscopy

    Science.gov (United States)

    Lau, Condon; Šćepanović, Obrad; Mirkovic, Jelena; McGee, Sasha; Yu, Chung-Chieh; Fulghum, Stephen; Wallace, Michael; Tunnell, James; Bechtel, Kate; Feld, Michael

    2009-01-01

    Model-based light scattering spectroscopy (LSS) seemed a promising technique for in-vivo diagnosis of dysplasia in multiple organs. In the studies, the residual spectrum, the difference between the observed and modeled diffuse reflectance spectra, was attributed to single elastic light scattering from epithelial nuclei, and diagnostic information due to nuclear changes was extracted from it. We show that this picture is incorrect. The actual single scattering signal arising from epithelial nuclei is much smaller than the previously computed residual spectrum, and does not have the wavelength dependence characteristic of Mie scattering. Rather, the residual spectrum largely arises from assuming a uniform hemoglobin distribution. In fact, hemoglobin is packaged in blood vessels, which alters the reflectance. When we include vessel packaging, which accounts for an inhomogeneous hemoglobin distribution, in the diffuse reflectance model, the reflectance is modeled more accurately, greatly reducing the amplitude of the residual spectrum. These findings are verified via numerical estimates based on light propagation and Mie theory, tissue phantom experiments, and analysis of published data measured from Barrett’s esophagus. In future studies, vessel packaging should be included in the model of diffuse reflectance and use of model-based LSS should be discontinued. PMID:19405760

  6. Re-evaluation of a radiation protection cost benefit analysis study in brachytherapy

    International Nuclear Information System (INIS)

    Broek, J.G. van den; Weatherburn, H.

    1994-01-01

    This study investigates changes in the NRPB advice concerning cost benefit analysis over the last 10 years by correcting all figures for inflation and applying them to a particular radiation protection example, a previously published case of the introduction of afterloading brachytherapy equipment at the Christie Hospital, Manchester. It has been shown that for this example NRPB advice at one time led to a large cost benefit, at another time led to a large cost deficit and later still it again gives a large cost benefit. Application of cost benefit analysis to decision making in radiation protection is therefore shown to be in need of further investigation and clarification. (author)

  7. Learning to love what we despise: Experiential re-evaluation of stigmatised technologies

    DEFF Research Database (Denmark)

    Scholderer, Joachim; Grunert, Klaus G.; Søndergaard, Helle Alsted

    , preferences for genetically modified products were found to be mainly dependent on the quality of the product participants had tried. Both effects were robust under contextual variations. Results are discussed in terms of theory and practice, focusing on point-of-sale promotions that could be the key element...

  8. Re-evaluation Of The Shallow Seismicity On Mt Etna Applying Probabilistic Earthquake Location Algorithms.

    Science.gov (United States)

    Tuve, T.; Mostaccio, A.; Langer, H. K.; di Grazia, G.

    2005-12-01

    A recent research project carried out together with the Italian Civil Protection concerns the study of amplitude decay laws in various areas on the Italian territory, including Mt Etna. A particular feature of seismic activity is the presence of moderate magnitude earthquakes causing frequently considerable damage in the epicentre areas. These earthquakes are supposed to occur at rather shallow depth, no more than 5 km. Given the geological context, however, these shallow earthquakes would origin in rather weak sedimentary material. In this study we check the reliability of standard earthquake location, in particular with respect to the calculated focal depth, using standard location methods as well as more advanced approaches such as the NONLINLOC software proposed by Lomax et al. (2000) using it with its various options (i.e., Grid Search, Metropolis-Gibbs and Oct-Tree) and 3D velocity model (Cocina et al., 2005). All three options of NONLINLOC gave comparable results with respect to hypocenter locations and quality. Compared to standard locations we note a significant improve of location quality and, in particular a considerable difference of focal depths (in the order of 1.5 - 2 km). However, we cannot find a clear bias towards greater or lower depth. Further analyses concern the assessment of the stability of locations. For this purpose we carry out various Monte Carlo experiments perturbing travel time reading randomly. Further investigations are devoted to possible biases which may arise from the use of an unsuitable velocity model.

  9. Aragonian stratigraphy reconsidered, and a re-evaluation of the middle Miocene mammal biochronology in Europe

    NARCIS (Netherlands)

    Daams, R.; Meulen, A.J. van der; Alvarez Sierra, M.A.; Peláez-Campomanes, P.; Krijgsman, W.

    1998-01-01

    The recently collected fauna of Armantes 1A in Chron C5Br of the Armantes section necessitates reinterpretation of the previous bio- and magnetostratigraphical correlations between the Armantes and Vargas sections (Calatayud-Daroca Basin, Central Spain) [W. Krijgsman, M. Garcés, C.G. Langereis, R.

  10. Beyond the ‘dyad’: a qualitative re-evaluation of the changing clinical consultation

    Science.gov (United States)

    Swinglehurst, Deborah; Roberts, Celia; Li, Shuangyu; Weber, Orest; Singy, Pascal

    2014-01-01

    Objective To identify characteristics of consultations that do not conform to the traditionally understood communication ‘dyad’, in order to highlight implications for medical education and develop a reflective ‘toolkit’ for use by medical practitioners and educators in the analysis of consultations. Design A series of interdisciplinary research workshops spanning 12 months explored the social impact of globalisation and computerisation on the clinical consultation, focusing specifically on contemporary challenges to the clinician–patient dyad. Researchers presented detailed case studies of consultations, taken from their recent research projects. Drawing on concepts from applied sociolinguistics, further analysis of selected case studies prompted the identification of key emergent themes. Setting University departments in the UK and Switzerland. Participants Six researchers with backgrounds in medicine, applied linguistics, sociolinguistics and medical education. One workshop was also attended by PhD students conducting research on healthcare interactions. Results The contemporary consultation is characterised by a multiplicity of voices. Incorporation of additional voices in the consultation creates new forms of order (and disorder) in the interaction. The roles ‘clinician’ and ‘patient’ are blurred as they become increasingly distributed between different participants. These new consultation arrangements make new demands on clinicians, which lie beyond the scope of most educational programmes for clinical communication. Conclusions The consultation is changing. Traditional consultation models that assume a ‘dyadic’ consultation do not adequately incorporate the realities of many contemporary consultations. A paradox emerges between the need to manage consultations in a ‘super-diverse’ multilingual society, while also attending to increasing requirements for standardised protocol-driven approaches to care prompted by computer use. The tension between standardisation and flexibility requires addressing in educational contexts. Drawing on concepts from applied sociolinguistics and the findings of these research observations, the authors offer a reflective ‘toolkit’ of questions to ask of the consultation in the context of enquiry-based learning. PMID:25270858

  11. Beyond the 'dyad': a qualitative re-evaluation of the changing clinical consultation.

    Science.gov (United States)

    Swinglehurst, Deborah; Roberts, Celia; Li, Shuangyu; Weber, Orest; Singy, Pascal

    2014-09-29

    To identify characteristics of consultations that do not conform to the traditionally understood communication 'dyad', in order to highlight implications for medical education and develop a reflective 'toolkit' for use by medical practitioners and educators in the analysis of consultations. A series of interdisciplinary research workshops spanning 12 months explored the social impact of globalisation and computerisation on the clinical consultation, focusing specifically on contemporary challenges to the clinician-patient dyad. Researchers presented detailed case studies of consultations, taken from their recent research projects. Drawing on concepts from applied sociolinguistics, further analysis of selected case studies prompted the identification of key emergent themes. University departments in the UK and Switzerland. Six researchers with backgrounds in medicine, applied linguistics, sociolinguistics and medical education. One workshop was also attended by PhD students conducting research on healthcare interactions. The contemporary consultation is characterised by a multiplicity of voices. Incorporation of additional voices in the consultation creates new forms of order (and disorder) in the interaction. The roles 'clinician' and 'patient' are blurred as they become increasingly distributed between different participants. These new consultation arrangements make new demands on clinicians, which lie beyond the scope of most educational programmes for clinical communication. The consultation is changing. Traditional consultation models that assume a 'dyadic' consultation do not adequately incorporate the realities of many contemporary consultations. A paradox emerges between the need to manage consultations in a 'super-diverse' multilingual society, while also attending to increasing requirements for standardised protocol-driven approaches to care prompted by computer use. The tension between standardisation and flexibility requires addressing in educational contexts. Drawing on concepts from applied sociolinguistics and the findings of these research observations, the authors offer a reflective 'toolkit' of questions to ask of the consultation in the context of enquiry-based learning. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  12. 75 FR 55846 - Draft Re-Evaluation for Environmental Impact Statement: Sikorsky Memorial Airport, Stratford, CT

    Science.gov (United States)

    2010-09-14

    ..., Environmental Program Manager, Federal Aviation Administration New England, 12 New England Executive Park... Memorial Airport in Stratford, Connecticut. The document will assist the FAA in determining the suitability... following locations: FAA New England Region, 16 New England Executive Park, Burlington, MA, 781-238-7613...

  13. Natural Selection or Problem Solving. Critical Re-evaluation of Karl Popper's Evolutionism

    Directory of Open Access Journals (Sweden)

    Boldachev Alexander

    2014-09-01

    Full Text Available Among the philosophers and the educated audience the name of Sir Karl Popper is usually associated with the critical method, evolutionary epistemology, falsification as a criterion for the demarcation of scientific knowledge, the concept of the third world and with his dislike to dialectics and contradictions. This article is aimed to show in what way all these things are connected in the evolutionary researches of the philosopher and the new conceptions, which he contributed to studying the mechanisms of evolution. Also there is an attempt to comprehend the evolutionary views of Popper, test them for falsification, relate his epistemology with his claims, which he puts forward to the theory of objective knowledge evolution and show the obvious contradiction between them.

  14. A revision of Sphaeria pilosa Pers. and re-evaluation of the Trichosphaeriales

    Czech Academy of Sciences Publication Activity Database

    Réblová, Martina; Gams, W.

    2016-01-01

    Roč. 15, č. 6 (2016), s. 1-8, č. článku 52. ISSN 1617-416X R&D Projects: GA ČR GAP506/12/0038 Institutional support: RVO:67985939 Keywords : Chaetosphaeria * nomenklatura * Trichosphaeria Subject RIV: EF - Botanics Impact factor: 1.616, year: 2016

  15. Re-evaluation of in vitro radiosensitivity of human fibroblasts of different genetic origins

    International Nuclear Information System (INIS)

    Deschavanne, P.J.; Debieu, D.; Malaise, E.P.; Fertil, B.

    1986-01-01

    Statistical analysis of the radiosensitivity of 204 survival curves of non-transformed human fibroblast cell strains of different genetic origins was made using the multi-target one-hit model (characterized by parameters eta and D 0 ), the surviving fraction for a 2 Gy dose (S 2 ) and the mean inactivation dose (D-bar). D-bar is found to be the parameter for characterization of anomalous radiosensitivity linked to a genetic disorder and discrimination between groups of cell strains of differing radiosensitivity. It allows the description of a range of 'normal' radiosensitivity for control fibroblasts and classification of genetic disorders as a function of their mean radiosensitivity expressed in terms of D-bar. Nine groups of cell strains appear to exhibit radiosensitivity differing significantly from the controls: seven groups are hypersensitive (ataxia-telengiectasia homozygotes and heterozygotes, Cockayne's syndrome, Gardner's syndrome, 5-oxoprolinuria homozygotes and heterozygotes, Fanconi's anaemia) and two groups are more radioresistant (fibroblasts from retinoblastoma patients and individuals with chromosome 13 anomalies). Since the coupled parameter eta and D 0 failed to discriminate between the radiosensitivity of the different genetic groups, the use of D-bar to make an intercomparison of intrinsic radiosensitivity of non-transformed human fibroblasts is recommended. (U.K.)

  16. Evaluation and re-evaluation of genetic radiation hazards in man

    International Nuclear Information System (INIS)

    Sankaranarayanan, K.

    1976-01-01

    The arm number hypothesis proposed by Brewen and colleagues in 1973 has been examined in the light of information thus far available from mammalian studies. In experiments with peripheral blood lymphocytes (radiation in vitro), a linear relationship between dicentric yield and the effective chromosome arm number of the species was obtained in the mouse, Chinese hamster, goat, sheep, pig, wallaby and man. However, the data are not consistent with such a relationship in several primate species (marmoset, rhesus monkey, cynomolgus monkey, squirrel monkey and the slow loris), the cat and the dog. In the rabbit, the data are conflicting. In the mouse and the Chinese hamster the frequencies of reciprocal translocations recorded in spermatocytes descended from irradiated spermatogonia are in line with the expectation based on the arm number hypothesis, whereas in the golden hamster, rabbit and the rhesus they are not. In man and the marmoset, the limited data are not inconsistent with a 2-fold higher sensitivity of these species relative to the mouse although they do not rule out a difference as high as 4-fold. In the guinea-pig, the situation is unclear. New data on the transmission of reciprocal translocations in mice suggest that the frequency in the F 1 progeny may be close to one-quarter of that recorded in the spermatocytes of the irradiated fathers (spermatogonial irradiation) at an exposure level of 150 R, whereas at higher exposures, the reduction factor is about one-eighth, the latter being in line with the earlier finding. All these results taken together suggest that inter-specific extrapolation from the radiosensitivity of somatic cells (to dicentric induction) to that of germ cells (to translocation induction) is fraught with uncertainty at present. Certain aspects that need to be studied in more detail in the context of induced chromosome aberrations are discussed

  17. Phylogenetic and morphological re-evaluation of the Botryosphaeria species causing diseases of Mangifera indica

    NARCIS (Netherlands)

    Slippers, B.; Johnson, G.I.; Crous, P.W.; Coutinho, T.A.; Wingfield, B.D.; Wingfield, M.J.

    2005-01-01

    Species of Botryosphaeria are among the most serious pathogens that affect mango trees and fruit. Several species occur on mangoes, and these are identified mainly on the morpholopy of the anamorphs. Common taxa include Dothiorella dominicana, D. mangiferae (= Natrassia mangiferae), D. aromatica and

  18. Re-evaluation of the socio-economical cost for straw

    International Nuclear Information System (INIS)

    Nikolaisen, L.

    1993-03-01

    The socio-economical price for straw is calculated. Prices per ton are for the year 1984, 230 Danish kroner per ton and 15.53 kroner per giga joule, for 1990, 291 kroner per ton and 19.40 per giga joule and for 1992, 171 kroner per ton and 11.81 per giga joule. This development reflects that in the case of some machines there has been a fall in price, and that the operation capacity has in many instances increased. The calculation of storage costs must be seen in relation to the current discussion on acceptable straw qualities. The calculated storage costs are based on the consideration of 6 storage possibilities. It was attempted to price the management methods in relation to equipment which is most used by farmers and lorry drivers daily. An example of this is that tranportation calculations are based on the probable use of secondhand renovated lorries, as new ones can not be afforded. Sliced straw is not included in the calculations. (AB) (16 refs.)

  19. Re-evaluating luminescence burial doses and bleaching of fluvial deposits using Bayesian computational statistics.

    NARCIS (Netherlands)

    Cunningham, A.C.; Wallinga, J.; Versendaal, Alice; Makaske, A.; Middelkoop, H.; Hobo, N.

    2015-01-01

    The optically stimulated luminescence (OSL) signal from fluvial sediment often contains a remnant from the previous deposition cycle, leading to a partially bleached equivalent-dose distribution. Although identification of the burial dose is of primary concern, the degree of bleaching could

  20. Re-evaluating luminescence burial doses and bleaching of fluvial deposits using Bayesian computational statistics

    NARCIS (Netherlands)

    Cunningham, A. C.; Wallinga, J.; Hobo, N.; Versendaal, A. J.; Makaske, B.; Middelkoop, H.

    2015-01-01

    The optically stimulated luminescence (OSL) signal from fluvial sediment often contains a remnant from the previous deposition cycle, leading to a partially bleached equivalent-dose distribution. Although identification of the burial dose is of primary concern, the degree of bleaching could

  1. Re-evaluation of P-T paths across the Himalayan Main Central Thrust

    Science.gov (United States)

    Catlos, E. J.; Harrison, M.; Kelly, E. D.; Ashley, K.; Lovera, O. M.; Etzel, T.; Lizzadro-McPherson, D. J.

    2016-12-01

    The Main Central Thrust (MCT) is the dominant crustal thickening structure in the Himalayas, juxtaposing high-grade Greater Himalayan Crystalline rocks over the lower-grade Lesser Himalaya Formations. The fault is underlain by a 2 to 12-km-thick sequence of deformed rocks characterized by an apparent inverted metamorphic gradient, termed the MCT shear zone. Garnet-bearing rocks sampled from across the MCT along the Marysandi River in central Nepal contain monazite that decrease in age from Early Miocene (ca. 20 Ma) in the hanging wall to Late Miocene-Pliocene (ca. 7 Ma and 3 Ma) towards structurally lower levels in the shear zone. We obtained high-resolution garnet-zoning pressure-temperature (P-T) paths from 11 of the same rocks used for monazite geochronology using a recently-developed semi-automated Gibbs-free-energy-minimization technique. Quartz-in-garnet Raman barometry refined the locations of the paths. Diffusional re-equilibration of garnet zoning in hanging wall samples prevented accurate path determinations from most Greater Himalayan Crystalline samples, but one that shows a bell-shaped Mn zoning profile shows a slight decrease in P (from 8.2 to 7.6kbar) with increase in T (from 590 to 640ºC). Three MCT shear zone samples were modeled: one yields a simple path increasing in both P and T (6 to 7kbar, 540 to 580ºC); the others yield N-shaped paths that occupy similar P-T space (4 to 5.5 kbar, 500 to 560ºC). Five lower lesser Himalaya garnet-bearing rocks were modeled. One yields a path increasing in both P-T (6 to 7 kbar, 525 to 550ºC) but others show either sharp compression/decompression or N-shape paths (within 4.5-6 kbar and 530-580ºC). The lowermost sample decreases in P (5.5 to 5 kbar) over increasing T (540 to 580°C). No progressive change is seen from one type of path to another within the Lesser Himalayan Formations to the MCT zone. The results using the modeling approach yield lower P-T conditions compared to the Gibbs method and lower core/rim P-T conditions compared to traditional thermometers and barometers. Inclusion barometry suggests that baric estimates from the modeling may be underestimated by 2-4 kbar. Despite uncertainty, path shapes are consistent with a model in which the MCT shear zone experienced a progressive accretion of footwall slivers.

  2. Forensic geotechniques in the re-evaluation of Ruskin Dam foundation shear strength

    Energy Technology Data Exchange (ETDEWEB)

    Rigbey, S.; Lawrence, M.S. [BC Hydro, Vancouver, BC (Canada); Daw, D. [Hatch Energy, Vancouver, BC (Canada)

    2008-07-01

    The 59 metre high Ruskin Dam was constructed in the 1930s at the south end of Hayward Lake in British Columbia. The concrete gravity dam is founded nearly entirely on rock. Although the dam has performed satisfactorily since its construction, it is categorized as a very high consequence structure based on criteria established in British Columbia Dam Safety Regulations. It was considered to have insufficient withstand for Maximum Designs earthquake (MDE). Stability analyses performed in the late 1990's relied on simplified geometry with presumed planar concrete-rock interfaces, and relatively conservative estimates of sliding resistance and no consideration for canyon geometry. The analyses suggested that the concrete base may need to be anchored to the rock foundation to achieve satisfactory seismic withstand. The sliding resistance of the dam's foundation had to be assessed in order to determine if remedial measures were needed to meet updated design criteria. A reliable 3-dimensional topographic model for the Ruskin Dam was created in 2006 following a review of construction records and drilling investigation programs. Irregularities were found in the rock concrete contact, and the canyon walls showed a positive downstream converging geometry. The potential critical failure modes were determined along the contact, along the potential subhorizontal joints within the foundation, and through a broken rock mass under the contact. Roughness for each selected case was evaluated and the Barton-Bandis basic friction angle for the rock was determined by laboratory testing. The resulting shear strengths were used in a series of dynamic stability analyses which revealed that the body of the dam would be stable in the updated design earthquake. The 3-D geotechnical model was the key to the new analyses, which showed that the abutment wedges are stable under seismic loading. As such, costly base anchoring of the dam was deemed unnecessary. 6 refs., 6 tabs., 12 figs.

  3. Ultrasound in gynecological cancer: is it time for re-evaluation of its uses?

    Science.gov (United States)

    Fischerova, Daniela; Cibula, David

    2015-06-01

    Ultrasound is the primary imaging modality in gynecological oncology. Over the last decade, there has been a massive technology development which led to a dramatic improvement in the quality ultrasound imaging. If performed by an experienced sonographer, ultrasound has an invaluable role in the primary diagnosis of gynecological cancer, in the assessment of tumor extent in the pelvis and abdominal cavity, in the evaluation of the treatment response, and in follow-up. Ultrasound is also a valuable procedure for monitoring patients treated with fertility-sparing surgery. Furthermore, it is an ideal technique to guide tru-cut biopsy for the collection of material for histology. Taking into consideration that besides its accuracy, the ultrasound is a commonly available, non-invasive, and inexpensive imaging method that can be carried out without any risk or discomfort to the patient; it is time to reconsider its role in gynecologic oncology and to allocate resources for a specialized education of future experts in ultrasound imaging in gynecology.

  4. Natural Icing Re-Evaluation of the EH-60A Quick Fix Helicopter

    Science.gov (United States)

    1989-05-01

    INDIVIDUAL 22b. TELEPHONE (Include Area Code) 22c. OFFICE SYMBOL ,!.lIl.,.\\ R I ,VIS (805) 277-2115 , , SAVIF-PR DDForm 1473, JUN 86 Previous editions are...Rc1iiort. -\\I-.FA PI lcit No. 88-06, Artijical and Natural Iking Test. of the I-il- trA )iuck IFix lI ic-opici June 1988. I ttcr. AVSCO(), AISA \\V--S

  5. A critical re-evaluation of multilocus sequence typing (MLST) efforts in Wolbachia.

    Science.gov (United States)

    Bleidorn, Christoph; Gerth, Michael

    2018-01-01

    Wolbachia (Alphaproteobacteria, Rickettsiales) is the most common, and arguably one of the most important inherited symbionts. Molecular differentiation of Wolbachia strains is routinely performed with a set of five multilocus sequence typing (MLST) markers. However, since its inception in 2006, the performance of MLST in Wolbachia strain typing has not been assessed objectively. Here, we evaluate the properties of Wolbachia MLST markers and compare it to 252 other single copy loci present in the genome of most Wolbachia strains. Specifically, we investigated how well MLST performs at strain differentiation, at reflecting genetic diversity of strains, and as phylogenetic marker. We find that MLST loci are outperformed by other loci at all tasks they are currently employed for, and thus that they do not reflect the properties of a Wolbachia strain very well. We argue that whole genome typing approaches should be used for Wolbachia typing in the future. Alternatively, if few loci approaches are necessary, we provide a characterisation of 252 single copy loci for a number a criteria, which may assist in designing specific typing systems or phylogenetic studies. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. The Lesser Antillean Ameiva (Sauria, Teiidae) Re-evaluation, zoogeography and the effects of predation

    NARCIS (Netherlands)

    Baskin, Jonathan N.; Williams, Ernest E.

    1966-01-01

    The Ameiva of the Lesser Antilles present an interesting case of isolated populations of related animals on a chain of islands that differ in size and proximity among themselves but form a geographic group. The situation is made still more interesting by the fact that at times in the Pleistocene the

  7. Re-evaluation of schistosomiasis mansoni in Minas Gerais, Brazil. III. "Noroeste de Minas" mesoregion.

    Science.gov (United States)

    Carvalho, O S; Massara, C L; Guerra, H L; Campos, Y R; Caldeira, R L; Chaves, A; Katz, N

    1998-01-01

    This study was conducted to assess the presence of schistosomiasis mansoni in the "Noroeste de Minas" mesoregion, an area considered non-endemic. A malacologic survey and parasitologic stool examinations were undertaken in 13 municipalities of the mesoregion. A sample of 3,283 primary school students was submitted to fecal examination by the Kato-Katz method. A total of 3,627 planorbids was collected and examined. The molluscs were identified as Biomphalaria straminea in seven municipalities (Unaí, Bonfinópolis de Minas, Paracatu, Jaão Pinheiro, Vazante, Lagamar and Lagoa Grande) and as Biomphalaria peregrina in one (Presidente Olegário). All planorbids were negative for Schistosoma mansoni. Four students were diagnosed with schistosomiasis in the municipalities of Buritis, Formoso, Paracatu and Unaí, but none of these cases was considered autochthonous. The data obtained indicate that the "Noroeste de Minas" mesoregion continues to be non-endemic for schistosomiasis mansoni, although the presence of intermediate hosts associated with parasitized individuals emphasizes the need for epidemiological surveillance of schistosomiasis in this mesoregion.

  8. RE-EVALUATION OF SCHISTOSOMIASIS MANSONI IN MINAS GERAIS, BRAZIL. III. "NOROESTE DE MINAS" MESOREGION

    Directory of Open Access Journals (Sweden)

    CARVALHO Omar S.

    1998-01-01

    Full Text Available This study was conducted to assess the presence of schistosomiasis mansoni in the "Noroeste de Minas" mesoregion, an area considered non-endemic. A malacologic survey and parasitologic stool examinations were undertaken in 13 municipalities of the mesoregion. A sample of 3,283 primary school students was submitted to fecal examination by the Kato-Katz method. A total of 3,627 planorbids was collected and examined. The molluscs were identified as Biomphalaria straminea in seven municipalities (Unaí, Bonfinópolis de Minas, Paracatu, João Pinheiro, Vazante, Lagamar and Lagoa Grande and as Biomphalaria peregrina in one (Presidente Olegário. All planorbids were negative for Schistosoma mansoni. Four students were diagnosed with schistosomiasis in the municipalities of Buritis, Formoso, Paracatu and Unaí, but none of these cases was considered autochthonous. The data obtained indicate that the "Noroeste de Minas" mesoregion continues to be non-endemic for schistosomiasis mansoni, although the presence of intermediate hosts associated with parasitized individuals emphasizes the need for epidemiological surveillance of schistosomiasis in this mesoregion.

  9. Re-Evaluation of Constant versus Varied Punishers Using Empirically Derived Consequences

    Science.gov (United States)

    Toole, Lisa M.; DeLeon, Iser G.; Kahng, Sung Woo; Ruffin, Geri E.; Pletcher, Carrie A.; Bowman, Lynn G.

    2004-01-01

    Charlop, Burgio, Iwata, and Ivancic [J. Appl. Behav. Anal. 21 (1988) 89] demonstrated that varied punishment procedures produced greater or more consistent reductions of problem behavior than a constant punishment procedure. More recently, Fisher and colleagues [Res. Dev. Disabil. 15 (1994) 133; J. Appl. Behav. Anal. 27 (1994) 447] developed a…

  10. Weight loss versus muscle loss: re-evaluating inclusion criteria for future cancer cachexia interventional trials.

    Science.gov (United States)

    Roeland, Eric J; Ma, Joseph D; Nelson, Sandahl H; Seibert, Tyler; Heavey, Sean; Revta, Carolyn; Gallivan, Andrea; Baracos, Vickie E

    2017-02-01

    Participation in cancer cachexia clinical trials requires a defined weight loss (WL) over time. A loss in skeletal muscle mass, measured by cross-sectional computed tomography (CT) image analysis, represents a possible alternative. Our aim was to compare WL versus muscle loss in patients who were screened to participate in a cancer cachexia clinical trial. This was a single-center, retrospective analysis in metastatic colorectal cancer patients screened for an interventional cancer cachexia trial requiring a ≥5 % WL over the preceding 6 months. Concurrent CT images obtained as part of standard oncology care were analyzed for changes in total muscle and fat (visceral, subcutaneous, and total). Of patients screened (n = 36), 3 (8 %) enrolled in the trial, 17 (47 %) were excluded due to insufficient WL (20 %), and 16 (44 %) met inclusion criteria for WL. Patients who met screening criteria for WL (5-20 %) had a mean ± SD of 7.7 ± 8.7 % muscle loss, 24.4 ± 37.5 % visceral adipose loss, 21.6 ± 22.3 % subcutaneous adipose loss, and 22.1 ± 24.7 % total adipose loss. Patients excluded due to insufficient WL had 2 ± 6.4 % muscle loss, but a gain of 8.5 ± 39.8 % visceral adipose, and 4.2 ± 28.2 % subcutaneous adipose loss and 0.8 ± 28.4 % total adipose loss. Of the patients excluded due to WL 5 %. Defining cancer cachexia by WL over time may be limited as it does not capture skeletal muscle loss. Cross-sectional CT body composition analysis may improve early detection of muscle loss and patient participation in future cancer cachexia clinical trials.

  11. The mineral treasure that almost got away: Re-evaluating yesterday's mine waste

    Science.gov (United States)

    Högdahl, K.; Jonsson, E.; Troll, V.; Majka, J.

    2012-04-01

    Rare metals and semi-metals such as In, Ga, Se, Te and rare earth elements (REE) are increasing in demand for use in "new" and "green" technology. Yet, before the end of the 20th century the applications and thus the markets for these elements were limited. In many mines, the exploration paradigms and current knowledge as well as contemporary analytical methodology likely resulted in minerals hosting these metals to end up as waste, that is, on the mine dumps. In other cases, they were identified, but considered as mineralogical "exotica". Even extremely well-known and traditionally valuable metals such as gold went undetected on the dumps in some mine fields. This is due to a combination of factors such as that the deposits were "of the wrong type", assays were expensive, and suitable laboratory capacity sparse. This implies that in many regions, this old mine waste is a potential resource for several sought-after metals and semi-metals, including the ones increasingly used in modern high-tech applications. Admittedly, many older dumps and dump fields host only minor to moderate total amounts of material, but in todaýs society - increasingly focused on sustainability and related needs for recycling - this is likely to become an asset. In Sweden, many mine dumps date back hundreds of years or more as mining has been documented to go back at least 1000 years. Before the 20th century, only a single or, at best, a couple of metals were extracted from any given mine. Due to modern development in analytical techniques, the concentrations of trace elements, including highly sought-after metals and semi-metals can be obtained at moderate costs today. The presence of variable amounts of precious and rare elements along with the main ore commodity has now been documented in several cases. A recently started project in the classic, Palaeoproterozoic Bergslagen ore province in central Sweden is aimed at resolving the potential for finding and utilising these "unknown treasures". A conservative estimate based on SGU databases is that in this province alone, there are over 6500 mineralisations/deposits. A majority of these have associated mine dumps and in the case of more recently mined deposits, different types of tailings. Initial results highlight the high average contents of REÉs and identify their mineralogical and textural distribution in apatite-iron oxide ore present in both dumps and tailings. In addition, we report the occurrence of previously undetected mineralisation of indium and tungsten in different mine dumps in the western part of the province.

  12. Reproductive outcome re-evaluation for women with primary ovarian insufficiency using office

    Directory of Open Access Journals (Sweden)

    Mohamed T. Gad Al Rab

    2015-12-01

    Full Text Available Objective & Aim: The objective of this study was to analyze the usefulness of office microlaparoscopy in the re-assessment of ovarian morphological picture, relevant clinical types and future fertility prognosis of primary ovarian insufficiency (POI. Methods: Forty-five patients with POI diagnosed in a private fertility care center between October 2009 and December 2014, who gave informed consent and underwent office microlaparoscopy were studied. Pelvic ultrasound had failed to visualize and morphologically assess both ovaries in the women included. The cases were divided into four groups based on the microlaparoscopic ovarian morphology: Group N (near to normal, Group G (Gyrus shaped, Group A (atrophied, and Group S (streak shaped. These groups were analyzed with respect to patient background, blood hormone levels, the level of antinuclear antibodies measured, and their individual fertility prognosis. Result: No significant differences in patient background and serum hormone levels were observed between groups. There was complete absence of both ovaries in 5 patients included. Groups N and G had shown some improvement, such as regular spontaneous menstruation, and forthcoming pregnancy, which happened once in Group N. Many other internal genital anomalies could be diagnosed during the same office procedure. Conclusion: Office microlaparoscopy under augmented local anesthesia is a useful procedure in the definite demarcation, and the differentiation between the types of POI, regarding their menstrual regularity and future fertility prognosis.

  13. Re-evaluation of the Haarlem Archaeopteryx and the radiation of maniraptoran theropod dinosaurs.

    Science.gov (United States)

    Foth, Christian; Rauhut, Oliver W M

    2017-12-02

    Archaeopteryx is an iconic fossil that has long been pivotal for our understanding of the origin of birds. Remains of this important taxon have only been found in the Late Jurassic lithographic limestones of Bavaria, Germany. Twelve skeletal specimens are reported so far. Archaeopteryx was long the only pre-Cretaceous paravian theropod known, but recent discoveries from the Tiaojishan Formation, China, yielded a remarkable diversity of this clade, including the possibly oldest and most basal known clade of avialan, here named Anchiornithidae. However, Archaeopteryx remains the only Jurassic paravian theropod based on diagnostic material reported outside China. Re-examination of the incomplete Haarlem Archaeopteryx specimen did not find any diagnostic features of this genus. In contrast, the specimen markedly differs in proportions from other Archaeopteryx specimens and shares two distinct characters with anchiornithids. Phylogenetic analysis confirms it as the first anchiornithid recorded outside the Tiaojushan Formation of China, for which the new generic name Ostromia is proposed here. In combination with a biogeographic analysis of coelurosaurian theropods and palaeogeographic and stratigraphic data, our results indicate an explosive radiation of maniraptoran coelurosaurs probably in isolation in eastern Asia in the late Middle Jurassic and a rapid, at least Laurasian dispersal of the different subclades in the Late Jurassic. Small body size and, possibly, a multiple origin of flight capabilities enhanced dispersal capabilities of paravian theropods and might thus have been crucial for their evolutionary success.

  14. Re-evaluation of Moisture Controls During ARIES Oxide Processing, Packaging and Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Karmiol, Benjamin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wayne, David Matthew [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-12-18

    DOE-STD-3013 [1] requires limiting the relative humidity (RH) in the glovebox during processing of the oxide product for specific types of plutonium oxides. This requirement is mandated in order to limit corrosion of the stainless steel containers by deliquescence of chloride salts if present in the PuO2. DOE-STD-3013 also specifies the need to limit and monitor internal pressure buildup in the 3013 containers due to the potential for the generation of free H2 and O2 gas from the radiolysis of surfaceadsorbed water. DOE-STD-3013 requires that the oxide sample taken for moisture content verification be representative of the stabilized material in the 3013 container. This is accomplished by either limiting the time between sampling and packaging, or by control of the glovebox relative humidity (%RH). This requirement ensures that the sample is not only representative, but also conservative from the standpoint of moisture content.

  15. Oral agents for ovulation induction:Old drugs revisited and new drugs re-evaluated

    NARCIS (Netherlands)

    Badawy, A.M.M.

    2008-01-01

    The aim of this thesis was to address a number of questions regarding oral agents used for ovulation induction. We were motivated to run the presented trials because of many reasons. Firstly, although oral agents, namely CC, have been in the market for decades, many basic aspects regarding the

  16. Re-Evaluating the Human Curriculum: The Change from Bureaucratic to Professional.

    Science.gov (United States)

    Karr, P. J.

    The relationship between organizational theory and the development of the human curriculum needs further assessment. The author suggests that Management by Objectives (MBO) can further the goals of human curriculum by altering the modes of organizational communication. (Author/DS)

  17. Brazilian Alcohol Program (Proalcool): economic re-evaluation and demand adjustments

    International Nuclear Information System (INIS)

    Motta, R.S. da; Rocha Ferreira, L. da

    1987-01-01

    The aim of this paper is to discuss the economic impact on the Brazilian National Alcohol Programme caused by changes in the energy scenery, in view of recent oil price fall in the international market, and evaluate the necessary adjustments of the Programme according to the new Brazilian economic reality. The economic analysis concludes that the alcohol production, considering current production capacity and its investments, could be economically feasible at international oil prices near US$ 30.00. Excluding investments, its feasibility would be between US$ 18.00 and US$ 20.00 per equivalent oil barrel. Based on these conclusions, proposals for adjusting the PROALCOOL are discussed, including alternative pricing, fiscal and credit policies to control the alcohol-fuel demand. (author)

  18. Re-evaluating the role of vitamin D in the periodontium.

    Science.gov (United States)

    Stein, S H; Livada, R; Tipton, D A

    2014-10-01

    The importance of vitamin D in maintaining skeletal health via the regulation of calcium has long been recognized as a critical function of this secosteroid. An abundance of literature shows an association between oral bone mineral density and some measure of systemic osteoporosis and suggests that osteoporosis/low bone mass may be a risk factor for periodontal disease. Recently, nonskeletal functions of vitamin D have gained notoriety for several reasons. Many cells that are not associated with calcium homeostasis have been demonstrated to possess membrane receptors for vitamin D. These include activated T and B lymphocytes, and skin, placenta, pancreas, prostate and colon cancer cells. In addition, vitamin D "insufficiency" is a worldwide epidemic and epidemiologic evidence has linked this condition to multiple chronic health problems, including cardiovascular and autoimmune diseases, hypertension and a variety of cancers. Interestingly, there is mounting evidence connecting diminished serum levels of vitamin D with increased gingival inflammation and supporting the concept of "continual vitamin D sufficiency" in maintaining periodontal health. The ability of vitamin D to regulate both the innate and the adaptive components of the host response may play an important role in this process. This review will examine the skeletal and nonskeletal functions of vitamin D, and explore its potential role in protecting the periodontium as well as in regulating periodontal wound healing. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Seismic re-evaluation of Mochovce nuclear power plant. Seismic reevaluation of civil structures

    International Nuclear Information System (INIS)

    Podrouzek, P.

    1997-01-01

    In this contribution, an overview of seismic design procedures used for reassessment of seismic safety of civil structures at the Mochovce NPP in Slovak Republic presented. As an introduction, the objectives, history, and current status of seismic design of the NPP have been explained. General philosophy of design methods, seismic classification of buildings, seismic data, calculation methods, assumptions on structural behavior under seismic loading and reliability assessment were described in detail in the subsequent section. Examples of calculation models used for dynamic calculations of seismic response are given in the last section. (author)

  20. Rasch analysis of the Knee injury and Osteoarthritis Outcome Score (KOOS): a statistical re-evaluation

    DEFF Research Database (Denmark)

    Comins, J; Brodersen, J; Krogsgaard, M

    2008-01-01

    The knee injury and Osteoarthritis Outcome Score (KOOS), based on the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC), is widely used to evaluate subjective outcome in anterior cruciate ligament (ACL) reconstructed patients. However, the validity of KOOS has not been assessed...

  1. Culling and the Common Good: Re-evaluating Harms and Benefits under the One Health Paradigm.

    Science.gov (United States)

    Degeling, Chris; Lederman, Zohar; Rock, Melanie

    2016-11-01

    One Health (OH) is a novel paradigm that recognizes that human and non-human animal health is interlinked through our shared environment. Increasingly prominent in public health responses to zoonoses, OH differs from traditional approaches to animal-borne infectious risks, because it also aims to promote the health of animals and ecological systems. Despite the widespread adoption of OH, culling remains a key component of institutional responses to the risks of zoonoses. Using the threats posed by highly pathogenic avian influenza viruses to human and animal health, economic activity and food security as a case exemplar, we explore whether culling and other standard control measures for animal-borne infectious disease might be justified as part of OH approaches. Our central premise is that OH requires us to reformulate 'health' as universal good that is best shared across species boundaries such that human health and well-being are contingent upon identifying and meeting the relevant sets of human and non-human interests and shared dependencies. Our purpose is to further nascent discussions about the ethical dimensions of OH and begin to describe the principles around which a public health agenda that truly seeks to co-promote human and non-human health could potentially begin to be implemented.

  2. Advanced Parkinson's or "complex phase" Parkinson's disease? Re-evaluation is needed.

    Science.gov (United States)

    Titova, Nataliya; Martinez-Martin, Pablo; Katunina, Elena; Chaudhuri, K Ray

    2017-12-01

    Holistic management of Parkinson's disease, now recognised as a combined motor and nonmotor disorder, remains a key unmet need. Such management needs relatively accurate definition of the various stages of Parkinson's from early untreated to late palliative as each stage calls for personalised therapies. Management also needs to have a robust knowledge of the progression pattern and clinical heterogeneity of the presentation of Parkinson's which may manifest in a motor dominant or nonmotor dominant manner. The "advanced" stages of Parkinson's disease qualify for advanced treatments such as with continuous infusion or stereotactic surgery yet the concept of "advanced Parkinson's disease" (APD) remains controversial in spite of growing knowledge of the natural history of the motor syndrome of PD. Advanced PD is currently largely defined on the basis of consensus opinion and thus with several caveats. Nonmotor aspects of PD may also reflect advancing course of the disorder, so far not reflected in usual scale based assessments which are largely focussed on motor symptoms. In this paper, we discuss the problems with current definitions of "advanced" PD and also propose the term "complex phase" Parkinson's disease as an alternative which takes into account a multimodal symptoms and biomarker based approach in addition to patient preference.

  3. Re-evaluating the NO 2 hotspot over the South African Highveld

    Directory of Open Access Journals (Sweden)

    Alexandra S.M. Lourens

    2012-10-01

    Full Text Available Globally, numerous pollution hotspots have been identified using satellite-based instruments. One of these hotspots is the prominent NO2hotspot over the South African Highveld. The tropospheric NO2column density of this area is comparable to that observed for central and northern Europe, eastern North America and south-east Asia. The most well-known pollution source in this area is a large array of coal-fired power stations. Upon closer inspection, long-term means of satellite observations also show a smaller area, approximately 100 km west of the Highveld hotspot, with a seemingly less substantial NO2column density. This area correlates with the geographical location of the Johannesburg–Pretoria conurbation or megacity, one of the 40 largest metropolitan areas in the world. Ground-based measurements indicate that NO2concentrations in the megacity have diurnal peaks in the early morning and late afternoon, which coincide with peak traffic hours and domestic combustion. During these times, NO2concentrations in the megacity are higher than those in the Highveld hotspot. These diurnal NO2 peaks in the megacity have generally been overlooked by satellite observations because the satellites have fixed local overpass times that do not coincide with these peak periods. Consequently, the importance of NO2 over the megacity has been underestimated. We examined the diurnal cycles of NO2 ground-based measurements for the two areas – the megacity and the Highveld hotspot – and compared them with the satellite-based NO2 observations. Results show that the Highveld hotspot is accompanied by a second hotspot over the megacity, which is of significance for the more than 10 million people living in this megacity.

  4. A re-evaluation of isotope screening for skeletal metastases in node ...

    African Journals Online (AJOL)

    clinical Tl.2~Mo breast cancer who had skeletal scintigraphy between 1974 and 1987, and who had been ... (11.4%) were suggestive or diagnostic of metastatic disease, with radiological confirmation in 3 (inrtial ... scintigraphy remains the most common screening test for asymptomatic skeletal metastases, and its superiority ...

  5. Site response of the Ganges basin inferred from re-evaluated ...

    Indian Academy of Sciences (India)

    This subsurface slip stressed the shallower regions of the ... where T is a tension factor and L is the Laplacian operator. ..... 'curve-fitting' exercise so that we can analyze the ..... Duda S 1965 Secular seismic energy release in circum-Pacific.

  6. Critical Data-Based Re-Evaluation of Minocycline as a Putative Specific Microglia Inhibitor

    NARCIS (Netherlands)

    Moller, Thomas; Bard, Frederique; Bhattacharya, Anindya; Biber, Knut; Campbell, Brian; Dale, Elena; Eder, Claudia; Gan, Li; Garden, Gwenn A.; Hughes, Zoe A.; Pearse, Damien D.; Staal, Roland G. W.; Sayed, Faten A.; Wes, Paul D.; Boddeke, Hendrikus W. G. M.

    2016-01-01

    Minocycline, a second generation broad-spectrum antibiotic, has been frequently postulated to be a "microglia inhibitor." A considerable number of publications have used minocycline as a tool and concluded, after achieving a pharmacological effect, that the effect must be due to "inhibition" of

  7. The effects of family policies in the German Democratic Republic: a re-evaluation.

    Science.gov (United States)

    Monnier, A

    1990-01-01

    The author examines "the impact which various pro-natalist measures adopted since 1976 in the German Democratic Republic have had on women's birth cohorts....A period analysis of subsequent birth and fertility trends would seem to indicate that this policy was remarkably effective. The annual number of births...started to rise rapidly in 1976 and reached a peak of 245,132 births in 1980, an increase of 37%....Furthermore, comparison with the period fertility trend in the Federal Republic of Germany...shows that the gap between the two Germanys has widened since 1977...whereas the trends had been very similar in the two countries before that date....However...other factors should be taken into account: in particular, the number of marriages has fallen steeply during the last few years, and at the same time the number of births outside marriage has soared. These changes, which were in all probability prompted by the adoption of social legislation which favoured single mothers (or fathers)...must be taken into account when assessing the consequences of the new family policy." excerpt

  8. Re-evaluating Palermo: The case of Burmese women as Chinese brides

    Directory of Open Access Journals (Sweden)

    Laura K Hackney

    2015-04-01

    Full Text Available The definition of human trafficking as set in the Trafficking Protocol (also known as the Palermo Protocol functionally centres most of the response to the phenomenon in the criminal justice system. This occludes many of the sociopolitical determinants of vulnerability that leads to trafficking.  It also discourages any real debate about the various forms of oppression and even structural violence that act as catalysts to the human trafficking market.  The Trafficking Protocol, and a vast number of international organisations, non-governmental organisations and governments, focuses on statistics of prosecution rates, arrests, victim typology and organised crime. I use the example of bride trafficking along the Sino-Burmese border to illustrate the complications and, in certain instances, harm that befall an anti-trafficking regime that does not use a wider lens of migration, agency, development and gender equality to address the factors leading to exploitation.

  9. Short Communications A re-evaluation of the taxonomic status of ...

    African Journals Online (AJOL)

    1990-06-05

    Jun 5, 1990 ... X. b. var. linea/US. Roux, 1907 was found to differ sufficiently from typical bieolor to allow subspecific status (FitzSimons 1946). In. 1915, Werner described Micaela pernasUla, later placed in the genus Xenoealamus by Hewitt (1926) and treated as X. b. pernasutus by FitzSimons (1946; 1962) and Witte &.

  10. Evaluation and re-evaluation of genetic radiation hazards in man

    International Nuclear Information System (INIS)

    Schalet, A.P.; Sankaranarayanan, K.

    1976-01-01

    A detailed presentation is made of the experimental data from the various systems used by Abrahamson to conclude that the per locus per rad (low LET) radiation-induced forward mutation rates in organisms whose DNA content varies by a factor of about 1000, is proportional to genome size. Additional information pertinent in this context is also reviewed. It is emphasized that the mutation rates cited by Abrahamson, although considered as pertaining to mutations at specific loci, actually derive from a broad variety of genetic end-points. It is argued that an initial (if not sufficient) condition for sound interspecific mutation rate comparisons, covering a wide range of organisms and detecting systems of various sensitivities, requires a reasonably consistent biological definition of a specific locus mutation, namely, a transmissible intralocus change. Granting the differences between systems in their resolving power to detect intergenic change, the data cited in this paper do not support the existence of a simple proportionality between radiation-induced intralocus mutation rate and genome size for the different species reviewed here

  11. Re-evaluating the role of bacteria in gerbera vase life

    NARCIS (Netherlands)

    Schouten, Rob E.; Verdonk, Julian C.; Meeteren, van Uulke

    2018-01-01

    The relation between bacteria numbers in vase water and vase life of gerbera cut flowers has recently been challenged because of reported negative effects of bactericidal compounds. This relation is investigated using two types of experiments that do not rely on antimicrobial compounds. The first

  12. Re-evaluation of the prolactin receptor expression in human breast cancer

    DEFF Research Database (Denmark)

    Galsgaard, Elisabeth Douglas; Rasmussen, Birgitte Bruun; Folkesson, Charlotta Grånäs

    2009-01-01

    , we evaluated the specificity of commercially available anti-human PRLR antibodies (B6.2, U5, PRLRi pAb, 1A2B1, 250448 and H-300). The latter three antibodies were found to specifically recognise PRLR. The relative PRLR expression level detected with these antibodies closely reflected the level...... to be sufficient to mediate PRL responsiveness in breast cancer cell lines....

  13. The Mollö Cog Re-Examined and Re-Evaluated

    DEFF Research Database (Denmark)

    Von Arbin, Staffan; Daly, Aoife

    2012-01-01

    As part of a research project on medieval trade and maritime transportation in the former Norwegian province of Bohuslän, western Sweden, a dendrochronological analysis of the so-called Mollö cog was undertaken. The wreck, which was first salvaged in 1980, was previously dated by 14 C analysis...

  14. Seismic re-evaluation of piping systems of heavy water plant, Kota

    CERN Document Server

    Mishra, R; Soni, R S; Venkat-Raj, V

    2002-01-01

    Heavy Water Plant, Kota is the first indigenous heavy water plant built in India. The plant started operation in the year 1985 and it is approaching the completion of its originally stipulated design life. In view of the excellent record of plant operation for the past so many years, it has been planned to carry out various exercises for the life extension of the plant. In the first stage, evaluation of operation stresses was carried out for the process critical piping layouts and equipment, which are connected with 25 process critical nozzle locations, identified based on past history of the plant performance. Fatigue life evaluation has been carried out to fmd out the Cumulative Usage Factor, which helps in arriving at a decision regarding the life extension of the plant. The results of these exercises have been already reported separately vide BARC/200I /E/O04. In the second stage, seismic reevaluation of the plant has been carried out to assess its ability to maintain its integ:rity in case of a seismic e...

  15. A mitogenomic re-evaluation of the bdelloid phylogeny and relationships among the Syndermata.

    Directory of Open Access Journals (Sweden)

    Erica Lasek-Nesselquist

    Full Text Available Molecular and morphological data regarding the relationships among the three classes of Rotifera (Bdelloidea, Seisonidea, and Monogononta and the phylum Acanthocephala are inconclusive. In particular, Bdelloidea lacks molecular-based phylogenetic appraisal. I obtained coding sequences from the mitochondrial genomes of twelve bdelloids and two monogononts to explore the molecular phylogeny of Bdelloidea and provide insight into the relationships among lineages of Syndermata (Rotifera + Acanthocephala. With additional sequences taken from previously published mitochondrial genomes, the total dataset included nine species of bdelloids, three species of monogononts, and two species of acanthocephalans. A supermatrix of these 10-12 mitochondrial proteins consistently recovered a bdelloid phylogeny that questions the validity of a generally accepted classification scheme despite different methods of inference and various parameter adjustments. Specifically, results showed that neither the family Philodinidae nor the order Philodinida are monophyletic as currently defined. The application of a similar analytical strategy to assess syndermate relationships recovered either a tree with Bdelloidea and Monogononta as sister taxa (Eurotatoria or Bdelloidea and Acanthocephala as sister taxa (Lemniscea. Both outgroup choice and method of inference affected the topological outcome emphasizing the need for sequences from more closely related outgroups and more sophisticated methods of analysis that can account for the complexity of the data.

  16. Re-evaluating the link between brain size and behavioural ecology in primates.

    Science.gov (United States)

    Powell, Lauren E; Isler, Karin; Barton, Robert A

    2017-10-25

    Comparative studies have identified a wide range of behavioural and ecological correlates of relative brain size, with results differing between taxonomic groups, and even within them. In primates for example, recent studies contradict one another over whether social or ecological factors are critical. A basic assumption of such studies is that with sufficiently large samples and appropriate analysis, robust correlations indicative of selection pressures on cognition will emerge. We carried out a comprehensive re-examination of correlates of primate brain size using two large comparative datasets and phylogenetic comparative methods. We found evidence in both datasets for associations between brain size and ecological variables (home range size, diet and activity period), but little evidence for an effect of social group size, a correlation which has previously formed the empirical basis of the Social Brain Hypothesis. However, reflecting divergent results in the literature, our results exhibited instability across datasets, even when they were matched for species composition and predictor variables. We identify several potential empirical and theoretical difficulties underlying this instability and suggest that these issues raise doubts about inferring cognitive selection pressures from behavioural correlates of brain size. © 2017 The Author(s).

  17. The Cascade Mountains revisited: A re-evaluation in light of new lead isotopic data

    International Nuclear Information System (INIS)

    Church, S.E.

    1976-01-01

    Lead isotopic analyses have been repeated using silica gel for several samples from the Cascade Mountains which were previously analyzed by lead sulfide. The improved precision indicates that some of the scatter in the original data was due to thermal fractionation; however, the bulk of the data have not changed significantly. Two-point mixing lines are demonstrated for main cone-satellitic cone pairs from Glacier Peak, Mt. Baker and Mt. Shasta. Comparison with data on oceanic basalts from the Juan de Fuca and Gorda Ridge area indicates that hypothesis of mixing of mid-ocean ridge (MOR) basalt lead and 'alkali basalt-like' lead from the oceanic crust is not tenable. Lead isotope analyses of pre-Astoria Fan sediments from DSDP Leg 18 sites and from the Eocene Tyee Formation indicate that the sedimentary continental detritus from the North American continent has the correct lead isotopic composition to be the continental component necessary to account for the Cascade Mountains lead isotopic array by mixing with Juan de Fuca-Gorda Ridge MOR basalts. However, from recent work on the structure of oceanic trenches by Karig and Sharman (1975), it does not appear that subduction of sediments is the rule. A model of crustal contamination and/or assimilation at the crust/mantle interface is the preferred explanation for the lead isotopic data from the Cascade Mountains. (Auth.)

  18. A RE-EVALUATION OF THE EVOLVED STARS IN THE GLOBULAR CLUSTER M13

    International Nuclear Information System (INIS)

    Sandquist, Eric L.; Gordon, Mark; Levine, Daniel; Bolte, Michael

    2010-01-01

    We have analyzed photometry from space- and ground-based cameras to identify all bright red giant branch (RGB), horizontal branch (HB), and asymptotic giant branch (AGB) stars within 10' of the center of the globular cluster M13. We identify a modest (7%) population of HB stars redder than the primary peak (including RR Lyrae variables at the blue end of the instability strip) that is somewhat more concentrated to the cluster core than the rest of the evolved stars. We find support for the idea that they are noticeably evolved and in the late stages of depleting helium in their cores. This resolves a disagreement between distance moduli derived from the tip of the RGB and from stars in or near the RR Lyrae instability strip. We identified disagreements between HB model sets on whether stars with T eff ∼ eff ∼ eff ∼ 22, 000 K) as previously suggested. These stars are brighter than other stars of similar color (either redder or bluer), and may be examples of 'early hot flashers' that ignite core helium fusion shortly after leaving the RGB. We used ultraviolet photometry to identify hot post-HB stars, and based on their numbers (relative to canonical AGB stars) we estimate the position on the HB where the morphology of the post-HB tracks change to I ∼ 17.3, between the two peaks in the HB distribution. Concerning the possibility of helium enrichment in M13, we revisited the helium-sensitive R ratio, applying a new method for correcting star counts for larger lifetimes of hot HB stars. We find that M13's R ratio is in agreement with theoretical values for primordial helium abundance Y P = 0.245 and inconsistent with a helium enhancement ΔY = 0.04. The brightness of the HB (both in comparison to the end of the canonical HB and to the tip of the RGB) also appears to rule out the idea that the envelopes of the reddest HB stars have been significantly enriched in helium. The absolute colors of the turnoffs of M3 and M13 potentially may be used to look for differences in their mean helium abundances, but there are inconsistencies in current data sets between colors using different filters that prevent a solid conclusion. The numbers of stars on the lower RGB and in the red giant bump agree very well with recent theoretical models, although there are slight indications of a deficit of red giant stars above the bump. There is not convincing evidence that a large fraction of stars leave the RGB before undergoing a core helium flash.

  19. Re-evaluating the relationships among filtering activity, unnecessary storage, and visual working memory capacity.

    Science.gov (United States)

    Emrich, Stephen M; Busseri, Michael A

    2015-09-01

    The amount of task-irrelevant information encoded in visual working memory (VWM), referred to as unnecessary storage, has been proposed as a potential mechanism underlying individual differences in VWM capacity. In addition, a number of studies have provided evidence for additional activity that initiates the filtering process originating in the frontal cortex and basal ganglia, and is therefore a crucial step in the link between unnecessary storage and VWM capacity. Here, we re-examine data from two prominent studies that identified unnecessary storage activity as a predictor of VWM capacity by directly testing the implied path model linking filtering-related activity, unnecessary storage, and VWM capacity. Across both studies, we found that unnecessary storage was not a significant predictor of individual differences in VWM capacity once activity associated with filtering was accounted for; instead, activity associated with filtering better explained variation in VWM capacity. These findings suggest that unnecessary storage is not a limiting factor in VWM performance, whereas neural activity associated with filtering may play a more central role in determining VWM performance that goes beyond preventing unnecessary storage.

  20. A re-evaluation of the phylogeny of Old World treefrogs

    African Journals Online (AJOL)

    1988-05-27

    May 27, 1988 ... Ranidae, while most African treefrogs, excluding the genus Chiromantis, were ..... (coded as 1110): Y-shaped terminal phalanx, the distal ends are pointed and ...... ting fauna and flora (Stoddart 1984). Although such an early ...

  1. Males Resemble Females: Re-Evaluating Sexual Dimorphism in Protoceratops andrewsi (Neoceratopsia, Protoceratopsidae.

    Directory of Open Access Journals (Sweden)

    Leonardo Maiorino

    Full Text Available Protoceratops andrewsi (Neoceratopsia, Protoceratopsidae is a well-known dinosaur from the Upper Cretaceous of Mongolia. Some previous workers hypothesized sexual dimorphism in the cranial shape of this taxon, using qualitative and quantitative observations. In particular, width and height of the frill as well as the development of a nasal horn have been hypothesized as potentially sexually dimorphic.Here, we reassess potential sexual dimorphism in skulls of Protoceratops andrewsi by applying two-dimensional geometric morphometrics to 29 skulls in lateral and dorsal views. Principal Component Analyses and nonparametric MANOVAs recover no clear separation between hypothetical "males" and "females" within the overall morphospace. Males and females thus possess similar overall cranial morphologies. No differences in size between "males" and "females" are recovered using nonparametric ANOVAs.Sexual dimorphism within Protoceratops andrewsi is not strongly supported by our results, as previously proposed by several authors. Anatomical traits such as height and width of the frill, and skull size thus may not be sexually dimorphic. Based on PCA for a data set focusing on the rostrum and associated ANOVA results, nasal horn height is the only feature with potential dimorphism. As a whole, most purported dimorphic variation is probably primarily the result of ontogenetic cranial shape changes as well as intraspecific cranial variation independent of sex.

  2. Re-evaluating neptunium in uranyl phases derived from corroded spent fuel

    International Nuclear Information System (INIS)

    Fortner, J. A.; Finch, R. J.; Kropf, A. J.; Cunnane, J. C.; Chemical Engineering

    2004-01-01

    Interest in mechanisms that may control radioelement release from corroded commercial spent nuclear fuel (CSNF) has been heightened by the selection of the Yucca Mountain site in Nevada as the repository for high-level nuclear waste in the United States. Neptunium is an important radionuclide in repository models owing to its relatively long half-life and its high aqueous mobility as neptunyl [Np(V)O+2]. The possibility of neptunium sequestration into uranyl alteration phases produced by corroding CSNF would suggest-a process for lowering neptunium concentration and subsequent migration from a geologic repository. However, there remains little experimental evidence that uranyl compounds will, in fact, serve as long-term host phases for the retention of neptunium under conditions expected in a deep geologic repository. To directly explore this possibility, we examined specimens of uranyl alteration phases derived from humid-air-corroded CSNF by X-ray absorption spectroscopy to better determine neptunium uptake in these phases. Although neptunium fluorescence was readily observed from as-received CSNF, it was not observed from the uranyl alteration rind. We establish upper limits for neptunium incorporation into CSNF alteration phases that are significantly below previously reported concentrations obtained by using electron energy loss spectroscopy (EELS). We attribute the discrepancy to a plural-scattering event that creates a spurious EELS peak at the neptunium-MV energy

  3. Thermal waters along the Konocti Bay fault zone, Lake County, California: a re-evaluation

    Science.gov (United States)

    Thompson, J.M.; Mariner, R.H.; White, L.D.; Presser, T.S.; Evans, William C.

    1992-01-01

    The Konocti Bay fault zone (KBFZ), initially regarded by some as a promising target for liquid-dominated geothermal systems, has been a disappointment. At least five exploratory wells were drilled in the vicinity of the KBFZ, but none were successful. Although the Na-K-Ca and Na-Li geothermometers indicate that the thermal waters discharging in the vicinity of Howard and Seigler Springs may have equilibrated at temperatures greater than 200??C, the spring temperatures and fluid discharges are low. Most thermal waters along the KBFZ contain >100 mg/l Mg. High concentrations of dissolved magnesium are usually indicative of relatively cool hydrothermal systems. Dissolution of serpentine at shallow depths may contribute dissolved silica and magnesium to rising thermal waters. Most thermal waters are saturated with respect to amorphous silica at the measured spring temperature. Silica geothermometers and mixing models are useless because the dissolved silica concentration is not controlled by the solubility of either quartz or chalcedony. Cation geothermometry indicates the possibility of a high-temperature fluid (> 200??C) only in the vicinity of Howard and Seigler Springs. However, even if the fluid temperature is as high as that indicated by the geothermometers, the permeability may be low. Deuterium and oxygen-18 values of the thermal waters indicate that they recharged locally and became enriched in oxygen-18 by exchange with rock. Diluting meteoric water and the thermal water appear to have the same deuterium value. Lack of tritium in the diluted spring waters suggest that the diluting water is old. ?? 1992.

  4. Re-evaluation of Cr concentration in some geostandard rocks by INAA

    International Nuclear Information System (INIS)

    Togashi, Shigeko; Kamioka, Hikari; Tanaka, Tsuyoshi; Ando, Atsushi

    1990-01-01

    Chromium in geological standard igneous rocks is precisely determined with a fully automated non-destructive neutron activation analysis. Samples are GSJ standard rocks (JP-1, JB-1, JB-1a, JA-3, JGb-1, JB-2, JA-1) and USGS ones (BCR-1 and G-2). Chromium concentration is determined relative to a chemical standard instead of a natural rock standard. Multiple aliquots of a slightly large amount of (200-300 mg) sample powder are analyzed to examine the heterogeneity in chromium concentration. The results agree with the consensus values within the errors of consensus values which have large coefficients of variation. The precise analysis and the examination on the distribution of reported values reveal the heterogeneity in chromium concentration of the sample powder. In particular, basaltic samples have heterogeneity in chromium concentration because of a small amount of chromite with extremely high chromium content. A chemical standard is useful to get high accuracy of chromium determination rather than natural standard materials. (author)

  5. Non-linear transient behavior during soil liquefaction based on re-evaluation of seismic records

    OpenAIRE

    Kamagata, S.; Takewaki, Izuru

    2015-01-01

    Focusing on soil liquefaction, the seismic records during the Niigata-ken earthquake in 1964, the southern Hyogo prefecture earthquake in 1995 and the 2011 off the Pacific coast of Tohoku earthquake are analyzed by the non-stationary Fourier spectra. The shift of dominant frequency in the seismic record of Kawagishi-cho during the Niigata-ken earthquake is evaluated based on the time-variant property of dominant frequencies. The reduction ratio of the soil stiffness is evaluated from the shif...

  6. Filtering and prediction

    CERN Document Server

    Fristedt, B; Krylov, N

    2007-01-01

    Filtering and prediction is about observing moving objects when the observations are corrupted by random errors. The main focus is then on filtering out the errors and extracting from the observations the most precise information about the object, which itself may or may not be moving in a somewhat random fashion. Next comes the prediction step where, using information about the past behavior of the object, one tries to predict its future path. The first three chapters of the book deal with discrete probability spaces, random variables, conditioning, Markov chains, and filtering of discrete Markov chains. The next three chapters deal with the more sophisticated notions of conditioning in nondiscrete situations, filtering of continuous-space Markov chains, and of Wiener process. Filtering and prediction of stationary sequences is discussed in the last two chapters. The authors believe that they have succeeded in presenting necessary ideas in an elementary manner without sacrificing the rigor too much. Such rig...

  7. CMAQ predicted concentration files

    Data.gov (United States)

    U.S. Environmental Protection Agency — CMAQ predicted ozone. This dataset is associated with the following publication: Gantt, B., G. Sarwar, J. Xing, H. Simon, D. Schwede, B. Hutzell, R. Mathur, and A....

  8. Methane prediction in collieries

    CSIR Research Space (South Africa)

    Creedy, DP

    1999-06-01

    Full Text Available The primary aim of the project was to assess the current status of research on methane emission prediction for collieries in South Africa in comparison with methods used and advances achieved elsewhere in the world....

  9. Climate Prediction Center - Outlooks

    Science.gov (United States)

    Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Home Site Map News Web resources and services. HOME > Outreach > Publications > Climate Diagnostics Bulletin Climate Diagnostics Bulletin - Tropics Climate Diagnostics Bulletin - Forecast Climate Diagnostics

  10. CMAQ predicted concentration files

    Data.gov (United States)

    U.S. Environmental Protection Agency — model predicted concentrations. This dataset is associated with the following publication: Muñiz-Unamunzaga, M., R. Borge, G. Sarwar, B. Gantt, D. de la Paz, C....

  11. Comparing Spatial Predictions

    KAUST Repository

    Hering, Amanda S.; Genton, Marc G.

    2011-01-01

    Under a general loss function, we develop a hypothesis test to determine whether a significant difference in the spatial predictions produced by two competing models exists on average across the entire spatial domain of interest. The null hypothesis

  12. Genomic prediction using subsampling

    OpenAIRE

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-01-01

    Background Genome-wide assisted selection is a critical tool for the?genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each rou...

  13. Predicting Online Purchasing Behavior

    OpenAIRE

    W.R BUCKINX; D. VAN DEN POEL

    2003-01-01

    This empirical study investigates the contribution of different types of predictors to the purchasing behaviour at an online store. We use logit modelling to predict whether or not a purchase is made during the next visit to the website using both forward and backward variable-selection techniques, as well as Furnival and Wilson’s global score search algorithm to find the best subset of predictors. We contribute to the literature by using variables from four different categories in predicting...

  14. Empirical Flutter Prediction Method.

    Science.gov (United States)

    1988-03-05

    been used in this way to discover species or subspecies of animals, and to discover different types of voter or comsumer requiring different persuasions...respect to behavior or performance or response variables. Once this were done, corresponding clusters might be sought among descriptive or predictive or...jump in a response. The first sort of usage does not apply to the flutter prediction problem. Here the types of behavior are the different kinds of

  15. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed

    2016-03-10

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  16. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed; Alsolami, Fawaz; Chikalov, Igor; Algharbi, Salem; Aboudi, Faisal; Khudiri, Musab

    2016-01-01

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  17. Genomic prediction using subsampling.

    Science.gov (United States)

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-03-24

    Genome-wide assisted selection is a critical tool for the genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each round of a Markov Chain Monte Carlo. We evaluated the effect of subsampling bootstrap on prediction and computational parameters. Across datasets, we observed an optimal subsampling proportion of observations around 50% with replacement, and around 33% without replacement. Subsampling provided a substantial decrease in computation time, reducing the time to fit the model by half. On average, losses on predictive properties imposed by subsampling were negligible, usually below 1%. For each dataset, an optimal subsampling point that improves prediction properties was observed, but the improvements were also negligible. Combining subsampling with Gibbs sampling is an interesting ensemble algorithm. The investigation indicates that the subsampling bootstrap Markov chain algorithm substantially reduces computational burden associated with model fitting, and it may slightly enhance prediction properties.

  18. Deep Visual Attention Prediction

    Science.gov (United States)

    Wang, Wenguan; Shen, Jianbing

    2018-05-01

    In this work, we aim to predict human eye fixation with view-free scenes based on an end-to-end deep learning architecture. Although Convolutional Neural Networks (CNNs) have made substantial improvement on human attention prediction, it is still needed to improve CNN based attention models by efficiently leveraging multi-scale features. Our visual attention network is proposed to capture hierarchical saliency information from deep, coarse layers with global saliency information to shallow, fine layers with local saliency response. Our model is based on a skip-layer network structure, which predicts human attention from multiple convolutional layers with various reception fields. Final saliency prediction is achieved via the cooperation of those global and local predictions. Our model is learned in a deep supervision manner, where supervision is directly fed into multi-level layers, instead of previous approaches of providing supervision only at the output layer and propagating this supervision back to earlier layers. Our model thus incorporates multi-level saliency predictions within a single network, which significantly decreases the redundancy of previous approaches of learning multiple network streams with different input scales. Extensive experimental analysis on various challenging benchmark datasets demonstrate our method yields state-of-the-art performance with competitive inference time.

  19. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...... the possibilities w.r.t. different numerical weather predictions actually available to the project....

  20. Transionospheric propagation predictions

    Science.gov (United States)

    Klobucher, J. A.; Basu, S.; Basu, S.; Bernhardt, P. A.; Davies, K.; Donatelli, D. E.; Fremouw, E. J.; Goodman, J. M.; Hartmann, G. K.; Leitinger, R.

    1979-01-01

    The current status and future prospects of the capability to make transionospheric propagation predictions are addressed, highlighting the effects of the ionized media, which dominate for frequencies below 1 to 3 GHz, depending upon the state of the ionosphere and the elevation angle through the Earth-space path. The primary concerns are the predictions of time delay of signal modulation (group path delay) and of radio wave scintillation. Progress in these areas is strongly tied to knowledge of variable structures in the ionosphere ranging from the large scale (thousands of kilometers in horizontal extent) to the fine scale (kilometer size). Ionospheric variability and the relative importance of various mechanisms responsible for the time histories observed in total electron content (TEC), proportional to signal group delay, and in irregularity formation are discussed in terms of capability to make both short and long term predictions. The data base upon which predictions are made is examined for its adequacy, and the prospects for prediction improvements by more theoretical studies as well as by increasing the available statistical data base are examined.

  1. Predictable grammatical constructions

    DEFF Research Database (Denmark)

    Lucas, Sandra

    2015-01-01

    My aim in this paper is to provide evidence from diachronic linguistics for the view that some predictable units are entrenched in grammar and consequently in human cognition, in a way that makes them functionally and structurally equal to nonpredictable grammatical units, suggesting that these p......My aim in this paper is to provide evidence from diachronic linguistics for the view that some predictable units are entrenched in grammar and consequently in human cognition, in a way that makes them functionally and structurally equal to nonpredictable grammatical units, suggesting...... that these predictable units should be considered grammatical constructions on a par with the nonpredictable constructions. Frequency has usually been seen as the only possible argument speaking in favor of viewing some formally and semantically fully predictable units as grammatical constructions. However, this paper...... semantically and formally predictable. Despite this difference, [méllo INF], like the other future periphrases, seems to be highly entrenched in the cognition (and grammar) of Early Medieval Greek language users, and consequently a grammatical construction. The syntactic evidence speaking in favor of [méllo...

  2. Essays on Earnings Predictability

    DEFF Research Database (Denmark)

    Bruun, Mark

    This dissertation addresses the prediction of corporate earnings. The thesis aims to examine whether the degree of precision in earnings forecasts can be increased by basing them on historical financial ratios. Furthermore, the intent of the dissertation is to analyze whether accounting standards...... forecasts are not more accurate than the simpler forecasts based on a historical timeseries of earnings. Secondly, the dissertation shows how accounting standards affect analysts’ earnings predictions. Accounting conservatism contributes to a more volatile earnings process, which lowers the accuracy...... of analysts’ earnings forecasts. Furthermore, the dissertation shows how the stock market’s reaction to the disclosure of information about corporate earnings depends on how well corporate earnings can be predicted. The dissertation indicates that the stock market’s reaction to the disclosure of earnings...

  3. Pulverized coal devolatilization prediction

    International Nuclear Information System (INIS)

    Rojas, Andres F; Barraza, Juan M

    2008-01-01

    The aim of this study was to predict the two bituminous coals devolatilization at low rate of heating (50 Celsius degrade/min), with program FG-DVC (functional group Depolymerization. Vaporization and crosslinking), and to compare the devolatilization profiles predicted by program FG-DVC, which are obtained in the thermogravimetric analyzer. It was also study the volatile liberation at (10 4 k/s) in a drop-tube furnace. The tar, methane, carbon monoxide, and carbon dioxide, formation rate profiles, and the hydrogen, oxygen, nitrogen and sulphur, elemental distribution in the devolatilization products by FG-DVC program at low rate of heating was obtained; and the liberation volatile and R factor at high rate of heating was calculated. it was found that the program predicts the bituminous coals devolatilization at low rate heating, at high rate heating, a volatile liberation around 30% was obtained

  4. Predicting Ideological Prejudice.

    Science.gov (United States)

    Brandt, Mark J

    2017-06-01

    A major shortcoming of current models of ideological prejudice is that although they can anticipate the direction of the association between participants' ideology and their prejudice against a range of target groups, they cannot predict the size of this association. I developed and tested models that can make specific size predictions for this association. A quantitative model that used the perceived ideology of the target group as the primary predictor of the ideology-prejudice relationship was developed with a representative sample of Americans ( N = 4,940) and tested against models using the perceived status of and choice to belong to the target group as predictors. In four studies (total N = 2,093), ideology-prejudice associations were estimated, and these observed estimates were compared with the models' predictions. The model that was based only on perceived ideology was the most parsimonious with the smallest errors.

  5. Inverse and Predictive Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Syracuse, Ellen Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-27

    The LANL Seismo-Acoustic team has a strong capability in developing data-driven models that accurately predict a variety of observations. These models range from the simple – one-dimensional models that are constrained by a single dataset and can be used for quick and efficient predictions – to the complex – multidimensional models that are constrained by several types of data and result in more accurate predictions. Team members typically build models of geophysical characteristics of Earth and source distributions at scales of 1 to 1000s of km, the techniques used are applicable for other types of physical characteristics at an even greater range of scales. The following cases provide a snapshot of some of the modeling work done by the Seismo- Acoustic team at LANL.

  6. Tide Predictions, California, 2014, NOAA

    Data.gov (United States)

    U.S. Environmental Protection Agency — The predictions from the web based NOAA Tide Predictions are based upon the latest information available as of the date of the user's request. Tide predictions...

  7. Predictive maintenance primer

    International Nuclear Information System (INIS)

    Flude, J.W.; Nicholas, J.R.

    1991-04-01

    This Predictive Maintenance Primer provides utility plant personnel with a single-source reference to predictive maintenance analysis methods and technologies used successfully by utilities and other industries. It is intended to be a ready reference to personnel considering starting, expanding or improving a predictive maintenance program. This Primer includes a discussion of various analysis methods and how they overlap and interrelate. Additionally, eighteen predictive maintenance technologies are discussed in sufficient detail for the user to evaluate the potential of each technology for specific applications. This document is designed to allow inclusion of additional technologies in the future. To gather the information necessary to create this initial Primer the Nuclear Maintenance Applications Center (NMAC) collected experience data from eighteen utilities plus other industry and government sources. NMAC also contacted equipment manufacturers for information pertaining to equipment utilization, maintenance, and technical specifications. The Primer includes a discussion of six methods used by analysts to study predictive maintenance data. These are: trend analysis; pattern recognition; correlation; test against limits or ranges; relative comparison data; and statistical process analysis. Following the analysis methods discussions are detailed descriptions for eighteen technologies analysts have found useful for predictive maintenance programs at power plants and other industrial facilities. Each technology subchapter has a description of the operating principles involved in the technology, a listing of plant equipment where the technology can be applied, and a general description of the monitoring equipment. Additionally, these descriptions include a discussion of results obtained from actual equipment users and preferred analysis techniques to be used on data obtained from the technology. 5 refs., 30 figs

  8. Does EMS Perceived Anatomic Injury Predict Trauma Center Need?

    Science.gov (United States)

    Lerner, E. Brooke; Roberts, Jennifer; Guse, Clare E.; Shah, Manish N.; Swor, Robert; Cushman, Jeremy T.; Blatt, Alan; Jurkovich, Gregory J.; Brasel, Karen

    2013-01-01

    Objective Our objective was to determine the predictive value of the anatomic step of the 2011 Field Triage Decision Scheme for identifying trauma center need. Methods EMS providers caring for injured adults transported to regional trauma centers in 3 midsized communities were interviewed over two years. Patients were included, regardless of injury severity, if they were at least 18 years old and were transported by EMS with a mechanism of injury that was an assault, motor vehicle or motorcycle crash, fall, or pedestrian or bicyclist struck. The interview was conducted upon ED arrival and collected physiologic condition and anatomic injury data. Patients who met the physiologic criteria were excluded. Trauma center need was defined as non-orthopedic surgery within 24 hours, intensive care unit admission, or death prior to hospital discharge. Data were analyzed by calculating descriptive statistics including positive likelihood ratios (+LR) with 95% confidence intervals. Results 11,892 interviews were conducted. One was excluded because of missing outcome data and 1,274 were excluded because they met the physiologic step. EMS providers identified 1,167 cases that met the anatomic criteria, of which 307 (26%) needed the resources of a trauma center (38% sensitivity, 91% specificity, +LR 4.4; CI: 3.9 - 4.9). Criteria with a +LR ≥5 were flail chest (9.0; CI: 4.1 - 19.4), paralysis (6.8; CI: 4.2 - 11.2), two or more long bone fractures (6.3; CI: 4.5 - 8.9), and amputation (6.1; CI: 1.5 - 24.4). Criteria with a +LR >2 and <5 were penetrating injury (4.8; CI: 4.2 - 5.6), and skull fracture (4.8; CI: 3.0 - 7.7). Only pelvic fracture (1.9; CI: 1.3 - 2.9) had a +LR less than 2. Conclusions The anatomic step of the Field Triage Guidelines as determined by EMS providers is a reasonable tool for determining trauma center need. Use of EMS perceived pelvic fracture as an indicator for trauma center need should be re-evaluated. PMID:23627418

  9. Serial assessment of pulmonary lesion volume by computed tomography allows survival prediction in invasive pulmonary aspergillosis

    Energy Technology Data Exchange (ETDEWEB)

    Vehreschild, J.J.; Vehreschild, M.J.G.T. [University Hospital of Cologne, Department I of Internal Medicine, Cologne (Germany); German Centre for Infection Research, Partner Site Bonn-Cologne, Cologne (Germany); Heussel, C.P. [Chest Clinic at University Hospital Heidelberg, Diagnostic and Interventional Radiology with Nuclear Medicine, Heidelberg (Germany); University Hospital of Heidelberg, Department of Diagnostic and Interventional Radiology, Heidelberg (Germany); Translational Lung Research Center Heidelberg (TLRC), Member of the German Center for Lung Research (DZL), Heidelberg (Germany); Groll, A.H. [University Children' s Hospital, Infectious Disease Research Program, Department of Paediatric Haematology/Oncology, Muenster (Germany); Silling, G. [University of Muenster, Department of Medicine A, Haematology/Oncology, Muenster (Germany); Wuerthwein, G. [University Hospital Muenster, Centre for Clinical Trials, ZKS Muenster (Germany); Brecht, M. [Chest Clinic at University Hospital Heidelberg, Diagnostic and Interventional Radiology with Nuclear Medicine, Heidelberg (Germany); Cornely, O.A. [University Hospital of Cologne, Department I of Internal Medicine, Cologne (Germany); University of Cologne, Clinical Trials Center Cologne, ZKS Koeln (BMBF 01KN1106), Cologne (Germany); Center for Integrated Oncology CIO Koeln Bonn, Cologne (Germany); University of Cologne, Cologne Excellence Cluster on Cellular Stress Responses in Aging-Associated Diseases (CECAD), Cologne (Germany)

    2017-08-15

    Serial chest CT is the standard of care to establish treatment success in invasive pulmonary aspergillosis (IPA). Data are lacking how response should be defined. Digital CT images from a clinical trial on treatment of IPA were re-evaluated and compared with available biomarkers. Total volume of pneumonia was added up after manual measurement of each lesion, followed by statistical analysis. One-hundred and ninety CT scans and 309 follow-up datasets from 40 patients were available for analysis. Thirty-one were neutropenic. Baseline galactomannan (OR 4.06, 95%CI: 1.08-15.31) and lesion volume (OR 3.14, 95%CI: 0.73-13.52) were predictive of death. Lesion volume at d7 and trend between d7 and d14 were strong predictors of death (OR 20.01, 95%CI: 1.42-282.00 and OR 15.97, 95%CI: 1.62-157.32) and treatment being rated as unsuccessful (OR 4.75, 95%CI: 0.94-24.05 and OR 40.69, 95%CI: 2.55-649.03), which was confirmed by a Cox proportional hazards model using time-dependent covariates. Any increase in CT lesion volume between day 7 and day 14 was a sensitive marker of a lethal outcome (>50%), supporting a CT rescan each one and 2 weeks after initial detection of IPA. The predictive value exceeded all other biomarkers. Further CT follow-up after response at day 14 was of low additional value. (orig.)

  10. Predicting tile drainage discharge

    DEFF Research Database (Denmark)

    Iversen, Bo Vangsø; Kjærgaard, Charlotte; Petersen, Rasmus Jes

    used in the analysis. For the dynamic modelling, a simple linear reservoir model was used where different outlets in the model represented tile drain as well as groundwater discharge outputs. This modelling was based on daily measured tile drain discharge values. The statistical predictive model...... was based on a polynomial regression predicting yearly tile drain discharge values using site specific parameters such as soil type, catchment topography, etc. as predictors. Values of calibrated model parameters from the dynamic modelling were compared to the same site specific parameter as used...

  11. Linguistic Structure Prediction

    CERN Document Server

    Smith, Noah A

    2011-01-01

    A major part of natural language processing now depends on the use of text data to build linguistic analyzers. We consider statistical, computational approaches to modeling linguistic structure. We seek to unify across many approaches and many kinds of linguistic structures. Assuming a basic understanding of natural language processing and/or machine learning, we seek to bridge the gap between the two fields. Approaches to decoding (i.e., carrying out linguistic structure prediction) and supervised and unsupervised learning of models that predict discrete structures as outputs are the focus. W

  12. Predicting Anthracycline Benefit

    DEFF Research Database (Denmark)

    Bartlett, John M S; McConkey, Christopher C; Munro, Alison F

    2015-01-01

    PURPOSE: Evidence supporting the clinical utility of predictive biomarkers of anthracycline activity is weak, with a recent meta-analysis failing to provide strong evidence for either HER2 or TOP2A. Having previously shown that duplication of chromosome 17 pericentromeric alpha satellite as measu......PURPOSE: Evidence supporting the clinical utility of predictive biomarkers of anthracycline activity is weak, with a recent meta-analysis failing to provide strong evidence for either HER2 or TOP2A. Having previously shown that duplication of chromosome 17 pericentromeric alpha satellite...

  13. Prediction of Antibody Epitopes

    DEFF Research Database (Denmark)

    Nielsen, Morten; Marcatili, Paolo

    2015-01-01

    Antibodies recognize their cognate antigens in a precise and effective way. In order to do so, they target regions of the antigenic molecules that have specific features such as large exposed areas, presence of charged or polar atoms, specific secondary structure elements, and lack of similarity...... to self-proteins. Given the sequence or the structure of a protein of interest, several methods exploit such features to predict the residues that are more likely to be recognized by an immunoglobulin.Here, we present two methods (BepiPred and DiscoTope) to predict linear and discontinuous antibody...

  14. Basis of predictive mycology.

    Science.gov (United States)

    Dantigny, Philippe; Guilmart, Audrey; Bensoussan, Maurice

    2005-04-15

    For over 20 years, predictive microbiology focused on food-pathogenic bacteria. Few studies concerned modelling fungal development. On one hand, most of food mycologists are not familiar with modelling techniques; on the other hand, people involved in modelling are developing tools dedicated to bacteria. Therefore, there is a tendency to extend the use of models that were developed for bacteria to moulds. However, some mould specificities should be taken into account. The use of specific models for predicting germination and growth of fungi was advocated previously []. This paper provides a short review of fungal modelling studies.

  15. Dopamine reward prediction error coding

    OpenAIRE

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards?an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less...

  16. Predicting Young Adults Binge Drinking in Nightlife Scenes: An Evaluation of the D-ARIANNA Risk Estimation Model.

    Science.gov (United States)

    Crocamo, Cristina; Bartoli, Francesco; Montomoli, Cristina; Carrà, Giuseppe

    2018-05-25

    Binge drinking (BD) among young people has significant public health implications. Thus, there is the need to target users most at risk. We estimated the discriminative accuracy of an innovative model nested in a recently developed e-Health app (Digital-Alcohol RIsk Alertness Notifying Network for Adolescents and young adults [D-ARIANNA]) for BD in young people, examining its performance to predict short-term BD episodes. We consecutively recruited young adults in pubs, discos, or live music events. Participants self-administered the app D-ARIANNA, which incorporates an evidence-based risk estimation model for the dependent variable BD. They were re-evaluated after 2 weeks using a single-item BD behavior as reference. We estimated D-ARIANNA discriminative ability through measures of sensitivity and specificity, and also likelihood ratios. ROC curve analyses were carried out, exploring variability of discriminative ability across subgroups. The analyses included 507 subjects, of whom 18% reported at least 1 BD episode at follow-up. The majority of these had been identified as at high/moderate or high risk (65%) at induction. Higher scores from the D-ARIANNA risk estimation model reflected an increase in the likelihood of BD. Additional risk factors such as high pocket money availability and alcohol expectancies influence the predictive ability of the model. The D-ARIANNA model showed an appreciable, though modest, predictive ability for subsequent BD episodes. Post-hoc model showed slightly better predictive properties. Using up-to-date technology, D-ARIANNA appears an innovative and promising screening tool for BD among young people. Long-term impact remains to be established, and also the role of additional social and environmental factors.

  17. Steering smog prediction

    NARCIS (Netherlands)

    R. van Liere (Robert); J.J. van Wijk (Jack)

    1997-01-01

    textabstractThe use of computational steering for smog prediction is described. This application is representative for many underlying issues found in steering high performance applications: high computing times, large data sets, and many different input parameters. After a short description of the

  18. Predicting Sustainable Work Behavior

    DEFF Research Database (Denmark)

    Hald, Kim Sundtoft

    2013-01-01

    Sustainable work behavior is an important issue for operations managers – it has implications for most outcomes of OM. This research explores the antecedents of sustainable work behavior. It revisits and extends the sociotechnical model developed by Brown et al. (2000) on predicting safe behavior...

  19. Gate valve performance prediction

    International Nuclear Information System (INIS)

    Harrison, D.H.; Damerell, P.S.; Wang, J.K.; Kalsi, M.S.; Wolfe, K.J.

    1994-01-01

    The Electric Power Research Institute is carrying out a program to improve the performance prediction methods for motor-operated valves. As part of this program, an analytical method to predict the stem thrust required to stroke a gate valve has been developed and has been assessed against data from gate valve tests. The method accounts for the loads applied to the disc by fluid flow and for the detailed mechanical interaction of the stem, disc, guides, and seats. To support development of the method, two separate-effects test programs were carried out. One test program determined friction coefficients for contacts between gate valve parts by using material specimens in controlled environments. The other test program investigated the interaction of the stem, disc, guides, and seat using a special fixture with full-sized gate valve parts. The method has been assessed against flow-loop and in-plant test data. These tests include valve sizes from 3 to 18 in. and cover a considerable range of flow, temperature, and differential pressure. Stem thrust predictions for the method bound measured results. In some cases, the bounding predictions are substantially higher than the stem loads required for valve operation, as a result of the bounding nature of the friction coefficients in the method

  20. Prediction method abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This conference was held December 4--8, 1994 in Asilomar, California. The purpose of this meeting was to provide a forum for exchange of state-of-the-art information concerning the prediction of protein structure. Attention if focused on the following: comparative modeling; sequence to fold assignment; and ab initio folding.

  1. Predicting Intrinsic Motivation

    Science.gov (United States)

    Martens, Rob; Kirschner, Paul A.

    2004-01-01

    Intrinsic motivation can be predicted from participants' perceptions of the social environment and the task environment (Ryan & Deci, 2000)in terms of control, relatedness and competence. To determine the degree of independence of these factors 251 students in higher vocational education (physiotherapy and hotel management) indicated the…

  2. Predicting visibility of aircraft.

    Directory of Open Access Journals (Sweden)

    Andrew Watson

    Full Text Available Visual detection of aircraft by human observers is an important element of aviation safety. To assess and ensure safety, it would be useful to be able to be able to predict the visibility, to a human observer, of an aircraft of specified size, shape, distance, and coloration. Examples include assuring safe separation among aircraft and between aircraft and unmanned vehicles, design of airport control towers, and efforts to enhance or suppress the visibility of military and rescue vehicles. We have recently developed a simple metric of pattern visibility, the Spatial Standard Observer (SSO. In this report we examine whether the SSO can predict visibility of simulated aircraft images. We constructed a set of aircraft images from three-dimensional computer graphic models, and measured the luminance contrast threshold for each image from three human observers. The data were well predicted by the SSO. Finally, we show how to use the SSO to predict visibility range for aircraft of arbitrary size, shape, distance, and coloration.

  3. Climate Prediction Center

    Science.gov (United States)

    Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Home Site Map News Organization Enter Search Term(s): Search Search the CPC Go NCEP Quarterly Newsletter Climate Highlights U.S Climate-Weather El Niño/La Niña MJO Blocking AAO, AO, NAO, PNA Climatology Global Monsoons Expert

  4. Predicting Commissary Store Success

    Science.gov (United States)

    2014-12-01

    stores or if it is possible to predict that success. Multiple studies of private commercial grocery consumer preferences , habits and demographics have...appropriate number of competitors due to the nature of international cultures and consumer preferences . 2. Missing Data Four of the remaining stores

  5. Predicting Job Satisfaction.

    Science.gov (United States)

    Blai, Boris, Jr.

    Psychological theories about human motivation and accommodation to environment can be used to achieve a better understanding of the human factors that function in the work environment. Maslow's theory of human motivational behavior provided a theoretical framework for an empirically-derived method to predict job satisfaction and explore the…

  6. Ocean Prediction Center

    Science.gov (United States)

    Social Media Facebook Twitter YouTube Search Search For Go NWS All NOAA Weather Analysis & Forecasts of Commerce Ocean Prediction Center National Oceanic and Atmospheric Administration Analysis & Unified Surface Analysis Ocean Ocean Products Ice & Icebergs NIC Ice Products NAIS Iceberg Analysis

  7. Predicting Reasoning from Memory

    Science.gov (United States)

    Heit, Evan; Hayes, Brett K.

    2011-01-01

    In an effort to assess the relations between reasoning and memory, in 8 experiments, the authors examined how well responses on an inductive reasoning task are predicted from responses on a recognition memory task for the same picture stimuli. Across several experimental manipulations, such as varying study time, presentation frequency, and the…

  8. Predicting coronary heart disease

    DEFF Research Database (Denmark)

    Sillesen, Henrik; Fuster, Valentin

    2012-01-01

    Atherosclerosis is the leading cause of death and disabling disease. Whereas risk factors are well known and constitute therapeutic targets, they are not useful for prediction of risk of future myocardial infarction, stroke, or death. Therefore, methods to identify atherosclerosis itself have bee...

  9. ANTHROPOMETRIC PREDICTIVE EQUATIONS FOR ...

    African Journals Online (AJOL)

    Keywords: Anthropometry, Predictive Equations, Percentage Body Fat, Nigerian Women, Bioelectric Impedance ... such as Asians and Indians (Pranav et al., 2009), ... size (n) of at least 3o is adjudged as sufficient for the ..... of people, gender and age (Vogel eta/., 1984). .... Fish Sold at Ile-Ife Main Market, South West Nigeria.

  10. Predicting Pilot Retention

    Science.gov (United States)

    2012-06-15

    forever… Gig ‘Em! Dale W. Stanley III vii Table of Contents Page Acknowledgments...over the last 20 years. Airbus predicted that these trends would continue as emerging economies , especially in Asia, were creating a fast growing...US economy , pay differential and hiring by the major airlines contributed most to the decision to separate from the Air Force (Fullerton, 2003: 354

  11. Predicting ideological prejudice

    NARCIS (Netherlands)

    Brandt, M.J.

    2018-01-01

    A major shortcoming of current models of ideological prejudice is that although they can anticipate the direction of the association between participants’ ideology and their prejudice against a range of target groups, they cannot predict the size of this association. I developed and tested models

  12. Predictive and prognostic value of tumor volume and its changes during radical radiotherapy of stage III non-small cell lung cancer. A systematic review

    International Nuclear Information System (INIS)

    Kaesmann, Lukas; Niyazi, Maximilian; Fleischmann, Daniel; Blanck, Oliver; Baumann, Rene; Baues, Christian; Klook, Lisa; Rosenbrock, Johannes; Trommer-Nestler, Maike; Dobiasch, Sophie; Eze, Chukwuka; Gauer, Tobias; Goy, Yvonne; Giordano, Frank A.; Sautter, Lisa; Hausmann, Jan; Henkenberens, Christoph; Kaul, David; Thieme, Alexander H.; Krug, David; Schmitt, Daniela; Maeurer, Matthias; Panje, Cedric M.; Suess, Christoph; Ziegler, Sonia; Ebert, Nadja; Medenwald, Daniel; Ostheimer, Christian

    2018-01-01

    Lung cancer remains the leading cause of cancer-related mortality worldwide. Stage III non-small cell lung cancer (NSCLC) includes heterogeneous presentation of the disease including lymph node involvement and large tumour volumes with infiltration of the mediastinum, heart or spine. In the treatment of stage III NSCLC an interdisciplinary approach including radiotherapy is considered standard of care with acceptable toxicity and improved clinical outcome concerning local control. Furthermore, gross tumour volume (GTV) changes during definitive radiotherapy would allow for adaptive replanning which offers normal tissue sparing and dose escalation. A literature review was conducted to describe the predictive value of GTV changes during definitive radiotherapy especially focussing on overall survival. The literature search was conducted in a two-step review process using PubMed registered /Medline registered with the key words ''stage III non-small cell lung cancer'' and ''radiotherapy'' and ''tumour volume'' and ''prognostic factors''. After final consideration 17, 14 and 9 studies with a total of 2516, 784 and 639 patients on predictive impact of GTV, GTV changes and its impact on overall survival, respectively, for definitive radiotherapy for stage III NSCLC were included in this review. Initial GTV is an important prognostic factor for overall survival in several studies, but the time of evaluation and the value of histology need to be further investigated. GTV changes during RT differ widely, optimal timing for re-evaluation of GTV and their predictive value for prognosis needs to be clarified. The prognostic value of GTV changes is unclear due to varying study qualities, re-evaluation time and conflicting results. The main findings were that the clinical impact of GTV changes during definitive radiotherapy is still unclear due to heterogeneous study designs with varying quality

  13. Dopamine reward prediction error coding.

    Science.gov (United States)

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  14. Urban pluvial flood prediction

    DEFF Research Database (Denmark)

    Thorndahl, Søren Liedtke; Nielsen, Jesper Ellerbæk; Jensen, David Getreuer

    2016-01-01

    Flooding produced by high-intensive local rainfall and drainage system capacity exceedance can have severe impacts in cities. In order to prepare cities for these types of flood events – especially in the future climate – it is valuable to be able to simulate these events numerically both...... historically and in real-time. There is a rather untested potential in real-time prediction of urban floods. In this paper radar data observations with different spatial and temporal resolution, radar nowcasts of 0–2 h lead time, and numerical weather models with lead times up to 24 h are used as inputs...... to an integrated flood and drainage systems model in order to investigate the relative difference between different inputs in predicting future floods. The system is tested on a small town Lystrup in Denmark, which has been flooded in 2012 and 2014. Results show it is possible to generate detailed flood maps...

  15. Predicting Bankruptcy in Pakistan

    Directory of Open Access Journals (Sweden)

    Abdul RASHID

    2011-09-01

    Full Text Available This paper aims to identify the financial ratios that are most significant in bankruptcy prediction for the non-financial sector of Pakistan based on a sample of companies which became bankrupt over the time period 1996-2006. Twenty four financial ratios covering four important financial attributes, namely profitability, liquidity, leverage, and turnover ratios, were examined for a five-year period prior bankruptcy. The discriminant analysis produced a parsimonious model of three variables viz. sales to total assets, EBIT to current liabilities, and cash flow ratio. Our estimates provide evidence that the firms having Z-value below zero fall into the “bankrupt” whereas the firms with Z-value above zero fall into the “non-bankrupt” category. The model achieved 76.9% prediction accuracy when it is applied to forecast bankruptcies on the underlying sample.

  16. Predicting Lotto Numbers

    DEFF Research Database (Denmark)

    Jørgensen, Claus Bjørn; Suetens, Sigrid; Tyran, Jean-Robert

    numbers based on recent drawings. While most players pick the same set of numbers week after week without regards of numbers drawn or anything else, we find that those who do change, act on average in the way predicted by the law of small numbers as formalized in recent behavioral theory. In particular......We investigate the “law of small numbers” using a unique panel data set on lotto gambling. Because we can track individual players over time, we can measure how they react to outcomes of recent lotto drawings. We can therefore test whether they behave as if they believe they can predict lotto......, on average they move away from numbers that have recently been drawn, as suggested by the “gambler’s fallacy”, and move toward numbers that are on streak, i.e. have been drawn several weeks in a row, consistent with the “hot hand fallacy”....

  17. Comparing Spatial Predictions

    KAUST Repository

    Hering, Amanda S.

    2011-11-01

    Under a general loss function, we develop a hypothesis test to determine whether a significant difference in the spatial predictions produced by two competing models exists on average across the entire spatial domain of interest. The null hypothesis is that of no difference, and a spatial loss differential is created based on the observed data, the two sets of predictions, and the loss function chosen by the researcher. The test assumes only isotropy and short-range spatial dependence of the loss differential but does allow it to be non-Gaussian, non-zero-mean, and spatially correlated. Constant and nonconstant spatial trends in the loss differential are treated in two separate cases. Monte Carlo simulations illustrate the size and power properties of this test, and an example based on daily average wind speeds in Oklahoma is used for illustration. Supplemental results are available online. © 2011 American Statistical Association and the American Society for Qualitys.

  18. Chaos detection and predictability

    CERN Document Server

    Gottwald, Georg; Laskar, Jacques

    2016-01-01

    Distinguishing chaoticity from regularity in deterministic dynamical systems and specifying the subspace of the phase space in which instabilities are expected to occur is of utmost importance in as disparate areas as astronomy, particle physics and climate dynamics.   To address these issues there exists a plethora of methods for chaos detection and predictability. The most commonly employed technique for investigating chaotic dynamics, i.e. the computation of Lyapunov exponents, however, may suffer a number of problems and drawbacks, for example when applied to noisy experimental data.   In the last two decades, several novel methods have been developed for the fast and reliable determination of the regular or chaotic nature of orbits, aimed at overcoming the shortcomings of more traditional techniques. This set of lecture notes and tutorial reviews serves as an introduction to and overview of modern chaos detection and predictability techniques for graduate students and non-specialists.   The book cover...

  19. Time-predictable architectures

    CERN Document Server

    Rochange, Christine; Uhrig , Sascha

    2014-01-01

    Building computers that can be used to design embedded real-time systems is the subject of this title. Real-time embedded software requires increasingly higher performances. The authors therefore consider processors that implement advanced mechanisms such as pipelining, out-of-order execution, branch prediction, cache memories, multi-threading, multicorearchitectures, etc. The authors of this book investigate the timepredictability of such schemes.

  20. Cultural Resource Predictive Modeling

    Science.gov (United States)

    2017-10-01

    CR cultural resource CRM cultural resource management CRPM Cultural Resource Predictive Modeling DoD Department of Defense ESTCP Environmental...resource management ( CRM ) legal obligations under NEPA and the NHPA, military installations need to demonstrate that CRM decisions are based on objective...maxim “one size does not fit all,” and demonstrate that DoD installations have many different CRM needs that can and should be met through a variety

  1. Predictive Game Theory

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  2. Predicting appointment breaking.

    Science.gov (United States)

    Bean, A G; Talaga, J

    1995-01-01

    The goal of physician referral services is to schedule appointments, but if too many patients fail to show up, the value of the service will be compromised. The authors found that appointment breaking can be predicted by the number of days to the scheduled appointment, the doctor's specialty, and the patient's age and gender. They also offer specific suggestions for modifying the marketing mix to reduce the incidence of no-shows.

  3. Adjusting estimative prediction limits

    OpenAIRE

    Masao Ueki; Kaoru Fueda

    2007-01-01

    This note presents a direct adjustment of the estimative prediction limit to reduce the coverage error from a target value to third-order accuracy. The adjustment is asymptotically equivalent to those of Barndorff-Nielsen & Cox (1994, 1996) and Vidoni (1998). It has a simpler form with a plug-in estimator of the coverage probability of the estimative limit at the target value. Copyright 2007, Oxford University Press.

  4. Space Weather Prediction

    Science.gov (United States)

    2014-10-31

    prominence eruptions and the ensuing coronal mass ejections. The ProMag is a spectro - polarimeter, consisting of a dual-beam polarization modulation unit...feeding a visible camera and an infrared camera. The instrument is designed to measure magnetic fields in solar prominences by simultaneous spectro ...as a result of coronal hole regions, we expect to improve UV predictions by incorporating an estimate of the Earth-side coronal hole regions. 5

  5. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  6. Predictive Systems Toxicology

    KAUST Repository

    Kiani, Narsis A.; Shang, Ming-Mei; Zenil, Hector; Tegner, Jesper

    2018-01-01

    In this review we address to what extent computational techniques can augment our ability to predict toxicity. The first section provides a brief history of empirical observations on toxicity dating back to the dawn of Sumerian civilization. Interestingly, the concept of dose emerged very early on, leading up to the modern emphasis on kinetic properties, which in turn encodes the insight that toxicity is not solely a property of a compound but instead depends on the interaction with the host organism. The next logical step is the current conception of evaluating drugs from a personalized medicine point-of-view. We review recent work on integrating what could be referred to as classical pharmacokinetic analysis with emerging systems biology approaches incorporating multiple omics data. These systems approaches employ advanced statistical analytical data processing complemented with machine learning techniques and use both pharmacokinetic and omics data. We find that such integrated approaches not only provide improved predictions of toxicity but also enable mechanistic interpretations of the molecular mechanisms underpinning toxicity and drug resistance. We conclude the chapter by discussing some of the main challenges, such as how to balance the inherent tension between the predictive capacity of models, which in practice amounts to constraining the number of features in the models versus allowing for rich mechanistic interpretability, i.e. equipping models with numerous molecular features. This challenge also requires patient-specific predictions on toxicity, which in turn requires proper stratification of patients as regards how they respond, with or without adverse toxic effects. In summary, the transformation of the ancient concept of dose is currently successfully operationalized using rich integrative data encoded in patient-specific models.

  7. Predictive systems ecology

    OpenAIRE

    Evans, Matthew R.; Bithell, Mike; Cornell, Stephen J.; Dall, Sasha R. X.; D?az, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J.; Lewis, Simon L.; Mace, Georgina M.; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim

    2013-01-01

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of ...

  8. UXO Burial Prediction Fidelity

    Science.gov (United States)

    2017-07-01

    models to capture detailed projectile dynamics during the early phases of water entry are wasted with regard to sediment -penetration depth prediction...ordnance (UXO) migrates and becomes exposed over time in response to water and sediment motion.  Such models need initial sediment penetration estimates...munition’s initial penetration depth into the sediment ,  the velocity of water at the water - sediment boundary (i.e., the bottom water velocity

  9. Predictive Systems Toxicology

    KAUST Repository

    Kiani, Narsis A.

    2018-01-15

    In this review we address to what extent computational techniques can augment our ability to predict toxicity. The first section provides a brief history of empirical observations on toxicity dating back to the dawn of Sumerian civilization. Interestingly, the concept of dose emerged very early on, leading up to the modern emphasis on kinetic properties, which in turn encodes the insight that toxicity is not solely a property of a compound but instead depends on the interaction with the host organism. The next logical step is the current conception of evaluating drugs from a personalized medicine point-of-view. We review recent work on integrating what could be referred to as classical pharmacokinetic analysis with emerging systems biology approaches incorporating multiple omics data. These systems approaches employ advanced statistical analytical data processing complemented with machine learning techniques and use both pharmacokinetic and omics data. We find that such integrated approaches not only provide improved predictions of toxicity but also enable mechanistic interpretations of the molecular mechanisms underpinning toxicity and drug resistance. We conclude the chapter by discussing some of the main challenges, such as how to balance the inherent tension between the predictive capacity of models, which in practice amounts to constraining the number of features in the models versus allowing for rich mechanistic interpretability, i.e. equipping models with numerous molecular features. This challenge also requires patient-specific predictions on toxicity, which in turn requires proper stratification of patients as regards how they respond, with or without adverse toxic effects. In summary, the transformation of the ancient concept of dose is currently successfully operationalized using rich integrative data encoded in patient-specific models.

  10. Predictive Surface Complexation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sverjensky, Dimitri A. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Earth and Planetary Sciences

    2016-11-29

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall, my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.

  11. Predicting Human Cooperation.

    Directory of Open Access Journals (Sweden)

    John J Nay

    Full Text Available The Prisoner's Dilemma has been a subject of extensive research due to its importance in understanding the ever-present tension between individual self-interest and social benefit. A strictly dominant strategy in a Prisoner's Dilemma (defection, when played by both players, is mutually harmful. Repetition of the Prisoner's Dilemma can give rise to cooperation as an equilibrium, but defection is as well, and this ambiguity is difficult to resolve. The numerous behavioral experiments investigating the Prisoner's Dilemma highlight that players often cooperate, but the level of cooperation varies significantly with the specifics of the experimental predicament. We present the first computational model of human behavior in repeated Prisoner's Dilemma games that unifies the diversity of experimental observations in a systematic and quantitatively reliable manner. Our model relies on data we integrated from many experiments, comprising 168,386 individual decisions. The model is composed of two pieces: the first predicts the first-period action using solely the structural game parameters, while the second predicts dynamic actions using both game parameters and history of play. Our model is successful not merely at fitting the data, but in predicting behavior at multiple scales in experimental designs not used for calibration, using only information about the game structure. We demonstrate the power of our approach through a simulation analysis revealing how to best promote human cooperation.

  12. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  13. Evolutionary relatedness does not predict competition and co-occurrence in natural or experimental communities of green algae

    Science.gov (United States)

    Alexandrou, Markos A.; Cardinale, Bradley J.; Hall, John D.; Delwiche, Charles F.; Fritschie, Keith; Narwani, Anita; Venail, Patrick A.; Bentlage, Bastian; Pankey, M. Sabrina; Oakley, Todd H.

    2015-01-01

    species interactions and community assembly in both natural and experimental systems. Our results challenge the generality of the CRH and suggest it may be time to re-evaluate the validity and assumptions of this hypothesis. PMID:25473009

  14. Disruption prediction at JET

    International Nuclear Information System (INIS)

    Milani, F.

    1998-12-01

    The sudden loss of the plasma magnetic confinement, known as disruption, is one of the major issue in a nuclear fusion machine as JET (Joint European Torus). Disruptions pose very serious problems to the safety of the machine. The energy stored in the plasma is released to the machine structure in few milliseconds resulting in forces that at JET reach several Mega Newtons. The problem is even more severe in the nuclear fusion power station where the forces are in the order of one hundred Mega Newtons. The events that occur during a disruption are still not well understood even if some mechanisms that can lead to a disruption have been identified and can be used to predict them. Unfortunately it is always a combination of these events that generates a disruption and therefore it is not possible to use simple algorithms to predict it. This thesis analyses the possibility of using neural network algorithms to predict plasma disruptions in real time. This involves the determination of plasma parameters every few milliseconds. A plasma boundary reconstruction algorithm, XLOC, has been developed in collaboration with Dr. D. O'Brien and Dr. J. Ellis capable of determining the plasma wall/distance every 2 milliseconds. The XLOC output has been used to develop a multilayer perceptron network to determine plasma parameters as l i and q ψ with which a machine operational space has been experimentally defined. If the limits of this operational space are breached the disruption probability increases considerably. Another approach for prediction disruptions is to use neural network classification methods to define the JET operational space. Two methods have been studied. The first method uses a multilayer perceptron network with softmax activation function for the output layer. This method can be used for classifying the input patterns in various classes. In this case the plasma input patterns have been divided between disrupting and safe patterns, giving the possibility of

  15. Genomic Prediction in Barley

    DEFF Research Database (Denmark)

    Edriss, Vahid; Cericola, Fabio; Jensen, Jens D

    2015-01-01

    to next generation. The main goal of this study was to see the potential of using genomic prediction in a commercial Barley breeding program. The data used in this study was from Nordic Seed company which is located in Denmark. Around 350 advanced lines were genotyped with 9K Barely chip from Illumina....... Traits used in this study were grain yield, plant height and heading date. Heading date is number days it takes after 1st June for plant to head. Heritabilities were 0.33, 0.44 and 0.48 for yield, height and heading, respectively for the average of nine plots. The GBLUP model was used for genomic...

  16. Predicting Lotto Numbers

    DEFF Research Database (Denmark)

    Suetens, Sigrid; Galbo-Jørgensen, Claus B.; Tyran, Jean-Robert Karl

    2016-01-01

    We investigate the ‘law of small numbers’ using a data set on lotto gambling that allows us to measure players’ reactions to draws. While most players pick the same set of numbers week after week, we find that those who do change react on average as predicted by the law of small numbers...... as formalized in recent behavioral theory. In particular, players tend to bet less on numbers that have been drawn in the preceding week, as suggested by the ‘gambler’s fallacy’, and bet more on a number if it was frequently drawn in the recent past, consistent with the ‘hot-hand fallacy’....

  17. Predictable return distributions

    DEFF Research Database (Denmark)

    Pedersen, Thomas Quistgaard

    trace out the entire distribution. A univariate quantile regression model is used to examine stock and bond return distributions individually, while a multivariate model is used to capture their joint distribution. An empirical analysis on US data shows that certain parts of the return distributions......-of-sample analyses show that the relative accuracy of the state variables in predicting future returns varies across the distribution. A portfolio study shows that an investor with power utility can obtain economic gains by applying the empirical return distribution in portfolio decisions instead of imposing...

  18. Predicting Ground Illuminance

    Science.gov (United States)

    Lesniak, Michael V.; Tregoning, Brett D.; Hitchens, Alexandra E.

    2015-01-01

    Our Sun outputs 3.85 x 1026 W of radiation, of which roughly 37% is in the visible band. It is directly responsible for nearly all natural illuminance experienced on Earth's surface, either in the form of direct/refracted sunlight or in reflected light bouncing off the surfaces and/or atmospheres of our Moon and the visible planets. Ground illuminance, defined as the amount of visible light intercepting a unit area of surface (from all incident angles), varies over 7 orders of magnitude from day to night. It is highly dependent on well-modeled factors such as the relative positions of the Sun, Earth, and Moon. It is also dependent on less predictable factors such as local atmospheric conditions and weather.Several models have been proposed to predict ground illuminance, including Brown (1952) and Shapiro (1982, 1987). The Brown model is a set of empirical data collected from observation points around the world that has been reduced to a smooth fit of illuminance against a single variable, solar altitude. It provides limited applicability to the Moon and for cloudy conditions via multiplicative reduction factors. The Shapiro model is a theoretical model that treats the atmosphere as a three layer system of light reflectance and transmittance. It has different sets of reflectance and transmittance coefficients for various cloud types.In this paper we compare the models' predictions to ground illuminance data from an observing run at the White Sands missile range (data was obtained from the United Kingdom's Meteorology Office). Continuous illuminance readings were recorded under various cloud conditions, during both daytime and nighttime hours. We find that under clear skies, the Shapiro model tends to better fit the observations during daytime hours with typical discrepancies under 10%. Under cloudy skies, both models tend to poorly predict ground illuminance. However, the Shapiro model, with typical average daytime discrepancies of 25% or less in many cases

  19. Predicting sports betting outcomes

    OpenAIRE

    Flis, Borut

    2014-01-01

    We wish to build a model, which could predict the outcome of basketball games. The goal was to achieve an sufficient enough accuracy to make a profit in sports betting. One learning example is a game in the NBA regular season. Every example has multiple features, which describe the opposing teams. We tried many methods, which return the probability of the home team winning and the probability of the away team winning. These probabilities are used for risk analysis. We used the best model in h...

  20. Predicting chaotic time series

    International Nuclear Information System (INIS)

    Farmer, J.D.; Sidorowich, J.J.

    1987-01-01

    We present a forecasting technique for chaotic data. After embedding a time series in a state space using delay coordinates, we ''learn'' the induced nonlinear mapping using local approximation. This allows us to make short-term predictions of the future behavior of a time series, using information based only on past values. We present an error estimate for this technique, and demonstrate its effectiveness by applying it to several examples, including data from the Mackey-Glass delay differential equation, Rayleigh-Benard convection, and Taylor-Couette flow

  1. Lattice of quantum predictions

    Science.gov (United States)

    Drieschner, Michael

    1993-10-01

    What is the structure of reality? Physics is supposed to answer this question, but a purely empiristic view is not sufficient to explain its ability to do so. Quantum mechanics has forced us to think more deeply about what a physical theory is. There are preconditions every physical theory must fulfill. It has to contain, e.g., rules for empirically testable predictions. Those preconditions give physics a structure that is “a priori” in the Kantian sense. An example is given how the lattice structure of quantum mechanics can be understood along these lines.

  2. Foundations of predictive analytics

    CERN Document Server

    Wu, James

    2012-01-01

    Drawing on the authors' two decades of experience in applied modeling and data mining, Foundations of Predictive Analytics presents the fundamental background required for analyzing data and building models for many practical applications, such as consumer behavior modeling, risk and marketing analytics, and other areas. It also discusses a variety of practical topics that are frequently missing from similar texts. The book begins with the statistical and linear algebra/matrix foundation of modeling methods, from distributions to cumulant and copula functions to Cornish--Fisher expansion and o

  3. Prediction of regulatory elements

    DEFF Research Database (Denmark)

    Sandelin, Albin

    2008-01-01

    Finding the regulatory mechanisms responsible for gene expression remains one of the most important challenges for biomedical research. A major focus in cellular biology is to find functional transcription factor binding sites (TFBS) responsible for the regulation of a downstream gene. As wet......-lab methods are time consuming and expensive, it is not realistic to identify TFBS for all uncharacterized genes in the genome by purely experimental means. Computational methods aimed at predicting potential regulatory regions can increase the efficiency of wet-lab experiments significantly. Here, methods...

  4. Age and Stress Prediction

    Science.gov (United States)

    2000-01-01

    Genoa is a software product that predicts progressive aging and failure in a variety of materials. It is the result of a SBIR contract between the Glenn Research Center and Alpha Star Corporation. Genoa allows designers to determine if the materials they plan on applying to a structure are up to the task or if alternate materials should be considered. Genoa's two feature applications are its progressive failure simulations and its test verification. It allows for a reduction in inspection frequency, rapid design solutions, and manufacturing with low cost materials. It will benefit the aerospace, airline, and automotive industries, with future applications for other uses.

  5. Prediction of Biomolecular Complexes

    KAUST Repository

    Vangone, Anna

    2017-04-12

    Almost all processes in living organisms occur through specific interactions between biomolecules. Any dysfunction of those interactions can lead to pathological events. Understanding such interactions is therefore a crucial step in the investigation of biological systems and a starting point for drug design. In recent years, experimental studies have been devoted to unravel the principles of biomolecular interactions; however, due to experimental difficulties in solving the three-dimensional (3D) structure of biomolecular complexes, the number of available, high-resolution experimental 3D structures does not fulfill the current needs. Therefore, complementary computational approaches to model such interactions are necessary to assist experimentalists since a full understanding of how biomolecules interact (and consequently how they perform their function) only comes from 3D structures which provide crucial atomic details about binding and recognition processes. In this chapter we review approaches to predict biomolecular complexesBiomolecular complexes, introducing the concept of molecular dockingDocking, a technique which uses a combination of geometric, steric and energetics considerations to predict the 3D structure of a biological complex starting from the individual structures of its constituent parts. We provide a mini-guide about docking concepts, its potential and challenges, along with post-docking analysis and a list of related software.

  6. Nuclear criticality predictability

    International Nuclear Information System (INIS)

    Briggs, J.B.

    1999-01-01

    As a result of lots of efforts, a large portion of the tedious and redundant research and processing of critical experiment data has been eliminated. The necessary step in criticality safety analyses of validating computer codes with benchmark critical data is greatly streamlined, and valuable criticality safety experimental data is preserved. Criticality safety personnel in 31 different countries are now using the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments'. Much has been accomplished by the work of the ICSBEP. However, evaluation and documentation represents only one element of a successful Nuclear Criticality Safety Predictability Program and this element only exists as a separate entity, because this work was not completed in conjunction with the experimentation process. I believe; however, that the work of the ICSBEP has also served to unify the other elements of nuclear criticality predictability. All elements are interrelated, but for a time it seemed that communications between these elements was not adequate. The ICSBEP has highlighted gaps in data, has retrieved lost data, has helped to identify errors in cross section processing codes, and has helped bring the international criticality safety community together in a common cause as true friends and colleagues. It has been a privilege to associate with those who work so diligently to make the project a success. (J.P.N.)

  7. Ratchetting strain prediction

    International Nuclear Information System (INIS)

    Noban, Mohammad; Jahed, Hamid

    2007-01-01

    A time-efficient method for predicting ratchetting strain is proposed. The ratchetting strain at any cycle is determined by finding the ratchetting rate at only a few cycles. This determination is done by first defining the trajectory of the origin of stress in the deviatoric stress space and then incorporating this moving origin into a cyclic plasticity model. It is shown that at the beginning of the loading, the starting point of this trajectory coincides with the initial stress origin and approaches the mean stress, displaying a power-law relationship with the number of loading cycles. The method of obtaining this trajectory from a standard uniaxial asymmetric cyclic loading is presented. Ratchetting rates are calculated with the help of this trajectory and through the use of a constitutive cyclic plasticity model which incorporates deviatoric stresses and back stresses that are measured with respect to this moving frame. The proposed model is used to predict the ratchetting strain of two types of steels under single- and multi-step loadings. Results obtained agree well with the available experimental measurements

  8. Predicting space climate change

    Science.gov (United States)

    Balcerak, Ernie

    2011-10-01

    Galactic cosmic rays and solar energetic particles can be hazardous to humans in space, damage spacecraft and satellites, pose threats to aircraft electronics, and expose aircrew and passengers to radiation. A new study shows that these threats are likely to increase in coming years as the Sun approaches the end of the period of high solar activity known as “grand solar maximum,” which has persisted through the past several decades. High solar activity can help protect the Earth by repelling incoming galactic cosmic rays. Understanding the past record can help scientists predict future conditions. Barnard et al. analyzed a 9300-year record of galactic cosmic ray and solar activity based on cosmogenic isotopes in ice cores as well as on neutron monitor data. They used this to predict future variations in galactic cosmic ray flux, near-Earth interplanetary magnetic field, sunspot number, and probability of large solar energetic particle events. The researchers found that the risk of space weather radiation events will likely increase noticeably over the next century compared with recent decades and that lower solar activity will lead to increased galactic cosmic ray levels. (Geophysical Research Letters, doi:10.1029/2011GL048489, 2011)

  9. Prediction of Biomolecular Complexes

    KAUST Repository

    Vangone, Anna; Oliva, Romina; Cavallo, Luigi; Bonvin, Alexandre M. J. J.

    2017-01-01

    Almost all processes in living organisms occur through specific interactions between biomolecules. Any dysfunction of those interactions can lead to pathological events. Understanding such interactions is therefore a crucial step in the investigation of biological systems and a starting point for drug design. In recent years, experimental studies have been devoted to unravel the principles of biomolecular interactions; however, due to experimental difficulties in solving the three-dimensional (3D) structure of biomolecular complexes, the number of available, high-resolution experimental 3D structures does not fulfill the current needs. Therefore, complementary computational approaches to model such interactions are necessary to assist experimentalists since a full understanding of how biomolecules interact (and consequently how they perform their function) only comes from 3D structures which provide crucial atomic details about binding and recognition processes. In this chapter we review approaches to predict biomolecular complexesBiomolecular complexes, introducing the concept of molecular dockingDocking, a technique which uses a combination of geometric, steric and energetics considerations to predict the 3D structure of a biological complex starting from the individual structures of its constituent parts. We provide a mini-guide about docking concepts, its potential and challenges, along with post-docking analysis and a list of related software.

  10. Energy Predictions 2011

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-10-15

    Even as the recession begins to subside, the energy sector is still likely to experience challenging conditions as we enter 2011. It should be remembered how very important a role energy plays in driving the global economy. Serving as a simple yet global and unified measure of economic recovery, it is oil's price range and the strength and sustainability of the recovery which will impact the ways in which all forms of energy are produced and consumed. The report aims for a closer insight into these predictions: What will happen with M and A (Mergers and Acquisitions) in the energy industry?; What are the prospects for renewables?; Will the water-energy nexus grow in importance?; How will technological leaps and bounds affect E and P (exploration and production) operations?; What about electric cars? This is the second year Deloitte's Global Energy and Resources Group has published its predictions for the year ahead. The report is based on in-depth interviews with clients, industry analysts, and senior energy practitioners from Deloitte member firms around the world.

  11. Energy Predictions 2011

    International Nuclear Information System (INIS)

    2010-10-01

    Even as the recession begins to subside, the energy sector is still likely to experience challenging conditions as we enter 2011. It should be remembered how very important a role energy plays in driving the global economy. Serving as a simple yet global and unified measure of economic recovery, it is oil's price range and the strength and sustainability of the recovery which will impact the ways in which all forms of energy are produced and consumed. The report aims for a closer insight into these predictions: What will happen with M and A (Mergers and Acquisitions) in the energy industry?; What are the prospects for renewables?; Will the water-energy nexus grow in importance?; How will technological leaps and bounds affect E and P (exploration and production) operations?; What about electric cars? This is the second year Deloitte's Global Energy and Resources Group has published its predictions for the year ahead. The report is based on in-depth interviews with clients, industry analysts, and senior energy practitioners from Deloitte member firms around the world.

  12. Predicting Alloreactivity in Transplantation

    Directory of Open Access Journals (Sweden)

    Kirsten Geneugelijk

    2014-01-01

    Full Text Available Human leukocyte Antigen (HLA mismatching leads to severe complications after solid-organ transplantation and hematopoietic stem-cell transplantation. The alloreactive responses underlying the posttransplantation complications include both direct recognition of allogeneic HLA by HLA-specific alloantibodies and T cells and indirect T-cell recognition. However, the immunogenicity of HLA mismatches is highly variable; some HLA mismatches lead to severe clinical B-cell- and T-cell-mediated alloreactivity, whereas others are well tolerated. Definition of the permissibility of HLA mismatches prior to transplantation allows selection of donor-recipient combinations that will have a reduced chance to develop deleterious host-versus-graft responses after solid-organ transplantation and graft-versus-host responses after hematopoietic stem-cell transplantation. Therefore, several methods have been developed to predict permissible HLA-mismatch combinations. In this review we aim to give a comprehensive overview about the current knowledge regarding HLA-directed alloreactivity and several developed in vitro and in silico tools that aim to predict direct and indirect alloreactivity.

  13. Generalized Predictive Control and Neural Generalized Predictive Control

    Directory of Open Access Journals (Sweden)

    Sadhana CHIDRAWAR

    2008-12-01

    Full Text Available As Model Predictive Control (MPC relies on the predictive Control using a multilayer feed forward network as the plants linear model is presented. In using Newton-Raphson as the optimization algorithm, the number of iterations needed for convergence is significantly reduced from other techniques. This paper presents a detailed derivation of the Generalized Predictive Control and Neural Generalized Predictive Control with Newton-Raphson as minimization algorithm. Taking three separate systems, performances of the system has been tested. Simulation results show the effect of neural network on Generalized Predictive Control. The performance comparison of this three system configurations has been given in terms of ISE and IAE.

  14. Numerical prediction of rose growth

    NARCIS (Netherlands)

    Bernsen, E.; Bokhove, Onno; van der Sar, D.M.

    2006-01-01

    A new mathematical model is presented for the prediction of rose growth in a greenhouse. Given the measured ambient environmental conditions, the model consists of a local photosynthesis model, predicting the photosynthesis per unit leaf area, coupled to a global greenhouse model, which predicts the

  15. Confidence scores for prediction models

    DEFF Research Database (Denmark)

    Gerds, Thomas Alexander; van de Wiel, MA

    2011-01-01

    In medical statistics, many alternative strategies are available for building a prediction model based on training data. Prediction models are routinely compared by means of their prediction performance in independent validation data. If only one data set is available for training and validation,...

  16. Protein docking prediction using predicted protein-protein interface

    Directory of Open Access Journals (Sweden)

    Li Bin

    2012-01-01

    Full Text Available Abstract Background Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. Results We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm, is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. Conclusion We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  17. Protein docking prediction using predicted protein-protein interface.

    Science.gov (United States)

    Li, Bin; Kihara, Daisuke

    2012-01-10

    Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm), is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  18. Epitope prediction methods

    DEFF Research Database (Denmark)

    Karosiene, Edita

    Analysis. The chapter provides detailed explanations on how to use different methods for T cell epitope discovery research, explaining how input should be given as well as how to interpret the output. In the last chapter, I present the results of a bioinformatics analysis of epitopes from the yellow fever...... peptide-MHC interactions. Furthermore, using yellow fever virus epitopes, we demonstrated the power of the %Rank score when compared with the binding affinity score of MHC prediction methods, suggesting that this score should be considered to be used for selecting potential T cell epitopes. In summary...... immune responses. Therefore, it is of great importance to be able to identify peptides that bind to MHC molecules, in order to understand the nature of immune responses and discover T cell epitopes useful for designing new vaccines and immunotherapies. MHC molecules in humans, referred to as human...

  19. Motor degradation prediction methods

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, J.R.; Kelly, J.F.; Delzingaro, M.J.

    1996-12-01

    Motor Operated Valve (MOV) squirrel cage AC motor rotors are susceptible to degradation under certain conditions. Premature failure can result due to high humidity/temperature environments, high running load conditions, extended periods at locked rotor conditions (i.e. > 15 seconds) or exceeding the motor`s duty cycle by frequent starts or multiple valve stroking. Exposure to high heat and moisture due to packing leaks, pressure seal ring leakage or other causes can significantly accelerate the degradation. ComEd and Liberty Technologies have worked together to provide and validate a non-intrusive method using motor power diagnostics to evaluate MOV rotor condition and predict failure. These techniques have provided a quick, low radiation dose method to evaluate inaccessible motors, identify degradation and allow scheduled replacement of motors prior to catastrophic failures.

  20. Filter replacement lifetime prediction

    Science.gov (United States)

    Hamann, Hendrik F.; Klein, Levente I.; Manzer, Dennis G.; Marianno, Fernando J.

    2017-10-25

    Methods and systems for predicting a filter lifetime include building a filter effectiveness history based on contaminant sensor information associated with a filter; determining a rate of filter consumption with a processor based on the filter effectiveness history; and determining a remaining filter lifetime based on the determined rate of filter consumption. Methods and systems for increasing filter economy include measuring contaminants in an internal and an external environment; determining a cost of a corrosion rate increase if unfiltered external air intake is increased for cooling; determining a cost of increased air pressure to filter external air; and if the cost of filtering external air exceeds the cost of the corrosion rate increase, increasing an intake of unfiltered external air.

  1. Neurological abnormalities predict disability

    DEFF Research Database (Denmark)

    Poggesi, Anna; Gouw, Alida; van der Flier, Wiesje

    2014-01-01

    To investigate the role of neurological abnormalities and magnetic resonance imaging (MRI) lesions in predicting global functional decline in a cohort of initially independent-living elderly subjects. The Leukoaraiosis And DISability (LADIS) Study, involving 11 European centres, was primarily aimed...... at evaluating age-related white matter changes (ARWMC) as an independent predictor of the transition to disability (according to Instrumental Activities of Daily Living scale) or death in independent elderly subjects that were followed up for 3 years. At baseline, a standardized neurological examination.......0 years, 45 % males), 327 (51.7 %) presented at the initial visit with ≥1 neurological abnormality and 242 (38 %) reached the main study outcome. Cox regression analyses, adjusting for MRI features and other determinants of functional decline, showed that the baseline presence of any neurological...

  2. Motor degradation prediction methods

    International Nuclear Information System (INIS)

    Arnold, J.R.; Kelly, J.F.; Delzingaro, M.J.

    1996-01-01

    Motor Operated Valve (MOV) squirrel cage AC motor rotors are susceptible to degradation under certain conditions. Premature failure can result due to high humidity/temperature environments, high running load conditions, extended periods at locked rotor conditions (i.e. > 15 seconds) or exceeding the motor's duty cycle by frequent starts or multiple valve stroking. Exposure to high heat and moisture due to packing leaks, pressure seal ring leakage or other causes can significantly accelerate the degradation. ComEd and Liberty Technologies have worked together to provide and validate a non-intrusive method using motor power diagnostics to evaluate MOV rotor condition and predict failure. These techniques have provided a quick, low radiation dose method to evaluate inaccessible motors, identify degradation and allow scheduled replacement of motors prior to catastrophic failures

  3. Predictability in community dynamics.

    Science.gov (United States)

    Blonder, Benjamin; Moulton, Derek E; Blois, Jessica; Enquist, Brian J; Graae, Bente J; Macias-Fauria, Marc; McGill, Brian; Nogué, Sandra; Ordonez, Alejandro; Sandel, Brody; Svenning, Jens-Christian

    2017-03-01

    The coupling between community composition and climate change spans a gradient from no lags to strong lags. The no-lag hypothesis is the foundation of many ecophysiological models, correlative species distribution modelling and climate reconstruction approaches. Simple lag hypotheses have become prominent in disequilibrium ecology, proposing that communities track climate change following a fixed function or with a time delay. However, more complex dynamics are possible and may lead to memory effects and alternate unstable states. We develop graphical and analytic methods for assessing these scenarios and show that these dynamics can appear in even simple models. The overall implications are that (1) complex community dynamics may be common and (2) detailed knowledge of past climate change and community states will often be necessary yet sometimes insufficient to make predictions of a community's future state. © 2017 John Wiley & Sons Ltd/CNRS.

  4. Neonatal heart rate prediction.

    Science.gov (United States)

    Abdel-Rahman, Yumna; Jeremic, Aleksander; Tan, Kenneth

    2009-01-01

    Technological advances have caused a decrease in the number of infant deaths. Pre-term infants now have a substantially increased chance of survival. One of the mechanisms that is vital to saving the lives of these infants is continuous monitoring and early diagnosis. With continuous monitoring huge amounts of data are collected with so much information embedded in them. By using statistical analysis this information can be extracted and used to aid diagnosis and to understand development. In this study we have a large dataset containing over 180 pre-term infants whose heart rates were recorded over the length of their stay in the Neonatal Intensive Care Unit (NICU). We test two types of models, empirical bayesian and autoregressive moving average. We then attempt to predict future values. The autoregressive moving average model showed better results but required more computation.

  5. Chloride ingress prediction

    DEFF Research Database (Denmark)

    Frederiksen, Jens Mejer; Geiker, Mette Rica

    2008-01-01

    Prediction of chloride ingress into concrete is an important part of durability design of reinforced concrete structures exposed to chloride containing environment. This paper presents experimentally based design parameters for Portland cement concretes with and without silica fume and fly ash...... in marine atmospheric and submersed South Scandinavian environment. The design parameters are based on sequential measurements of 86 chloride profiles taken over ten years from 13 different types of concrete. The design parameters provide the input for an analytical model for chloride profiles as function...... of depth and time, when both the surface chloride concentration and the diffusion coefficient are allowed to vary in time. The model is presented in a companion paper....

  6. Strontium 90 fallout prediction

    International Nuclear Information System (INIS)

    Sarmiento, J.L.; Gwinn, E.

    1986-01-01

    An empirical formula is developed for predicting monthly sea level strontium 90 fallout (F) in the northern hemisphere as a function of time (t), precipitation rate (P), latitude (phi), longitude (lambda), and the sea level concentration of stronium 90 in air (C): F(lambda, phi, t) = C(t, phi)[v /sub d/(phi) + v/sub w/(lambda, phi, t)], where v/sub w/(lambda, phi, t) = a(phi)[P(lambda, phi, t)/P/sub o/]/sup b//sup (//sup phi//sup )/ is the wet removal, v/sub d/(phi) is the dry removal and P 0 is 1 cm/month. The constants v/sub d/, a, and b are determined as functions of latitude by fitting land based observations. The concentration of 90 Sr in air is calculated as a function of the deseasonalized concentration at a reference latitude (C-bar/sub r//sub e//sub f/), the ratio of the observations at the latitude of interest to the reference latitude (R), and a function representing the seasonal trend in the air concentration (1 + g): C-bar(t, phi) = C/sub r//sub e//sub f/(t)R(phi)[1 + g(m, phi)]; m is the month. Zonal trends in C are shown to be relatively small. This formula can be used in conjuction with precipitation observations and/or estimates to predict fallout in the northern hemisphere for any month in the years 1954 to 1974. Error estimates are given; they do not include uncertainty due to errors in precipitation data

  7. Plume rise predictions

    International Nuclear Information System (INIS)

    Briggs, G.A.

    1976-01-01

    Anyone involved with diffusion calculations becomes well aware of the strong dependence of maximum ground concentrations on the effective stack height, h/sub e/. For most conditions chi/sub max/ is approximately proportional to h/sub e/ -2 , as has been recognized at least since 1936 (Bosanquet and Pearson). Making allowance for the gradual decrease in the ratio of vertical to lateral diffusion at increasing heights, the exponent is slightly larger, say chi/sub max/ approximately h/sub e/ - 2 . 3 . In inversion breakup fumigation, the exponent is somewhat smaller; very crudely, chi/sub max/ approximately h/sub e/ -1 . 5 . In any case, for an elevated emission the dependence of chi/sub max/ on h/sub e/ is substantial. It is postulated that a really clever ignorant theoretician can disguise his ignorance with dimensionless constants. For most sources the effective stack height is considerably larger than the actual source height, h/sub s/. For instance, for power plants with no downwash problems, h/sub e/ is more than twice h/sub s/ whenever the wind is less than 10 m/sec, which is most of the time. This is unfortunate for anyone who has to predict ground concentrations, for he is likely to have to calculate the plume rise, Δh. Especially when using h/sub e/ = h/sub s/ + Δh instead of h/sub s/ may reduce chi/sub max/ by a factor of anywhere from 4 to infinity. Factors to be considered in making plume rise predictions are discussed

  8. Predictive coarse-graining

    Energy Technology Data Exchange (ETDEWEB)

    Schöberl, Markus, E-mail: m.schoeberl@tum.de [Continuum Mechanics Group, Technical University of Munich, Boltzmannstraße 15, 85748 Garching (Germany); Zabaras, Nicholas [Institute for Advanced Study, Technical University of Munich, Lichtenbergstraße 2a, 85748 Garching (Germany); Department of Aerospace and Mechanical Engineering, University of Notre Dame, 365 Fitzpatrick Hall, Notre Dame, IN 46556 (United States); Koutsourelakis, Phaedon-Stelios [Continuum Mechanics Group, Technical University of Munich, Boltzmannstraße 15, 85748 Garching (Germany)

    2017-03-15

    We propose a data-driven, coarse-graining formulation in the context of equilibrium statistical mechanics. In contrast to existing techniques which are based on a fine-to-coarse map, we adopt the opposite strategy by prescribing a probabilistic coarse-to-fine map. This corresponds to a directed probabilistic model where the coarse variables play the role of latent generators of the fine scale (all-atom) data. From an information-theoretic perspective, the framework proposed provides an improvement upon the relative entropy method and is capable of quantifying the uncertainty due to the information loss that unavoidably takes place during the coarse-graining process. Furthermore, it can be readily extended to a fully Bayesian model where various sources of uncertainties are reflected in the posterior of the model parameters. The latter can be used to produce not only point estimates of fine-scale reconstructions or macroscopic observables, but more importantly, predictive posterior distributions on these quantities. Predictive posterior distributions reflect the confidence of the model as a function of the amount of data and the level of coarse-graining. The issues of model complexity and model selection are seamlessly addressed by employing a hierarchical prior that favors the discovery of sparse solutions, revealing the most prominent features in the coarse-grained model. A flexible and parallelizable Monte Carlo – Expectation–Maximization (MC-EM) scheme is proposed for carrying out inference and learning tasks. A comparative assessment of the proposed methodology is presented for a lattice spin system and the SPC/E water model.

  9. Data-Based Predictive Control with Multirate Prediction Step

    Science.gov (United States)

    Barlow, Jonathan S.

    2010-01-01

    Data-based predictive control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. One challenge of MPC is computational requirements increasing with prediction horizon length. This paper develops a closed-loop dynamic output feedback controller that minimizes a multi-step-ahead receding-horizon cost function with multirate prediction step. One result is a reduced influence of prediction horizon and the number of system outputs on the computational requirements of the controller. Another result is an emphasis on portions of the prediction window that are sampled more frequently. A third result is the ability to include more outputs in the feedback path than in the cost function.

  10. Earthquake prediction with electromagnetic phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp [Hayakawa Institute of Seismo Electomagnetics, Co. Ltd., University of Electro-Communications (UEC) Incubation Center, 1-5-1 Chofugaoka, Chofu Tokyo, 182-8585 (Japan); Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo (Japan); Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062 (Japan); Fuji Security Systems. Co. Ltd., Iwato-cho 1, Shinjyuku-ku, Tokyo (Japan)

    2016-02-01

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.

  11. Performance Prediction Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-25

    The Performance Prediction Toolkit (PPT), is a scalable co-design tool that contains the hardware and middle-ware models, which accept proxy applications as input in runtime prediction. PPT relies on Simian, a parallel discrete event simulation engine in Python or Lua, that uses the process concept, where each computing unit (host, node, core) is a Simian entity. Processes perform their task through message exchanges to remain active, sleep, wake-up, begin and end. The PPT hardware model of a compute core (such as a Haswell core) consists of a set of parameters, such as clock speed, memory hierarchy levels, their respective sizes, cache-lines, access times for different cache levels, average cycle counts of ALU operations, etc. These parameters are ideally read off a spec sheet or are learned using regression models learned from hardware counters (PAPI) data. The compute core model offers an API to the software model, a function called time_compute(), which takes as input a tasklist. A tasklist is an unordered set of ALU, and other CPU-type operations (in particular virtual memory loads and stores). The PPT application model mimics the loop structure of the application and replaces the computational kernels with a call to the hardware model's time_compute() function giving tasklists as input that model the compute kernel. A PPT application model thus consists of tasklists representing kernels and the high-er level loop structure that we like to think of as pseudo code. The key challenge for the hardware model's time_compute-function is to translate virtual memory accesses into actual cache hierarchy level hits and misses.PPT also contains another CPU core level hardware model, Analytical Memory Model (AMM). The AMM solves this challenge soundly, where our previous alternatives explicitly include the L1,L2,L3 hit-rates as inputs to the tasklists. Explicit hit-rates inevitably only reflect the application modeler's best guess, perhaps informed by a few

  12. Introduction: Long term prediction

    International Nuclear Information System (INIS)

    Beranger, G.

    2003-01-01

    Making a decision upon the right choice of a material appropriate to a given application should be based on taking into account several parameters as follows: cost, standards, regulations, safety, recycling, chemical properties, supplying, transformation, forming, assembly, mechanical and physical properties as well as the behaviour in practical conditions. Data taken from a private communication (J.H.Davidson) are reproduced presenting the life time range of materials from a couple of minutes to half a million hours corresponding to applications from missile technology up to high-temperature nuclear reactors or steam turbines. In the case of deep storage of nuclear waste the time required is completely different from these values since we have to ensure the integrity of the storage system for several thousand years. The vitrified nuclear wastes should be stored in metallic canisters made of iron and carbon steels, stainless steels, copper and copper alloys, nickel alloys or titanium alloys. Some of these materials are passivating metals, i.e. they develop a thin protective film, 2 or 3 nm thick - the so-called passive films. These films prevent general corrosion of the metal in a large range of chemical condition of the environment. In some specific condition, localized corrosion such as the phenomenon of pitting, occurs. Consequently, it is absolutely necessary to determine these chemical condition and their stability in time to understand the behavior of a given material. In other words the corrosion system is constituted by the complex material/surface/medium. For high level nuclear wastes the main features for resolving problem are concerned with: geological disposal; deep storage in clay; waste metallic canister; backfill mixture (clay-gypsum) or concrete; long term behavior; data needed for modelling and for predicting; choice of appropriate solution among several metallic candidates. The analysis of the complex material/surface/medium is of great importance

  13. Predictability of blocking

    International Nuclear Information System (INIS)

    Tosi, E.; Ruti, P.; Tibaldi, S.; D'Andrea, F.

    1994-01-01

    Tibaldi and Molteni (1990, hereafter referred to as TM) had previously investigated operational blocking predictability by the ECMWF model and the possible relationships between model systematic error and blocking in the winter season of the Northern Hemisphere, using seven years of ECMWF operational archives of analyses and day 1 to 10 forecasts. They showed that fewer blocking episodes than in the real atmosphere were generally simulated by the model, and that this deficiency increased with increasing forecast time. As a consequence of this, a major contribution to the systematic error in the winter season was shown to derive from the inability of the model to properly forecast blocking. In this study, the analysis performed in TM for the first seven winter seasons of the ECMWF operational model is extended to the subsequent five winters, during which model development, reflecting both resolution increases and parametrisation modifications, continued unabated. In addition the objective blocking index developed by TM has been applied to the observed data to study the natural low frequency variability of blocking. The ability to simulate blocking of some climate models has also been tested

  14. GABA predicts visual intelligence.

    Science.gov (United States)

    Cook, Emily; Hammett, Stephen T; Larsson, Jonas

    2016-10-06

    Early psychological researchers proposed a link between intelligence and low-level perceptual performance. It was recently suggested that this link is driven by individual variations in the ability to suppress irrelevant information, evidenced by the observation of strong correlations between perceptual surround suppression and cognitive performance. However, the neural mechanisms underlying such a link remain unclear. A candidate mechanism is neural inhibition by gamma-aminobutyric acid (GABA), but direct experimental support for GABA-mediated inhibition underlying suppression is inconsistent. Here we report evidence consistent with a global suppressive mechanism involving GABA underlying the link between sensory performance and intelligence. We measured visual cortical GABA concentration, visuo-spatial intelligence and visual surround suppression in a group of healthy adults. Levels of GABA were strongly predictive of both intelligence and surround suppression, with higher levels of intelligence associated with higher levels of GABA and stronger surround suppression. These results indicate that GABA-mediated neural inhibition may be a key factor determining cognitive performance and suggests a physiological mechanism linking surround suppression and intelligence. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  15. Predictability in cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Chira, Camelia; Giuclea, Marius

    2014-01-01

    Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zero-one borders within the automaton's binary configuration. An exponential formula in the number of zero-one borders has been proved for the 1-D, 2-D and 3-D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case.

  16. Predictive Manufacturing: A Classification Strategy to Predict Product Failures

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Kulahci, Murat

    2018-01-01

    manufacturing analytics model that employs a big data approach to predicting product failures; third, we illustrate the issue of high dimensionality, along with statistically redundant information; and, finally, our proposed method will be compared against the well-known classification methods (SVM, K......-nearest neighbor, artificial neural networks). The results from real data show that our predictive manufacturing analytics approach, using genetic algorithms and Voronoi tessellations, is capable of predicting product failure with reasonable accuracy. The potential application of this method contributes...... to accurately predicting product failures, which would enable manufacturers to reduce production costs without compromising product quality....

  17. House Price Prediction Using LSTM

    OpenAIRE

    Chen, Xiaochen; Wei, Lai; Xu, Jiaxin

    2017-01-01

    In this paper, we use the house price data ranging from January 2004 to October 2016 to predict the average house price of November and December in 2016 for each district in Beijing, Shanghai, Guangzhou and Shenzhen. We apply Autoregressive Integrated Moving Average model to generate the baseline while LSTM networks to build prediction model. These algorithms are compared in terms of Mean Squared Error. The result shows that the LSTM model has excellent properties with respect to predict time...

  18. Long Range Aircraft Trajectory Prediction

    OpenAIRE

    Magister, Tone

    2009-01-01

    The subject of the paper is the improvement of the aircraft future trajectory prediction accuracy for long-range airborne separation assurance. The strategic planning of safe aircraft flights and effective conflict avoidance tactics demand timely and accurate conflict detection based upon future four–dimensional airborne traffic situation prediction which is as accurate as each aircraft flight trajectory prediction. The improved kinematics model of aircraft relative flight considering flight ...

  19. Review of Nearshore Morphologic Prediction

    Science.gov (United States)

    Plant, N. G.; Dalyander, S.; Long, J.

    2014-12-01

    The evolution of the world's erodible coastlines will determine the balance between the benefits and costs associated with human and ecological utilization of shores, beaches, dunes, barrier islands, wetlands, and estuaries. So, we would like to predict coastal evolution to guide management and planning of human and ecological response to coastal changes. After decades of research investment in data collection, theoretical and statistical analysis, and model development we have a number of empirical, statistical, and deterministic models that can predict the evolution of the shoreline, beaches, dunes, and wetlands over time scales of hours to decades, and even predict the evolution of geologic strata over the course of millennia. Comparisons of predictions to data have demonstrated that these models can have meaningful predictive skill. But these comparisons also highlight the deficiencies in fundamental understanding, formulations, or data that are responsible for prediction errors and uncertainty. Here, we review a subset of predictive models of the nearshore to illustrate tradeoffs in complexity, predictive skill, and sensitivity to input data and parameterization errors. We identify where future improvement in prediction skill will result from improved theoretical understanding, and data collection, and model-data assimilation.

  20. PREDICTED PERCENTAGE DISSATISFIED (PPD) MODEL ...

    African Journals Online (AJOL)

    HOD

    their low power requirements, are relatively cheap and are environment friendly. ... PREDICTED PERCENTAGE DISSATISFIED MODEL EVALUATION OF EVAPORATIVE COOLING ... The performance of direct evaporative coolers is a.

  1. Model Prediction Control For Water Management Using Adaptive Prediction Accuracy

    NARCIS (Netherlands)

    Tian, X.; Negenborn, R.R.; Van Overloop, P.J.A.T.M.; Mostert, E.

    2014-01-01

    In the field of operational water management, Model Predictive Control (MPC) has gained popularity owing to its versatility and flexibility. The MPC controller, which takes predictions, time delay and uncertainties into account, can be designed for multi-objective management problems and for

  2. Predictability and Prediction for an Experimental Cultural Market

    Science.gov (United States)

    Colbaugh, Richard; Glass, Kristin; Ormerod, Paul

    Individuals are often influenced by the behavior of others, for instance because they wish to obtain the benefits of coordinated actions or infer otherwise inaccessible information. In such situations this social influence decreases the ex ante predictability of the ensuing social dynamics. We claim that, interestingly, these same social forces can increase the extent to which the outcome of a social process can be predicted very early in the process. This paper explores this claim through a theoretical and empirical analysis of the experimental music market described and analyzed in [1]. We propose a very simple model for this music market, assess the predictability of market outcomes through formal analysis of the model, and use insights derived through this analysis to develop algorithms for predicting market share winners, and their ultimate market shares, in the very early stages of the market. The utility of these predictive algorithms is illustrated through analysis of the experimental music market data sets [2].

  3. Predicting epileptic seizures in advance.

    Directory of Open Access Journals (Sweden)

    Negin Moghim

    Full Text Available Epilepsy is the second most common neurological disorder, affecting 0.6-0.8% of the world's population. In this neurological disorder, abnormal activity of the brain causes seizures, the nature of which tend to be sudden. Antiepileptic Drugs (AEDs are used as long-term therapeutic solutions that control the condition. Of those treated with AEDs, 35% become resistant to medication. The unpredictable nature of seizures poses risks for the individual with epilepsy. It is clearly desirable to find more effective ways of preventing seizures for such patients. The automatic detection of oncoming seizures, before their actual onset, can facilitate timely intervention and hence minimize these risks. In addition, advance prediction of seizures can enrich our understanding of the epileptic brain. In this study, drawing on the body of work behind automatic seizure detection and prediction from digitised Invasive Electroencephalography (EEG data, a prediction algorithm, ASPPR (Advance Seizure Prediction via Pre-ictal Relabeling, is described. ASPPR facilitates the learning of predictive models targeted at recognizing patterns in EEG activity that are in a specific time window in advance of a seizure. It then exploits advanced machine learning coupled with the design and selection of appropriate features from EEG signals. Results, from evaluating ASPPR independently on 21 different patients, suggest that seizures for many patients can be predicted up to 20 minutes in advance of their onset. Compared to benchmark performance represented by a mean S1-Score (harmonic mean of Sensitivity and Specificity of 90.6% for predicting seizure onset between 0 and 5 minutes in advance, ASPPR achieves mean S1-Scores of: 96.30% for prediction between 1 and 6 minutes in advance, 96.13% for prediction between 8 and 13 minutes in advance, 94.5% for prediction between 14 and 19 minutes in advance, and 94.2% for prediction between 20 and 25 minutes in advance.

  4. Quadratic prediction of factor scores

    NARCIS (Netherlands)

    Wansbeek, T

    1999-01-01

    Factor scores are naturally predicted by means of their conditional expectation given the indicators y. Under normality this expectation is linear in y but in general it is an unknown function of y. II is discussed that under nonnormality factor scores can be more precisely predicted by a quadratic

  5. Predictions for Excited Strange Baryons

    Energy Technology Data Exchange (ETDEWEB)

    Fernando, Ishara P.; Goity, Jose L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-04-01

    An assessment is made of predictions for excited hyperon masses which follow from flavor symmetry and consistency with a 1/N c expansion of QCD. Such predictions are based on presently established baryonic resonances. Low lying hyperon resonances which do not seem to fit into the proposed scheme are discussed.

  6. Climate Prediction Center - Seasonal Outlook

    Science.gov (United States)

    Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Site Map News Forecast Discussion PROGNOSTIC DISCUSSION FOR MONTHLY OUTLOOK NWS CLIMATE PREDICTION CENTER COLLEGE PARK MD INFLUENCE ON THE MONTHLY-AVERAGED CLIMATE. OUR MID-MONTH ASSESSMENT OF LOW-FREQUENCY CLIMATE VARIABILITY IS

  7. Dividend Predictability Around the World

    DEFF Research Database (Denmark)

    Rangvid, Jesper; Schmeling, Maik; Schrimpf, Andreas

    2014-01-01

    We show that dividend-growth predictability by the dividend yield is the rule rather than the exception in global equity markets. Dividend predictability is weaker, however, in large and developed markets where dividends are smoothed more, the typical firm is large, and volatility is lower. Our f...

  8. Dividend Predictability Around the World

    DEFF Research Database (Denmark)

    Rangvid, Jesper; Schmeling, Maik; Schrimpf, Andreas

    We show that dividend growth predictability by the dividend yield is the rule rather than the exception in global equity markets. Dividend predictability is weaker, however, in large and developed markets where dividends are smoothed more, the typical firm is large, and volatility is lower. Our f...

  9. Decadal climate prediction (project GCEP).

    Science.gov (United States)

    Haines, Keith; Hermanson, Leon; Liu, Chunlei; Putt, Debbie; Sutton, Rowan; Iwi, Alan; Smith, Doug

    2009-03-13

    Decadal prediction uses climate models forced by changing greenhouse gases, as in the International Panel for Climate Change, but unlike longer range predictions they also require initialization with observations of the current climate. In particular, the upper-ocean heat content and circulation have a critical influence. Decadal prediction is still in its infancy and there is an urgent need to understand the important processes that determine predictability on these timescales. We have taken the first Hadley Centre Decadal Prediction System (DePreSys) and implemented it on several NERC institute compute clusters in order to study a wider range of initial condition impacts on decadal forecasting, eventually including the state of the land and cryosphere. The eScience methods are used to manage submission and output from the many ensemble model runs required to assess predictive skill. Early results suggest initial condition skill may extend for several years, even over land areas, but this depends sensitively on the definition used to measure skill, and alternatives are presented. The Grid for Coupled Ensemble Prediction (GCEP) system will allow the UK academic community to contribute to international experiments being planned to explore decadal climate predictability.

  10. Prediction during natural language comprehension

    NARCIS (Netherlands)

    Willems, R.M.; Frank, S.L.; Nijhof, A.D.; Hagoort, P.; Bosch, A.P.J. van den

    2016-01-01

    The notion of prediction is studied in cognitive neuroscience with increasing intensity. We investigated the neural basis of 2 distinct aspects of word prediction, derived from information theory, during story comprehension. We assessed the effect of entropy of next-word probability distributions as

  11. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    Science.gov (United States)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  12. Psychometric prediction of penitentiary recidivism.

    Science.gov (United States)

    Medina García, Pedro M; Baños Rivera, Rosa M

    2016-05-01

    Attempts to predict prison recidivism based on the personality have not been very successful. This study aims to provide data on recidivism prediction based on the scores on a personality questionnaire. For this purpose, a predictive model combining the actuarial procedure with a posteriori probability was developed, consisting of the probabilistic calculation of the effective verification of the event once it has already occurred. Cuestionario de Personalidad Situacional (CPS; Fernández, Seisdedos, & Mielgo, 1998) was applied to 978 male inmates classified as recidivists or non-recidivists. High predictive power was achieved, with the area under the curve (AUC) of 0.85 (p <.001; Se = 0.012; 95% CI [0.826, 0.873]. The answers to the CPS items made it possible to properly discriminate 77.3% of the participants. These data indicate the important role of the personality as a key factor in understanding delinquency and predicting recidivism.

  13. Predictive Biomarkers for Asthma Therapy.

    Science.gov (United States)

    Medrek, Sarah K; Parulekar, Amit D; Hanania, Nicola A

    2017-09-19

    Asthma is a heterogeneous disease characterized by multiple phenotypes. Treatment of patients with severe disease can be challenging. Predictive biomarkers are measurable characteristics that reflect the underlying pathophysiology of asthma and can identify patients that are likely to respond to a given therapy. This review discusses current knowledge regarding predictive biomarkers in asthma. Recent trials evaluating biologic therapies targeting IgE, IL-5, IL-13, and IL-4 have utilized predictive biomarkers to identify patients who might benefit from treatment. Other work has suggested that using composite biomarkers may offer enhanced predictive capabilities in tailoring asthma therapy. Multiple biomarkers including sputum eosinophil count, blood eosinophil count, fractional concentration of nitric oxide in exhaled breath (FeNO), and serum periostin have been used to identify which patients will respond to targeted asthma medications. Further work is needed to integrate predictive biomarkers into clinical practice.

  14. Female song rate and structure predict reproductive success in a socially monogamous bird.

    Directory of Open Access Journals (Sweden)

    Dianne Heather Brunton

    2016-03-01

    Full Text Available Bird song is commonly regarded as a male trait that has evolved through sexual selection. However, recent research has prompted a re-evaluation of this view by demonstrating that female song is an ancestral and phylogenetically widespread trait. Species with female song provide opportunities to study selective pressures and mechanisms specific to females within the wider context of social competition. We investigated the relationship between reproductive success and female song performance in the New Zealand bellbird (Anthornis melanura, a passerine resident year round in New Zealand temperate forests. We monitored breeding behavior and song over three years on Tiritiri Matangi Island. Female bellbirds contributed significantly more towards parental care than males (solely incubating young and provisioning chicks at more than twice the rate of males. Female song rate in the vicinity of the nest was higher than that of males during incubation and chick-rearing stages but similar during early-nesting and post-breeding stages. Using GLMs, we found that female song rates during both incubation and chick-rearing stages strongly predicted the number of fledged chicks. However, male song rate and male and female chick provisioning rates had no effect on fledging success. Two measures of female song complexity (number of syllable types and the number of transitions between different syllable types were also good predictors of breeding success (GLM on PC scores. In contrast, song duration, the total number of syllables, and the number of ‘stutter’ syllables per song were not correlated with fledging success. It is unclear why male song rate was not associated with reproductive success and we speculate that extra-pair paternity might play a role. While we have previously demonstrated that female bellbird song is important in intrasexual interactions, we clearly demonstrate here that female song predicts reproductive success. These results, with others

  15. Are abrupt climate changes predictable?

    Science.gov (United States)

    Ditlevsen, Peter

    2013-04-01

    It is taken for granted that the limited predictability in the initial value problem, the weather prediction, and the predictability of the statistics are two distinct problems. Lorenz (1975) dubbed this predictability of the first and the second kind respectively. Predictability of the first kind in a chaotic dynamical system is limited due to the well-known critical dependence on initial conditions. Predictability of the second kind is possible in an ergodic system, where either the dynamics is known and the phase space attractor can be characterized by simulation or the system can be observed for such long times that the statistics can be obtained from temporal averaging, assuming that the attractor does not change in time. For the climate system the distinction between predictability of the first and the second kind is fuzzy. This difficulty in distinction between predictability of the first and of the second kind is related to the lack of scale separation between fast and slow components of the climate system. The non-linear nature of the problem furthermore opens the possibility of multiple attractors, or multiple quasi-steady states. As the ice-core records show, the climate has been jumping between different quasi-stationary climates, stadials and interstadials through the Dansgaard-Oechger events. Such a jump happens very fast when a critical tipping point has been reached. The question is: Can such a tipping point be predicted? This is a new kind of predictability: the third kind. If the tipping point is reached through a bifurcation, where the stability of the system is governed by some control parameter, changing in a predictable way to a critical value, the tipping is predictable. If the sudden jump occurs because internal chaotic fluctuations, noise, push the system across a barrier, the tipping is as unpredictable as the triggering noise. In order to hint at an answer to this question, a careful analysis of the high temporal resolution NGRIP isotope

  16. Emerging approaches in predictive toxicology.

    Science.gov (United States)

    Zhang, Luoping; McHale, Cliona M; Greene, Nigel; Snyder, Ronald D; Rich, Ivan N; Aardema, Marilyn J; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan

    2014-12-01

    Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described. © 2014 Wiley Periodicals, Inc.

  17. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  18. Collective motion of predictive swarms.

    Directory of Open Access Journals (Sweden)

    Nathaniel Rupprecht

    Full Text Available Theoretical models of populations and swarms typically start with the assumption that the motion of agents is governed by the local stimuli. However, an intelligent agent, with some understanding of the laws that govern its habitat, can anticipate the future, and make predictions to gather resources more efficiently. Here we study a specific model of this kind, where agents aim to maximize their consumption of a diffusing resource, by attempting to predict the future of a resource field and the actions of other agents. Once the agents make a prediction, they are attracted to move towards regions that have, and will have, denser resources. We find that the further the agents attempt to see into the future, the more their attempts at prediction fail, and the less resources they consume. We also study the case where predictive agents compete against non-predictive agents and find the predictors perform better than the non-predictors only when their relative numbers are very small. We conclude that predictivity pays off either when the predictors do not see too far into the future or the number of predictors is small.

  19. Dividend Predictability Around the World

    DEFF Research Database (Denmark)

    Rangvid, Jesper; Schrimpf, Andreas

    The common perception in the literature, mainly based on U.S. data, is that current dividend yields are uninformative about future dividends. We show that this nding changes substantially when looking at a broad international panel of countries, as aggregate dividend growth rates are found...... that in countries where the quality of institutions is high, dividend predictability is weaker. These ndings indicate that the apparent lack of dividend predictability in the U.S. does not, in general, extend to other countries. Rather, dividend predictability is driven by cross-country dierences in rm...

  20. The Theory of Linear Prediction

    CERN Document Server

    Vaidyanathan, PP

    2007-01-01

    Linear prediction theory has had a profound impact in the field of digital signal processing. Although the theory dates back to the early 1940s, its influence can still be seen in applications today. The theory is based on very elegant mathematics and leads to many beautiful insights into statistical signal processing. Although prediction is only a part of the more general topics of linear estimation, filtering, and smoothing, this book focuses on linear prediction. This has enabled detailed discussion of a number of issues that are normally not found in texts. For example, the theory of vecto

  1. Practical aspects of geological prediction

    International Nuclear Information System (INIS)

    Mallio, W.J.; Peck, J.H.

    1981-01-01

    Nuclear waste disposal requires that geology be a predictive science. The prediction of future events rests on (1) recognizing the periodicity of geologic events; (2) defining a critical dimension of effect, such as the area of a drainage basin, the length of a fault trace, etc; and (3) using our understanding of active processes the project the frequency and magnitude of future events in the light of geological principles. Of importance to nuclear waste disposal are longer term processes such as continental denudation and removal of materials by glacial erosion. Constant testing of projections will allow the practical limits of predicting geological events to be defined. 11 refs

  2. Adaptive filtering prediction and control

    CERN Document Server

    Goodwin, Graham C

    2009-01-01

    Preface1. Introduction to Adaptive TechniquesPart 1. Deterministic Systems2. Models for Deterministic Dynamical Systems3. Parameter Estimation for Deterministic Systems4. Deterministic Adaptive Prediction5. Control of Linear Deterministic Systems6. Adaptive Control of Linear Deterministic SystemsPart 2. Stochastic Systems7. Optimal Filtering and Prediction8. Parameter Estimation for Stochastic Dynamic Systems9. Adaptive Filtering and Prediction10. Control of Stochastic Systems11. Adaptive Control of Stochastic SystemsAppendicesA. A Brief Review of Some Results from Systems TheoryB. A Summary o

  3. Predicting emergency diesel starting performance

    International Nuclear Information System (INIS)

    DeBey, T.M.

    1989-01-01

    The US Department of Energy effort to extend the operational lives of commercial nuclear power plants has examined methods for predicting the performance of specific equipment. This effort focuses on performance prediction as a means for reducing equipment surveillance, maintenance, and outages. Realizing these goals will result in nuclear plants that are more reliable, have lower maintenance costs, and have longer lives. This paper describes a monitoring system that has been developed to predict starting performance in emergency diesels. A prototype system has been built and tested on an engine at Sandia National Laboratories. 2 refs

  4. Defining a glycated haemoglobin (HbA1c) level that predicts increased risk of penile implant infection.

    Science.gov (United States)

    Habous, Mohamad; Tal, Raanan; Tealab, Alaa; Soliman, Tarek; Nassar, Mohammed; Mekawi, Zenhom; Mahmoud, Saad; Abdelwahab, Osama; Elkhouly, Mohamed; Kamr, Hatem; Remeah, Abdallah; Binsaleh, Saleh; Ralph, David; Mulhall, John

    2018-02-01

    To re-evaluate the role of diabetes mellitus (DM) as a risk factor for penile implant infection by exploring the association between glycated haemoglobin (HbA1c) levels and penile implant infection rates and to define a threshold value that predicts implant infection. We conducted a multicentre prospective study including all patients undergoing penile implant surgery between 2009 and 2015. Preoperative, perioperative and postoperative management were identical for the entire cohort. Univariate analysis was performed to define predictors of implant infection. The HbA1c levels were analysed as continuous variables and sequential analysis was conducted using 0.5% increments to define a threshold level predicting implant infection. Multivariable analysis was performed with the following factors entered in the model: DM, HbA1C level, patient age, implant type, number of vascular risk factors (VRFs), presence of Peyronie's disease (PD), body mass index (BMI), and surgeon volume. A receiver operating characteristic (ROC) curve was generated to define the optimal HbA1C threshold for infection prediction. In all, 902 implant procedures were performed over the study period. The mean patient age was 56.6 years. The mean HbA1c level was 8.0%, with 81% of men having a HbA1c level of >6%. In all, 685 (76%) implants were malleable and 217 (24%) were inflatable devices; 302 (33.5%) patients also had a diagnosis of PD. The overall infection rate was 8.9% (80/902). Patients who had implant infection had significantly higher mean HbA1c levels, 9.5% vs 7.8% (P HbA1c level, we found infection rates were: 1.3% with HbA1c level of 9.5% (P HbA1c level, whilst a high-volume surgeon had a protective effect and was associated with a reduced infection risk. Using ROC analysis, we determined that a HbA1c threshold level of 8.5% predicted infection with a sensitivity of 80% and a specificity of 65%. Uncontrolled DM is associated with increased risk of infection after penile implant surgery

  5. Predictive Models and Computational Embryology

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  6. Fatigue life prediction in composites

    CSIR Research Space (South Africa)

    Huston, RJ

    1994-01-01

    Full Text Available Because of the relatively large number of possible failure mechanisms in fibre reinforced composite materials, the prediction of fatigue life in a component is not a simple process. Several mathematical and statistical models have been proposed...

  7. Trading network predicts stock price.

    Science.gov (United States)

    Sun, Xiao-Qian; Shen, Hua-Wei; Cheng, Xue-Qi

    2014-01-16

    Stock price prediction is an important and challenging problem for studying financial markets. Existing studies are mainly based on the time series of stock price or the operation performance of listed company. In this paper, we propose to predict stock price based on investors' trading behavior. For each stock, we characterize the daily trading relationship among its investors using a trading network. We then classify the nodes of trading network into three roles according to their connectivity pattern. Strong Granger causality is found between stock price and trading relationship indices, i.e., the fraction of trading relationship among nodes with different roles. We further predict stock price by incorporating these trading relationship indices into a neural network based on time series of stock price. Experimental results on 51 stocks in two Chinese Stock Exchanges demonstrate the accuracy of stock price prediction is significantly improved by the inclusion of trading relationship indices.

  8. Prediction based on mean subset

    DEFF Research Database (Denmark)

    Øjelund, Henrik; Brown, P. J.; Madsen, Henrik

    2002-01-01

    , it is found that the proposed mean subset method has superior prediction performance than prediction based on the best subset method, and in some settings also better than the ridge regression and lasso methods. The conclusions drawn from the Monte Carlo study is corroborated in an example in which prediction......Shrinkage methods have traditionally been applied in prediction problems. In this article we develop a shrinkage method (mean subset) that forms an average of regression coefficients from individual subsets of the explanatory variables. A Bayesian approach is taken to derive an expression of how...... the coefficient vectors from each subset should be weighted. It is not computationally feasible to calculate the mean subset coefficient vector for larger problems, and thus we suggest an algorithm to find an approximation to the mean subset coefficient vector. In a comprehensive Monte Carlo simulation study...

  9. EPRI MOV performance prediction program

    International Nuclear Information System (INIS)

    Hosler, J.F.; Damerell, P.S.; Eidson, M.G.; Estep, N.E.

    1994-01-01

    An overview of the EPRI Motor-Operated Valve (MOV) Performance Prediction Program is presented. The objectives of this Program are to better understand the factors affecting the performance of MOVs and to develop and validate methodologies to predict MOV performance. The Program involves valve analytical modeling, separate-effects testing to refine the models, and flow-loop and in-plant MOV testing to provide a basis for model validation. The ultimate product of the Program is an MOV Performance Prediction Methodology applicable to common gate, globe, and butterfly valves. The methodology predicts thrust and torque requirements at design-basis flow and differential pressure conditions, assesses the potential for gate valve internal damage, and provides test methods to quantify potential for gate valve internal damage, and provides test methods to quantify potential variations in actuator output thrust with loading condition. Key findings and their potential impact on MOV design and engineering application are summarized

  10. In silico prediction of genotoxicity.

    Science.gov (United States)

    Wichard, Jörg D

    2017-08-01

    The in silico prediction of genotoxicity has made considerable progress during the last years. The main driver for the pharmaceutical industry is the ICH M7 guideline about the assessment of DNA reactive impurities. An important component of this guideline is the use of in silico models as an alternative approach to experimental testing. The in silico prediction of genotoxicity provides an established and accepted method that defines the first step in the assessment of DNA reactive impurities. This was made possible by the growing amount of reliable Ames screening data, the attempts to understand the activity pathways and the subsequent development of computer-based prediction systems. This paper gives an overview of how the in silico prediction of genotoxicity is performed under the ICH M7 guideline. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. New Tool to Predict Glaucoma

    Science.gov (United States)

    ... In This Section A New Tool to Predict Glaucoma email Send this article to a friend by ... Close Send Thanks for emailing that article! Tweet Glaucoma can be difficult to detect and diagnose. Measurement ...

  12. Dynamical Predictability of Monthly Means.

    Science.gov (United States)

    Shukla, J.

    1981-12-01

    We have attempted to determine the theoretical upper limit of dynamical predictability of monthly means for prescribed nonfluctuating external forcings. We have extended the concept of `classical' predictability, which primarily refers to the lack of predictability due mainly to the instabilities of synoptic-scale disturbances, to the predictability of time averages, which are determined by the predictability of low-frequency planetary waves. We have carded out 60-day integrations of a global general circulation model with nine different initial conditions but identical boundary conditions of sea surface temperature, snow, sea ice and soil moisture. Three of these initial conditions are the observed atmospheric conditions on 1 January of 1975, 1976 and 1977. The other six initial conditions are obtained by superimposing over the observed initial conditions a random perturbation comparable to the errors of observation. The root-mean-square (rms) error of random perturbations at all the grid points and all the model levels is 3 m s1 in u and v components of wind. The rms vector wind error between the observed initial conditions is >15 m s1.It is hypothesized that for a given averaging period, if the rms error among the time averages predicted from largely different initial conditions becomes comparable to the rms error among the time averages predicted from randomly perturbed initial conditions, the time averages are dynamically unpredictable. We have carried out the analysis of variance to compare the variability, among the three groups, due to largely different initial conditions, and within each group due to random perturbations.It is found that the variances among the first 30-day means, predicted from largely different initial conditions, are significantly different from the variances due to random perturbations in the initial conditions, whereas the variances among 30-day means for days 31-60 are not distinguishable from the variances due to random initial

  13. Predictive coding in Agency Detection

    DEFF Research Database (Denmark)

    Andersen, Marc Malmdorf

    2017-01-01

    Agency detection is a central concept in the cognitive science of religion (CSR). Experimental studies, however, have so far failed to lend support to some of the most common predictions that follow from current theories on agency detection. In this article, I argue that predictive coding, a highly...... promising new framework for understanding perception and action, may solve pending theoretical inconsistencies in agency detection research, account for the puzzling experimental findings mentioned above, and provide hypotheses for future experimental testing. Predictive coding explains how the brain......, unbeknownst to consciousness, engages in sophisticated Bayesian statistics in an effort to constantly predict the hidden causes of sensory input. My fundamental argument is that most false positives in agency detection can be seen as the result of top-down interference in a Bayesian system generating high...

  14. Time-predictable Stack Caching

    DEFF Research Database (Denmark)

    Abbaspourseyedi, Sahar

    completely. Thus, in systems with hard deadlines the worst-case execution time (WCET) of the real-time software running on them needs to be bounded. Modern architectures use features such as pipelining and caches for improving the average performance. These features, however, make the WCET analysis more...... addresses, provides an opportunity to predict and tighten the WCET of accesses to data in caches. In this thesis, we introduce the time-predictable stack cache design and implementation within a time-predictable processor. We introduce several optimizations to our design for tightening the WCET while...... keeping the timepredictability of the design intact. Moreover, we provide a solution for reducing the cost of context switching in a system using the stack cache. In design of these caches, we use custom hardware and compiler support for delivering time-predictable stack data accesses. Furthermore...

  15. Age at disease onset and peak ammonium level rather than interventional variables predict the neurological outcome in urea cycle disorders.

    Science.gov (United States)

    Posset, Roland; Garcia-Cazorla, Angeles; Valayannopoulos, Vassili; Teles, Elisa Leão; Dionisi-Vici, Carlo; Brassier, Anaïs; Burlina, Alberto B; Burgard, Peter; Cortès-Saladelafont, Elisenda; Dobbelaere, Dries; Couce, Maria L; Sykut-Cegielska, Jolanta; Häberle, Johannes; Lund, Allan M; Chakrapani, Anupam; Schiff, Manuel; Walter, John H; Zeman, Jiri; Vara, Roshni; Kölker, Stefan

    2016-09-01

    Patients with urea cycle disorders (UCDs) have an increased risk of neurological disease manifestation. Determining the effect of diagnostic and therapeutic interventions on the neurological outcome. Evaluation of baseline, regular follow-up and emergency visits of 456 UCD patients prospectively followed between 2011 and 2015 by the E-IMD patient registry. About two-thirds of UCD patients remained asymptomatic until age 12 days [i.e. the median age at diagnosis of patients identified by newborn screening (NBS)] suggesting a potential benefit of NBS. In fact, NBS lowered the age at diagnosis in patients with late onset of symptoms (>28 days), and a trend towards improved long-term neurological outcome was found for patients with argininosuccinate synthetase and lyase deficiency as well as argininemia identified by NBS. Three to 17 different drug combinations were used for maintenance therapy, but superiority of any single drug or specific drug combination above other combinations was not demonstrated. Importantly, non-interventional variables of disease severity, such as age at disease onset and peak ammonium level of the initial hyperammonemic crisis (cut-off level: 500 μmol/L) best predicted the neurological outcome. Promising results of NBS for late onset UCD patients are reported and should be re-evaluated in a larger and more advanced age group. However, non-interventional variables affect the neurological outcome of UCD patients. Available evidence-based guideline recommendations are currently heterogeneously implemented into practice, leading to a high variability of drug combinations that hamper our understanding of optimised long-term and emergency treatment.

  16. Innovative biomarkers for predicting type 2 diabetes mellitus: relevance to dietary management of frailty in older adults.

    Science.gov (United States)

    Ikwuobe, John; Bellary, Srikanth; Griffiths, Helen R

    2016-06-01

    Type 2 diabetes mellitus (T2DM) increases in prevalence in the elderly. There is evidence for significant muscle loss and accelerated cognitive impairment in older adults with T2DM; these comorbidities are critical features of frailty. In the early stages of T2DM, insulin sensitivity can be improved by a "healthy" diet. Management of insulin resistance by diet in people over 65 years of age should be carefully re-evaluated because of the risk for falling due to hypoglycaemia. To date, an optimal dietary programme for older adults with insulin resistance and T2DM has not been described. The use of biomarkers to identify those at risk for T2DM will enable clinicians to offer early dietary advice that will delay onset of disease and of frailty. Here we have used an in silico literature search for putative novel biomarkers of T2DM risk and frailty. We suggest that plasma bilirubin, plasma, urinary DPP4-positive microparticles and plasma pigment epithelium-derived factor merit further investigation as predictive biomarkers for T2DM and frailty risk in older adults. Bilirubin is screened routinely in clinical practice. Measurement of specific microparticle frequency in urine is less invasive than a blood sample so is a good choice for biomonitoring. Future studies should investigate whether early dietary changes, such as increased intake of whey protein and micronutrients that improve muscle function and insulin sensitivity, affect biomarkers and can reduce the longer term complication of frailty in people at risk for T2DM.

  17. Endoscopic and Histological Findings Are Predicted by Fecal Calprotectin in Acute Intestinal Graft-Versus-Host-Disease.

    Science.gov (United States)

    Adam, Birgit; Koldehoff, Michael; Ditschkowski, Markus; Gromke, Tanja; Hlinka, Michal; Trenschel, Rudolf; Kordeals, Lambros; Steckel, Nina K; Beelen, Dietrich W; Liebregts, Tobias

    2016-07-01

    Gastrointestinal graft-versus-host-disease (GI-GVHD) is a major cause of nonrelapse mortality after hematopoietic stem cell transplantation (HSCT) necessitating endoscopic examinations and biopsies for diagnosis. Fecal calprotectin (CPT) has been widely used in gastrointestinal inflammation, but comprehensive data in GI-GVHD are lacking. We aimed to identify an association of CPT with endoscopic findings, mucosal damage and symptoms for diagnosing and monitoring acute GI-GVHD. Symptoms were prospectively evaluated in 110 consecutive HSCT recipients by standardized questionnaires and Bristol Stool Scale (BSS). CPT was assayed by ELISA. Symptom assessment and CPT were performed weekly and with onset of first symptoms. GVHD was diagnosed according to the Glucksberg criteria and by endoscopic biopsies. Patients with GI-GVHD received standard high-dose corticosteroid therapy and follow-up CPT, and symptom evaluation was performed after 28 days. Patients not responding to steroid treatment were re-evaluated by colonoscopy. GI-GVHD was diagnosed in 40 patients. Twelve patients with GI symptoms and CMV colitis and 24 patients with isolated skin GVHD were included as control subjects. CPT was significantly higher in GI-GVHD compared to skin GVHD and CMV colitis. Endoscopic findings, histological grading, abdominal cramps, diarrhea, urgency and BSS correlated with CPT. At follow-up, CPT correlated with abdominal cramps, diarrhea, urgency and BSS. In steroid refractory patients, CPT level was still significantly associated with severity of mucosal damage. CPT predicts endoscopic and histological findings in GI-GVHD and correlates with lower GI symptoms. It enables to discriminate GVHD from CMV colitis and to monitor therapeutic success.

  18. NASA/MSFC prediction techniques

    International Nuclear Information System (INIS)

    Smith, R.E.

    1987-01-01

    The NASA/MSFC method of forecasting is more formal than NOAA's. The data are smoothed by the Lagrangian method and linear regression prediction techniques are used. The solar activity period is fixed at 11 years--the mean period of all previous cycles. Interestingly, the present prediction for the time of the next solar minimum is February or March of 1987, which, within the uncertainties of two methods, can be taken to be the same as the NOAA result

  19. Prediction of molecular crystal structures

    International Nuclear Information System (INIS)

    Beyer, Theresa

    2001-01-01

    The ab initio prediction of molecular crystal structures is a scientific challenge. Reliability of first-principle prediction calculations would show a fundamental understanding of crystallisation. Crystal structure prediction is also of considerable practical importance as different crystalline arrangements of the same molecule in the solid state (polymorphs)are likely to have different physical properties. A method of crystal structure prediction based on lattice energy minimisation has been developed in this work. The choice of the intermolecular potential and of the molecular model is crucial for the results of such studies and both of these criteria have been investigated. An empirical atom-atom repulsion-dispersion potential for carboxylic acids has been derived and applied in a crystal structure prediction study of formic, benzoic and the polymorphic system of tetrolic acid. As many experimental crystal structure determinations at different temperatures are available for the polymorphic system of paracetamol (acetaminophen), the influence of the variations of the molecular model on the crystal structure lattice energy minima, has also been studied. The general problem of prediction methods based on the assumption that the experimental thermodynamically stable polymorph corresponds to the global lattice energy minimum, is that more hypothetical low lattice energy structures are found within a few kJ mol -1 of the global minimum than are likely to be experimentally observed polymorphs. This is illustrated by the results for molecule I, 3-oxabicyclo(3.2.0)hepta-1,4-diene, studied for the first international blindtest for small organic crystal structures organised by the Cambridge Crystallographic Data Centre (CCDC) in May 1999. To reduce the number of predicted polymorphs, additional factors to thermodynamic criteria have to be considered. Therefore the elastic constants and vapour growth morphologies have been calculated for the lowest lattice energy

  20. Does Carbon Dioxide Predict Temperature?

    OpenAIRE

    Mytty, Tuukka

    2013-01-01

    Does carbon dioxide predict temperature? No it does not, in the time period of 1880-2004 with the carbon dioxide and temperature data used in this thesis. According to the Inter Governmental Panel on Climate Change(IPCC) carbon dioxide is the most important factor in raising the global temperature. Therefore, it is reasonable to assume that carbon dioxide truly predicts temperature. Because this paper uses observational data it has to be kept in mind that no causality interpretation can be ma...

  1. Prediction of molecular crystal structures

    Energy Technology Data Exchange (ETDEWEB)

    Beyer, Theresa

    2001-07-01

    The ab initio prediction of molecular crystal structures is a scientific challenge. Reliability of first-principle prediction calculations would show a fundamental understanding of crystallisation. Crystal structure prediction is also of considerable practical importance as different crystalline arrangements of the same molecule in the solid state (polymorphs)are likely to have different physical properties. A method of crystal structure prediction based on lattice energy minimisation has been developed in this work. The choice of the intermolecular potential and of the molecular model is crucial for the results of such studies and both of these criteria have been investigated. An empirical atom-atom repulsion-dispersion potential for carboxylic acids has been derived and applied in a crystal structure prediction study of formic, benzoic and the polymorphic system of tetrolic acid. As many experimental crystal structure determinations at different temperatures are available for the polymorphic system of paracetamol (acetaminophen), the influence of the variations of the molecular model on the crystal structure lattice energy minima, has also been studied. The general problem of prediction methods based on the assumption that the experimental thermodynamically stable polymorph corresponds to the global lattice energy minimum, is that more hypothetical low lattice energy structures are found within a few kJ mol{sup -1} of the global minimum than are likely to be experimentally observed polymorphs. This is illustrated by the results for molecule I, 3-oxabicyclo(3.2.0)hepta-1,4-diene, studied for the first international blindtest for small organic crystal structures organised by the Cambridge Crystallographic Data Centre (CCDC) in May 1999. To reduce the number of predicted polymorphs, additional factors to thermodynamic criteria have to be considered. Therefore the elastic constants and vapour growth morphologies have been calculated for the lowest lattice energy

  2. MCMC exploration of supermassive black hole binary inspirals

    International Nuclear Information System (INIS)

    Cornish, Neil J; Porter, Edward K

    2006-01-01

    The Laser Interferometer Space Antenna will be able to detect the inspiral and merger of super massive black hole binaries (SMBHBs) anywhere in the universe. Standard matched filtering techniques can be used to detect and characterize these systems. Markov Chain Monte Carlo (MCMC) methods are ideally suited to this and other LISA data analysis problems as they are able to efficiently handle models with large dimensions. Here we compare the posterior parameter distributions derived by an MCMC algorithm with the distributions predicted by the Fisher information matrix. We find excellent agreement for the extrinsic parameters, while the Fisher matrix slightly overestimates errors in the intrinsic parameters

  3. Prediction of interannual climate variations

    International Nuclear Information System (INIS)

    Shukla, J.

    1993-01-01

    It has been known for some time that the behavior of the short-term fluctuations of the earth's atmosphere resembles that of a chaotic non-linear dynamical system, and that the day-to-day weather cannot be predicted beyond a few weeks. However, it has also been found that the interactions of the atmosphere with the underlying oceans and the land surfaces can produce fluctuations whose time scales are much longer than the limits of deterministic prediction of weather. It is, therefore, natural to ask whether it is possible that the seasonal and longer time averages of climate fluctuations can be predicted with sufficient skill to be beneficial for social and economic applications, even though the details of day-to-day weather cannot be predicted beyond a few weeks. The main objective of the workshop was to address this question by assessing the current state of knowledge on predictability of seasonal and interannual climate variability and to investigate various possibilities for its prediction. (orig./KW)

  4. Postprocessing for Air Quality Predictions

    Science.gov (United States)

    Delle Monache, L.

    2017-12-01

    In recent year, air quality (AQ) forecasting has made significant progress towards better predictions with the goal of protecting the public from harmful pollutants. This progress is the results of improvements in weather and chemical transport models, their coupling, and more accurate emission inventories (e.g., with the development of new algorithms to account in near real-time for fires). Nevertheless, AQ predictions are still affected at times by significant biases which stem from limitations in both weather and chemistry transport models. Those are the result of numerical approximations and the poor representation (and understanding) of important physical and chemical process. Moreover, although the quality of emission inventories has been significantly improved, they are still one of the main sources of uncertainties in AQ predictions. For operational real-time AQ forecasting, a significant portion of these biases can be reduced with the implementation of postprocessing methods. We will review some of the techniques that have been proposed to reduce both systematic and random errors of AQ predictions, and improve the correlation between predictions and observations of ground-level ozone and surface particulate matter less than 2.5 µm in diameter (PM2.5). These methods, which can be applied to both deterministic and probabilistic predictions, include simple bias-correction techniques, corrections inspired by the Kalman filter, regression methods, and the more recently developed analog-based algorithms. These approaches will be compared and contrasted, and strength and weaknesses of each will be discussed.

  5. Predictive value of diminutive colonic adenoma trial: the PREDICT trial.

    Science.gov (United States)

    Schoenfeld, Philip; Shad, Javaid; Ormseth, Eric; Coyle, Walter; Cash, Brooks; Butler, James; Schindler, William; Kikendall, Walter J; Furlong, Christopher; Sobin, Leslie H; Hobbs, Christine M; Cruess, David; Rex, Douglas

    2003-05-01

    Diminutive adenomas (1-9 mm in diameter) are frequently found during colon cancer screening with flexible sigmoidoscopy (FS). This trial assessed the predictive value of these diminutive adenomas for advanced adenomas in the proximal colon. In a multicenter, prospective cohort trial, we matched 200 patients with normal FS and 200 patients with diminutive adenomas on FS for age and gender. All patients underwent colonoscopy. The presence of advanced adenomas (adenoma >or= 10 mm in diameter, villous adenoma, adenoma with high grade dysplasia, and colon cancer) and adenomas (any size) was recorded. Before colonoscopy, patients completed questionnaires about risk factors for adenomas. The prevalence of advanced adenomas in the proximal colon was similar in patients with diminutive adenomas and patients with normal FS (6% vs. 5.5%, respectively) (relative risk, 1.1; 95% confidence interval [CI], 0.5-2.6). Diminutive adenomas on FS did not accurately predict advanced adenomas in the proximal colon: sensitivity, 52% (95% CI, 32%-72%); specificity, 50% (95% CI, 49%-51%); positive predictive value, 6% (95% CI, 4%-8%); and negative predictive value, 95% (95% CI, 92%-97%). Male gender (odds ratio, 1.63; 95% CI, 1.01-2.61) was associated with an increased risk of proximal colon adenomas. Diminutive adenomas on sigmoidoscopy may not accurately predict advanced adenomas in the proximal colon.

  6. Reward positivity: Reward prediction error or salience prediction error?

    Science.gov (United States)

    Heydari, Sepideh; Holroyd, Clay B

    2016-08-01

    The reward positivity is a component of the human ERP elicited by feedback stimuli in trial-and-error learning and guessing tasks. A prominent theory holds that the reward positivity reflects a reward prediction error signal that is sensitive to outcome valence, being larger for unexpected positive events relative to unexpected negative events (Holroyd & Coles, 2002). Although the theory has found substantial empirical support, most of these studies have utilized either monetary or performance feedback to test the hypothesis. However, in apparent contradiction to the theory, a recent study found that unexpected physical punishments also elicit the reward positivity (Talmi, Atkinson, & El-Deredy, 2013). The authors of this report argued that the reward positivity reflects a salience prediction error rather than a reward prediction error. To investigate this finding further, in the present study participants navigated a virtual T maze and received feedback on each trial under two conditions. In a reward condition, the feedback indicated that they would either receive a monetary reward or not and in a punishment condition the feedback indicated that they would receive a small shock or not. We found that the feedback stimuli elicited a typical reward positivity in the reward condition and an apparently delayed reward positivity in the punishment condition. Importantly, this signal was more positive to the stimuli that predicted the omission of a possible punishment relative to stimuli that predicted a forthcoming punishment, which is inconsistent with the salience hypothesis. © 2016 Society for Psychophysiological Research.

  7. Climate Prediction - NOAA's National Weather Service

    Science.gov (United States)

    Statistical Models... MOS Prod GFS-LAMP Prod Climate Past Weather Predictions Weather Safety Weather Radio National Weather Service on FaceBook NWS on Facebook NWS Director Home > Climate > Predictions Climate Prediction Long range forecasts across the U.S. Climate Prediction Web Sites Climate Prediction

  8. Weighted-Average Least Squares Prediction

    NARCIS (Netherlands)

    Magnus, Jan R.; Wang, Wendun; Zhang, Xinyu

    2016-01-01

    Prediction under model uncertainty is an important and difficult issue. Traditional prediction methods (such as pretesting) are based on model selection followed by prediction in the selected model, but the reported prediction and the reported prediction variance ignore the uncertainty from the

  9. Potential Predictability and Prediction Skill for Southern Peru Summertime Rainfall

    Science.gov (United States)

    WU, S.; Notaro, M.; Vavrus, S. J.; Mortensen, E.; Block, P. J.; Montgomery, R. J.; De Pierola, J. N.; Sanchez, C.

    2016-12-01

    The central Andes receive over 50% of annual climatological rainfall during the short period of January-March. This summertime rainfall exhibits strong interannual and decadal variability, including severe drought events that incur devastating societal impacts and cause agricultural communities and mining facilities to compete for limited water resources. An improved seasonal prediction skill of summertime rainfall would aid in water resource planning and allocation across the water-limited southern Peru. While various underlying mechanisms have been proposed by past studies for the drivers of interannual variability in summertime rainfall across southern Peru, such as the El Niño-Southern Oscillation (ENSO), Madden Julian Oscillation (MJO), and extratropical forcings, operational forecasts continue to be largely based on rudimentary ENSO-based indices, such as NINO3.4, justifying further exploration of predictive skill. In order to bridge this gap between the understanding of driving mechanisms and the operational forecast, we performed systematic studies on the predictability and prediction skill of southern Peru summertime rainfall by constructing statistical forecast models using best available weather station and reanalysis datasets. At first, by assuming the first two empirical orthogonal functions (EOFs) of summertime rainfall are predictable, the potential predictability skill was evaluated for southern Peru. Then, we constructed a simple regression model, based on the time series of tropical Pacific sea-surface temperatures (SSTs), and a more advanced Linear Inverse Model (LIM), based on the EOFs of tropical ocean SSTs and large-scale atmosphere variables from reanalysis. Our results show that the LIM model consistently outperforms the more rudimentary regression models on the forecast skill of domain averaged precipitation index and individual station indices. The improvement of forecast correlation skill ranges from 10% to over 200% for different

  10. Prediction of GNSS satellite clocks

    International Nuclear Information System (INIS)

    Broederbauer, V.

    2010-01-01

    This thesis deals with the characterisation and prediction of GNSS-satellite-clocks. A prerequisite to develop powerful algorithms for the prediction of clock-corrections is the thorough study of the behaviour of the different clock-types of the satellites. In this context the predicted part of the IGU-clock-corrections provided by the Analysis Centers (ACs) of the IGS was compared to the IGS-Rapid-clock solutions to determine reasonable estimates of the quality of already existing well performing predictions. For the shortest investigated interval (three hours) all ACs obtain almost the same accuracy of 0,1 to 0,4 ns. For longer intervals the individual predictions results start to diverge. Thus, for a 12-hours- interval the differences range from nearly 10 ns (GFZ, CODE) until up to some 'tens of ns'. Based on the estimated clock corrections provided via the IGS Rapid products a simple quadratic polynomial turns out to be sufficient to describe the time series of Rubidium-clocks. On the other hand Cesium-clocks show a periodical behaviour (revolution period) with an amplitude of up to 6 ns. A clear correlation between these amplitudes and the Sun elevation angle above the orbital planes can be demonstrated. The variability of the amplitudes is supposed to be caused by temperature-variations affecting the oscillator. To account for this periodical behaviour a quadratic polynomial with an additional sinus-term was finally chosen as prediction model both for the Cesium as well as for the Rubidium clocks. The three polynomial-parameters as well as amplitude and phase shift of the periodic term are estimated within a least-square-adjustment by means of program GNSS-VC/static. Input-data are time series of the observed part of the IGU clock corrections. With the estimated parameters clock-corrections are predicted for various durations. The mean error of the prediction of Rubidium-clock-corrections for an interval of six hours reaches up to 1,5 ns. For the 12-hours

  11. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  12. Neural Elements for Predictive Coding

    Directory of Open Access Journals (Sweden)

    Stewart SHIPP

    2016-11-01

    Full Text Available Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backwards in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many ‘illusory’ instances of perception where what is seen (heard, etc is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forwards and backwards pathways should be completely separate, given their functional distinction; this aspect of circuitry – that neurons with extrinsically bifurcating axons do not project in both directions – has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy formulation of predictive coding is combined with the classic ‘canonical microcircuit’ and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a updates in the microcircuitry of primate visual cortex, and (b rapid technical advances made

  13. Neural Elements for Predictive Coding.

    Science.gov (United States)

    Shipp, Stewart

    2016-01-01

    Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backward in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many 'illusory' instances of perception where what is seen (heard, etc.) is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forward and backward pathways should be completely separate, given their functional distinction; this aspect of circuitry - that neurons with extrinsically bifurcating axons do not project in both directions - has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy) formulation of predictive coding is combined with the classic 'canonical microcircuit' and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a) updates in the microcircuitry of primate visual cortex, and (b) rapid technical advances made possible by transgenic neural

  14. Quantifying prognosis with risk predictions.

    Science.gov (United States)

    Pace, Nathan L; Eberhart, Leopold H J; Kranke, Peter R

    2012-01-01

    Prognosis is a forecast, based on present observations in a patient, of their probable outcome from disease, surgery and so on. Research methods for the development of risk probabilities may not be familiar to some anaesthesiologists. We briefly describe methods for identifying risk factors and risk scores. A probability prediction rule assigns a risk probability to a patient for the occurrence of a specific event. Probability reflects the continuum between absolute certainty (Pi = 1) and certified impossibility (Pi = 0). Biomarkers and clinical covariates that modify risk are known as risk factors. The Pi as modified by risk factors can be estimated by identifying the risk factors and their weighting; these are usually obtained by stepwise logistic regression. The accuracy of probabilistic predictors can be separated into the concepts of 'overall performance', 'discrimination' and 'calibration'. Overall performance is the mathematical distance between predictions and outcomes. Discrimination is the ability of the predictor to rank order observations with different outcomes. Calibration is the correctness of prediction probabilities on an absolute scale. Statistical methods include the Brier score, coefficient of determination (Nagelkerke R2), C-statistic and regression calibration. External validation is the comparison of the actual outcomes to the predicted outcomes in a new and independent patient sample. External validation uses the statistical methods of overall performance, discrimination and calibration and is uniformly recommended before acceptance of the prediction model. Evidence from randomised controlled clinical trials should be obtained to show the effectiveness of risk scores for altering patient management and patient outcomes.

  15. PREDICTING DEMAND FOR COTTON YARNS

    Directory of Open Access Journals (Sweden)

    SALAS-MOLINA Francisco

    2017-05-01

    Full Text Available Predicting demand for fashion products is crucial for textile manufacturers. In an attempt to both avoid out-of-stocks and minimize holding costs, different forecasting techniques are used by production managers. Both linear and non-linear time-series analysis techniques are suitable options for forecasting purposes. However, demand for fashion products presents a number of particular characteristics such as short life-cycles, short selling seasons, high impulse purchasing, high volatility, low predictability, tremendous product variety and a high number of stock-keeping-units. In this paper, we focus on predicting demand for cotton yarns using a non-linear forecasting technique that has been fruitfully used in many areas, namely, random forests. To this end, we first identify a number of explanatory variables to be used as a key input to forecasting using random forests. We consider explanatory variables usually labeled either as causal variables, when some correlation is expected between them and the forecasted variable, or as time-series features, when extracted from time-related attributes such as seasonality. Next, we evaluate the predictive power of each variable by means of out-of-sample accuracy measurement. We experiment on a real data set from a textile company in Spain. The numerical results show that simple time-series features present more predictive ability than other more sophisticated explanatory variables.

  16. Lightning prediction using radiosonde data

    Energy Technology Data Exchange (ETDEWEB)

    Weng, L.Y.; Bin Omar, J.; Siah, Y.K.; Bin Zainal Abidin, I.; Ahmad, S.K. [Univ. Tenaga, Darul Ehsan (Malaysia). College of Engineering

    2008-07-01

    Lightning is a natural phenomenon in tropical regions. Malaysia experiences very high cloud-to-ground lightning density, posing both health and economic concerns to individuals and industries. In the commercial sector, power lines, telecommunication towers and buildings are most frequently hit by lightning. In the event that a power line is hit and the protection system fails, industries which rely on that power line would cease operations temporarily, resulting in significant monetary loss. Current technology is unable to prevent lightning occurrences. However, the ability to predict lightning would significantly reduce damages from direct and indirect lightning strikes. For that reason, this study focused on developing a method to predict lightning with radiosonde data using only a simple back propagation neural network model written in C code. The study was performed at the Kuala Lumpur International Airport (KLIA). In this model, the parameters related to wind were disregarded. Preliminary results indicate that this method shows some positive results in predicting lighting. However, a larger dataset is needed in order to obtain more accurate predictions. It was concluded that future work should include wind parameters to fully capture all properties for lightning formation, subsequently its prediction. 8 refs., 5 figs.

  17. Re-evaluating your nuclear program needs: how to benefit from your vendor's Q.A. program

    International Nuclear Information System (INIS)

    Cocoros, A.E.

    1979-01-01

    The quality assurance component control and verification program to be presented provides a cost effective approach to monitoring and controlling the implementation of the design, fabrication, inspection and shipping plans of a supplier. It attempts to coordinate and integrate quality control and verification effort of a supplier with the control and verification effort of the purchaser to obtain a composite which accomplishes a total need. Based on the competency and capabilities of the supplier the purchaser can either maximize the effort the supplier performs or he must maximize his effort to obtain an optimum mix. The ultimate goal is to utilize the supplier's quality assurance program to the greatest benefit in assuring maximum quality

  18. Re-evaluation of the petrogenesis of the Proterozoic Jabiluka unconformity-related uranium deposit, Northern Territory, Australia

    Science.gov (United States)

    Polito, Paul A.; Kurt Kyser, T.; Thomas, David; Marlatt, Jim; Drever, Garth

    2005-11-01

    The world class Jabiluka unconformity-related uranium deposit in the Alligator Rivers Uranium Field, Australia, contains >163,000 tons of contained U3O8. Mineralization is hosted by shallow-to-steeply dipping basement rocks comprising graphitic units of chlorite-biotite-muscovite schist. These rocks are overlain by flat-lying coarse-grained sandstones belonging to the Kombolgie Subgroup. The deposit was discovered in 1971, but has never been mined. The construction of an 1,150 m decline into the upper eastern sector of the Jabiluka II deposit combined with closely spaced underground drilling in 1998 and 1999 allowed mapping and sampling from underground for the first time. Structural mapping, drill core logging and petrographic studies on polished thin sections established a detailed paragenesis that provided the framework for subsequent electron microprobe and X-ray diffraction, fluid inclusion, and O-H, U-Pb and 40Ar/39Ar isotope analysis. Uranium mineralization is structurally controlled within semi-brittle shears that are sub-conformable to the basement stratigraphy, and breccias that are developed within the hinge zone of fault-related folds adjacent to the shears. Uraninite is intimately associated with chlorite, sericite, hematite ± quartz. Electron microprobe and X-ray diffraction analysis of syn-ore illite and chlorite indicates a mineralization temperature of 200°C. Pre- and syn-ore minerals extracted from the Kombolgie Subgroup overlying the deposit and syn-ore alteration minerals in the Cahill Formation have δ18Ofluid and δ D fluid values of 4.0±3.7 and -27±17‰, respectively. These values are indistinguishable from illite separates extracted from diagenetic aquifers in the Kombolgie Subgroup up to 70 km to the south and east of the deposit and believed to be the source of the uraniferous fluid. New fluid inclusion microthermometry data reveal that the mineralising brine was saline, but not saturated. U-Pb and 207Pb/206Pb ratios of uraninite by laser-ablation ICP-MS suggest that massive uraninite first precipitated at ca. 1,680 Ma, which is coincident with the timing of brine migration out from the Kombolgie Subgroup as indicated by 40Ar/39Ar ages of 1,683±11 Ma from sandstone-hosted illite. Unmineralized breccias cemeted by chlorite, quartz and sericite cross-cut the mineralized breccias and are in turn cut by straight-sided, high-angle veins of drusy quartz, sulphide and dolomite. U-Pb and 207Pb/206Pb ratios combined with fluid inclusion and stable isotope data indicate that these post-ore minerals formed when mixing between two fluids occurred sometime between ca. 1,450 and 550 Ma. Distinct 207Pb/206Pb age populations occur at ca. 1,302±37, 1,191±27 and 802±57 Ma, which respectively correlate with the intrusion of the Maningkorrirr/Mudginberri phonolitic dykes and the Derim Derim Dolerite between 1,370 and 1,316 Ma, the amalgamation of Australia and Laurentia during the Grenville Orogen at ca. 1,140 Ma, and the break-up of Rodinia between 1,000 and 750 Ma.

  19. Metabolic enzyme activities of abyssal and hadal fishes: pressure effects and a re-evaluation of depth-related changes

    Science.gov (United States)

    Gerringer, M. E.; Drazen, J. C.; Yancey, P. H.

    2017-07-01

    Metabolic enzyme activities of muscle tissue have been useful and widely-applied indicators of whole animal metabolic capacity, particularly in inaccessible systems such as the deep sea. Previous studies have been conducted at atmospheric pressure, regardless of organism habitat depth. However, maximum reaction rates of some of these enzymes are pressure dependent, complicating the use of metabolic enzyme activities as proxies of metabolic rates. Here, we show pressure-related rate changes in lactate and malate dehydrogenase (LDH, MDH) and pyruvate kinase (PK) in six fish species (2 hadal, 2 abyssal, 2 shallow). LDH maximal reaction rates decreased with pressure for the two shallow species, but, in contrast to previous findings, it increased for the four deep species, suggesting evolutionary changes in LDH reaction volumes. MDH maximal reaction rates increased with pressure in all species (up to 51±10% at 60 MPa), including the tide pool snailfish, Liparis florae (activity increase at 60 MPa 44±9%), suggesting an inherent negative volume change of the reaction. PK was inhibited by pressure in all species tested, including the hadal liparids (up to 34±3% at 60 MPa), suggesting a positive volume change during the reaction. The addition of 400 mM TMAO counteracted this inhibition at both 0.5 and 2.0 mM ADP concentrations for the hadal liparid, Notoliparis kermadecensis. We revisit depth-related trends in metabolic enzyme activities according to these pressure-related rate changes and new data from seven abyssal and hadal species from the Kermadec and Mariana trenches. Results show that, with abyssal and hadal species, pressure-related rate changes are another variable to be considered in the use of enzyme activities as proxies for metabolic rate, in addition to factors such as temperature and body mass. Intraspecific increases in tricarboxylic acid cycle enzymes with depth of capture, independent of body mass, in two hadal snailfishes suggest improved nutritional condition for individuals deeper in the hadal zone, likely related to food availability. These new data inform the discussion of factors controlling metabolism in the deep sea, including the visual interactions hypothesis and extend published trends to the planet's deepest-living fishes.

  20. Re-evaluation of the fludrocortisone test: duration, NaCl supplementation and cut-off limits for aldosterone.

    Science.gov (United States)

    Westerdahl, Christina; Bergenfelz, Anders; Larsson, Johanna; Nerbrand, Christina; Valdemarsson, Stig; Wihl, Anders; Isaksson, Anders

    2009-01-01

    Primary aldosteronism (PA) is the most common form of secondary hypertension. Thus, the aims of this study were: (1) to clarify whether the fludrocortisone suppression test (FST), which confirms autonomous aldosterone secretion, is reliable when carried out during a shorter period of time and (2) to confirm the importance of NaCl supplementation. The cut-off limits already obtained for aldosterone in healthy subjects during the FST were applied in hypertensive patients with a high aldosterone to renin ratio (ARR). The healthy subjects were allocated to three groups. Fludrocortisone was administered 4 times daily over 4 days and sodium chloride was supplemented in 3 different doses. The result was applied in 24 hypertensive patients, in 24 healthy subjects (10 women (23-38 years old) and 14 men (23-58 years old)) and in 24 patients with hypertension and high ARR (16 women (45-74 years old) and 8 men (56-73 years old)). Blood pressure, aldosterone, renin, potassium and sodium were measured. After three days of FST, there was a significant decrease in the serum level of aldosterone in the healthy subjects, regardless of high or low sodium chloride supplementation (p<0.001). The decrease in serum aldosterone was significantly less pronounced in patients with PA than in healthy subjects and hypertensive patients without PA (p<0.001). The 95th percentile of plasma aldosterone at the end of the test was 225 pmol/L. The FST can be shortened to 3 days and a daily 500 mg NaCl supplementation is sufficient. A cut-off value for aldosterone of 225 pmol/L after 4 days with FST is appropriate.

  1. (2012 30 Windsor Y B Access Just 225 RE-EVALUATING INDEPENDENCE: THE EMERGING PROBLEM OF CROWN-POLICE ALIGNMENT

    Directory of Open Access Journals (Sweden)

    Jeremy Tatum

    2012-10-01

    La partie II fera ressortir les rôles distincts de la police et des procureurs de la poursuite et fournira quelques renseignements généraux sur la question de l’alignement des fonctions de la Couronne et de la police. La partie III examinera un nombre d’affaires récentes où des bureaux et des procureurs qui exercaient leurs fonctions de poursuivants, semblent avoir agi dans l’intérêt de la police plutôt que dans l’intérêt public. Finalement, parce que le problème de l’alignement des fonctions de la Couronne et la de police est, par nature, provincial et localisé, l’auteur tentera de démontrer, dans la partie IV, le rôle crucial que les barreaux et d’autres administrations publiques jouent lorsqu’il s’agit de surveiller le déroulement des poursuites et de veiller à ce que les procureurs généraux voient effectivement à ce que les affaires publiques soient administrées en conformité avec le droit.

  2. A Re-evaluation of Discarded Deceased Donor Kidneys in the UK: Are Usable Organs Still Being Discarded?

    Science.gov (United States)

    Mittal, Shruti; Adamusiak, Anna; Horsfield, Catherine; Loukopoulos, Ioannis; Karydis, Nikolaos; Kessaris, Nicos; Drage, Martin; Olsburgh, Jonathon; Watson, Christopher Je; Callaghan, Chris J

    2017-07-01

    A significant proportion of procured deceased donor kidneys are subsequently discarded. The UK Kidney Fast-Track Scheme (KFTS) was introduced in 2012, enabling kidneys at risk of discard to be simultaneously offered to participating centers. We undertook an analysis of discarded kidneys to determine if unnecessary organ discard was still occurring since the KFTS was introduced. Between April and June 2015, senior surgeons independently inspected 31 consecutive discarded kidneys from throughout the United Kingdom. All kidneys were biopsied. Organs were categorized as usable, possibly usable pending histology, or not usable for implantation. After histology reports were available, final assessments of usability were made. There were 19 donors (6 donations after brain death, 13 donations after circulatory death), with a median (range) donor age of 67 (29-83) years and Kidney Donor Profile Index of 93 (19-100). Reasons for discard were variable. Only 3 discarded kidneys had not entered the KFTS. After initial assessment postdiscard, 11 kidneys were assessed as usable, with 9 kidneys thought to be possibly usable. Consideration of histological data reduced the number of kidneys thought usable to 10 (10/31; 32%). The KFTS scheme is successfully identifying organs at high risk of discard, though potentially transplantable organs are still being discarded. Analyses of discarded organs are essential to identify barriers to organ utilization and develop strategies to reduce unnecessary discard.

  3. Re-evaluating the Rose approach: comparative benefits of the population and high-risk preventive strategies.

    LENUS (Irish Health Repository)

    Cooney, Marie-Therese

    2009-10-01

    Options for the prevention of cardiovascular disease, the greatest global cause of death, include population preventive measures (the Rose approach), or specifically seeking out and managing high-risk cases. However, the likely benefit of a population approach has been recently questioned.

  4. Re-evaluation of forest biomass carbon stocks and lessons from the world's most carbon-dense forests.

    Science.gov (United States)

    Keith, Heather; Mackey, Brendan G; Lindenmayer, David B

    2009-07-14

    From analysis of published global site biomass data (n = 136) from primary forests, we discovered (i) the world's highest known total biomass carbon density (living plus dead) of 1,867 tonnes carbon per ha (average value from 13 sites) occurs in Australian temperate moist Eucalyptus regnans forests, and (ii) average values of the global site biomass data were higher for sampled temperate moist forests (n = 44) than for sampled tropical (n = 36) and boreal (n = 52) forests (n is number of sites per forest biome). Spatially averaged Intergovernmental Panel on Climate Change biome default values are lower than our average site values for temperate moist forests, because the temperate biome contains a diversity of forest ecosystem types that support a range of mature carbon stocks or have a long land-use history with reduced carbon stocks. We describe a framework for identifying forests important for carbon storage based on the factors that account for high biomass carbon densities, including (i) relatively cool temperatures and moderately high precipitation producing rates of fast growth but slow decomposition, and (ii) older forests that are often multiaged and multilayered and have experienced minimal human disturbance. Our results are relevant to negotiations under the United Nations Framework Convention on Climate Change regarding forest conservation, management, and restoration. Conserving forests with large stocks of biomass from deforestation and degradation avoids significant carbon emissions to the atmosphere, irrespective of the source country, and should be among allowable mitigation activities. Similarly, management that allows restoration of a forest's carbon sequestration potential also should be recognized.

  5. Re-evaluating the green versus red signal in eukaryotes with secondary plastid of red algal origin

    KAUST Repository

    Burki, Fabien; Flegontov, Pavel; Oborní k, Miroslav; Cihlá ř, Jaromí r; Pain, Arnab; Lukeš, Julius; Keeling, Patrick J.

    2012-01-01

    genomes by reanalyzing the recently published EST dataset for Chromera velia, an interesting test case of a photosynthetic alga closely related to apicomplexan parasites. Previously, 513 genes were reported to originate from red and green algae in a 1

  6. Trends and variability of midlatitude stratospheric water vapour deduced from the re-evaluated Boulder balloon series and HALOE

    Directory of Open Access Journals (Sweden)

    M. Scherer

    2008-03-01

    Full Text Available This paper presents an updated trend analysis of water vapour in the lower midlatitude stratosphere from the Boulder balloon-borne NOAA frostpoint hygrometer measurements and from the Halogen Occulation Experiment (HALOE. Two corrections for instrumental bias are applied to homogenise the frostpoint data series, and a quality assessment of all soundings after 1991 is presented. Linear trend estimates based on the corrected data for the period 1980–2000 are up to 40% lower than previously reported. Vertically resolved trends and variability are calculated with a multi regression analysis including the quasi-biennal oscillation and equivalent latitude as explanatory variables. In the range of 380 to 640 K potential temperature (≈14 to 25 km, the frostpoint data from 1981 to 2006 show positive linear trends between 0.3±0.3 and 0.7±0.1%/yr. The same dataset shows trends between −0.2±0.3 and 1.0±0.3%/yr for the period 1992 to 2005. HALOE data over the same time period suggest negative trends ranging from −1.1±0.2 to −0.1±0.1%/yr. In the lower stratosphere, a rapid drop of water vapour is observed in 2000/2001 with little change since. At higher altitudes, the transition is more gradual, with slowly decreasing concentrations between 2001 and 2007. This pattern is consistent with a change induced by a drop of water concentrations at entry into the stratosphere. Previously noted differences in trends and variability between frostpoint and HALOE remain for the homogenised data. Due to uncertainties in reanalysis temperatures and stratospheric transport combined with uncertainties in observations, no quantitative inference about changes of water entering the stratosphere in the tropics could be made with the mid latitude measurements analysed here.

  7. Re-evaluating the Rose approach: comparative benefits of the population and high-risk preventive strategies

    DEFF Research Database (Denmark)

    Cooney, Marie-Therese; Dudina, Alexandra; Whincup, Peter

    2009-01-01

    BACKGROUND: Options for the prevention of cardiovascular disease, the greatest global cause of death, include population preventive measures (the Rose approach), or specifically seeking out and managing high-risk cases. However, the likely benefit of a population approach has been recently...

  8. A re-evaluation of the size of the white shark (Carcharodon carcharias) population off California, USA.

    Science.gov (United States)

    Burgess, George H; Bruce, Barry D; Cailliet, Gregor M; Goldman, Kenneth J; Grubbs, R Dean; Lowe, Christopher G; MacNeil, M Aaron; Mollet, Henry F; Weng, Kevin C; O'Sullivan, John B

    2014-01-01

    White sharks are highly migratory and segregate by sex, age and size. Unlike marine mammals, they neither surface to breathe nor frequent haul-out sites, hindering generation of abundance data required to estimate population size. A recent tag-recapture study used photographic identifications of white sharks at two aggregation sites to estimate abundance in "central California" at 219 mature and sub-adult individuals. They concluded this represented approximately one-half of the total abundance of mature and sub-adult sharks in the entire eastern North Pacific Ocean (ENP). This low estimate generated great concern within the conservation community, prompting petitions for governmental endangered species designations. We critically examine that study and find violations of model assumptions that, when considered in total, lead to population underestimates. We also use a Bayesian mixture model to demonstrate that the inclusion of transient sharks, characteristic of white shark aggregation sites, would substantially increase abundance estimates for the adults and sub-adults in the surveyed sub-population. Using a dataset obtained from the same sampling locations and widely accepted demographic methodology, our analysis indicates a minimum all-life stages population size of >2000 individuals in the California subpopulation is required to account for the number and size range of individual sharks observed at the two sampled sites. Even accounting for methodological and conceptual biases, an extrapolation of these data to estimate the white shark population size throughout the ENP is inappropriate. The true ENP white shark population size is likely several-fold greater as both our study and the original published estimate exclude non-aggregating sharks and those that independently aggregate at other important ENP sites. Accurately estimating the central California and ENP white shark population size requires methodologies that account for biases introduced by sampling a limited number of sites and that account for all life history stages across the species' range of habitats.

  9. Re-evaluation and extension of the scope of elements in US Geological Survey Standard Reference Water Samples

    Science.gov (United States)

    Peart, D.B.; Antweiler, Ronald C.; Taylor, Howard E.; Roth, D.A.; Brinton, T.I.

    1998-01-01

    More than 100 US Geological Survey (USGS) Standard Reference Water Samples (SRWSs) were analyzed for numerous trace constituents, including Al, As, B, Ba, Be, Bi, Br, Cd, Cr, Co, Cu, I, Fe, Pb, Li, Mn, Mo, Ni, Rb, Sb, Se, Sr, Te, Tl, U, V, Zn and major elements (Ca, Mg, Na, SiO2, SO4, Cl) by inductively coupled plasma mass spectrometry and inductively coupled plasma atomic emission spectrometry. In addition, 15 USGS SRWSs and National Institute of Standards and Technology (NIST) standard reference material (SRM) 1641b were analyzed for mercury using cold vapor atomic fluorescence spectrometry. Also USGS SRWS Hg-7 was analyzed using isotope dilution-inductively coupled plasma mass spectrometry. The results were compared with the reported certified values of the following standard reference materials: NIST SRM 1643a, 1643b, 1643c and 1643d and National Research Council of Canada Riverine Water Reference Materials for Trace Metals SLRS-1, SLRS-2 and SLRS-3. New concentration values for trace and major elements in the SRWSs, traceable to the certified standards, are reported. Additional concentration values are reported for elements that were neither previously published for the SRWSs nor traceable to the certified reference materials. Robust statistical procedures were used that were insensitive to outliers. These data can be used for quality assurance/quality control purposes in analytical laboratories.

  10. Re-evaluation of the neutron scattering dynamics in heavy water, generation of multigroup cross sections for THERM-126

    International Nuclear Information System (INIS)

    Keinert, J.

    1982-06-01

    In providing THERM-126 with cross section matrices for deuterium bound in heavy water the IKE phonon spectrum was reevaluated. The changes are modifications in the acoustic part and in the frequency of the second oscillator. Contrary to the phonon spectrum model for D in D 2 O in ENDF/B-IV the broad band of hindered rotations is assumed to be temperature dependent taking into account the diffusive motion of the molecule. With the new model scattering law data S (α, β) are generated in the temperature range 293.6 K-673.6 K. The THERM-126 scattering cross section matrices are calculated up to P 3 . As a validity check a lot of differential and integral cross sections are compared to experiments and benchmarks are recalculated. (orig.) [de

  11. Low-Frequency Otolith Function in Microgravity: A Re-Evaluation of the Otolith Tilt-Translation Reinterpretation (OTTR) Hypothesis

    Science.gov (United States)

    Moore, Steven T.; Cohen, Bernard; Clement, Gilles; Raphan, Theodore

    1999-01-01

    On Earth, the low-frequency afferent signal from the otoliths encodes head tilt with respect to the gravitational vertical, and the higher frequency components reflect both tilt and linear acceleration of the head. In microgravity, static tilt of the head does not influence otolith output, and the relationship between sensory input from the vestibular organs, and the visual, proprioceptive and somatosensory systems, would be disrupted. Several researchers have proposed that in 0-g this conflict may induce a reinterpretation of all otolith signals by the brain to encode only linear translation (otolith tilt-translation reinterpretation or OTTR). Ocular counter-rolling (OCR) is a low-frequency otolith-mediated reflex, which generates compensatory torsional eye movements (rotation about the visual axis) towards the spatial vertical during static roll tilt with a gain of approximately 10%. Transient linear acceleration and off-axis centrifugation at a constant angular velocity can also generate OCR. According to the OTTR hypothesis, OCR should be reduced in microgravity, and immediately upon return from a 0-g environment. Results to date have been inconclusive. OCR was reduced following the 10 day Spacelab-1 mission in response to leftward roll tilts (28-56% in 3 subjects and unchanged in one subject), and sinusoidal linear oscillations at 0.4 and 0.8 Hz. OCR gain declined 70% in four monkeys following a 14 day COSMOS mission. Following a 30 day MIR mission OCR gain decreased in one astronaut, but increased in two others following a 180 day mission. We have studied the affect of microgravity on low-frequency otolith function as part of a larger study of the interaction of vision and the vestibular system. This experiment (E-047) involved off-axis centrifugation of payload crewmembers and flew aboard the recent Neurolab mission (STS 90). Presented below are preliminary results focusing on perception and the OCR response during both centrifugation and static tilt.

  12. Re-evaluating the health of coral reef communities: baselines and evidence for human impacts across the central Pacific.

    Science.gov (United States)

    Smith, Jennifer E; Brainard, Rusty; Carter, Amanda; Grillo, Saray; Edwards, Clinton; Harris, Jill; Lewis, Levi; Obura, David; Rohwer, Forest; Sala, Enric; Vroom, Peter S; Sandin, Stuart

    2016-01-13

    Numerous studies have documented declines in the abundance of reef-building corals over the last several decades and in some but not all cases, phase shifts to dominance by macroalgae have occurred. These assessments, however, often ignore the remainder of the benthos and thus provide limited information on the present-day structure and function of coral reef communities. Here, using an unprecedentedly large dataset collected within the last 10 years across 56 islands spanning five archipelagos in the central Pacific, we examine how benthic reef communities differ in the presence and absence of human populations. Using islands as replicates, we examine whether benthic community structure is associated with human habitation within and among archipelagos and across latitude. While there was no evidence for coral to macroalgal phase shifts across our dataset we did find that the majority of reefs on inhabited islands were dominated by fleshy non-reef-building organisms (turf algae, fleshy macroalgae and non-calcifying invertebrates). By contrast, benthic communities from uninhabited islands were more variable but in general supported more calcifiers and active reef builders (stony corals and crustose coralline algae). Our results suggest that cumulative human impacts across the central Pacific may be causing a reduction in the abundance of reef builders resulting in island scale phase shifts to dominance by fleshy organisms. © 2016 The Author(s).

  13. Re-evaluation of groundwater monitoring data for glyphosate and bentazone by taking detection limits into account

    DEFF Research Database (Denmark)

    Hansen, Claus Toni; Ritz, Christian; Gerhard, Daniel

    2015-01-01

    .e. samples with concentrations below the detection limit, as left-censored observations. The median calculated pesticide concentrations are shown to be reduced 10(4) to 10(5) fold for two representative herbicides (glyphosate and bentazone) relative to the median concentrations based upon observations above...

  14. Re-evaluation of pulmonary titanium dioxide nanoparticle distribution using the "relative deposition index": Evidence for clearance through microvasculature

    Directory of Open Access Journals (Sweden)

    Gehr Peter

    2007-08-01

    Full Text Available Abstract Background Translocation of nanoparticles (NP from the pulmonary airways into other pulmonary compartments or the systemic circulation is controversially discussed in the literature. In a previous study it was shown that titanium dioxide (TiO2 NP were "distributed in four lung compartments (air-filled spaces, epithelium/endothelium, connective tissue, capillary lumen in correlation with compartment size". It was concluded that particles can move freely between these tissue compartments. To analyze whether the distribution of TiO2 NP in the lungs is really random or shows a preferential targeting we applied a newly developed method for comparing NP distributions. Methods Rat lungs exposed to an aerosol containing TiO2 NP were prepared for light and electron microscopy at 1 h and at 24 h after exposure. Numbers of TiO2 NP associated with each compartment were counted using energy filtering transmission electron microscopy. Compartment size was estimated by unbiased stereology from systematically sampled light micrographs. Numbers of particles were related to compartment size using a relative deposition index and chi-squared analysis. Results Nanoparticle distribution within the four compartments was not random at 1 h or at 24 h after exposure. At 1 h the connective tissue was the preferential target of the particles. At 24 h the NP were preferentially located in the capillary lumen. Conclusion We conclude that TiO2 NP do not move freely between pulmonary tissue compartments, although they can pass from one compartment to another with relative ease. The residence time of NP in each tissue compartment of the respiratory system depends on the compartment and the time after exposure. It is suggested that a small fraction of TiO2 NP are rapidly transported from the airway lumen to the connective tissue and subsequently released into the systemic circulation.

  15. (Re)evaluating the Implications of the Autoregressive Latent Trajectory Model Through Likelihood Ratio Tests of Its Initial Conditions.

    Science.gov (United States)

    Ou, Lu; Chow, Sy-Miin; Ji, Linying; Molenaar, Peter C M

    2017-01-01

    The autoregressive latent trajectory (ALT) model synthesizes the autoregressive model and the latent growth curve model. The ALT model is flexible enough to produce a variety of discrepant model-implied change trajectories. While some researchers consider this a virtue, others have cautioned that this may confound interpretations of the model's parameters. In this article, we show that some-but not all-of these interpretational difficulties may be clarified mathematically and tested explicitly via likelihood ratio tests (LRTs) imposed on the initial conditions of the model. We show analytically the nested relations among three variants of the ALT model and the constraints needed to establish equivalences. A Monte Carlo simulation study indicated that LRTs, particularly when used in combination with information criterion measures, can allow researchers to test targeted hypotheses about the functional forms of the change process under study. We further demonstrate when and how such tests may justifiably be used to facilitate our understanding of the underlying process of change using a subsample (N = 3,995) of longitudinal family income data from the National Longitudinal Survey of Youth.

  16. Re-evaluating the green versus red signal in eukaryotes with secondary plastid of red algal origin

    KAUST Repository

    Burki, Fabien

    2012-05-16

    The transition from endosymbiont to organelle in eukaryotic cells involves the transfer of significant numbers of genes to the host genomes, a process known as endosymbiotic gene transfer (EGT). In the case of plastid organelles, EGTs have been shown to leave a footprint in the nuclear genome that can be indicative of ancient photosynthetic activity in present-day plastid-lacking organisms, or even hint at the existence of cryptic plastids. Here,we evaluated the impact of EGTon eukaryote genomes by reanalyzing the recently published EST dataset for Chromera velia, an interesting test case of a photosynthetic alga closely related to apicomplexan parasites. Previously, 513 genes were reported to originate from red and green algae in a 1:1 ratio. In contrast, by manually inspecting newly generated trees indicating putative algal ancestry, we recovered only 51 genes congruent with EGT, of which 23 and 9 were of red and green algal origin, respectively,whereas 19 were ambiguous regarding the algal provenance.Our approach also uncovered 109 genes that branched within a monocot angiosperm clade, most likely representing a contamination. We emphasize the lack of congruence and the subjectivity resulting from independent phylogenomic screens for EGT, which appear to call for extreme caution when drawing conclusions for major evolutionary events. 2012 The Author(s).

  17. Re-Evaluating the Role of Social Capital in the Career Decision-Making Behaviour of Working-Class Students

    Science.gov (United States)

    Greenbank, Paul

    2009-01-01

    The evidence suggests that working-class students are disadvantaged in the graduate labour market. This article focuses on the extent to which students from working-class backgrounds are disadvantaged in the career decision-making process because of their lack of social capital. The study is based on in-depth interviews with 30 final-year…

  18. Re-evaluation of Yellowstone grizzly bear population dynamics not supported by empirical data: response to Doak & Cutler

    Science.gov (United States)

    van Manen, Frank T.; Ebinger, Michael R.; Haroldson, Mark A.; Harris, Richard B.; Higgs, Megan D.; Cherry, Steve; White, Gary C.; Schwartz, Charles C.

    2014-01-01

    Doak and Cutler critiqued methods used by the Interagency Grizzly Bear Study Team (IGBST) to estimate grizzly bear population size and trend in the Greater Yellowstone Ecosystem. Here, we focus on the premise, implementation, and interpretation of simulations they used to support their arguments. They argued that population increases documented by IGBST based on females with cubs-of-the-year were an artifact of increased search effort. However, we demonstrate their simulations were neither reflective of the true observation process nor did their results provide statistical support for their conclusion. They further argued that survival and reproductive senescence should be incorporated into population projections, but we demonstrate their choice of extreme mortality risk beyond age 20 and incompatible baseline fecundity led to erroneous conclusions. The conclusions of Doak and Cutler are unsubstantiated when placed within the context of a thorough understanding of the data, study system, and previous research findings and publications.

  19. Local or systemic treatment for New World cutaneous leishmaniasis? Re-evaluating the evidence for the risk of mucosal leishmaniasis

    NARCIS (Netherlands)

    Blum, Johannes; Lockwood, Diana N. J.; Visser, Leo; Harms, Gundel; Bailey, Mark S.; Caumes, Eric; Clerinx, Jan; van Thiel, Pieter P. A. M.; Morizot, Gloria; Hatz, Christoph; Buffet, Pierre

    2012-01-01

    This review addresses the question of whether the risk of developing mucosal leishmaniasis (ML) warrants systemic treatment in all patients with New World cutaneous leishmaniasis (CL) or whether local treatment might be an acceptable alternative. The risk of patients with New World CL developing ML

  20. Advanced Parkinson’s or “complex phase” Parkinson’s disease? Re-evaluation is needed

    OpenAIRE

    Titova, Nataliya; Martinez-Martin, Pablo; Katunina, Elena; Chaudhuri, K. Ray

    2017-01-01

    Holistic management of Parkinson’s disease, now recognised as a combined motor and nonmotor disorder, remains a key unmet need. Such management needs relatively accurate definition of the various stages of Parkinson’s from early untreated to late palliative as each stage calls for personalised therapies. Management also needs to have a robust knowledge of the progression pattern and clinical heterogeneity of the presentation of Parkinson’s which may manifest in a motor dominant or nonmotor do...