WorldWideScience

Sample records for continuous endpoints calculation

  1. End-point detection in potentiometric titration by continuous wavelet transform.

    Science.gov (United States)

    Jakubowska, Małgorzata; Baś, Bogusław; Kubiak, Władysław W

    2009-10-15

    The aim of this work was construction of the new wavelet function and verification that a continuous wavelet transform with a specially defined dedicated mother wavelet is a useful tool for precise detection of end-point in a potentiometric titration. The proposed algorithm does not require any initial information about the nature or the type of analyte and/or the shape of the titration curve. The signal imperfection, as well as random noise or spikes has no influence on the operation of the procedure. The optimization of the new algorithm was done using simulated curves and next experimental data were considered. In the case of well-shaped and noise-free titration data, the proposed method gives the same accuracy and precision as commonly used algorithms. But, in the case of noisy or badly shaped curves, the presented approach works good (relative error mainly below 2% and coefficients of variability below 5%) while traditional procedures fail. Therefore, the proposed algorithm may be useful in interpretation of the experimental data and also in automation of the typical titration analysis, specially in the case when random noise interfere with analytical signal.

  2. Summary statistics for end-point conditioned continuous-time Markov chains

    DEFF Research Database (Denmark)

    Hobolth, Asger; Jensen, Jens Ledet

    Continuous-time Markov chains are a widely used modelling tool. Applications include DNA sequence evolution, ion channel gating behavior and mathematical finance. We consider the problem of calculating properties of summary statistics (e.g. mean time spent in a state, mean number of jumps between...... two states and the distribution of the total number of jumps) for discretely observed continuous time Markov chains. Three alternative methods for calculating properties of summary statistics are described and the pros and cons of the methods are discussed. The methods are based on (i) an eigenvalue...... decomposition of the rate matrix, (ii) the uniformization method, and (iii) integrals of matrix exponentials. In particular we develop a framework that allows for analyses of rather general summary statistics using the uniformization method....

  3. Continuous reactivity calculation for subcritical system

    International Nuclear Information System (INIS)

    Silva, Cristiano; Goncalves, Alessandro C.; Martinez, Aquilino S.; Silva, Fernando C. da

    2011-01-01

    With the rise of a new generation of nuclear reactors as for existence the ADS (Accelerator-Driven System), it is important to have a fast and accurate prediction of the variation in reactivity during a possible variation in the intensity of external sources. This paper presents a formulation for the calculation of reactivity in subcritical systems using the inverse method related only to nuclear power derivatives. One of the applications of the proposed method is the possibility of developing reactimeters that allow the continuous monitoring of subcritical systems. (author)

  4. Continuous reactivity calculation for subcritical system

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Cristiano; Goncalves, Alessandro C.; Martinez, Aquilino S.; Silva, Fernando C. da, E-mail: cristiano@herzeleid.net, E-mail: aquilino@lmp.ufrj.br, E-mail: fernando@con.ufrj.br [Coordenacao dos Programas de Pos-Graduacao em Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Palma, Daniel A.P., E-mail: dapalma@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    With the rise of a new generation of nuclear reactors as for existence the ADS (Accelerator-Driven System), it is important to have a fast and accurate prediction of the variation in reactivity during a possible variation in the intensity of external sources. This paper presents a formulation for the calculation of reactivity in subcritical systems using the inverse method related only to nuclear power derivatives. One of the applications of the proposed method is the possibility of developing reactimeters that allow the continuous monitoring of subcritical systems. (author)

  5. SIMULATION FROM ENDPOINT-CONDITIONED, CONTINUOUS-TIME MARKOV CHAINS ON A FINITE STATE SPACE, WITH APPLICATIONS TO MOLECULAR EVOLUTION.

    Science.gov (United States)

    Hobolth, Asger; Stone, Eric A

    2009-09-01

    Analyses of serially-sampled data often begin with the assumption that the observations represent discrete samples from a latent continuous-time stochastic process. The continuous-time Markov chain (CTMC) is one such generative model whose popularity extends to a variety of disciplines ranging from computational finance to human genetics and genomics. A common theme among these diverse applications is the need to simulate sample paths of a CTMC conditional on realized data that is discretely observed. Here we present a general solution to this sampling problem when the CTMC is defined on a discrete and finite state space. Specifically, we consider the generation of sample paths, including intermediate states and times of transition, from a CTMC whose beginning and ending states are known across a time interval of length T. We first unify the literature through a discussion of the three predominant approaches: (1) modified rejection sampling, (2) direct sampling, and (3) uniformization. We then give analytical results for the complexity and efficiency of each method in terms of the instantaneous transition rate matrix Q of the CTMC, its beginning and ending states, and the length of sampling time T. In doing so, we show that no method dominates the others across all model specifications, and we give explicit proof of which method prevails for any given Q, T, and endpoints. Finally, we introduce and compare three applications of CTMCs to demonstrate the pitfalls of choosing an inefficient sampler.

  6. Dose-response regressions for algal growth and similar continuous endpoints: Calculation of effective concentrations

    DEFF Research Database (Denmark)

    Christensen, Erik R.; Kusk, Kresten Ole; Nyholm, Niels

    2009-01-01

    We derive equations for the effective concentration giving 10% inhibition (EC10) with 95% confidence limits for probit (log-normal), Weibull, and logistic dose -responsemodels on the basis of experimentally derived median effective concentrations (EC50s) and the curve slope at the central point (50......% inhibition). For illustration, data from closed, freshwater algal assays are analyzed using the green alga Pseudokirchneriella subcapitata with growth rate as the response parameter. Dose-response regressions for four test chemicals (tetraethylammonium bromide, musculamine, benzonitrile, and 4...... regression program with variance weighting and proper inverse estimation. The Weibull model provides the best fit to the data for all four chemicals. Predicted EC10s (95% confidence limits) from our derived equations are quite accurate; for example, with 4-4-(trifluoromethyl)phenoxy-phenol and the probit...

  7. Simulation from endpoint-conditioned, continuous-time Markov chains on a finite state space, with applications to molecular evolution

    DEFF Research Database (Denmark)

    Hobolth, Asger; Stone, Eric

    2009-01-01

    computational finance to human genetics and genomics. A common theme among these diverse applications is the need to simulate sample paths of a CTMC conditional on realized data that is discretely observed. Here we present a general solution to this sampling problem when the CTMC is defined on a discrete....... In doing so, we show that no method dominates the others across all model specifications, and we give explicit proof of which method prevails for any given Q, T, and endpoints. Finally, we introduce and compare three applications of CTMCs to demonstrate the pitfalls of choosing an inefficient sampler....

  8. Approving cancer treatments based on endpoints other than overall survival: an analysis of historical data using the PACE Continuous Innovation Indicators™ (CII).

    Science.gov (United States)

    Brooks, Neon; Campone, Mario; Paddock, Silvia; Shortenhaus, Scott; Grainger, David; Zummo, Jacqueline; Thomas, Samuel; Li, Rose

    2017-01-01

    There is an active debate about the role that endpoints other than overall survival (OS) should play in the drug approval process. Yet the term 'surrogate endpoint' implies that OS is the only critical metric for regulatory approval of cancer treatments. We systematically analyzed the relationship between U.S. Food and Drug Administration (FDA) approval and publication of OS evidence to understand better the risks and benefits of delaying approval until OS evidence is available. Using the PACE Continuous Innovation Indicators (CII) platform, we analyzed the effects of cancer type, treatment goal, and year of approval on the lag time between FDA approval and publication of first significant OS finding for 53 treatments approved between 1952 and 2016 for 10 cancer types (n = 71 approved indications). Greater than 59% of treatments were approved before significant OS data for the approved indication were published. Of the drugs in the sample, 31% had lags between approval and first published OS evidence of 4 years or longer. The average number of years between approval and first OS evidence varied by cancer type and did not reliably predict the eventual amount of OS evidence accumulated. Striking the right balance between early access and minimizing risk is a central challenge for regulators worldwide. We illustrate that endpoints other than OS have long helped to provide timely access to new medicines, including many current standards of care. We found that many critical drugs are approved many years before OS data are published, and that OS may not be the most appropriate endpoint in some treatment contexts. Our examination of approved treatments without significant OS data suggests contexts where OS may not be the most relevant endpoint and highlights the importance of using a wide variety of fit-for-purpose evidence types in the approval process.

  9. Generating bessel functions in mie scattering calculations using continued fractions.

    Science.gov (United States)

    Lentz, W J

    1976-03-01

    A new method of generating the Bessel functions and ratios of Bessel functions necessary for Mie calculations is presented. Accuracy is improved while eliminating the need for extended precision word lengths or large storage capability. The algorithm uses a new technique of evaluating continued fractions that starts at the beginning rather than the tail and has a built-in error check. The continued fraction representations for both spherical Bessel functions and ratios of Bessel functions of consecutive order are presented.

  10. Calculation of Wind Power Limit adjusting the Continuation Power Flow

    International Nuclear Information System (INIS)

    Santos Fuentefria, Ariel; Castro Fernández, Miguel; Martínez García, Antonio

    2012-01-01

    The wind power insertion in the power system is an important issue and can create some instability problems in voltage and system frequency due to stochastic origin of wind. Know the Wind Power Limit is a very important matter. Existing in bibliography a few methods for calculation of wind power limit. The calculation is based in static constrains, dynamic constraints or both. In this paper is developed a method for the calculation of wind power limit using some adjust in the continuation power flow, and having into account the static constrains. The method is complemented with Minimal Power Production Criterion. The method is proved in the Isla de la Juventud Electric System. The software used in the simulations was the Power System Analysis Toolbox (PSAT). (author)

  11. Molecular recognition in a diverse set of protein-ligand interactions studied with molecular dynamics simulations and end-point free energy calculations.

    Science.gov (United States)

    Wang, Bo; Li, Liwei; Hurley, Thomas D; Meroueh, Samy O

    2013-10-28

    End-point free energy calculations using MM-GBSA and MM-PBSA provide a detailed understanding of molecular recognition in protein-ligand interactions. The binding free energy can be used to rank-order protein-ligand structures in virtual screening for compound or target identification. Here, we carry out free energy calculations for a diverse set of 11 proteins bound to 14 small molecules using extensive explicit-solvent MD simulations. The structure of these complexes was previously solved by crystallography and their binding studied with isothermal titration calorimetry (ITC) data enabling direct comparison to the MM-GBSA and MM-PBSA calculations. Four MM-GBSA and three MM-PBSA calculations reproduced the ITC free energy within 1 kcal·mol(-1) highlighting the challenges in reproducing the absolute free energy from end-point free energy calculations. MM-GBSA exhibited better rank-ordering with a Spearman ρ of 0.68 compared to 0.40 for MM-PBSA with dielectric constant (ε = 1). An increase in ε resulted in significantly better rank-ordering for MM-PBSA (ρ = 0.91 for ε = 10), but larger ε significantly reduced the contributions of electrostatics, suggesting that the improvement is due to the nonpolar and entropy components, rather than a better representation of the electrostatics. The SVRKB scoring function applied to MD snapshots resulted in excellent rank-ordering (ρ = 0.81). Calculations of the configurational entropy using normal-mode analysis led to free energies that correlated significantly better to the ITC free energy than the MD-based quasi-harmonic approach, but the computed entropies showed no correlation with the ITC entropy. When the adaptation energy is taken into consideration by running separate simulations for complex, apo, and ligand (MM-PBSAADAPT), there is less agreement with the ITC data for the individual free energies, but remarkably good rank-ordering is observed (ρ = 0.89). Interestingly, filtering MD snapshots by prescoring

  12. A general theory of effect size, and its consequences for defining the benchmark response (BMR) for continuous endpoints.

    Science.gov (United States)

    Slob, Wout

    2017-04-01

    A general theory on effect size for continuous data predicts a relationship between maximum response and within-group variation of biological parameters, which is empirically confirmed by results from dose-response analyses of 27 different biological parameters. The theory shows how effect sizes observed in distinct biological parameters can be compared and provides a basis for a generic definition of small, intermediate and large effects. While the theory is useful for experimental science in general, it has specific consequences for risk assessment: it solves the current debate on the appropriate metric for the Benchmark response in continuous data. The theory shows that scaling the BMR expressed as a percent change in means to the maximum response (in the way specified) automatically takes "natural variability" into account. Thus, the theory supports the underlying rationale of the BMR 1 SD. For various reasons, it is, however, recommended to use a BMR in terms of a percent change that is scaled to maximum response and/or within group variation (averaged over studies), as a single harmonized approach.

  13. A prospective, randomized, blinded-endpoint, controlled study - continuous epidural infusion versus programmed intermittent epidural bolus in labor analgesia

    Directory of Open Access Journals (Sweden)

    Joana Nunes

    Full Text Available Abstract Background: There is evidence that administration of a programmed intermittent epidural bolus (PIEB compared to continuous epidural infusion (CEI leads to greater analgesia efficacy and maternal satisfaction with decreased anesthetic interventions. Methods: In this study, 166 women with viable pregnancies were included. After an epidural loading dose of 10 mL with Ropivacaine 0.16% plus Sufentanil 10 µg, parturient were randomly assigned to one of three regimens: A - Ropivacaine 0.15% plus Sufentanil 0.2 µg/mL solution as continuous epidural infusion (5 mL/h, beginning immediately after the initial bolus; B - Ropivacaine 0.1% plus Sufentanil 0.2 µg/mL as programmed intermittent epidural bolus and C - Same solution as group A as programmed intermittent epidural bolus. PIEB regimens were programmed as 10 mL/h starting 60 min after the initial bolus. Rescue boluses of 5 mL of the same solution were administered, with the infusion pump. We evaluated maternal satisfaction using a verbal numeric scale from 0 to 10. We also evaluated adverse, maternal and neonatal outcomes. Results: We analyzed 130 pregnants (A = 60; B = 33; C = 37. The median verbal numeric scale for maternal satisfaction was 8.8 in group A; 8.6 in group B and 8.6 in group C (p = 0.83. We found a higher caesarean delivery rate in group A (56.7%; p = 0.02. No differences in motor block, instrumental delivery rate and neonatal outcomes were observed. Conclusions: Maintenance of epidural analgesia with programmed intermittent epidural bolus is associated with a reduced incidence of caesarean delivery with equally high maternal satisfaction and no adverse outcomes.

  14. A prospective, randomized, blinded-endpoint, controlled study - continuous epidural infusion versus programmed intermittent epidural bolus in labor analgesia.

    Science.gov (United States)

    Nunes, Joana; Nunes, Sara; Veiga, Mariano; Cortez, Mara; Seifert, Isabel

    2016-01-01

    There is evidence that administration of a programmed intermittent epidural bolus (PIEB) compared to continuous epidural infusion (CEI) leads to greater analgesia efficacy and maternal satisfaction with decreased anesthetic interventions. In this study, 166 women with viable pregnancies were included. After an epidural loading dose of 10mL with Ropivacaine 0.16% plus Sufentanil 10μg, parturient were randomly assigned to one of three regimens: A - Ropivacaine 0.15% plus Sufentanil 0.2μg/mL solution as continuous epidural infusion (5mL/h, beginning immediately after the initial bolus); B - Ropivacaine 0.1% plus Sufentanil 0.2μg/mL as programmed intermittent epidural bolus and C - Same solution as group A as programmed intermittent epidural bolus. PIEB regimens were programmed as 10mL/h starting 60min after the initial bolus. Rescue boluses of 5mL of the same solution were administered, with the infusion pump. We evaluated maternal satisfaction using a verbal numeric scale from 0 to 10. We also evaluated adverse, maternal and neonatal outcomes. We analyzed 130 pregnants (A=60; B=33; C=37). The median verbal numeric scale for maternal satisfaction was 8.8 in group A; 8.6 in group B and 8.6 in group C (p=0.83). We found a higher caesarean delivery rate in group A (56.7%; p=0.02). No differences in motor block, instrumental delivery rate and neonatal outcomes were observed. Maintenance of epidural analgesia with programmed intermittent epidural bolus is associated with a reduced incidence of caesarean delivery with equally high maternal satisfaction and no adverse outcomes. Copyright © 2015 Sociedade Brasileira de Anestesiologia. Published by Elsevier Editora Ltda. All rights reserved.

  15. Development and validation of continuous energy adjoint-weighted calculations

    International Nuclear Information System (INIS)

    Truchet, Guillaume

    2015-01-01

    A key issue in nowadays Reactor Physics is to propagate input data uncertainties (e.g. nuclear data, manufacturing tolerances, etc.) to nuclear codes final results (e.g. k(eff), reaction rate, etc.). In order to propagate uncertainties, one typically assumes small variations around a reference and evaluates at first sensitivity profiles. Problem is that nuclear Monte Carlo codes are not - or were not until very recently - able to straightforwardly process such sensitivity profiles, even thought they are considered as reference codes. First goal of this PhD thesis is to implement a method to calculate k(eff)-sensitivity profiles to nuclear data or any perturbations in TRIPOLI-4, the CEA Monte Carlo neutrons transport code. To achieve such a goal, a method has first been developed to calculate the adjoint flux using the Iterated Fission Probability (IFP) principle that states that the adjoint flux at a given phase space point is proportional to the neutron importance in a just critical core after several power iterations. Thanks to our developments, it has been made possible, for the fist time, to calculate the continuous adjoint flux for an actual and complete reactor core configuration. From that new feature, we have elaborated a new method able to forwardly apply the exact perturbation theory in Monte Carlo codes. Exact perturbation theory does not rely on small variations which makes possible to calculate very complex experiments. Finally and after a deep analysis of the IFP method, this PhD thesis also reproduces and improves an already used method to calculate adjoint weighted kinetic parameters as well as reference migrations areas. (author) [fr

  16. A prospective, randomized, blinded-endpoint, controlled study – continuous epidural infusion versus programmed intermittent epidural bolus in labor analgesia

    Directory of Open Access Journals (Sweden)

    Joana Nunes

    2016-09-01

    Full Text Available Background: There is evidence that administration of a programmed intermittent epidural bolus (PIEB compared to continuous epidural infusion (CEI leads to greater analgesia efficacy and maternal satisfaction with decreased anesthetic interventions. Methods: In this study, 166 women with viable pregnancies were included. After an epidural loading dose of 10 mL with Ropivacaine 0.16% plus Sufentanil 10 μg, parturient were randomly assigned to one of three regimens: A – Ropivacaine 0.15% plus Sufentanil 0.2 μg/mL solution as continuous epidural infusion (5 mL/h, beginning immediately after the initial bolus; B – Ropivacaine 0.1% plus Sufentanil 0.2 μg/mL as programmed intermittent epidural bolus and C – Same solution as group A as programmed intermittent epidural bolus. PIEB regimens were programmed as 10 mL/h starting 60 min after the initial bolus. Rescue boluses of 5 mL of the same solution were administered, with the infusion pump. We evaluated maternal satisfaction using a verbal numeric scale from 0 to 10. We also evaluated adverse, maternal and neonatal outcomes. Results: We analyzed 130 pregnants (A = 60; B = 33; C = 37. The median verbal numeric scale for maternal satisfaction was 8.8 in group A; 8.6 in group B and 8.6 in group C (p = 0.83. We found a higher caesarean delivery rate in group A (56.7%; p = 0.02. No differences in motor block, instrumental delivery rate and neonatal outcomes were observed. Conclusions: Maintenance of epidural analgesia with programmed intermittent epidural bolus is associated with a reduced incidence of caesarean delivery with equally high maternal satisfaction and no adverse outcomes. Resumo: Justificativa: Há evidências de que a administração de um bolus epidural intermitente programado (BEIP comparada à infusão epidural contínua (IEC resulta em maior eficácia da analgesia e da satisfação materna, com redução das intervenções anestésicas. Métodos: Neste estudo, 166

  17. A calculation technique to improve continuous monitoring of containment integrity

    International Nuclear Information System (INIS)

    Dick, J.E.

    1990-01-01

    The containment envelope of nuclear plants is a passive and extremely effective safety feature. World experience indicates, however, that inadvertent breaches of envelope integrity can go undetected for substantial time periods. Consequently, continuous monitoring of integrity is being closely examined by many containment designers and operators. The most promising approach is to use sensors and systems that automatically measure changes in the mass of air in containment, time integrate any known air mass flow rates across containment boundaries, and perform a mass balance to obtain the air mass leaked. As fluctuations in such measurements are typically too large to enable leakage to be calculated to the desired precision, filtering and statistical techniques must be used to filter out random and time-dependent fluctuations. Current approaches cannot easily deal with nonrandom or systematic fluctuations in the measurements, including pressure changes within the containment. As a result, sampling periods must be kept short, or data measured during periods of varying containment pressure must be discarded. The technique described allows for much longer sampling periods under conditions of fluctuating containment pressure and eliminates the invalidation of data when the containment pressure fluctuation is nonrandom. It should therefore yield a much more precise value for the containment leakage characteristic. It also promises to be able to distinguish the presence of systematic errors unrelated to systematic pressure changes and to establish whether the containment leakage characteristic is laminar or turbulent

  18. Refractory Graft-Versus-Host Disease-Free, Relapse-Free Survival as an Accurate and Easy-to-Calculate Endpoint to Assess the Long-Term Transplant Success.

    Science.gov (United States)

    Kawamura, Koji; Nakasone, Hideki; Kurosawa, Saiko; Yoshimura, Kazuki; Misaki, Yukiko; Gomyo, Ayumi; Hayakawa, Jin; Tamaki, Masaharu; Akahoshi, Yu; Kusuda, Machiko; Kameda, Kazuaki; Wada, Hidenori; Ishihara, Yuko; Sato, Miki; Terasako-Saito, Kiriko; Kikuchi, Misato; Kimura, Shun-Ichi; Tanihara, Aki; Kako, Shinichi; Kanamori, Heiwa; Mori, Takehiko; Takahashi, Satoshi; Taniguchi, Shuichi; Atsuta, Yoshiko; Kanda, Yoshinobu

    2018-02-21

    The aim of this study was to develop a new composite endpoint that accurately reflects the long-term success of allogeneic hematopoietic stem cell transplantation (allo-HSCT), as the conventional graft-versus-host disease (GVHD)-free, relapse-free survival (GRFS) overestimates the impact of GVHD. First, we validated current GRFS (cGRFS), which recently was proposed as a more accurate endpoint of long-term transplant success. cGRFS was defined as survival without disease relapse/progression or active chronic GVHD at a given time after allo-HSCT, calculated using 2 distinct methods: a linear combination of a Kaplan-Meier estimates approach and a multistate modelling approach. Next, we developed a new composite endpoint, refractory GRFS (rGRFS). rGRFS was calculated similarly to conventional GRFS treating grade III to IV acute GVHD, chronic GVHD requiring systemic treatment, and disease relapse/progression as events, except that GVHD that resolved and did not require systemic treatment at the last evaluation was excluded as an event in rGRFS. The 2 cGRFS curves obtained using 2 different approaches were superimposed and both were superior to that of conventional GRFS, reflecting the proportion of patients with resolved chronic GVHD. Finally, the curves of cGRFS and rGRFS overlapped after the first 2 years of post-transplant follow-up. These results suggest that cGRFS and rGRFS more accurately reflect transplant success than conventional GRFS. Especially, rGRFS can be more easily calculated than cGRFS and analyzed with widely used statistical approaches, whereas cGRFS more accurately represents the burden of GVHD-related morbidity in the first 2 years after transplantation. Copyright © 2018 The American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.

  19. Application of wavelet scaling function expansion continuous-energy resonance calculation method to MOX fuel problem

    International Nuclear Information System (INIS)

    Yang, W.; Wu, H.; Cao, L.

    2012-01-01

    More and more MOX fuels are used in all over the world in the past several decades. Compared with UO 2 fuel, it contains some new features. For example, the neutron spectrum is harder and more resonance interference effects within the resonance energy range are introduced because of more resonant nuclides contained in the MOX fuel. In this paper, the wavelets scaling function expansion method is applied to study the resonance behavior of plutonium isotopes within MOX fuel. Wavelets scaling function expansion continuous-energy self-shielding method is developed recently. It has been validated and verified by comparison to Monte Carlo calculations. In this method, the continuous-energy cross-sections are utilized within resonance energy, which means that it's capable to solve problems with serious resonance interference effects without iteration calculations. Therefore, this method adapts to treat the MOX fuel resonance calculation problem natively. Furthermore, plutonium isotopes have fierce oscillations of total cross-section within thermal energy range, especially for 240 Pu and 242 Pu. To take thermal resonance effect of plutonium isotopes into consideration the wavelet scaling function expansion continuous-energy resonance calculation code WAVERESON is enhanced by applying the free gas scattering kernel to obtain the continuous-energy scattering source within thermal energy range (2.1 eV to 4.0 eV) contrasting against the resonance energy range in which the elastic scattering kernel is utilized. Finally, all of the calculation results of WAVERESON are compared with MCNP calculation. (authors)

  20. TBI Endpoints Development

    Science.gov (United States)

    2015-10-01

    therapy , and early mild physical activity, which result in fewer symptoms, lower mean severity of symptoms, less social disability, and fewer days off work...developing more precise TBI diagnostic tools, clinical endpoints, and effective therapies . We designed and executed an interactive program that combined...surgery, neuropsychology, neuroradiology, psychiatry, neurology, sports medicine, pediatrics, geriatrics , health economics, biostatistics, and informatics

  1. Development of continuous energy Monte Carlo burn-up calculation code MVP-BURN

    International Nuclear Information System (INIS)

    Okumura, Keisuke; Nakagawa, Masayuki; Sasaki, Makoto

    2001-01-01

    Burn-up calculations based on the continuous energy Monte Carlo method became possible by development of MVP-BURN. To confirm the reliably of MVP-BURN, it was applied to the two numerical benchmark problems; cell burn-up calculations for High Conversion LWR lattice and BWR lattice with burnable poison rods. Major burn-up parameters have shown good agreements with the results obtained by a deterministic code (SRAC95). Furthermore, spent fuel composition calculated by MVP-BURN was compared with measured one. Atomic number densities of major actinides at 34 GWd/t could be predicted within 10% accuracy. (author)

  2. Continuous Energy, Multi-Dimensional Transport Calculations for Problem Dependent Resonance Self-Shielding

    International Nuclear Information System (INIS)

    Downar, T.

    2009-01-01

    The overall objective of the work here has been to eliminate the approximations used in current resonance treatments by developing continuous energy multi-dimensional transport calculations for problem dependent self-shielding calculations. The work here builds on the existing resonance treatment capabilities in the ORNL SCALE code system. The overall objective of the work here has been to eliminate the approximations used in current resonance treatments by developing continuous energy multidimensional transport calculations for problem dependent self-shielding calculations. The work here builds on the existing resonance treatment capabilities in the ORNL SCALE code system. Specifically, the methods here utilize the existing continuous energy SCALE5 module, CENTRM, and the multi-dimensional discrete ordinates solver, NEWT to develop a new code, CENTRM( ) NEWT. The work here addresses specific theoretical limitations in existing CENTRM resonance treatment, as well as investigates advanced numerical and parallel computing algorithms for CENTRM and NEWT in order to reduce the computational burden. The result of the work here will be a new computer code capable of performing problem dependent self-shielding analysis for both existing and proposed GENIV fuel designs. The objective of the work was to have an immediate impact on the safety analysis of existing reactors through improvements in the calculation of fuel temperature effects, as well as on the analysis of more sophisticated GENIV/NGNP systems through improvements in the depletion/transmutation of actinides for Advanced Fuel Cycle Initiatives.

  3. Using meta-differential evolution to enhance a calculation of a continuous blood glucose level.

    Science.gov (United States)

    Koutny, Tomas

    2016-09-01

    We developed a new model of glucose dynamics. The model calculates blood glucose level as a function of transcapillary glucose transport. In previous studies, we validated the model with animal experiments. We used analytical method to determine model parameters. In this study, we validate the model with subjects with type 1 diabetes. In addition, we combine the analytic method with meta-differential evolution. To validate the model with human patients, we obtained a data set of type 1 diabetes study that was coordinated by Jaeb Center for Health Research. We calculated a continuous blood glucose level from continuously measured interstitial fluid glucose level. We used 6 different scenarios to ensure robust validation of the calculation. Over 96% of calculated blood glucose levels fit A+B zones of the Clarke Error Grid. No data set required any correction of model parameters during the time course of measuring. We successfully verified the possibility of calculating a continuous blood glucose level of subjects with type 1 diabetes. This study signals a successful transition of our research from an animal experiment to a human patient. Researchers can test our model with their data on-line at https://diabetes.zcu.cz. Copyright © 2016 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  4. Evaluation of the HTTR criticality and burnup calculations with continuous-energy and multigroup cross sections

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Min-Han; Wang, Jui-Yu [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101, Section 2, Kung-Fu Road, Hsinchu 30013, Taiwan (China); Sheu, Rong-Jiun, E-mail: rjsheu@mx.nthu.edu.tw [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101, Section 2, Kung-Fu Road, Hsinchu 30013, Taiwan (China); Department of Engineering System and Science, National Tsing Hua University, 101, Section 2, Kung-Fu Road, Hsinchu 30013, Taiwan (China); Liu, Yen-Wan Hsueh [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101, Section 2, Kung-Fu Road, Hsinchu 30013, Taiwan (China); Department of Engineering System and Science, National Tsing Hua University, 101, Section 2, Kung-Fu Road, Hsinchu 30013, Taiwan (China)

    2014-05-01

    The High Temperature Engineering Test Reactor (HTTR) in Japan is a helium-cooled graphite-moderated reactor designed and operated for the future development of high-temperature gas-cooled reactors. Two detailed full-core models of HTTR have been established by using SCALE6 and MCNP5/X, respectively, to study its neutronic properties. Several benchmark problems were repeated first to validate the calculation models. Careful code-to-code comparisons were made to ensure that two calculation models are both correct and equivalent. Compared with experimental data, the two models show a consistent bias of approximately 20–30 mk overestimation in effective multiplication factor for a wide range of core states. Most of the bias could be related to the ENDF/B-VII.0 cross-section library or incomplete modeling of impurities in graphite. After that, a series of systematic analyses was performed to investigate the effects of cross sections on the HTTR criticality and burnup calculations, with special interest in the comparison between continuous-energy and multigroup results. Multigroup calculations in this study were carried out in 238-group structure and adopted the SCALE double-heterogeneity treatment for resonance self-shielding. The results show that multigroup calculations tend to underestimate the system eigenvalue by a constant amount of ∼5 mk compared to their continuous-energy counterparts. Further sensitivity studies suggest the differences between multigroup and continuous-energy results appear to be temperature independent and also insensitive to burnup effects.

  5. Evaluation of the HTTR criticality and burnup calculations with continuous-energy and multigroup cross sections

    International Nuclear Information System (INIS)

    Chiang, Min-Han; Wang, Jui-Yu; Sheu, Rong-Jiun; Liu, Yen-Wan Hsueh

    2014-01-01

    The High Temperature Engineering Test Reactor (HTTR) in Japan is a helium-cooled graphite-moderated reactor designed and operated for the future development of high-temperature gas-cooled reactors. Two detailed full-core models of HTTR have been established by using SCALE6 and MCNP5/X, respectively, to study its neutronic properties. Several benchmark problems were repeated first to validate the calculation models. Careful code-to-code comparisons were made to ensure that two calculation models are both correct and equivalent. Compared with experimental data, the two models show a consistent bias of approximately 20–30 mk overestimation in effective multiplication factor for a wide range of core states. Most of the bias could be related to the ENDF/B-VII.0 cross-section library or incomplete modeling of impurities in graphite. After that, a series of systematic analyses was performed to investigate the effects of cross sections on the HTTR criticality and burnup calculations, with special interest in the comparison between continuous-energy and multigroup results. Multigroup calculations in this study were carried out in 238-group structure and adopted the SCALE double-heterogeneity treatment for resonance self-shielding. The results show that multigroup calculations tend to underestimate the system eigenvalue by a constant amount of ∼5 mk compared to their continuous-energy counterparts. Further sensitivity studies suggest the differences between multigroup and continuous-energy results appear to be temperature independent and also insensitive to burnup effects

  6. Comparison of methods for calculating conditional expectations of sufficient statistics for continuous time Markov chains.

    Science.gov (United States)

    Tataru, Paula; Hobolth, Asger

    2011-12-05

    Continuous time Markov chains (CTMCs) is a widely used model for describing the evolution of DNA sequences on the nucleotide, amino acid or codon level. The sufficient statistics for CTMCs are the time spent in a state and the number of changes between any two states. In applications past evolutionary events (exact times and types of changes) are unaccessible and the past must be inferred from DNA sequence data observed in the present. We describe and implement three algorithms for computing linear combinations of expected values of the sufficient statistics, conditioned on the end-points of the chain, and compare their performance with respect to accuracy and running time. The first algorithm is based on an eigenvalue decomposition of the rate matrix (EVD), the second on uniformization (UNI), and the third on integrals of matrix exponentials (EXPM). The implementation in R of the algorithms is available at http://www.birc.au.dk/~paula/. We use two different models to analyze the accuracy and eight experiments to investigate the speed of the three algorithms. We find that they have similar accuracy and that EXPM is the slowest method. Furthermore we find that UNI is usually faster than EVD.

  7. Comparison of methods for calculating conditional expectations of sufficient statistics for continuous time Markov chains

    Directory of Open Access Journals (Sweden)

    Tataru Paula

    2011-12-01

    Full Text Available Abstract Background Continuous time Markov chains (CTMCs is a widely used model for describing the evolution of DNA sequences on the nucleotide, amino acid or codon level. The sufficient statistics for CTMCs are the time spent in a state and the number of changes between any two states. In applications past evolutionary events (exact times and types of changes are unaccessible and the past must be inferred from DNA sequence data observed in the present. Results We describe and implement three algorithms for computing linear combinations of expected values of the sufficient statistics, conditioned on the end-points of the chain, and compare their performance with respect to accuracy and running time. The first algorithm is based on an eigenvalue decomposition of the rate matrix (EVD, the second on uniformization (UNI, and the third on integrals of matrix exponentials (EXPM. The implementation in R of the algorithms is available at http://www.birc.au.dk/~paula/. Conclusions We use two different models to analyze the accuracy and eight experiments to investigate the speed of the three algorithms. We find that they have similar accuracy and that EXPM is the slowest method. Furthermore we find that UNI is usually faster than EVD.

  8. New sampling method in continuous energy Monte Carlo calculation for pebble bed reactors

    International Nuclear Information System (INIS)

    Murata, Isao; Takahashi, Akito; Mori, Takamasa; Nakagawa, Masayuki.

    1997-01-01

    A pebble bed reactor generally has double heterogeneity consisting of two kinds of spherical fuel element. In the core, there exist many fuel balls piled up randomly in a high packing fraction. And each fuel ball contains a lot of small fuel particles which are also distributed randomly. In this study, to realize precise neutron transport calculation of such reactors with the continuous energy Monte Carlo method, a new sampling method has been developed. The new method has been implemented in the general purpose Monte Carlo code MCNP to develop a modified version MCNP-BALL. This method was validated by calculating inventory of spherical fuel elements arranged successively by sampling during transport calculation and also by performing criticality calculations in ordered packing models. From the results, it was confirmed that the inventory of spherical fuel elements could be reproduced using MCNP-BALL within a sufficient accuracy of 0.2%. And the comparison of criticality calculations in ordered packing models between MCNP-BALL and the reference method shows excellent agreement in neutron spectrum as well as multiplication factor. MCNP-BALL enables us to analyze pebble bed type cores such as PROTEUS precisely with the continuous energy Monte Carlo method. (author)

  9. Calculation of the gamma-dose rate from a continuously emitted plume

    International Nuclear Information System (INIS)

    Huebschmann, W.; Papadopoulos, D.

    1975-06-01

    A computer model is presented which calculates the long term gamma dose rate caused by the radioactive off-gas continuously emitted from a stack. The statistical distribution of the wind direction and velocity and of the stability categories is taken into account. The emitted activity, distributed in the atmosphere according to this statistics, is assumed to be concentrated at the mesh points of a three-dimensional grid. The grid spacing and the integration limits determine the accuracy as well as the computer time needed. When calculating the dose rate in a given wind direction, the contribution of the activity emitted into the neighbouring sectors is evaluated. This influence is demonstrated in the results, which are calculated with a error below 3% and compared to the dose rate distribution curves of the submersion model and the model developed by K.J. Vogt. (orig.) [de

  10. Continuous energy Monte Carlo calculations for randomly distributed spherical fuels based on statistical geometry model

    Energy Technology Data Exchange (ETDEWEB)

    Murata, Isao [Osaka Univ., Suita (Japan); Mori, Takamasa; Nakagawa, Masayuki; Itakura, Hirofumi

    1996-03-01

    The method to calculate neutronics parameters of a core composed of randomly distributed spherical fuels has been developed based on a statistical geometry model with a continuous energy Monte Carlo method. This method was implemented in a general purpose Monte Carlo code MCNP, and a new code MCNP-CFP had been developed. This paper describes the model and method how to use it and the validation results. In the Monte Carlo calculation, the location of a spherical fuel is sampled probabilistically along the particle flight path from the spatial probability distribution of spherical fuels, called nearest neighbor distribution (NND). This sampling method was validated through the following two comparisons: (1) Calculations of inventory of coated fuel particles (CFPs) in a fuel compact by both track length estimator and direct evaluation method, and (2) Criticality calculations for ordered packed geometries. This method was also confined by applying to an analysis of the critical assembly experiment at VHTRC. The method established in the present study is quite unique so as to a probabilistic model of the geometry with a great number of spherical fuels distributed randomly. Realizing the speed-up by vector or parallel computations in future, it is expected to be widely used in calculation of a nuclear reactor core, especially HTGR cores. (author).

  11. Calculating a Continuous Metabolic Syndrome Score Using Nationally Representative Reference Values.

    Science.gov (United States)

    Guseman, Emily Hill; Eisenmann, Joey C; Laurson, Kelly R; Cook, Stephen R; Stratbucker, William

    2018-02-26

    The prevalence of metabolic syndrome in youth varies on the basis of the classification system used, prompting implementation of continuous scores; however, the use of these scores is limited to the sample from which they were derived. We sought to describe the derivation of the continuous metabolic syndrome score using nationally representative reference values in a sample of obese adolescents and a national sample obtained from National Health and Nutrition Examination Survey (NHANES) 2011-2012. Clinical data were collected from 50 adolescents seeking obesity treatment at a stage 3 weight management center. A second analysis relied on data from adolescents included in NHANES 2011-2012, performed for illustrative purposes. The continuous metabolic syndrome score was calculated by regressing individual values onto nationally representative age- and sex-specific standards (NHANES III). Resultant z scores were summed to create a total score. The final sample included 42 obese adolescents (15 male and 35 female subjects; mean age, 14.8 ± 1.9 years) and an additional 445 participants from NHANES 2011-2012. Among the clinical sample, the mean continuous metabolic syndrome score was 4.16 ± 4.30, while the NHANES sample mean was quite a bit lower, at -0.24 ± 2.8. We provide a method to calculate the continuous metabolic syndrome by comparing individual risk factor values to age- and sex-specific percentiles from a nationally representative sample. Copyright © 2018 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  12. Calculating the electron temperature in the lightning channel by continuous spectrum

    Science.gov (United States)

    Xiangcheng, DONG; Jianhong, CHEN; Xiufang, WEI; Ping, YUAN

    2017-12-01

    Based on the theory of plasma continuous radiation, the relationship between the emission intensity of bremsstrahlung and recombination radiation and the plasma electron temperature is obtained. During the development process of a return stroke of ground flash, the intensity of continuous radiation spectrum is separated on the basis of the spectrums with obviously different luminous intensity at two moments. The electron temperature of the lightning discharge channel is obtained through the curve fitting of the continuous spectrum intensity. It is found that electron temperature increases with the increase of wavelength and begins to reduce after the peak. The peak temperature of the two spectra is close to 25 000 K. To be compared with the result of discrete spectrum, the electron temperature is fitted by the O I line and N II line of the spectrum respectively. The comparison shows that the high temperature value is in good agreement with the temperature of the lightning core current channel obtained from the ion line information, and the low temperature at the high band closes to the calculation result of the atomic line, at a low band is lower than the calculation of the atomic line, which reflects the temperature of the luminous channel of the outer corona.

  13. Highly parallel demagnetization field calculation using the fast multipole method on tetrahedral meshes with continuous sources

    Science.gov (United States)

    Palmesi, P.; Exl, L.; Bruckner, F.; Abert, C.; Suess, D.

    2017-11-01

    The long-range magnetic field is the most time-consuming part in micromagnetic simulations. Computational improvements can relieve problems related to this bottleneck. This work presents an efficient implementation of the Fast Multipole Method [FMM] for the magnetic scalar potential as used in micromagnetics. The novelty lies in extending FMM to linearly magnetized tetrahedral sources making it interesting also for other areas of computational physics. We treat the near field directly and in use (exact) numerical integration on the multipole expansion in the far field. This approach tackles important issues like the vectorial and continuous nature of the magnetic field. By using FMM the calculations scale linearly in time and memory.

  14. Continuous energy Monte Carlo method based homogenization multi-group constants calculation

    International Nuclear Information System (INIS)

    Li Mancang; Wang Kan; Yao Dong

    2012-01-01

    The efficiency of the standard two-step reactor physics calculation relies on the accuracy of multi-group constants from the assembly-level homogenization process. In contrast to the traditional deterministic methods, generating the homogenization cross sections via Monte Carlo method overcomes the difficulties in geometry and treats energy in continuum, thus provides more accuracy parameters. Besides, the same code and data bank can be used for a wide range of applications, resulting in the versatility using Monte Carlo codes for homogenization. As the first stage to realize Monte Carlo based lattice homogenization, the track length scheme is used as the foundation of cross section generation, which is straight forward. The scattering matrix and Legendre components, however, require special techniques. The Scattering Event method was proposed to solve the problem. There are no continuous energy counterparts in the Monte Carlo calculation for neutron diffusion coefficients. P 1 cross sections were used to calculate the diffusion coefficients for diffusion reactor simulator codes. B N theory is applied to take the leakage effect into account when the infinite lattice of identical symmetric motives is assumed. The MCMC code was developed and the code was applied in four assembly configurations to assess the accuracy and the applicability. At core-level, A PWR prototype core is examined. The results show that the Monte Carlo based multi-group constants behave well in average. The method could be applied to complicated configuration nuclear reactor core to gain higher accuracy. (authors)

  15. POPFOOD - a computer code for calculating ingestion collective doses from continuous atmospheric releases

    International Nuclear Information System (INIS)

    Hotson, J.; Stacey, A.; Nair, S.

    1980-07-01

    The basic methodology incorporated in the POPFOOD computer code is described, which may be used to calculate equilibrium collective dose rates associated with continuous atmospheric releases and arising from consumption of a broad range of food products. The standard data libraries associated with the code are also described. These include a data library, based on the 1972 agricultural census, describing the spatial distribution of production, in England, Wales and Scotland, of the following food products: milk; beef and veal; pork bacon and ham; poultrymeat; eggs; mutton and lamb; root vegetables; green vegetables; fruit; cereals. Illustrative collective dose calculations were made for the case of 1 Ci per year emissions of 131 I, tritium and 14 C from a typical rural UK site. The calculations indicate that the ingestion pathway results in a greater collective dose than that via inhalation, with the contributions from consumption of root and green vegetables, and cereals being of comparable significance to that from liquid milk consumption, in all three cases. (author)

  16. Electromagnetic field modeling and ion optics calculations for a continuous-flow AMS system

    International Nuclear Information System (INIS)

    Han, B.X.; Reden, K.F. von; Roberts, M.L.; Schneider, R.J.; Hayes, J.M.; Jenkins, W.J.

    2007-01-01

    A continuous-flow 14 C AMS (CFAMS) system is under construction at the NOSAMS facility. This system is based on a NEC Model 1.5SDH-1 0.5 MV Pelletron accelerator and will utilize a combination of a microwave ion source (MIS) and a charge exchange canal (CXC) to produce negative carbon ions from a continuously flowing stream of CO 2 gas. For high-efficiency transmission of the large emittance, large energy-spread beam from the ion source unit, a large-acceptance and energy-achromatic injector consisting of a 45 o electrostatic spherical analyzer (ESA) and a 90 o double-focusing magnet has been designed. The 45 o ESA is rotatable to accommodate a 134-sample MC-SNICS as a second ion source. The high-energy achromat (90 o double focusing magnet and 90 o ESA) has also been customized for large acceptance. Electromagnetic field modeling and ion optics calculations of the beamline were done with Infolytica MagNet, ElecNet, and Trajectory Evaluator. PBGUNS and SIMION were used for the modeling of ion source unit

  17. Shared Contract-Obedient Endpoints

    Directory of Open Access Journals (Sweden)

    Étienne Lozes

    2012-12-01

    Full Text Available Most of the existing verification techniques for message-passing programs suppose either that channel endpoints are used in a linear fashion, where at most one thread may send or receive from an endpoint at any given time, or that endpoints may be used arbitrarily by any number of threads. The former approach usually forbids the sharing of channels while the latter limits what is provable about programs. In this paper we propose a midpoint between these techniques by extending a proof system based on separation logic to allow sharing of endpoints. We identify two independent mechanisms for supporting sharing: an extension of fractional shares to endpoints, and a new technique based on what we call reflexive ownership transfer. We demonstrate on a number of examples that a linear treatment of sharing is possible.

  18. The intermediate endpoint effect in logistic and probit regression

    Science.gov (United States)

    MacKinnon, DP; Lockwood, CM; Brown, CH; Wang, W; Hoffman, JM

    2010-01-01

    Background An intermediate endpoint is hypothesized to be in the middle of the causal sequence relating an independent variable to a dependent variable. The intermediate variable is also called a surrogate or mediating variable and the corresponding effect is called the mediated, surrogate endpoint, or intermediate endpoint effect. Clinical studies are often designed to change an intermediate or surrogate endpoint and through this intermediate change influence the ultimate endpoint. In many intermediate endpoint clinical studies the dependent variable is binary, and logistic or probit regression is used. Purpose The purpose of this study is to describe a limitation of a widely used approach to assessing intermediate endpoint effects and to propose an alternative method, based on products of coefficients, that yields more accurate results. Methods The intermediate endpoint model for a binary outcome is described for a true binary outcome and for a dichotomization of a latent continuous outcome. Plots of true values and a simulation study are used to evaluate the different methods. Results Distorted estimates of the intermediate endpoint effect and incorrect conclusions can result from the application of widely used methods to assess the intermediate endpoint effect. The same problem occurs for the proportion of an effect explained by an intermediate endpoint, which has been suggested as a useful measure for identifying intermediate endpoints. A solution to this problem is given based on the relationship between latent variable modeling and logistic or probit regression. Limitations More complicated intermediate variable models are not addressed in the study, although the methods described in the article can be extended to these more complicated models. Conclusions Researchers are encouraged to use an intermediate endpoint method based on the product of regression coefficients. A common method based on difference in coefficient methods can lead to distorted

  19. Calorimetry end-point predictions

    International Nuclear Information System (INIS)

    Fox, M.A.

    1981-01-01

    This paper describes a portion of the work presently in progress at Rocky Flats in the field of calorimetry. In particular, calorimetry end-point predictions are outlined. The problems associated with end-point predictions and the progress made in overcoming these obstacles are discussed. The two major problems, noise and an accurate description of the heat function, are dealt with to obtain the most accurate results. Data are taken from an actual calorimeter and are processed by means of three different noise reduction techniques. The processed data are then utilized by one to four algorithms, depending on the accuracy desired to determined the end-point

  20. 40 CFR 63.5885 - How do I calculate percent reduction to demonstrate compliance for continuous lamination/casting...

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true How do I calculate percent reduction to... Pollutants: Reinforced Plastic Composites Production Testing and Initial Compliance Requirements § 63.5885 How do I calculate percent reduction to demonstrate compliance for continuous lamination/casting...

  1. Kinetics and dose calculations of ampicillin and gentamicin given as continuous intravenous infusion during parenteral nutrition in 88 newborn infants

    DEFF Research Database (Denmark)

    Colding, H; Møller, S; Bentzon, M W

    1983-01-01

    Ampicillin and gentamicin were administered continuously intravenously to 88 newborn infants using individually calculated dosages. For infants with a mean value of plasma clearance of the antibiotics, it was calculated that the serum ampicillin and gentamicin concentrations would be between 35-5...

  2. Calculations of the mean regional dispersion of a radioactive gas emitted from a continuous source

    International Nuclear Information System (INIS)

    Persson, C.

    1974-10-01

    The mean dispersion of a radioactive gas over distances of the order of 1000 kilometers is estimated with the aid of a statistical treatment of computed geostrophic trajectories and simplified vertical diffusion calculations based on the eddy diffusivity theory. (author)

  3. Endpoints in pediatric pain studies

    NARCIS (Netherlands)

    M. van Dijk (Monique); I. Ceelie (Ilse); D. Tibboel (Dick)

    2011-01-01

    textabstractAssessing pain intensity in (preverbal) children is more difficult than in adults. Tools to measure pain are being used as primary endpoints [e.g., pain intensity, time to first (rescue) analgesia, total analgesic consumption, adverse effects, and long-term effects] in studies on the

  4. Endpoint singularities in unintegrated parton distributions

    CERN Document Server

    Hautmann, F

    2007-01-01

    We examine the singular behavior from the endpoint region x -> 1 in parton distributions unintegrated in both longitudinal and transverse momenta. We identify and regularize the singularities by using the subtraction method, and compare this with the cut-off regularization method. The counterterms for the distributions with subtractive regularization are given in coordinate space by compact all-order expressions in terms of eikonal-line operators. We carry out an explicit calculation at one loop for the unintegrated quark distribution. We discuss the relation of the unintegrated parton distributions in subtractive regularization with the ordinary parton distributions.

  5. Evolutionary calculations for planetary nebula nuclei with continuing mass loss and realistic starting conditions

    International Nuclear Information System (INIS)

    Faulkner, D.J.; Wood, P.R.

    1984-01-01

    Evolutionary calculations for nuclei of planetary nebulae are described. They were made using assumptions regarding mass of the NPN, phase in the He shell flash cycle at which the NPN leaves the AGB, and time variation of the mass loss rate. Comparison of the evolutionary tracks with the observational Harman-Seaton sequence indicates that some recently published NPN luminosities may be too low by a factor of three. Comparison of the calculated timescales with the observed properties of NPN and of white dwarfs provides marginal evidence for the PN ejection being initiated by the helium shell flash itself

  6. A Simplified Calculation of a Continuous Flow Packed Contactor with Help of Characteristic Times

    OpenAIRE

    Sovová, Helena

    2011-01-01

    The proposed approach is illustrated on several examples of supercritical fluid extraction kinetics. Simple expressions for the calculation of characteristic times of both extraction and individual extraction steps are derived from mass balance equations and applied on experimental data from typical extractions from plants with supercritical CO2, as the extraction of oils from seeds or the extraction of essential oils from aromatic plants.

  7. Calculation of thermal stress condition in long metal cylinder under heating by continuous laser radiation

    International Nuclear Information System (INIS)

    Uglov, A.A.; Uglov, S.A.; Kulik, A.N.

    1997-01-01

    The method of determination of temperature field and unduced thermal stresses in long metallic cylinder under its heating by cw-laser normally distributed heat flux is offered. The graphically presented results of calculation show the stress maximum is placed behind of center of laser heat sport along its movement line on the cylinder surface

  8. Multi-Group Library Generation with Explicit Resonance Interference Using Continuous Energy Monte Carlo Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ho Jin; Cho, Jin Young [KAERI, Daejeon (Korea, Republic of); Kim, Kang Seog [Oak Ridge National Laboratory, Oak Ridge (United States); Hong, Ser Gi [Kyung Hee University, Yongin (Korea, Republic of)

    2016-05-15

    In this study, multi-group cross section libraries for the DeCART code were generated using a new procedure. The new procedure includes generating the RI tables based on the MC calculations, correcting the effective fission product yield calculations, and considering most of the fission products as resonant nuclides. KAERI (Korea Atomic Energy Research Institute) has developed the transport lattice code KARMA (Kernel Analyzer by Ray-tracing Method for fuel Assembly) and DeCART (Deterministic Core Analysis based on Ray Tracing) for a multi-group neutron transport analysis of light water reactors (LWRs). These codes adopt the method of characteristics (MOC) to solve the multi-group transport equation and resonance fixed source problem, the subgroup and the direct iteration method with resonance integral tables for resonance treatment. With the development of the DeCART and KARMA code, KAERI has established its own library generation system for a multi-group transport calculation. In the KAERI library generation system, the multi-group average cross section and resonance integral (RI) table are generated and edited using PENDF (point-wise ENDF) and GENDF (group-wise ENDF) produced by the NJOY code. The new method does not need additional processing because the MC method can handle any geometry information and material composition. In this study, the new method is applied to the dominant resonance nuclide such as U{sup 235} and U{sup 238} and the conventional method is applied to the minor resonance nuclides. To examine the newly generated multi-group cross section libraries, various benchmark calculations such as pin-cell, FA, and core depletion problem are performed and the results are compared with the reference solutions. Overall, the results by the new method agree well with the reference solution. The new procedure based on the MC method were verified and provided the multi-group library that can be used in the SMR nuclear design analysis.

  9. Continuous hemofiltration dose calculation in a newborn patient with congenital heart disease and preoperative renal failure.

    Science.gov (United States)

    Ricci, Z; Polito, A; Giorni, C; Di Chiara, L; Ronco, C; Picardo, S

    2007-03-01

    To report a case of a newborn patient with renal failure due to polycystic kidneys requiring renal replacement therapy, and total anomalous pulmonary venous return requiring major cardiosurgical intervention. Pediatric cardiosurgery operatory room and pediatric cardiologic intensive care. A 6-day-old newborn child weighing 3.1 kg. Renal function (creatinine value and urine output) was monitored during the course of the operation and intraoperative renal replacement therapy was not initiated. Serum creatinine concentration decreased from 4.4 to 3 mg/dL at cardiopulmonary bypass (CPB) start and to 1.5 at the end of surgery: the creatinine decrease was provided by the dilutional effect of CPB priming and the infusion of fresh blood from transfusions together with an adequate filtration rate (800 m/L in about 120 minutes). After the operation, extracorporeal membrane oxygenation (ECMO) for ventricular dysfunction and continuous hemofiltration for anuria refractory to medical therapy were prescribed. The hemofiltration machine was set in parallel with the ECMO machine at a blood flow rate of 60 ml/min and a predilution replacement solution infusion of 600 ml/h (4.5 ml/min of creatinine clearance once adjusted on extracorporeal circuits; 3000 mL/m2 hemofiltration): after a single hemofiltration session lasting 96 hours, serum creatinine reached optimal steady state levels around 0.5 mg/dL on postoperative day 2 and 3. Administration of intraoperative continuous hemofiltration is not mandatory in the case of a 3-kg newborn patient with established renal failure needing major cardiosurgery: hemodilution secondary to CPB, transfusion of hemoderivates, and optimal UF rate appear to be effective methods for achieving solute removal. If postoperative continuous hemofiltration is started, however, a "dialytic dose" of 4.5 ml/min allows an adequate creatinine clearance, quick achievement of a steady state of serum creatinine concentration and an eventual acceptable rate of

  10. Matrix continued-fraction calculation of localization length in disordered systems

    International Nuclear Information System (INIS)

    Pastawski, H.M.; Weisz, J.F.

    1983-01-01

    A Matrix Continued-Fraction method is used to study the localization length of the states at the band center of a two dimensional crystals with disorder given by the Anderson model. It is found that exponentially localized states which scale according to the work of Mac Kinnon and Kramer, becomes weakly localized as the disorder becomes weaker, and there is some critical disorder for which the localization length does not saturate with the width of the strips, this confirms the resuts found by Pichard and Sarma. Weakly localized states are also found in one dimension for w/v [pt

  11. Matrix continued-fraction calculation of localization length in disordered systems

    International Nuclear Information System (INIS)

    Pastawski, H.M.; Weisz, J.F.

    1983-01-01

    A Matrix Continued-Fraction method is used to study the localization length of the states at the band center of a two dimensional crystal with disorder given by the Anderson model. It is found that exponentially localized states, which scale according to the work of Mac Kinnon and Kramer, becomes weakly localized as the disorder becomes weaker, and there is some critical disorder for which the localization length does not saturate with the width of the strips, this confirms the results found by Pichard and Sarma. Weakly localized states are also found in one dimension for w/v [pt

  12. Multi-Group Covariance Data Generation from Continuous-Energy Monte Carlo Transport Calculations

    International Nuclear Information System (INIS)

    Lee, Dong Hyuk; Shim, Hyung Jin

    2015-01-01

    The sensitivity and uncertainty (S/U) methodology in deterministic tools has been utilized for quantifying uncertainties of nuclear design parameters induced by those of nuclear data. The S/U analyses which are based on multi-group cross sections can be conducted by an simple error propagation formula with the sensitivities of nuclear design parameters to multi-group cross sections and the covariance of multi-group cross section. The multi-group covariance data required for S/U analysis have been produced by nuclear data processing codes such as ERRORJ or PUFF from the covariance data in evaluated nuclear data files. However in the existing nuclear data processing codes, an asymptotic neutron flux energy spectrum, not the exact one, has been applied to the multi-group covariance generation since the flux spectrum is unknown before the neutron transport calculation. It can cause an inconsistency between the sensitivity profiles and the covariance data of multi-group cross section especially in resolved resonance energy region, because the sensitivities we usually use are resonance self-shielded while the multi-group cross sections produced from an asymptotic flux spectrum are infinitely-diluted. In order to calculate the multi-group covariance estimation in the ongoing MC simulation, mathematical derivations for converting the double integration equation into a single one by utilizing sampling method have been introduced along with the procedure of multi-group covariance tally

  13. Mathematical simulation and calculation of continuous countercurrent process of ion-exchange extraction of strontium from strongly mineralized solutions

    International Nuclear Information System (INIS)

    Nikashina, V.A.; Venitsianov, E.V.; Ivanov, V.A.; Gur'yanova, L.N.; Nikolaev, N.P.; Baturova, L.L.; Moskovskij Gosudarstvennyj Univ., Moscow

    1993-01-01

    A program 'Countercurrent' is developed for the simulation of a continuous ion-exchange extraction of strontium from the strongly mineralized solutions containing NaCl and CaCl 2 using carboxylic cation exchanger KB-4 in countercurrent columns. The use of the program allows one to calculate the consitions of Ca and Sr separation depending on the modes of operation at the stage of sorption as well as regeneration, to calculate a residual Sr content on an overloaded sorbent and Sr separation on an incompletely regenerated KB-4, and to find the optimal separation conditions

  14. Application of a generalized Leibniz rule for calculating electromagnetic fields within continuous source regions

    International Nuclear Information System (INIS)

    Silberstein, M.

    1991-01-01

    In deriving the electric and magnetic fields in a continuous source region by differentiating the vector potential, Yaghjian (1985) explains that the central obstacle is the dependence of the integration limits on the differentiation variable. Since it is not mathematically rigorous to assume the curl and integral signs are interchangeable, he uses an integration variable substitution to circumvent this problematic dependence. Here, an alternative derivation is presented, which evaluates the curl of the vector potential volume integral directly, retaining the dependence of the limits of integration on the differentiation variable. It involves deriving a three-dimensional version of Leibniz' rule for differentiating an integral with variable limits of integration, and using the generalized rule to find the Maxwellian and cavity fields in the source region. 7 refs

  15. Calculation and simulation on mid-spatial frequency error in continuous polishing

    International Nuclear Information System (INIS)

    Xie Lei; Zhang Yunfan; You Yunfeng; Ma Ping; Liu Yibin; Yan Dingyao

    2013-01-01

    Based on theoretical model of continuous polishing, the influence of processing parameters on the polishing result was discussed. Possible causes of mid-spatial frequency error in the process were analyzed. The simulation results demonstrated that the low spatial frequency error was mainly caused by large rotating ratio. The mid-spatial frequency error would decrease as the low spatial frequency error became lower. The regular groove shape was the primary reason of the mid-spatial frequency error. When irregular and fitful grooves were adopted, the mid-spatial frequency error could be lessened. Moreover, the workpiece swing could make the polishing process more uniform and reduce the mid-spatial frequency error caused by the fix-eccentric plane polishing. (authors)

  16. Comparison of methods for calculating conditional expectations of sufficient statistics for continuous time Markov chains

    DEFF Research Database (Denmark)

    Tataru, Paula Cristina; Hobolth, Asger

    2011-01-01

    past evolutionary events (exact times and types of changes) are unaccessible and the past must be inferred from DNA sequence data observed in the present. RESULTS: We describe and implement three algorithms for computing linear combinations of expected values of the sufficient statistics, conditioned......BACKGROUND: Continuous time Markov chains (CTMCs) is a widely used model for describing the evolution of DNA sequences on the nucleotide, amino acid or codon level. The sufficient statistics for CTMCs are the time spent in a state and the number of changes between any two states. In applications...... of the algorithms is available at www.birc.au.dk/~paula/. CONCLUSIONS: We use two different models to analyze the accuracy and eight experiments to investigate the speed of the three algorithms. We find that they have similar accuracy and that EXPM is the slowest method. Furthermore we find that UNI is usually...

  17. Establishing a group of endpoints to support collective operations without specifying unique identifiers for any endpoints

    Science.gov (United States)

    Archer, Charles J.; Blocksom, Michael A.; Ratterman, Joseph D.; Smith, Brian E.; Xue, Hanghon

    2016-02-02

    A parallel computer executes a number of tasks, each task includes a number of endpoints and the endpoints are configured to support collective operations. In such a parallel computer, establishing a group of endpoints receiving a user specification of a set of endpoints included in a global collection of endpoints, where the user specification defines the set in accordance with a predefined virtual representation of the endpoints, the predefined virtual representation is a data structure setting forth an organization of tasks and endpoints included in the global collection of endpoints and the user specification defines the set of endpoints without a user specification of a particular endpoint; and defining a group of endpoints in dependence upon the predefined virtual representation of the endpoints and the user specification.

  18. From Risk Models to Loan Contracts: Austerity as the Continuation of Calculation by Other Means

    Directory of Open Access Journals (Sweden)

    Pierre Pénet

    2014-06-01

    Full Text Available This article analyses how financial actors sought to minimise financial uncertainties during the European sovereign debt crisis by employing simulations as legal instruments of market regulation. We first contrast two roles that simulations can play in sovereign debt markets: ‘simulation-hypotheses’, which work as bundles of constantly updated hypotheses with the goal of better predicting financial risks; and ‘simulation-fictions’, which provide fixed narratives about the present with the purpose of postponing the revision of market risks. Using ratings reports published by Moody’s on Greece and European Central Bank (ECB regulations, we show that Moody’s stuck to a simulationfiction and displayed rating inertia on Greece’s trustworthiness to prevent the destabilising effects that further downgrades would have on Greek borrowing costs. We also show that the multi-notch downgrade issued by Moody’s in June 2010 followed the ECB’s decision to remove ratings from its collateral eligibility requirements. Then, as regulators moved from ‘regulation through model’ to ‘regulation through contract’, ratings stopped functioning as simulation-fictions. Indeed, the conditions of the Greek bailout implemented in May 2010 replaced the CRAs’ models as the main simulation-fiction, which market actors employed to postpone the prospect of a Greek default. We conclude by presenting austerity measures as instruments of calculative governance rather than ideological compacts

  19. Sample size determination in clinical trials with multiple endpoints

    CERN Document Server

    Sozu, Takashi; Hamasaki, Toshimitsu; Evans, Scott R

    2015-01-01

    This book integrates recent methodological developments for calculating the sample size and power in trials with more than one endpoint considered as multiple primary or co-primary, offering an important reference work for statisticians working in this area. The determination of sample size and the evaluation of power are fundamental and critical elements in the design of clinical trials. If the sample size is too small, important effects may go unnoticed; if the sample size is too large, it represents a waste of resources and unethically puts more participants at risk than necessary. Recently many clinical trials have been designed with more than one endpoint considered as multiple primary or co-primary, creating a need for new approaches to the design and analysis of these clinical trials. The book focuses on the evaluation of power and sample size determination when comparing the effects of two interventions in superiority clinical trials with multiple endpoints. Methods for sample size calculation in clin...

  20. Doppler Temperature Coefficient Calculations Using Adjoint-Weighted Tallies and Continuous Energy Cross Sections in MCNP6

    Science.gov (United States)

    Gonzales, Matthew Alejandro

    The calculation of the thermal neutron Doppler temperature reactivity feedback co-efficient, a key parameter in the design and safe operation of advanced reactors, using first order perturbation theory in continuous energy Monte Carlo codes is challenging as the continuous energy adjoint flux is not readily available. Traditional approaches of obtaining the adjoint flux attempt to invert the random walk process as well as require data corresponding to all temperatures and their respective temperature derivatives within the system in order to accurately calculate the Doppler temperature feedback. A new method has been developed using adjoint-weighted tallies and On-The-Fly (OTF) generated continuous energy cross sections within the Monte Carlo N-Particle (MCNP6) transport code. The adjoint-weighted tallies are generated during the continuous energy k-eigenvalue Monte Carlo calculation. The weighting is based upon the iterated fission probability interpretation of the adjoint flux, which is the steady state population in a critical nuclear reactor caused by a neutron introduced at that point in phase space. The adjoint-weighted tallies are produced in a forward calculation and do not require an inversion of the random walk. The OTF cross section database uses a high order functional expansion between points on a user-defined energy-temperature mesh in which the coefficients with respect to a polynomial fitting in temperature are stored. The coefficients of the fits are generated before run- time and called upon during the simulation to produce cross sections at any given energy and temperature. The polynomial form of the OTF cross sections allows the possibility of obtaining temperature derivatives of the cross sections on-the-fly. The use of Monte Carlo sampling of adjoint-weighted tallies and the capability of computing derivatives of continuous energy cross sections with respect to temperature are used to calculate the Doppler temperature coefficient in a research

  1. Mathematical simulation and calculation of the continuous countercurrent process of ion-exchange extraction of strontium from strongly mineralized solutions

    International Nuclear Information System (INIS)

    Nikashina, V.A.; Guryanova, L.N.; Baturova, L.L.; Venetsianov, E.V.; Ivanov, V.A.; Nikolaev, N.P.

    1993-01-01

    The program open-quotes Countercurrentclose quotes is developed for the simulation of a continuous ion-exchange extraction of strontium from strongly mineralized NaCl and CaCl 2 solutions using a KB-4 carboxylic cation-exchanger in the countercurrent columns. The program allows one to Calculate the conditions of Ca and Sr separation depending on the mode of operation at the sorption and regeneration stages, the residual Sr content on the overloaded sorbent, and the Sr separation on incompletely regenerated KB-4. It also makes it possible to find the optimal separation conditions. The program open-quotes Countercurrentclose quotes can be also used to simulate other ion-exchange processes

  2. How to calculate clearance of highly protein-bound drugs during continuous venovenous hemofiltration demonstrated with flucloxacillin.

    Science.gov (United States)

    Meyer, Brigitte; Ahmed el Gendy, Salwa; Delle Karth, Georg; Locker, Gottfried J; Heinz, Gottfried; Jaeger, Walter; Thalhammer, Florian

    2003-01-01

    Flucloxacillin is an important antimicrobial drug in the treatment of infections with Staphylococcus aureus and therefore is often used in staphylococcal infections. Furthermore, flucloxacillin has a high protein binding rate as for example ceftriaxone or teicoplanin--drugs which have formerly been characterized as not being dialyzable. The pharmacokinetic parameters of 4.0 g flucloxacillin every 8 h were examined in 10 intensive care patients during continuous venovenous hemofiltration (CVVH) using a polyamide capillary hemofilter. In addition, the difficulty of calculating the hemofiltration clearance of a highly protein-bound drug is described. Flucloxacillin serum levels were significantly lowered (56.9 +/- 24.0%) even though only 15% of the drug was detected in the ultrafiltrate. Elimination half-life, total body clearance and sieving coefficient were 4.9 +/- 0.7 h, 117.2 +/- 79.1 ml/min and 0.21 +/- 0.09, respectively. These discrepancies can be explained by the high protein binding of flucloxacillin, the adsorbing property of polyamide and the equation in order to calculate hemofiltration clearance. The unbound fraction of a 4.0 g flucloxacillin dosage facilitates time above the minimum inhibitory concentration (T > MIC) of 60% only for strains up to a minimum inhibitory concentration (MIC) of 0.5 mg/l. Based on the data of this study, we conclude that intensive care patients with staphylococcal infections on CVVH should be treated with 4.0 g flucloxacillin every 8 h which was safe and well tolerated. Moreover, further studies with highly protein-bound drugs are recommended to check the classical 'hemodialysis' equation as the standard equation in calculating the CVVH clearance of highly protein-bound drugs. Copyright 2003 S. Karger AG, Basel

  3. Pollutant threshold concentration determination in marine ecosystems using an ecological interaction endpoint

    International Nuclear Information System (INIS)

    Wang, Changyou; Liang, Shengkang; Guo, Wenting; Yu, Hua; Xing, Wenhui

    2015-01-01

    The threshold concentrations of pollutants are determined by extrapolating single-species effect data to community-level effects. This assumes the most sensitive endpoint of the life cycle of individuals and the species sensitivity distribution from single-species toxic effect tests, thus, ignoring the ecological interactions. The uncertainties due to this extrapolation can be partially overcome using the equilibrium point of a customized ecosystem. This method incorporates ecological interactions and integrates the effects on growth, survival, and ingestion into a single effect measure, the equilibrium point excursion in the customized ecosystem, in order to describe the toxic effects on plankton. A case study showed that the threshold concentration of copper calculated with the endpoint of the equilibrium point was 10 μg L −1 , which is significantly different from the threshold calculated with a single-species endpoint. The endpoint calculated using this method provides a more relevant measure of the ecological impact than any single individual-level endpoint. - Highlights: • Ecotoxicological effect of exposure to copper was tested on a customized ecosystem. • Equilibrium point of biomasses in the customized ecosystem was used as an endpoint. • Exposure–response relationship in a community level was built on equilibrium point. • A threshold concentration incorporating ecological interactions was derived. - The equilibrium biomass incorporating ecological interactions in a customized ecosystem was used as an endpoint to calculate the threshold concentration at a community level

  4. Establishing a group of endpoints in a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.; Xue, Hanhong

    2016-02-02

    A parallel computer executes a number of tasks, each task includes a number of endpoints and the endpoints are configured to support collective operations. In such a parallel computer, establishing a group of endpoints receiving a user specification of a set of endpoints included in a global collection of endpoints, where the user specification defines the set in accordance with a predefined virtual representation of the endpoints, the predefined virtual representation is a data structure setting forth an organization of tasks and endpoints included in the global collection of endpoints and the user specification defines the set of endpoints without a user specification of a particular endpoint; and defining a group of endpoints in dependence upon the predefined virtual representation of the endpoints and the user specification.

  5. SWAT4.0 - The integrated burnup code system driving continuous energy Monte Carlo codes MVP, MCNP and deterministic calculation code SRAC

    International Nuclear Information System (INIS)

    Kashima, Takao; Suyama, Kenya; Takada, Tomoyuki

    2015-03-01

    There have been two versions of SWAT depending on details of its development history: the revised SWAT that uses the deterministic calculation code SRAC as a neutron transportation solver, and the SWAT3.1 that uses the continuous energy Monte Carlo code MVP or MCNP5 for the same purpose. It takes several hours, however, to execute one calculation by the continuous energy Monte Carlo code even on the super computer of the Japan Atomic Energy Agency. Moreover, two-dimensional burnup calculation is not practical using the revised SWAT because it has problems on production of effective cross section data and applying them to arbitrary fuel geometry when a calculation model has multiple burnup zones. Therefore, SWAT4.0 has been developed by adding, to SWAT3.1, a function to utilize the deterministic code SARC2006, which has shorter calculation time, as an outer module of neutron transportation solver for burnup calculation. SWAT4.0 has been enabled to execute two-dimensional burnup calculation by providing an input data template of SRAC2006 to SWAT4.0 input data, and updating atomic number densities of burnup zones in each burnup step. This report describes outline, input data instruction, and examples of calculations of SWAT4.0. (author)

  6. A summation free β+-endpoint spectrometer

    International Nuclear Information System (INIS)

    Keller, H.; Kirchner, R.; Klepper, O.; Roeckl, E.; Schardt, D.; Simon, R.S.; Kleinheinz, P.; Liang, C.F.; Paris, P.

    1990-08-01

    A β + -endpoint spectrometer is described, where positrons are observed in an 11-mm thick silicon detector in coincidence with subsequent γ-rays meausred in a germanium detector, and where the summing of the positron energy with the annihilation radiation is prevented by detecting both 511-keV quanta in opposite segments of a BGO ring surrounding the silicon detector. The procedure of measuring and analyzing the data is outlined for the decay of the 11/2 - -isomer of 149 Tb; its endpoint energy is determined to be 1853(10) keV, in agreement with the literature. The accuracy and reliability of β + -endpoint measurements is discussed in comparison to the EC/β + -ratio method. (orig.)

  7. End-point sharpness in thermometric titrimetry.

    Science.gov (United States)

    Tyrrell, H J

    1967-07-01

    It is shown that the sharpness of an end-point in a thermometric titration where the simple reaction A + B right harpoon over left harpoon AB takes place, depends on Kc(A') where K is the equilibrium constant for the reaction, and c(A') is the total concentration of the titrand (A) in the reaction mixture. The end-point is sharp if, (i) the enthalpy change in the reaction is not negligible, and (ii) Kc(A') > 10(3). This shows that it should, for example, be possible to titrate 0.1 M acid, pK(A) = 10, using a thennometric end-point. Some aspects of thermometric titrimetry when Kc(A') < 10(3) are also considered.

  8. Reducing sample size by combining superiority and non-inferiority for two primary endpoints in the Social Fitness study.

    Science.gov (United States)

    Donkers, Hanneke; Graff, Maud; Vernooij-Dassen, Myrra; Nijhuis-van der Sanden, Maria; Teerenstra, Steven

    2017-01-01

    In randomized controlled trials, two endpoints may be necessary to capture the multidimensional concept of the intervention and the objectives of the study adequately. We show how to calculate sample size when defining success of a trial by combinations of superiority and/or non-inferiority aims for the endpoints. The randomized controlled trial design of the Social Fitness study uses two primary endpoints, which can be combined into five different scenarios for defining success of the trial. We show how to calculate power and sample size for each scenario and compare these for different settings of power of each endpoint and correlation between them. Compared to a single primary endpoint, using two primary endpoints often gives more power when success is defined as: improvement in one of the two endpoints and no deterioration in the other. This also gives better power than when success is defined as: improvement in one prespecified endpoint and no deterioration in the remaining endpoint. When two primary endpoints are equally important, but a positive effect in both simultaneously is not per se required, the objective of having one superior and the other (at least) non-inferior could make sense and reduce sample size. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Forecasting interest rates with shifting endpoints

    DEFF Research Database (Denmark)

    Van Dijk, Dick; Koopman, Siem Jan; Wel, Michel van der

    2014-01-01

    We consider forecasting the term structure of interest rates with the assumption that factors driving the yield curve are stationary around a slowly time-varying mean or ‘shifting endpoint’. The shifting endpoints are captured using either (i) time series methods (exponential smoothing) or (ii......) long-range survey forecasts of either interest rates or inflation and output growth, or (iii) exponentially smoothed realizations of these macro variables. Allowing for shifting endpoints in yield curve factors provides substantial and significant gains in out-of-sample predictive accuracy, relative...... to stationary and random walk benchmarks. Forecast improvements are largest for long-maturity interest rates and for long-horizon forecasts....

  10. A suggestion of a new method for the calculation of the coating thickness in continuous hot-dip galvanizing

    Energy Technology Data Exchange (ETDEWEB)

    Jo, C. M.; Kwon, Y. D.; Kwon, S. B. [Kyungpook National University, Daegu (Korea, Republic of); Kim, G. Y. [POSCO Technical Research laboratories, Gumgo-dong (Korea, Republic of)

    2011-11-15

    It is known that the distributions of the impinging pressure gradient and the shear stress at the strip surface play a decisive key role in the decision of the coating thickness in hot-dip galvanizing. So, to predict the exact coating thickness, it is essential that the distributions of the impinging wall jet pressure and the shear stress acting between the liquid film and jet stream are measured (or calculated) exactly for each specific coating condition. So far, to obtain the impinging wall jet pressure, it was assumed that the jet issuing from an air-knife is similar to the Hiemenz plane stagnation flow, and the wall shear stress could be predicted by an equation using the assumption of a non-negative Gaussian profile in impinging wall jet pressure in general, so that it cannot be reliable for some impinging wall jet regions and nozzle systems intrinsically. Nevertheless, one cannot find a suitable method to cope with the difficulties in measuring/calculating of the shear stress and the impinging wall jet pressure. Such a difficulty which will cause an inaccuracy in the coating thickness prediction. With these connections, in the present study, we suggest a new method named as a two-step calculation method to calculate the final coating thickness, which consists of the air jet analysis and coating thickness calculation. And, from the comparison of the results one may confirm the validation of the new suggested method.

  11. Application of the single-channel continuous synthesis method to criticity and power distribution calculations in thermal reactors

    International Nuclear Information System (INIS)

    Medrano Asensio, Gregorio.

    1976-06-01

    A detailed power distribution calculation in a large power reactor requires the solution of the multigroup 3D diffusion equations. Using the finite difference method, this computation is too expensive to be performed for design purposes. This work is devoted to the single channel continous synthesis method: the choice of the trial functions and the determination of the mixing functions are discussed in details; 2D and 3D results are presented. The method is applied to the calculation of the IAEA ''Benchmark'' reactor and the results obtained are compared with a finite element resolution and with published results [fr

  12. Gene expression profiling reveals multiple toxicity endpoints induced by hepatotoxicants

    Energy Technology Data Exchange (ETDEWEB)

    Huang Qihong; Jin Xidong; Gaillard, Elias T.; Knight, Brian L.; Pack, Franklin D.; Stoltz, James H.; Jayadev, Supriya; Blanchard, Kerry T

    2004-05-18

    Microarray technology continues to gain increased acceptance in the drug development process, particularly at the stage of toxicology and safety assessment. In the current study, microarrays were used to investigate gene expression changes associated with hepatotoxicity, the most commonly reported clinical liability with pharmaceutical agents. Acetaminophen, methotrexate, methapyrilene, furan and phenytoin were used as benchmark compounds capable of inducing specific but different types of hepatotoxicity. The goal of the work was to define gene expression profiles capable of distinguishing the different subtypes of hepatotoxicity. Sprague-Dawley rats were orally dosed with acetaminophen (single dose, 4500 mg/kg for 6, 24 and 72 h), methotrexate (1 mg/kg per day for 1, 7 and 14 days), methapyrilene (100 mg/kg per day for 3 and 7 days), furan (40 mg/kg per day for 1, 3, 7 and 14 days) or phenytoin (300 mg/kg per day for 14 days). Hepatic gene expression was assessed using toxicology-specific gene arrays containing 684 target genes or expressed sequence tags (ESTs). Principal component analysis (PCA) of gene expression data was able to provide a clear distinction of each compound, suggesting that gene expression data can be used to discern different hepatotoxic agents and toxicity endpoints. Gene expression data were applied to the multiplicity-adjusted permutation test and significantly changed genes were categorized and correlated to hepatotoxic endpoints. Repression of enzymes involved in lipid oxidation (acyl-CoA dehydrogenase, medium chain, enoyl CoA hydratase, very long-chain acyl-CoA synthetase) were associated with microvesicular lipidosis. Likewise, subsets of genes associated with hepatotocellular necrosis, inflammation, hepatitis, bile duct hyperplasia and fibrosis have been identified. The current study illustrates that expression profiling can be used to: (1) distinguish different hepatotoxic endpoints; (2) predict the development of toxic endpoints; and

  13. Biomarkers and correlative endpoints for immunotherapy trials.

    Science.gov (United States)

    Morse, Michael A; Osada, Takuya; Hobeika, Amy; Patel, Sandip; Lyerly, H Kim

    2013-01-01

    Immunotherapies for lung cancer are reaching phase III clinical trial, but the ultimate success likely will depend on developing biomarkers to guide development and choosing patient populations most likely to benefit. Because the immune response to cancer involves multiple cell types and cytokines, some spatially and temporally separated, it is likely that multiple biomarkers will be required to fully characterize efficacy of the vaccine and predict eventual benefit. Peripheral blood markers of response, such as the ELISPOT assay and cytokine flow cytometry analyses of peripheral blood mononuclear cells following immunotherapy, remain the standard approach, but it is increasingly important to obtain tissue to study the immune response at the site of the tumor. Earlier clinical endpoints such as response rate and progression-free survival do not correlate with overall survival demonstrated for some immunotherapies, suggesting the need to develop other intermediary clinical endpoints. Insofar as all these biomarkers and surrogate endpoints are relevant in multiple malignancies, it may be possible to extrapolate findings to immunotherapy of lung cancer.

  14. Improved Endpoints for Cancer Immunotherapy Trials

    Science.gov (United States)

    Eggermont, Alexander M. M.; Janetzki, Sylvia; Hodi, F. Stephen; Ibrahim, Ramy; Anderson, Aparna; Humphrey, Rachel; Blumenstein, Brent; Wolchok, Jedd

    2010-01-01

    Unlike chemotherapy, which acts directly on the tumor, cancer immunotherapies exert their effects on the immune system and demonstrate new kinetics that involve building a cellular immune response, followed by changes in tumor burden or patient survival. Thus, adequate design and evaluation of some immunotherapy clinical trials require a new development paradigm that includes reconsideration of established endpoints. Between 2004 and 2009, several initiatives facilitated by the Cancer Immunotherapy Consortium of the Cancer Research Institute and partner organizations systematically evaluated an immunotherapy-focused clinical development paradigm and created the principles for redefining trial endpoints. On this basis, a body of clinical and laboratory data was generated that supports three novel endpoint recommendations. First, cellular immune response assays generate highly variable results. Assay harmonization in multicenter trials may minimize variability and help to establish cellular immune response as a reproducible biomarker, thus allowing investigation of its relationship with clinical outcomes. Second, immunotherapy may induce novel patterns of antitumor response not captured by Response Evaluation Criteria in Solid Tumors or World Health Organization criteria. New immune-related response criteria were defined to more comprehensively capture all response patterns. Third, delayed separation of Kaplan–Meier curves in randomized immunotherapy trials can affect results. Altered statistical models describing hazard ratios as a function of time and recognizing differences before and after separation of curves may allow improved planning of phase III trials. These recommendations may improve our tools for cancer immunotherapy trials and may offer a more realistic and useful model for clinical investigation. PMID:20826737

  15. Advanced burnup calculation code system in a subcritical state with continuous-energy Monte Carlo code for fusion-fission hybrid reactor

    International Nuclear Information System (INIS)

    Matsunaka, Masayuki; Ohta, Masayuki; Miyamaru, Hiroyuki; Murata, Isao

    2009-01-01

    The fusion-fission (FF) hybrid reactor is a promising energy source that is thought to act as a bridge between the existing fission reactor and the genuine fusion reactor in the future. The burnup calculation system that aims at precise burnup calculations of a subcritical system was developed for the detailed design of the FF hybrid reactor, and the system consists of MCNP, ORIGEN, and postprocess codes. In the present study, the calculation system was substantially modified to improve the calculation accuracy and at the same time the calculation speed as well. The reaction rate estimation can be carried out accurately with the present system that uses track-length (TL) data in the continuous-energy treatment. As for the speed-up of the reaction rate calculation, a new TL data bunching scheme was developed so that only necessary TL data are used as long as the accuracy of the point-wise nuclear data is conserved. With the present system, an example analysis result for our proposed FF hybrid reactor is described, showing that the computation time could really be saved with the same accuracy as before. (author)

  16. Continuous-energy adjoint flux and perturbation calculation using the iterated fission probability method in Monte-Carlo code TRIPOLI-4 and underlying applications

    International Nuclear Information System (INIS)

    Truchet, G.; Leconte, P.; Peneliau, Y.; Santamarina, A.

    2013-01-01

    The first goal of this paper is to present an exact method able to precisely evaluate very small reactivity effects with a Monte Carlo code (<10 pcm). it has been decided to implement the exact perturbation theory in TRIPOLI-4 and, consequently, to calculate a continuous-energy adjoint flux. The Iterated Fission Probability (IFP) method was chosen because it has shown great results in some other Monte Carlo codes. The IFP method uses a forward calculation to compute the adjoint flux, and consequently, it does not rely on complex code modifications but on the physical definition of the adjoint flux as a phase-space neutron importance. In the first part of this paper, the IFP method implemented in TRIPOLI-4 is described. To illustrate the efficiency of the method, several adjoint fluxes are calculated and compared with their equivalent obtained by the deterministic code APOLLO-2. The new implementation can calculate angular adjoint flux. In the second part, a procedure to carry out an exact perturbation calculation is described. A single cell benchmark has been used to test the accuracy of the method, compared with the 'direct' estimation of the perturbation. Once again the method based on the IFP shows good agreement for a calculation time far more inferior to the 'direct' method. The main advantage of the method is that the relative accuracy of the reactivity variation does not depend on the magnitude of the variation itself, which allows us to calculate very small reactivity perturbations with high precision. It offers the possibility to split reactivity contributions on both isotopes and reactions. Other applications of this perturbation method are presented and tested like the calculation of exact kinetic parameters (βeff, Λeff) or sensitivity parameters

  17. Evaluation of an open access software for calculating glucose variability parameters of a continuous glucose monitoring system applied at pediatric intensive care unit.

    Science.gov (United States)

    Marics, Gábor; Lendvai, Zsófia; Lódi, Csaba; Koncz, Levente; Zakariás, Dávid; Schuster, György; Mikos, Borbála; Hermann, Csaba; Szabó, Attila J; Tóth-Heyn, Péter

    2015-04-24

    Continuous Glucose Monitoring (CGM) has become an increasingly investigated tool, especially with regards to monitoring of diabetic and critical care patients. The continuous glucose data allows the calculation of several glucose variability parameters, however, without specific application the interpretation of the results is time-consuming, utilizing extreme efforts. Our aim was to create an open access software [Glycemic Variability Analyzer Program (GVAP)], readily available to calculate the most common parameters of the glucose variability and to test its usability. The GVAP was developed in MATLAB® 2010b environment. The calculated parameters were the following: average area above/below the target range (Avg. AUC-H/L); Percentage Spent Above/Below the Target Range (PATR/PBTR); Continuous Overall Net Glycemic Action (CONGA); Mean of Daily Differences (MODD); Mean Amplitude of Glycemic Excursions (MAGE). For verification purposes we selected 14 CGM curves of pediatric critical care patients. Medtronic® Guardian® Real-Time with Enlite® sensor was used. The reference values were obtained from Medtronic®(')s own software for Avg. AUC-H/L and PATR/PBTR, from GlyCulator for MODD and CONGA, and using manual calculation for MAGE. The Pearson and Spearman correlation coefficients were above 0.99 for all parameters. The initial execution took 30 minutes, for further analysis with the Windows® Standalone Application approximately 1 minute was needed. The GVAP is a reliable open access program for analyzing different glycemic variability parameters, hence it could be a useful tool for the study of glycemic control among critically ill patients.

  18. Pollutant threshold concentration determination in marine ecosystems using an ecological interaction endpoint.

    Science.gov (United States)

    Wang, Changyou; Liang, Shengkang; Guo, Wenting; Yu, Hua; Xing, Wenhui

    2015-09-01

    The threshold concentrations of pollutants are determined by extrapolating single-species effect data to community-level effects. This assumes the most sensitive endpoint of the life cycle of individuals and the species sensitivity distribution from single-species toxic effect tests, thus, ignoring the ecological interactions. The uncertainties due to this extrapolation can be partially overcome using the equilibrium point of a customized ecosystem. This method incorporates ecological interactions and integrates the effects on growth, survival, and ingestion into a single effect measure, the equilibrium point excursion in the customized ecosystem, in order to describe the toxic effects on plankton. A case study showed that the threshold concentration of copper calculated with the endpoint of the equilibrium point was 10 μg L(-1), which is significantly different from the threshold calculated with a single-species endpoint. The endpoint calculated using this method provides a more relevant measure of the ecological impact than any single individual-level endpoint. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Helicopter EMS: Research Endpoints and Potential Benefits

    Directory of Open Access Journals (Sweden)

    Stephen H. Thomas

    2012-01-01

    Full Text Available Patients, EMS systems, and healthcare regions benefit from Helicopter EMS (HEMS utilization. This article discusses these benefits in terms of specific endpoints utilized in research projects. The endpoint of interest, be it primary, secondary, or surrogate, is important to understand in the deployment of HEMS resources or in planning further HEMS outcomes research. The most important outcomes are those which show potential benefits to the patients, such as functional survival, pain relief, and earlier ALS care. Case reports are also important “outcomes” publications. The benefits of HEMS in the rural setting is the ability to provide timely access to Level I or Level II trauma centers and in nontrauma, interfacility transport of cardiac, stroke, and even sepsis patients. Many HEMS crews have pharmacologic and procedural capabilities that bring a different level of care to a trauma scene or small referring hospital, especially in the rural setting. Regional healthcare and EMS system's benefit from HEMS by their capability to extend the advanced level of care throughout a region, provide a “backup” for areas with limited ALS coverage, minimize transport times, make available direct transport to specialized centers, and offer flexibility of transport in overloaded hospital systems.

  20. Transverse acceptance calculation for continuous ion beam injection into the electron beam ion trap charge breeder of the ReA post-accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Kittimanapun, K., E-mail: kritsadak@slri.or.th [National Superconducting Cyclotron Laboratory (NSCL), Michigan State University (MSU), 640 S. Shaw Lane, East Lansing, Michigan 48824 (United States); Synchrotron Light Research Institute (SLRI), 111 University Avenue, Muang District, Nakhon Ratchasima, 30000 (Thailand); Baumann, T.M.; Lapierre, A.; Schwarz, S. [National Superconducting Cyclotron Laboratory (NSCL), Michigan State University (MSU), 640 S. Shaw Lane, East Lansing, Michigan 48824 (United States); Bollen, G. [National Superconducting Cyclotron Laboratory (NSCL), Michigan State University (MSU), 640 S. Shaw Lane, East Lansing, Michigan 48824 (United States); Facility for Rare Isotope Beams (FRIB), Michigan State University, 640 S. Shaw Lane, East Lansing, Michigan 48824 (United States)

    2015-11-11

    The ReA post-accelerator at the National Superconducting Cyclotron Laboratory (NSCL) employs an electron beam ion trap (EBIT) as a charge breeder. A Monte-Carlo simulation code was developed to calculate the transverse acceptance phase space of the EBIT for continuously injected ion beams and to determine the capture efficiency in dependence of the transverse beam emittance. For this purpose, the code records the position and time of changes in charge state of injected ions, leading either to capture or loss of ions. To benchmark and validate the code, calculated capture efficiencies were compared with results from a geometrical model and measurements. The results of the code agree with the experimental findings within a few 10%. The code predicts a maximum total capture efficiency of 50% for EBIT parameters readily achievable and an efficiency of up to 80% for an electron beam current density of 1900 A/cm{sup 2}.

  1. Log-gamma directed polymer with fixed endpoints via the replica Bethe Ansatz

    International Nuclear Information System (INIS)

    Thiery, Thimothée; Le Doussal, Pierre

    2014-01-01

    We study the model of a discrete directed polymer (DP) on a square lattice with homogeneous inverse gamma distribution of site random Boltzmann weights, introduced by Seppalainen (2012 Ann. Probab. 40 19–73). The integer moments of the partition sum, Z n -bar , are studied using a transfer matrix formulation, which appears as a generalization of the Lieb–Liniger quantum mechanics of bosons to discrete time and space. In the present case of the inverse gamma distribution the model is integrable in terms of a coordinate Bethe Ansatz, as discovered by Brunet. Using the Brunet-Bethe eigenstates we obtain an exact expression for the integer moments of Z n -bar for polymers of arbitrary lengths and fixed endpoint positions. Although these moments do not exist for all integer n, we are nevertheless able to construct a generating function which reproduces all existing integer moments and which takes the form of a Fredholm determinant (FD). This suggests an analytic continuation via a Mellin–Barnes transform and we thereby propose a FD ansatz representation for the probability distribution function (PDF) of Z and its Laplace transform. In the limit of a very long DP, this ansatz yields that the distribution of the free energy converges to the Gaussian unitary ensemble (GUE) Tracy-Widom distribution up to a non-trivial average and variance that we calculate. Our asymptotic predictions coincide with a result by Borodin et al (2013 Commun. Math. Phys. 324 215–32) based on a formula obtained by Corwin et al (2011 arXiv:1110.3489) using the geometric Robinson–Schensted–Knuth (gRSK) correspondence. In addition we obtain the dependence on the endpoint position and the exact elastic coefficient at a large time. We argue the equivalence between our formula and that of Borodin et al. As we will discuss, this provides a connection between quantum integrability and tropical combinatorics. (paper)

  2. Free Energy, Enthalpy and Entropy from Implicit Solvent End-Point Simulations.

    Science.gov (United States)

    Fogolari, Federico; Corazza, Alessandra; Esposito, Gennaro

    2018-01-01

    Free energy is the key quantity to describe the thermodynamics of biological systems. In this perspective we consider the calculation of free energy, enthalpy and entropy from end-point molecular dynamics simulations. Since the enthalpy may be calculated as the ensemble average over equilibrated simulation snapshots the difficulties related to free energy calculation are ultimately related to the calculation of the entropy of the system and in particular of the solvent entropy. In the last two decades implicit solvent models have been used to circumvent the problem and to take into account solvent entropy implicitly in the solvation terms. More recently outstanding advancement in both implicit solvent models and in entropy calculations are making the goal of free energy estimation from end-point simulations more feasible than ever before. We review briefly the basic theory and discuss the advancements in light of practical applications.

  3. Free Energy, Enthalpy and Entropy from Implicit Solvent End-Point Simulations

    Directory of Open Access Journals (Sweden)

    Federico Fogolari

    2018-02-01

    Full Text Available Free energy is the key quantity to describe the thermodynamics of biological systems. In this perspective we consider the calculation of free energy, enthalpy and entropy from end-point molecular dynamics simulations. Since the enthalpy may be calculated as the ensemble average over equilibrated simulation snapshots the difficulties related to free energy calculation are ultimately related to the calculation of the entropy of the system and in particular of the solvent entropy. In the last two decades implicit solvent models have been used to circumvent the problem and to take into account solvent entropy implicitly in the solvation terms. More recently outstanding advancement in both implicit solvent models and in entropy calculations are making the goal of free energy estimation from end-point simulations more feasible than ever before. We review briefly the basic theory and discuss the advancements in light of practical applications.

  4. MVP/GMVP 2: general purpose Monte Carlo codes for neutron and photon transport calculations based on continuous energy and multigroup methods

    International Nuclear Information System (INIS)

    Nagaya, Yasunobu; Okumura, Keisuke; Mori, Takamasa; Nakagawa, Masayuki

    2005-06-01

    In order to realize fast and accurate Monte Carlo simulation of neutron and photon transport problems, two vectorized Monte Carlo codes MVP and GMVP have been developed at JAERI. MVP is based on the continuous energy model and GMVP is on the multigroup model. Compared with conventional scalar codes, these codes achieve higher computation speed by a factor of 10 or more on vector super-computers. Both codes have sufficient functions for production use by adopting accurate physics model, geometry description capability and variance reduction techniques. The first version of the codes was released in 1994. They have been extensively improved and new functions have been implemented. The major improvements and new functions are (1) capability to treat the scattering model expressed with File 6 of the ENDF-6 format, (2) time-dependent tallies, (3) reaction rate calculation with the pointwise response function, (4) flexible source specification, (5) continuous-energy calculation at arbitrary temperatures, (6) estimation of real variances in eigenvalue problems, (7) point detector and surface crossing estimators, (8) statistical geometry model, (9) function of reactor noise analysis (simulation of the Feynman-α experiment), (10) arbitrary shaped lattice boundary, (11) periodic boundary condition, (12) parallelization with standard libraries (MPI, PVM), (13) supporting many platforms, etc. This report describes the physical model, geometry description method used in the codes, new functions and how to use them. (author)

  5. Extracting gluino endpoints with event topology patterns

    Energy Technology Data Exchange (ETDEWEB)

    Pietsch, N. [Hamburg Univ. (Germany). Inst. fuer Experimentalphysik; Reuter, J.; Sakurai, K.; Wiesler, D. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-06-15

    In this paper we study the gluino dijet mass edge measurement at the LHC in a realistic situation including both SUSY and combinatorical backgrounds together with effects of initial and final state radiation as well as a finite detector resolution. Three benchmark scenarios are examined in which the dominant SUSY production process and also the decay modes are different. Several new kinematical variables are proposed to minimize the impact of SUSY and combinatorial backgrounds in the measurement. By selecting events with a particular number of jets and leptons, we attempt to measure two distinct gluino dijet mass edges originating from wino g {yields} jjW and bino g {yields}jjB decay modes, separately. We determine the endpoints of distributions of proposed and existing variables and show that those two edges can be disentangled and measured within good accuracy, irrespective of the presence of ISR, FSR, and detector effects.

  6. Radiological endpoints relevant to ecological risk assessment

    International Nuclear Information System (INIS)

    Harrison, F.

    1997-01-01

    Because of the potential risk from radiation due to the releases of radionuclides from anthropogenic activities, considerable research was performed to determine for humans the levels of dose received, their responses to the doses and mechanisms of action of radioactivity on living matter. More recently, there is an increased interest in the effects of radioactivity on non-human species. There are differences in approach between risk assessment for humans and ecosystems. For protection of humans, the focus is the individual and the endpoint of primary concern is cancer induction. For protection of ecosystems, the focus is on population stability and the endpoint of concern is reproductive success for organisms important ecologically and economically. For these organisms, information is needed on their responses to irradiation and the potential impact of the doses absorbed on their reproductive success. Considerable information is available on the effects of radiation on organisms from different phyla and types of ecosystems. Databases useful for assessing risk from exposures of populations to radioactivity are the effects of irradiation on mortality, fertility and sterility, the latter two of which are important components of reproductive success. Data on radiation effects on mortality are available both from acute and chronic irradiation. In relation to radiation effects, reproductive success for a given population is related to a number of characteristics of the species, including inherent radiosensitivity of reproductive tissues and early life stages, processes occurring during gametogenesis, reproductive strategy and exposure history. The available data on acute and chronic radiation doses is reviewed for invertebrates, fishes and mammals. The information reviewed indicates that wide ranges in responses with species can be expected. Parameters that most likely contribute to inherent radiosensitivity are discussed. (author)

  7. MVP/GMVP Version 3. General purpose Monte Carlo codes for neutron and photon transport calculations based on continuous energy and multigroup methods (Translated document)

    International Nuclear Information System (INIS)

    Nagaya, Yasunobu; Okumura, Keisuke; Sakurai, Takeshi; Mori, Takamasa

    2017-03-01

    In order to realize fast and accurate Monte Carlo simulation of neutron and photon transport problems, two Monte Carlo codes MVP (continuous-energy method) and GMVP (multigroup method) have been developed at Japan Atomic Energy Agency. The codes have adopted a vectorized algorithm and have been developed for vector-type supercomputers. They also support parallel processing with a standard parallelization library MPI and thus a speed-up of Monte Carlo calculations can be achieved on general computing platforms. The first and second versions of the codes were released in 1994 and 2005, respectively. They have been extensively improved and new capabilities have been implemented. The major improvements and new capabilities are as follows: (1) perturbation calculation for effective multiplication factor, (2) exact resonant elastic scattering model, (3) calculation of reactor kinetics parameters, (4) photo-nuclear model, (5) simulation of delayed neutrons, (6) generation of group constants. This report describes the physical model, geometry description method used in the codes, new capabilities and input instructions. (author)

  8. MVP/GMVP version 3. General purpose Monte Carlo codes for neutron and photon transport calculations based on continuous energy and multigroup methods

    International Nuclear Information System (INIS)

    Nagaya, Yasunobu; Okumura, Keisuke; Sakurai, Takeshi; Mori, Takamasa

    2017-03-01

    In order to realize fast and accurate Monte Carlo simulation of neutron and photon transport problems, two Monte Carlo codes MVP (continuous-energy method) and GMVP (multigroup method) have been developed at Japan Atomic Energy Agency. The codes have adopted a vectorized algorithm and have been developed for vector-type supercomputers. They also support parallel processing with a standard parallelization library MPI and thus a speed-up of Monte Carlo calculations can be achieved on general computing platforms. The first and second versions of the codes were released in 1994 and 2005, respectively. They have been extensively improved and new capabilities have been implemented. The major improvements and new capabilities are as follows: (1) perturbation calculation for effective multiplication factor, (2) exact resonant elastic scattering model, (3) calculation of reactor kinetics parameters, (4) photo-nuclear model, (5) simulation of delayed neutrons, (6) generation of group constants. This report describes the physical model, geometry description method used in the codes, new capabilities and input instructions. (author)

  9. Endpoint distinctiveness facilitates analogical mapping in pigeons.

    Science.gov (United States)

    Hagmann, Carl Erick; Cook, Robert G

    2015-03-01

    Analogical thinking necessitates mapping shared relations across two separate domains. We investigated whether pigeons could learn faster with ordinal mapping of relations across two physical dimensions (circle size & choice spatial position) relative to random mapping of these relations. Pigeons were trained to relate six circular samples of different sizes to horizontally positioned choice locations in a six alternative matching-to-sample task. Three pigeons were trained in a mapped condition in which circle size mapped directly onto choice spatial position. Three other pigeons were trained in a random condition in which the relations between size and choice position were arbitrarily assigned. The mapped group showed an advantage over the random group in acquiring this task. In a subsequent second phase, relations between the dimensions were ordinally reversed for the mapped group and re-randomized for the random group. There was no difference in how quickly matching accuracy re-emerged in the two groups, although the mapped group eventually performed more accurately. Analyses suggested this mapped advantage was likely due to endpoint distinctiveness and the benefits of proximity errors during choice responding rather than a conceptual or relational advantage attributable to the common or ordinal mapping of the two dimensions. This potential difficulty in mapping relations across dimensions may limit the pigeons' capacity for more advanced types of analogical reasoning. This article is part of a Special Issue entitled: Tribute to Tom Zentall. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Endpoint Distinctiveness Facilitates Analogical Mapping in Pigeons

    Science.gov (United States)

    Hagmann, Carl Erick; Cook, Robert G.

    2015-01-01

    Analogical thinking necessitates mapping shared relations across two separate domains. We investigated whether pigeons could learn faster with ordinal mapping of relations across two physical dimensions (circle size & choice spatial position) relative to random mapping of these relations. Pigeons were trained to relate six circular samples of different sizes to horizontally positioned choice locations in a six alternative matching-to-sample task. Three pigeons were trained in a mapped condition in which circle size mapped directly onto choice spatial position. Three other pigeons were trained in a random condition in which the relations between size and choice position were arbitrarily assigned. The mapped group showed an advantage over the random group in acquiring this task. In a subsequent second phase, reassignment, relations between the dimensions were ordinally reversed for the mapped group and re-randomized for the random group. There was no difference in how quickly matching accuracy re-emerged in the two groups, although the mapped group eventually performed more accurately. Analyses suggested this mapped advantage was likely due endpoint distinctiveness and the benefits of proximity errors during choice responding rather than a conceptual or relational advantage attributable to the common or ordinal map of the two dimensions. This potential difficulty in mapping relations across dimensions may limit the pigeons’ capacity for more advanced types of analogical reasoning. PMID:25447511

  11. Comparison of treatment with continuous subcutaneous insulin infusion versus multiple daily insulin injections with bolus calculator in patients with type 1 diabetes.

    Science.gov (United States)

    Pérez-García, L; Goñi-Iriarte, M J; García-Mouriz, M

    2015-01-01

    A study of the glycemic control, quality of life, and fear and perception of hypoglycemia by comparing continuous subcutaneous insulin infusion (CSII) group with multiple daily inyections (MDI) with bolus calculator group. This is a retrospective cohort study with following up during the first 12 months that CSII group (n=30) begins the use of "bolus wizard" and the MDI-calculator (n=30) group begins the use of the bolus calculator (Accu-Chek(®) Aviva Expert). HbA1c (3, 6 and 12 months). Questionnaires used: EsDQOL (quality of life), FH-15 (fear of hypoglycemia), and Clarke (perception of hypoglycemia). T Student and nonparametric tests. The average reduction in HbA1c during the study was significantly higher in CSII group (-0.56±0.84%) compared with the MDI group (0.097±0.94%), P=.028. The average basal insulin dose was significantly higher in the MDI group (at baseline, 6 and 12 months). No significant differences were found between the 2 treatment groups after analyzing the EsDQOL, FH-15 and Clarke questionnaires. In the CSII group, perceived quality of life assessed by the EsDQOL questionnaire was found to be better at the end of the study than at the beginning of using the insulin pump. The average reduction in HbA1c was significantly higher in the CSII group. In the CSII group, perceived quality of life was better at the end of the study than at the beginning. Copyright © 2014 SEEN. Published by Elsevier España, S.L.U. All rights reserved.

  12. Sample size determination for equivalence assessment with multiple endpoints.

    Science.gov (United States)

    Sun, Anna; Dong, Xiaoyu; Tsong, Yi

    2014-01-01

    Equivalence assessment between a reference and test treatment is often conducted by two one-sided tests (TOST). The corresponding power function and sample size determination can be derived from a joint distribution of the sample mean and sample variance. When an equivalence trial is designed with multiple endpoints, it often involves several sets of two one-sided tests. A naive approach for sample size determination in this case would select the largest sample size required for each endpoint. However, such a method ignores the correlation among endpoints. With the objective to reject all endpoints and when the endpoints are uncorrelated, the power function is the production of all power functions for individual endpoints. With correlated endpoints, the sample size and power should be adjusted for such a correlation. In this article, we propose the exact power function for the equivalence test with multiple endpoints adjusted for correlation under both crossover and parallel designs. We further discuss the differences in sample size for the naive method without and with correlation adjusted methods and illustrate with an in vivo bioequivalence crossover study with area under the curve (AUC) and maximum concentration (Cmax) as the two endpoints.

  13. Ordered kinematic endpoints for 5-body cascade decays

    Energy Technology Data Exchange (ETDEWEB)

    Klimek, Matthew D. [Theory Group, Department of Physics and Texas Cosmology Center,University of Texas at Austin, 2515 Speedway, Stop C1608, Austin, TX, 78712 (United States)

    2016-12-23

    We present expressions for the kinematic endpoints of 5-body cascade decay chains proceeding through all possible combinations of 2-body and 3-body decays, with one stable invisible particle in the final decay stage. When an invariant mass can be formed in multiple ways by choosing different final state particles from a common vertex, we introduce techniques for finding the sub-leading endpoints for all indistinguishable versions of the invariant mass. In contrast to short decay chains, where sub-leading endpoints are linearly related to the leading endpoints, we find that in 5-body decays, they provide additional independent constraints on the mass spectrum.

  14. Cardiovascular risk calculation

    African Journals Online (AJOL)

    James A. Ker

    2014-08-20

    Aug 20, 2014 ... smoking and elevated blood sugar levels (diabetes mellitus). These risk ... These are risk charts, e.g. FRS, a non-laboratory-based risk calculation, and ... for hard cardiovascular end-points, such as coronary death, myocardial ...

  15. Hippeastrum hybridum anthocyanins as indicators of endpoint in acid

    African Journals Online (AJOL)

    Anthocyanins from Hippeastrum hybridum (Amaryllis) were investigated as indicators of endpoint in acid- base titrations. Extraction of the anthocyanins was done using distilled water, methanol and methanol containing 0.5% acetic acid. The extracts were used in determination of endpoint in titrations between strong.

  16. Hippeastrum hybridum anthocyanins as indicators of endpoint in ...

    African Journals Online (AJOL)

    Anthocyanins from Hippeastrum hybridum (Amaryllis) were investigated as indicators of endpoint in acid- base titrations. Extraction of the anthocyanins was done using distilled water, methanol and methanol containing 0.5% acetic acid. The extracts were used in determination of endpoint in titrations between strong ...

  17. The impact of endpoint measures in rheumatoid arthritis clinical trials

    NARCIS (Netherlands)

    van der Heide, A.; Jacobs, J. W.; Dinant, H. J.; Bijlsma, J. W.

    1992-01-01

    In clinical trials on the effectiveness of disease-modifying antirheumatic drugs (DMARDs) in patients with rheumatoid arthritis (RA), it is common to apply a large number of endpoint measures. This practice has several disadvantages. To determine which endpoint measures are most valuable, reports of

  18. A new method suitable for calculating accurately wetting temperature over a wide range of conditions: Based on the adaptation of continuation algorithm to classical DFT

    Science.gov (United States)

    Zhou, Shiqi

    2017-11-01

    A new scheme is put forward to determine the wetting temperature (Tw) by utilizing the adaptation of arc-length continuation algorithm to classical density functional theory (DFT) used originally by Frink and Salinger, and its advantages are summarized into four points: (i) the new scheme is applicable whether the wetting occurs near a planar or a non-planar surface, whereas a zero contact angle method is considered only applicable to a perfectly flat solid surface, as demonstrated previously and in this work, and essentially not fit for non-planar surface. (ii) The new scheme is devoid of an uncertainty, which plagues a pre-wetting extrapolation method and originates from an unattainability of the infinitely thick film in the theoretical calculation. (iii) The new scheme can be similarly and easily applied to extreme instances characterized by lower temperatures and/or higher surface attraction force field, which, however, can not be dealt with by the pre-wetting extrapolation method because of the pre-wetting transition being mixed with many layering transitions and the difficulty in differentiating varieties of the surface phase transitions. (iv) The new scheme still works in instance wherein the wetting transition occurs close to the bulk critical temperature; however, this case completely can not be managed by the pre-wetting extrapolation method because near the bulk critical temperature the pre-wetting region is extremely narrow, and no enough pre-wetting data are available for use of the extrapolation procedure.

  19. Ecosystem services as assessment endpoints for ecological risk assessment.

    Science.gov (United States)

    Munns, Wayne R; Rea, Anne W; Suter, Glenn W; Martin, Lawrence; Blake-Hedges, Lynne; Crk, Tanja; Davis, Christine; Ferreira, Gina; Jordan, Steve; Mahoney, Michele; Barron, Mace G

    2016-07-01

    Ecosystem services are defined as the outputs of ecological processes that contribute to human welfare or have the potential to do so in the future. Those outputs include food and drinking water, clean air and water, and pollinated crops. The need to protect the services provided by natural systems has been recognized previously, but ecosystem services have not been formally incorporated into ecological risk assessment practice in a general way in the United States. Endpoints used conventionally in ecological risk assessment, derived directly from the state of the ecosystem (e.g., biophysical structure and processes), and endpoints based on ecosystem services serve different purposes. Conventional endpoints are ecologically important and susceptible entities and attributes that are protected under US laws and regulations. Ecosystem service endpoints are a conceptual and analytical step beyond conventional endpoints and are intended to complement conventional endpoints by linking and extending endpoints to goods and services with more obvious benefit to humans. Conventional endpoints can be related to ecosystem services even when the latter are not considered explicitly during problem formulation. To advance the use of ecosystem service endpoints in ecological risk assessment, the US Environmental Protection Agency's Risk Assessment Forum has added generic endpoints based on ecosystem services (ES-GEAE) to the original 2003 set of generic ecological assessment endpoints (GEAEs). Like conventional GEAEs, ES-GEAEs are defined by an entity and an attribute. Also like conventional GEAEs, ES-GEAEs are broadly described and will need to be made specific when applied to individual assessments. Adoption of ecosystem services as a type of assessment endpoint is intended to improve the value of risk assessment to environmental decision making, linking ecological risk to human well-being, and providing an improved means of communicating those risks. Integr Environ Assess Manag

  20. Planning and analyzing clinical trials with composite endpoints

    CERN Document Server

    Rauch, Geraldine; Kieser, Meinhard

    2017-01-01

    This book addresses the most important aspects of how to plan and evaluate clinical trials with a composite primary endpoint to guarantee a clinically meaningful and valid interpretation of the results. Composite endpoints are often used as primary efficacy variables for clinical trials, particularly in the fields of oncology and cardiology. These endpoints combine several variables of interest within a single composite measure, and as a result, all variables that are of major clinical relevance can be considered in the primary analysis without the need to adjust for multiplicity. Moreover, composite endpoints are intended to increase the size of the expected effects thus making clinical trials more powerful. The book offers practical advice for statisticians and medical experts involved in the planning and analysis of clinical trials. For readers who are mainly interested in the application of the methods, all the approaches are illustrated with real-world clinical trial examples, and the software codes requ...

  1. Gastroenterological endpoints in drug trials for cystic fibrosis

    NARCIS (Netherlands)

    Bodewes, Frank A. J. A.; Verkade, Henkjan J.; Wilschanski, Micheal

    2016-01-01

    The phenotype of cystic fibrosis includes a wide variety of clinical and biochemical gastrointestinal presentations. These gastrointestinal characteristics of the disease have come under renewed interest as potential outcome measures and clinical endpoints for therapeutic trials in cystic fibrosis.

  2. Endpoint behavior of high-energy scattering cross sections

    International Nuclear Information System (INIS)

    Chay, Junegone; Kim, Chul

    2010-01-01

    In high-energy processes near the endpoint, there emerge new contributions associated with spectator interactions. Away from the endpoint region, these new contributions are suppressed compared to the leading contribution, but the leading contribution becomes suppressed as we approach the endpoint and the new contributions become comparable. We present how the new contributions scale as we reach the endpoint and show that they are comparable to the suppressed leading contributions in deep inelastic scattering by employing a power-counting analysis. The hadronic tensor in deep inelastic scattering is shown to factorize including the spectator interactions, and it can be expressed in terms of the light cone distribution amplitudes of initial hadrons. We also consider the contribution of the spectator contributions in Drell-Yan processes. Here the spectator interactions are suppressed compared to double parton annihilation according to the power counting.

  3. Real-Time Continuous Response Spectra Exceedance Calculation Displayed in a Web-Browser Enables Rapid and Robust Damage Evaluation by First Responders

    Science.gov (United States)

    Franke, M.; Skolnik, D. A.; Harvey, D.; Lindquist, K.

    2014-12-01

    A novel and robust approach is presented that provides near real-time earthquake alarms for critical structures at distributed locations and large facilities using real-time estimation of response spectra obtained from near free-field motions. Influential studies dating back to the 1980s identified spectral response acceleration as a key ground motion characteristic that correlates well with observed damage in structures. Thus, monitoring and reporting on exceedance of spectra-based thresholds are useful tools for assessing the potential for damage to facilities or multi-structure campuses based on input ground motions only. With as little as one strong-motion station per site, this scalable approach can provide rapid alarms on the damage status of remote towns, critical infrastructure (e.g., hospitals, schools) and points of interests (e.g., bridges) for a very large number of locations enabling better rapid decision making during critical and difficult immediate post-earthquake response actions. Details on the novel approach are presented along with an example implementation for a large energy company. Real-time calculation of PSA exceedance and alarm dissemination are enabled with Bighorn, an extension module based on the Antelope software package that combines real-time spectral monitoring and alarm capabilities with a robust built-in web display server. Antelope is an environmental data collection software package from Boulder Real Time Technologies (BRTT) typically used for very large seismic networks and real-time seismic data analyses. The primary processing engine produces continuous time-dependent response spectra for incoming acceleration streams. It utilizes expanded floating-point data representations within object ring-buffer packets and waveform files in a relational database. This leads to a very fast method for computing response spectra for a large number of channels. A Python script evaluates these response spectra for exceedance of one or more

  4. OPERA models for predicting physicochemical properties and environmental fate endpoints.

    Science.gov (United States)

    Mansouri, Kamel; Grulke, Chris M; Judson, Richard S; Williams, Antony J

    2018-03-08

    The collection of chemical structure information and associated experimental data for quantitative structure-activity/property relationship (QSAR/QSPR) modeling is facilitated by an increasing number of public databases containing large amounts of useful data. However, the performance of QSAR models highly depends on the quality of the data and modeling methodology used. This study aims to develop robust QSAR/QSPR models for chemical properties of environmental interest that can be used for regulatory purposes. This study primarily uses data from the publicly available PHYSPROP database consisting of a set of 13 common physicochemical and environmental fate properties. These datasets have undergone extensive curation using an automated workflow to select only high-quality data, and the chemical structures were standardized prior to calculation of the molecular descriptors. The modeling procedure was developed based on the five Organization for Economic Cooperation and Development (OECD) principles for QSAR models. A weighted k-nearest neighbor approach was adopted using a minimum number of required descriptors calculated using PaDEL, an open-source software. The genetic algorithms selected only the most pertinent and mechanistically interpretable descriptors (2-15, with an average of 11 descriptors). The sizes of the modeled datasets varied from 150 chemicals for biodegradability half-life to 14,050 chemicals for logP, with an average of 3222 chemicals across all endpoints. The optimal models were built on randomly selected training sets (75%) and validated using fivefold cross-validation (CV) and test sets (25%). The CV Q 2 of the models varied from 0.72 to 0.95, with an average of 0.86 and an R 2 test value from 0.71 to 0.96, with an average of 0.82. Modeling and performance details are described in QSAR model reporting format and were validated by the European Commission's Joint Research Center to be OECD compliant. All models are freely available as an open

  5. Surrogate Endpoint Evaluation: Principal Stratification Criteria and the Prentice Definition.

    Science.gov (United States)

    Gilbert, Peter B; Gabriel, Erin E; Huang, Ying; Chan, Ivan S F

    2015-09-01

    A common problem of interest within a randomized clinical trial is the evaluation of an inexpensive response endpoint as a valid surrogate endpoint for a clinical endpoint, where a chief purpose of a valid surrogate is to provide a way to make correct inferences on clinical treatment effects in future studies without needing to collect the clinical endpoint data. Within the principal stratification framework for addressing this problem based on data from a single randomized clinical efficacy trial, a variety of definitions and criteria for a good surrogate endpoint have been proposed, all based on or closely related to the "principal effects" or "causal effect predictiveness (CEP)" surface. We discuss CEP-based criteria for a useful surrogate endpoint, including (1) the meaning and relative importance of proposed criteria including average causal necessity (ACN), average causal sufficiency (ACS), and large clinical effect modification; (2) the relationship between these criteria and the Prentice definition of a valid surrogate endpoint; and (3) the relationship between these criteria and the consistency criterion (i.e., assurance against the "surrogate paradox"). This includes the result that ACN plus a strong version of ACS generally do not imply the Prentice definition nor the consistency criterion, but they do have these implications in special cases. Moreover, the converse does not hold except in a special case with a binary candidate surrogate. The results highlight that assumptions about the treatment effect on the clinical endpoint before the candidate surrogate is measured are influential for the ability to draw conclusions about the Prentice definition or consistency. In addition, we emphasize that in some scenarios that occur commonly in practice, the principal strata sub-populations for inference are identifiable from the observable data, in which cases the principal stratification framework has relatively high utility for the purpose of effect

  6. Surrogate Endpoint Evaluation: Principal Stratification Criteria and the Prentice Definition

    Science.gov (United States)

    Gilbert, Peter B.; Gabriel, Erin E.; Huang, Ying; Chan, Ivan S.F.

    2015-01-01

    A common problem of interest within a randomized clinical trial is the evaluation of an inexpensive response endpoint as a valid surrogate endpoint for a clinical endpoint, where a chief purpose of a valid surrogate is to provide a way to make correct inferences on clinical treatment effects in future studies without needing to collect the clinical endpoint data. Within the principal stratification framework for addressing this problem based on data from a single randomized clinical efficacy trial, a variety of definitions and criteria for a good surrogate endpoint have been proposed, all based on or closely related to the “principal effects” or “causal effect predictiveness (CEP)” surface. We discuss CEP-based criteria for a useful surrogate endpoint, including (1) the meaning and relative importance of proposed criteria including average causal necessity (ACN), average causal sufficiency (ACS), and large clinical effect modification; (2) the relationship between these criteria and the Prentice definition of a valid surrogate endpoint; and (3) the relationship between these criteria and the consistency criterion (i.e., assurance against the “surrogate paradox”). This includes the result that ACN plus a strong version of ACS generally do not imply the Prentice definition nor the consistency criterion, but they do have these implications in special cases. Moreover, the converse does not hold except in a special case with a binary candidate surrogate. The results highlight that assumptions about the treatment effect on the clinical endpoint before the candidate surrogate is measured are influential for the ability to draw conclusions about the Prentice definition or consistency. In addition, we emphasize that in some scenarios that occur commonly in practice, the principal strata sub-populations for inference are identifiable from the observable data, in which cases the principal stratification framework has relatively high utility for the purpose of

  7. Sensitivity of submersed freshwater macrophytes and endpoints in laboratory toxicity tests

    International Nuclear Information System (INIS)

    Arts, Gertie H.P.; Belgers, J. Dick M.; Hoekzema, Conny H.; Thissen, Jac T.N.M.

    2008-01-01

    The toxicological sensitivity and variability of a range of macrophyte endpoints were statistically tested with data from chronic, non-axenic, macrophyte toxicity tests. Five submersed freshwater macrophytes, four pesticides/biocides and 13 endpoints were included in the statistical analyses. Root endpoints, reflecting root growth, were most sensitive in the toxicity tests, while endpoints relating to biomass, growth and shoot length were less sensitive. The endpoints with the lowest coefficients of variation were not necessarily the endpoints, which were toxicologically most sensitive. Differences in sensitivity were in the range of 10-1000 for different macrophyte-specific endpoints. No macrophyte species was consistently the most sensitive. Criteria to select endpoints in macrophyte toxicity tests should include toxicological sensitivity, variance and ecological relevance. Hence, macrophyte toxicity tests should comprise an array of endpoints, including very sensitive endpoints like those relating to root growth. - A range of endpoints is more representative of macrophyte fitness than biomass and growth only

  8. Analytical continuation in coupling constant method; application to the calculation of resonance energies and widths for organic molecules: Glycine, alanine and valine and dimer of formic acid

    Czech Academy of Sciences Publication Activity Database

    Papp, P.; Matejčík, Š.; Mach, P.; Urban, J.; Paidarová, Ivana; Horáček, J.

    2013-01-01

    Roč. 418, JUN 2013 (2013), s. 8-13 ISSN 0301-0104 R&D Projects: GA ČR GAP203/12/0665 Institutional support: RVO:61388955 Keywords : analytic continuation * resonances * vertical attachment energy Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 2.028, year: 2013

  9. Sodium chloride as a reference substance for the three growth endpoints used in the Lemna minor L. (1753 test

    Directory of Open Access Journals (Sweden)

    Aline Andrade Godoy

    2017-01-01

    Full Text Available Lemna sp. growth inhibition test standardized protocols suggest the use of compounds such as 3,5-dichlorophenol as reference substances for checking the test organism’s sensitivity routinely. However, this and other recommended chemicals present risks to human health and to the environment. Sodium chloride (NaCl appears as a less toxic alternative reference substance which has been successfully used in routine ecotoxicological tests. However, the evaluation of this compound in multiple growth endpoints used in the L. minor test, which is required for recommending it as a reference substance for this test organism, has not yet been reported. In the present study, NaCl was tested with L. minor for the growth endpoints frond number, total frond area and fresh weight. Results showed acceptable sensitivity and reproducibility (coefficient of variance < 15.0% for all three of the measured endpoints. Statistically significant differences were observed between the EC50 values calculated based on the three endpoints (p < 0.05. Total frond area was the most sensitive one, with average EC50 value of 2742.80 ± 245.7 mg L-1. It was anticipated that NaCl can be a suitable alternative reference substance and that total frond area should be the endpoint of choice for sensitivity toxicity tests using NaCl.

  10. Individuals versus organisms versus populations in the definition of ecological assessment endpoints.

    Science.gov (United States)

    Suter, Glenn W; Norton, Susan B; Fairbrother, Anne

    2005-11-01

    Discussions and applications of the policies and practices of the U.S. Environmental Protection Agency (USEPA) in ecological risk assessment will benefit from continued clarification of the concepts of assessment endpoints and of levels of biological organization. First, assessment endpoint entities and attributes can be defined at different levels of organization. Hence, an organism-level attribute, such as growth or survival, can be applied collectively to a population-level entity such as the brook trout in a stream. Second, assessment endpoints for ecological risk assessment are often mistakenly described as "individual level," which leads to the idea that such assessments are intended to protect individuals. Finally, populations play a more important role in risk assessments than is generally recognized. Organism-level attributes are used primarily for population-level assessments. In addition, the USEPA and other agencies already are basing management decisions on population or community entities and attributes such as production of fisheries, abundance of migratory bird populations, and aquatic community composition.

  11. Quantitative 4D Transcatheter Intraarterial Perfusion MR Imaging as a Method to Standardize Angiographic Chemoembolization Endpoints

    Science.gov (United States)

    Jin, Brian; Wang, Dingxin; Lewandowski, Robert J.; Ryu, Robert K.; Sato, Kent T.; Larson, Andrew C.; Salem, Riad; Omary, Reed A.

    2011-01-01

    PURPOSE We aimed to test the hypothesis that subjective angiographic endpoints during transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC) exhibit consistency and correlate with objective intraprocedural reductions in tumor perfusion as determined by quantitative four dimensional (4D) transcatheter intraarterial perfusion (TRIP) magnetic resonance (MR) imaging. MATERIALS AND METHODS This prospective study was approved by the institutional review board. Eighteen consecutive patients underwent TACE in a combined MR/interventional radiology (MR-IR) suite. Three board-certified interventional radiologists independently graded the angiographic endpoint of each procedure based on a previously described subjective angiographic chemoembolization endpoint (SACE) scale. A consensus SACE rating was established for each patient. Patients underwent quantitative 4D TRIP-MR imaging immediately before and after TACE, from which mean whole tumor perfusion (Fρ) was calculated. Consistency of SACE ratings between observers was evaluated using the intraclass correlation coefficient (ICC). The relationship between SACE ratings and intraprocedural TRIP-MR imaging perfusion changes was evaluated using Spearman’s rank correlation coefficient. RESULTS The SACE rating scale demonstrated very good consistency among all observers (ICC = 0.80). The consensus SACE rating was significantly correlated with both absolute (r = 0.54, P = 0.022) and percent (r = 0.85, P SACE rating scale demonstrates very good consistency between raters, and significantly correlates with objectively measured intraprocedural perfusion reductions during TACE. These results support the use of the SACE scale as a standardized alternative method to quantitative 4D TRIP-MR imaging to classify patients based on embolic endpoints of TACE. PMID:22021520

  12. ESCLOUD: A computer program to calculate the air concentration, deposition rate and external dose rate from a continuous discharge of radioactive material to atmosphere

    International Nuclear Information System (INIS)

    Jones, J.A.

    1980-03-01

    Radioactive material may be discharged to atmosphere in small quantities during the normal operation of a nuclear installation as part of a considered waste management practice. Estimates of the individual and collective dose equivalent rates resulting from such a discharge are required in a number of contexts: for example, in assessing compliance with dose limits, in estimating the radiological impact of the discharge and as an input into optimisation studies. The suite of programs which has been developed to undertake such calculations is made up of a number of independent modules one of which, ESCLOUD, is described in this report. The ESCLOUD program evaluates, as a function of distance and direction from the release point, the air concentration, deposition rate and external β and γ doses from airborne and deposited activity. The air concentration and deposition rate can be used as input to other modules for calculating inhalation and ingestion doses. (author)

  13. Surrogate endpoints for overall survival in digestive oncology trials: which candidates? A questionnaires survey among clinicians and methodologists

    Directory of Open Access Journals (Sweden)

    Bonnetain Franck

    2010-06-01

    Full Text Available Abstract Background Overall survival (OS is the gold standard for the demonstration of a clinical benefit in cancer trials. Replacement of OS by a surrogate endpoint allows to reduce trial duration. To date, few surrogate endpoints have been validated in digestive oncology. The aim of this study was to draw up an ordered list of potential surrogate endpoints for OS in digestive cancer trials, by way of a survey among clinicians and methodologists. Secondary objective was to obtain their opinion on surrogacy and quality of life (QoL. Methods In 2007 and 2008, self administered sequential questionnaires were sent to a panel of French clinicians and methodologists involved in the conduct of cancer clinical trials. In the first questionnaire, panellists were asked to choose the most important characteristics defining a surrogate among six proposals, to give advantages and drawbacks of the surrogates, and to answer questions about their validation and use. Then they had to suggest potential surrogate endpoints for OS in each of the following tumour sites: oesophagus, stomach, liver, pancreas, biliary tract, lymphoma, colon, rectum, and anus. They finally gave their opinion on QoL as surrogate endpoint. In the second questionnaire, they had to classify the previously proposed candidate surrogates from the most (position #1 to the least relevant in their opinion. Frequency at which the endpoints were chosen as first, second or third most relevant surrogates was calculated and served as final ranking. Results Response rate was 30% (24/80 in the first round and 20% (16/80 in the second one. Participants highlighted key points concerning surrogacy. In particular, they reminded that a surrogate endpoint is expected to predict clinical benefit in a well-defined therapeutic situation. Half of them thought it was not relevant to study QoL as surrogate for OS. DFS, in the neoadjuvant settings or early stages, and PFS, in the non operable or metastatic settings

  14. Surrogate endpoints for overall survival in digestive oncology trials: which candidates? A questionnaires survey among clinicians and methodologists

    International Nuclear Information System (INIS)

    Methy, Nicolas; Bedenne, Laurent; Bonnetain, Franck

    2010-01-01

    Overall survival (OS) is the gold standard for the demonstration of a clinical benefit in cancer trials. Replacement of OS by a surrogate endpoint allows to reduce trial duration. To date, few surrogate endpoints have been validated in digestive oncology. The aim of this study was to draw up an ordered list of potential surrogate endpoints for OS in digestive cancer trials, by way of a survey among clinicians and methodologists. Secondary objective was to obtain their opinion on surrogacy and quality of life (QoL). In 2007 and 2008, self administered sequential questionnaires were sent to a panel of French clinicians and methodologists involved in the conduct of cancer clinical trials. In the first questionnaire, panellists were asked to choose the most important characteristics defining a surrogate among six proposals, to give advantages and drawbacks of the surrogates, and to answer questions about their validation and use. Then they had to suggest potential surrogate endpoints for OS in each of the following tumour sites: oesophagus, stomach, liver, pancreas, biliary tract, lymphoma, colon, rectum, and anus. They finally gave their opinion on QoL as surrogate endpoint. In the second questionnaire, they had to classify the previously proposed candidate surrogates from the most (position N1) to the least relevant in their opinion. Frequency at which the endpoints were chosen as first, second or third most relevant surrogates was calculated and served as final ranking. Response rate was 30% (24/80) in the first round and 20% (16/80) in the second one. Participants highlighted key points concerning surrogacy. In particular, they reminded that a surrogate endpoint is expected to predict clinical benefit in a well-defined therapeutic situation. Half of them thought it was not relevant to study QoL as surrogate for OS. DFS, in the neoadjuvant settings or early stages, and PFS, in the non operable or metastatic settings, were ranked first, with a frequency of more than

  15. Surrogate endpoints for overall survival in digestive oncology trials: which candidates? A questionnaires survey among clinicians and methodologists.

    Science.gov (United States)

    Methy, Nicolas; Bedenne, Laurent; Bonnetain, Franck

    2010-06-10

    Overall survival (OS) is the gold standard for the demonstration of a clinical benefit in cancer trials. Replacement of OS by a surrogate endpoint allows to reduce trial duration. To date, few surrogate endpoints have been validated in digestive oncology. The aim of this study was to draw up an ordered list of potential surrogate endpoints for OS in digestive cancer trials, by way of a survey among clinicians and methodologists. Secondary objective was to obtain their opinion on surrogacy and quality of life (QoL). In 2007 and 2008, self administered sequential questionnaires were sent to a panel of French clinicians and methodologists involved in the conduct of cancer clinical trials. In the first questionnaire, panellists were asked to choose the most important characteristics defining a surrogate among six proposals, to give advantages and drawbacks of the surrogates, and to answer questions about their validation and use. Then they had to suggest potential surrogate endpoints for OS in each of the following tumour sites: oesophagus, stomach, liver, pancreas, biliary tract, lymphoma, colon, rectum, and anus. They finally gave their opinion on QoL as surrogate endpoint. In the second questionnaire, they had to classify the previously proposed candidate surrogates from the most (position #1) to the least relevant in their opinion.Frequency at which the endpoints were chosen as first, second or third most relevant surrogates was calculated and served as final ranking. Response rate was 30% (24/80) in the first round and 20% (16/80) in the second one. Participants highlighted key points concerning surrogacy. In particular, they reminded that a surrogate endpoint is expected to predict clinical benefit in a well-defined therapeutic situation. Half of them thought it was not relevant to study QoL as surrogate for OS.DFS, in the neoadjuvant settings or early stages, and PFS, in the non operable or metastatic settings, were ranked first, with a frequency of more than 69

  16. The Asthma Control Questionnaire as a clinical trial endpoint

    DEFF Research Database (Denmark)

    Barnes, P J; Casale, T B; Dahl, Ronald

    2014-01-01

    these component endpoints; however, there is no consensus on the optimal instrument for use in clinical trials. The Asthma Control Questionnaire (ACQ) has been shown to be a valid, reliable instrument that allows accurate and reproducible assessment of asthma control that compares favourably with other commonly...

  17. Phase II Trials for Heterogeneous Patient Populations with a Time-to-Event Endpoint.

    Science.gov (United States)

    Jung, Sin-Ho

    2017-07-01

    In this paper, we consider a single-arm phase II trial with a time-to-event end-point. We assume that the study population has multiple subpopulations with different prognosis, but the study treatment is expected to be similarly efficacious across the subpopulations. We review a stratified one-sample log-rank test and present its sample size calculation method under some practical design settings. Our sample size method requires specification of the prevalence of subpopulations. We observe that the power of the resulting sample size is not very sensitive to misspecification of the prevalence.

  18. Biomechanical constraints on the feedforward regulation of endpoint stiffness.

    Science.gov (United States)

    Hu, Xiao; Murray, Wendy M; Perreault, Eric J

    2012-10-01

    Although many daily tasks tend to destabilize arm posture, it is still possible to have stable interactions with the environment by regulating the multijoint mechanics of the arm in a task-appropriate manner. For postural tasks, this regulation involves the appropriate control of endpoint stiffness, which represents the stiffness of the arm at the hand. Although experimental studies have been used to evaluate endpoint stiffness control, including the orientation of maximal stiffness, the underlying neural strategies remain unknown. Specifically, the relative importance of feedforward and feedback mechanisms has yet to be determined due to the difficulty separately identifying the contributions of these mechanisms in human experiments. This study used a previously validated three-dimensional musculoskeletal model of the arm to quantify the degree to which the orientation of maximal endpoint stiffness could be changed using only steady-state muscle activations, used to represent feedforward motor commands. Our hypothesis was that the feedforward control of endpoint stiffness orientation would be significantly constrained by the biomechanical properties of the musculoskeletal system. Our results supported this hypothesis, demonstrating substantial biomechanical constraints on the ability to regulate endpoint stiffness throughout the workspace. The ability to regulate stiffness orientation was further constrained by additional task requirements, such as the need to support the arm against gravity or exert forces on the environment. Together, these results bound the degree to which slowly varying feedforward motor commands can be used to regulate the orientation of maximum arm stiffness and provide a context for better understanding conditions in which feedback control may be needed.

  19. Modeling hard clinical end-point data in economic analyses.

    Science.gov (United States)

    Kansal, Anuraag R; Zheng, Ying; Palencia, Roberto; Ruffolo, Antonio; Hass, Bastian; Sorensen, Sonja V

    2013-11-01

    The availability of hard clinical end-point data, such as that on cardiovascular (CV) events among patients with type 2 diabetes mellitus, is increasing, and as a result there is growing interest in using hard end-point data of this type in economic analyses. This study investigated published approaches for modeling hard end-points from clinical trials and evaluated their applicability in health economic models with different disease features. A review of cost-effectiveness models of interventions in clinically significant therapeutic areas (CV diseases, cancer, and chronic lower respiratory diseases) was conducted in PubMed and Embase using a defined search strategy. Only studies integrating hard end-point data from randomized clinical trials were considered. For each study included, clinical input characteristics and modeling approach were summarized and evaluated. A total of 33 articles (23 CV, eight cancer, two respiratory) were accepted for detailed analysis. Decision trees, Markov models, discrete event simulations, and hybrids were used. Event rates were incorporated either as constant rates, time-dependent risks, or risk equations based on patient characteristics. Risks dependent on time and/or patient characteristics were used where major event rates were >1%/year in models with fewer health states (Models of infrequent events or with numerous health states generally preferred constant event rates. The detailed modeling information and terminology varied, sometimes requiring interpretation. Key considerations for cost-effectiveness models incorporating hard end-point data include the frequency and characteristics of the relevant clinical events and how the trial data is reported. When event risk is low, simplification of both the model structure and event rate modeling is recommended. When event risk is common, such as in high risk populations, more detailed modeling approaches, including individual simulations or explicitly time-dependent event rates, are

  20. Calculating the Mean Amplitude of Glycemic Excursions from Continuous Glucose Data Using an Open-Code Programmable Algorithm Based on the Integer Nonlinear Method.

    Science.gov (United States)

    Yu, Xuefei; Lin, Liangzhuo; Shen, Jie; Chen, Zhi; Jian, Jun; Li, Bin; Xin, Sherman Xuegang

    2018-01-01

    The mean amplitude of glycemic excursions (MAGE) is an essential index for glycemic variability assessment, which is treated as a key reference for blood glucose controlling at clinic. However, the traditional "ruler and pencil" manual method for the calculation of MAGE is time-consuming and prone to error due to the huge data size, making the development of robust computer-aided program an urgent requirement. Although several software products are available instead of manual calculation, poor agreement among them is reported. Therefore, more studies are required in this field. In this paper, we developed a mathematical algorithm based on integer nonlinear programming. Following the proposed mathematical method, an open-code computer program named MAGECAA v1.0 was developed and validated. The results of the statistical analysis indicated that the developed program was robust compared to the manual method. The agreement among the developed program and currently available popular software is satisfied, indicating that the worry about the disagreement among different software products is not necessary. The open-code programmable algorithm is an extra resource for those peers who are interested in the related study on methodology in the future.

  1. Polarized spectra calculation and continuous wave laser operation of Yb-doped disordered Ca3La2(BO3)4 crystal

    Science.gov (United States)

    Wang, Yeqing; Chen, Aixi; You, Zhenyu; Tu, Chaoyang

    2015-12-01

    A notable disorder crystal Yb:Ca3La2(BO3)4 crystal with Yb3+ ion doping concentration of 10 at.% was grown by the Czochralski method. The polarized absorption, polarized emission, and polarized gain cross sections were systematically calculated. The laser operations were investigated with Yb:Ca3La2(BO3)4 crystals cut along the a, b, and c crystallographic axes. The highest output power of 3.88 W was obtained by using the b-cut Yb:Ca3La2(BO3)4 crystal, with a slope efficiency of 62%. Additionally, it was confirmed that the output laser spectra were largely dependent on the output coupler.

  2. Polarized spectra calculation and continuous wave laser operation of Yb-doped disordered Ca3La2(BO3)4 crystal

    International Nuclear Information System (INIS)

    Wang, Yeqing; Chen, Aixi; You, Zhenyu; Tu, Chaoyang

    2015-01-01

    A notable disorder crystal Yb:Ca 3 La 2 (BO 3 ) 4 crystal with Yb 3+ ion doping concentration of 10 at.% was grown by the Czochralski method. The polarized absorption, polarized emission, and polarized gain cross sections were systematically calculated. The laser operations were investigated with Yb:Ca 3 La 2 (BO 3 ) 4 crystals cut along the a, b, and c crystallographic axes. The highest output power of 3.88 W was obtained by using the b-cut Yb:Ca 3 La 2 (BO 3 ) 4 crystal, with a slope efficiency of 62%. Additionally, it was confirmed that the output laser spectra were largely dependent on the output coupler. (paper)

  3. Patients’ preferences for selection of endpoints in cardiovascular clinical trials

    Directory of Open Access Journals (Sweden)

    Robert D. Chow

    2014-02-01

    Full Text Available Background: To reduce the duration and overall costs of cardiovascular trials, use of the combined endpoints in trial design has become commonplace. Though this methodology may serve the needs of investigators and trial sponsors, the preferences of patients or potential trial subjects in the trial design process has not been studied. Objective: To determine the preferences of patients in the design of cardiovascular trials. Design: Participants were surveyed in a pilot study regarding preferences among various single endpoints commonly used in cardiovascular trials, preference for single vs. composite endpoints, and the likelihood of compliance with a heart medication if patients similar to them participated in the trial design process. Participants: One hundred adult English-speaking patients, 38% male, from a primary care ambulatory practice located in an urban setting. Key results: Among single endpoints, participants rated heart attack as significantly more important than death from other causes (4.53 vs. 3.69, p=0.004 on a scale of 1–6. Death from heart disease was rated as significantly more important than chest pain (4.73 vs. 2.47, p<0.001, angioplasty/PCI/CABG (4.73 vs. 2.43, p<0.001, and stroke (4.73 vs. 2.43, p<0.001. Participants also expressed a slight preference for combined endpoints over single endpoint (43% vs. 57%, incorporation of the opinions of the study patient population into the design of trials (48% vs. 41% for researchers, and a greater likelihood of medication compliance if patient preferences were considered during trial design (67% indicated a significant to major effect. Conclusions: Patients are able to make judgments and express preferences regarding trial design. They prefer that the opinions of the study population rather than the general population be incorporated into the design of the study. This novel approach to study design would not only incorporate patient preferences into medical decision making, but

  4. For Time-Continuous Optimisation

    DEFF Research Database (Denmark)

    Heinrich, Mary Katherine; Ayres, Phil

    2016-01-01

    Strategies for optimisation in design normatively assume an artefact end-point, disallowing continuous architecture that engages living systems, dynamic behaviour, and complex systems. In our Flora Robotica investigations of symbiotic plant-robot bio-hybrids, we re- quire computational tools...

  5. Trial endpoints for drug approval in oncology: Chemoprevention.

    Science.gov (United States)

    Beitz, J

    2001-04-01

    As with other drugs, new drug applications for marketing approval of chemopreventive drugs must include data from adequate and well-controlled clinical trials that demonstrate effectiveness and safety for the intended use. This article summarizes the regulatory requirements for traditional marketing approval, as well as for approval under the accelerated approval regulations. Unlike traditional approval, accelerated approval is based on a surrogate endpoint that is reasonably likely to predict clinical benefit. Discussions with the Food and Drug Administration (FDA) regarding the validity of trial endpoints that may serve as surrogates for clinical benefit for accelerated approval should take place as early as possible in drug development. Meetings with the FDA to discuss these issues may be requested throughout the clinical development of a new drug.

  6. Sperm count as a surrogate endpoint for male fertility control.

    Science.gov (United States)

    Benda, Norbert; Gerlinger, Christoph

    2007-11-30

    When assessing the effectiveness of a hormonal method of fertility control in men, the classical approach used for the assessment of hormonal contraceptives in women, by estimating the pregnancy rate or using a life-table analysis for the time to pregnancy, is difficult to apply in a clinical development program. The main reasons are the dissociation of the treated unit, i.e. the man, and the observed unit, i.e. his female partner, the high variability in the frequency of male intercourse, the logistical cost and ethical concerns related to the monitoring of the trial. A reasonable surrogate endpoint of the definite endpoint time to pregnancy is sperm count. In addition to the avoidance of the mentioned problems, trials that compare different treatments are possible with reasonable sample sizes, and study duration can be shorter. However, current products do not suppress sperm production to 100 per cent in all men and the sperm count is only observed with measurement error. Complete azoospermia might not be necessary in order to achieve an acceptable failure rate compared with other forms of male fertility control. Therefore, the use of sperm count as a surrogate endpoint must rely on the results of a previous trial in which both the definitive- and surrogate-endpoint results were assessed. The paper discusses different estimation functions of the mean pregnancy rate (corresponding to the cumulative hazard) that are based on the results of sperm count trial and a previous trial in which both sperm count and time to pregnancy were assessed, as well as the underlying assumptions. Sample size estimations are given for pregnancy rate estimation with a given precision.

  7. Pulmonary Endpoints in Duchenne Muscular Dystrophy. A Workshop Summary.

    Science.gov (United States)

    Finder, Jonathan; Mayer, Oscar Henry; Sheehan, Daniel; Sawnani, Hemant; Abresch, R Ted; Benditt, Joshua; Birnkrant, David J; Duong, Tina; Henricson, Erik; Kinnett, Kathi; McDonald, Craig M; Connolly, Anne M

    2017-08-15

    Development of novel therapeutics for treatment of Duchenne muscular dystrophy (DMD) has led to clinical trials that include pulmonary endpoints that allow assessment of respiratory muscle status, especially in nonambulatory subjects. Parent Project Muscular Dystrophy (PPMD) convened a workshop in Bethesda, Maryland, on April 14 and 15, 2016, to summarize published respiratory data in DMD and give guidance to clinical researchers assessing the effect of interventions on pulmonary outcomes in DMD.

  8. Exposing the cancer genome atlas as a SPARQL endpoint

    Science.gov (United States)

    Deus, Helena F.; Veiga, Diogo F.; Freire, Pablo R.; Weinstein, John N.; Mills, Gordon B.; Almeida, Jonas S.

    2011-01-01

    The Cancer Genome Atlas (TCGA) is a multidisciplinary, multi-institutional effort to characterize several types of cancer. Datasets from biomedical domains such as TCGA present a particularly challenging task for those interested in dynamically aggregating its results because the data sources are typically both heterogeneous and distributed. The Linked Data best practices offer a solution to integrate and discover data with those characteristics, namely through exposure of data as Web services supporting SPARQL, the Resource Description Framework query language. Most SPARQL endpoints, however, cannot easily be queried by data experts. Furthermore, exposing experimental data as SPARQL endpoints remains a challenging task because, in most cases, data must first be converted to Resource Description Framework triples. In line with those requirements, we have developed an infrastructure to expose clinical, demographic and molecular data elements generated by TCGA as a SPARQL endpoint by assigning elements to entities of the Simple Sloppy Semantic Database (S3DB) management model. All components of the infrastructure are available as independent Representational State Transfer (REST) Web services to encourage reusability, and a simple interface was developed to automatically assemble SPARQL queries by navigating a representation of the TCGA domain. A key feature of the proposed solution that greatly facilitates assembly of SPARQL queries is the distinction between the TCGA domain descriptors and data elements. Furthermore, the use of the S3DB management model as a mediator enables queries to both public and protected data without the need for prior submission to a single data source. PMID:20851208

  9. Overview of Biomarkers and Surrogate Endpoints in Drug Development

    Directory of Open Access Journals (Sweden)

    John A. Wagner

    2002-01-01

    Full Text Available There are numerous factors that recommend the use of biomarkers in drug development including the ability to provide a rational basis for selection of lead compounds, as an aid in determining or refining mechanism of action or pathophysiology, and the ability to work towards qualification and use of a biomarker as a surrogate endpoint. Examples of biomarkers come from many different means of clinical and laboratory measurement. Total cholesterol is an example of a clinically useful biomarker that was successfully qualified for use as a surrogate endpoint. Biomarkers require validation in most circumstances. Validation of biomarker assays is a necessary component to delivery of high-quality research data necessary for effective use of biomarkers. Qualification is necessary for use of a biomarker as a surrogate endpoint. Putative biomarkers are typically identified because of a relationship to known or hypothetical steps in a pathophysiologic cascade. Biomarker discovery can also be effected by expression profiling experiment using a variety of array technologies and related methods. For example, expression profiling experiments enabled the discovery of adipocyte related complement protein of 30 kD (Acrp30 or adiponectin as a biomarker for in vivo activation of peroxisome proliferator-activated receptors (PPAR γ activity.

  10. Congenital Adrenal Hyperplasia: Classification of Studies Employing Psychological Endpoints

    Directory of Open Access Journals (Sweden)

    Sandberg DavidE

    2010-08-01

    Full Text Available Psychological outcomes in persons with congenital adrenal hyperplasia (CAH have received substantial attention. The objectives of this paper were to (1 catalog psychological endpoints assessed in CAH outcome studies and (2 classify the conceptual/theoretical model shaping the research design and interpretation of CAH-related psychological effects. A total of 98 original research studies, published between 1955 and 2009, were categorized based on psychological endpoints examined as well as the research design and conceptual model guiding analysis and interpretation of data. The majority of studies (68% investigated endpoints related to psychosexual differentiation. The preponderance of studies (76% examined a direct relationship (i.e., inferring causality between prenatal androgen exposure and psychological outcomes. Findings are discussed in relation to the observed imbalance between theoretical interest in the role of prenatal androgens in shaping psychosexual differentiation and a broader conceptual model that examines the role of other potential factors in mediating or moderating the influence of CAH pathophysiology on psychological outcomes in both affected females and males. The latter approach offers to identify factors amenable to clinical intervention that enhance both health and quality of life outcomes in CAH as well as other disorders of sex development.

  11. Continuous tokamaks

    International Nuclear Information System (INIS)

    Peng, Y.K.M.

    1978-04-01

    A tokamak configuration is proposed that permits the rapid replacement of a plasma discharge in a ''burn'' chamber by another one in a time scale much shorter than the elementary thermal time constant of the chamber first wall. With respect to the chamber, the effective duty cycle factor can thus be made arbitrarily close to unity minimizing the cyclic thermal stress in the first wall. At least one plasma discharge always exists in the new tokamak configuration, hence, a continuous tokamak. By incorporating adiabatic toroidal compression, configurations of continuous tokamak compressors are introduced. To operate continuous tokamaks, it is necessary to introduce the concept of mixed poloidal field coils, which spatially groups all the poloidal field coils into three sets, all contributing simultaneously to inducing the plasma current and maintaining the proper plasma shape and position. Preliminary numerical calculations of axisymmetric MHD equilibria in continuous tokamaks indicate the feasibility of their continued plasma operation. Advanced concepts of continuous tokamaks to reduce the topological complexity and to allow the burn plasma aspect ratio to decrease for increased beta are then suggested

  12. An European inter-laboratory validation of alternative endpoints of the murine local lymph node assay

    International Nuclear Information System (INIS)

    Ehling, G.; Hecht, M.; Heusener, A.; Huesler, J.; Gamer, A.O.; Loveren, H. van; Maurer, Th.; Riecke, K.; Ullmann, L.; Ulrich, P.; Vandebriel, R.; Vohr, H.-W.

    2005-01-01

    The original local lymph node assay (LLNA) is based on the use of radioactive labelling to measure cell proliferation. Other endpoints for the assessment of proliferation are also authorized by the OECD Guideline 429 provided there is appropriate scientific support, including full citations and description of the methodology (OECD, 2002. OECD Guideline for the Testing of Chemicals; Skin Sensitization: Local Lymph Node Assay, Guideline 429. Paris, adopted 24th April 2002.). Here, we describe the outcome of the second round of an inter-laboratory validation of alternative endpoints in the LLNA conducted in nine laboratories in Europe. The validation study was managed and supervised by the Swiss Agency for Therapeutic Products (Swissmedic) in Bern. Ear-draining lymph node (LN) weight and cell counts were used to assess LN cell proliferation instead of [3H]TdR incorporation. In addition, the acute inflammatory skin reaction was measured by ear weight determination of circular biopsies of the ears to identify skin irritation properties of the test items. The statistical analysis was performed in the department of statistics at the university of Bern. Similar to the EC 3 values defined for the radioactive method, threshold values were calculated for the endpoints measured in this modification of the LLNA. It was concluded that all parameters measured have to be taken into consideration for the categorisation of compounds due to their sensitising potencies. Therefore, an assessment scheme has been developed which turned out to be of great importance to consistently assess sensitisation versus irritancy based on the data of the different parameters. In contrast to the radioactive method, irritants have been picked up by all the laboratories applying this assessment scheme

  13. Choice of futility boundaries for group sequential designs with two endpoints

    Directory of Open Access Journals (Sweden)

    Svenja Schüler

    2017-08-01

    Full Text Available Abstract Background In clinical trials, the opportunity for an early stop during an interim analysis (either for efficacy or for futility may relevantly save time and financial resources. This is especially important, if the planning assumptions required for power calculation are based on a low level of evidence. For example, when including two primary endpoints in the confirmatory analysis, the power of the trial depends on the effects of both endpoints and on their correlation. Assessing the feasibility of such a trial is therefore difficult, as the number of parameter assumptions to be correctly specified is large. For this reason, so-called ‘group sequential designs’ are of particular importance in this setting. Whereas the choice of adequate boundaries to stop a trial early for efficacy has been broadly discussed in the literature, the choice of optimal futility boundaries has not been investigated so far, although this may have serious consequences with respect to performance characteristics. Methods In this work, we propose a general method to construct ‘optimal’ futility boundaries according to predefined criteria. Further, we present three different group sequential designs for two endpoints applying these futility boundaries. Our methods are illustrated by a real clinical trial example and by Monte-Carlo simulations. Results By construction, the provided method of choosing futility boundaries maximizes the probability to correctly stop in case of small or opposite effects while limiting the power loss and the probability of stopping the study ‘wrongly’. Our results clearly demonstrate the benefit of using such ‘optimal’ futility boundaries, especially compared to futility boundaries commonly applied in practice. Conclusions As the properties of futility boundaries are often not considered in practice and unfavorably chosen futility boundaries may imply bad properties of the study design, we recommend assessing the

  14. Exposing the cancer genome atlas as a SPARQL endpoint.

    Science.gov (United States)

    Deus, Helena F; Veiga, Diogo F; Freire, Pablo R; Weinstein, John N; Mills, Gordon B; Almeida, Jonas S

    2010-12-01

    The Cancer Genome Atlas (TCGA) is a multidisciplinary, multi-institutional effort to characterize several types of cancer. Datasets from biomedical domains such as TCGA present a particularly challenging task for those interested in dynamically aggregating its results because the data sources are typically both heterogeneous and distributed. The Linked Data best practices offer a solution to integrate and discover data with those characteristics, namely through exposure of data as Web services supporting SPARQL, the Resource Description Framework query language. Most SPARQL endpoints, however, cannot easily be queried by data experts. Furthermore, exposing experimental data as SPARQL endpoints remains a challenging task because, in most cases, data must first be converted to Resource Description Framework triples. In line with those requirements, we have developed an infrastructure to expose clinical, demographic and molecular data elements generated by TCGA as a SPARQL endpoint by assigning elements to entities of the Simple Sloppy Semantic Database (S3DB) management model. All components of the infrastructure are available as independent Representational State Transfer (REST) Web services to encourage reusability, and a simple interface was developed to automatically assemble SPARQL queries by navigating a representation of the TCGA domain. A key feature of the proposed solution that greatly facilitates assembly of SPARQL queries is the distinction between the TCGA domain descriptors and data elements. Furthermore, the use of the S3DB management model as a mediator enables queries to both public and protected data without the need for prior submission to a single data source. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. Two-temperature LATE-PCR endpoint genotyping

    Directory of Open Access Journals (Sweden)

    Reis Arthur H

    2006-12-01

    Full Text Available Abstract Background In conventional PCR, total amplicon yield becomes independent of starting template number as amplification reaches plateau and varies significantly among replicate reactions. This paper describes a strategy for reconfiguring PCR so that the signal intensity of a single fluorescent detection probe after PCR thermal cycling reflects genomic composition. The resulting method corrects for product yield variations among replicate amplification reactions, permits resolution of homozygous and heterozygous genotypes based on endpoint fluorescence signal intensities, and readily identifies imbalanced allele ratios equivalent to those arising from gene/chromosomal duplications. Furthermore, the use of only a single colored probe for genotyping enhances the multiplex detection capacity of the assay. Results Two-Temperature LATE-PCR endpoint genotyping combines Linear-After-The-Exponential (LATE-PCR (an advanced form of asymmetric PCR that efficiently generates single-stranded DNA and mismatch-tolerant probes capable of detecting allele-specific targets at high temperature and total single-stranded amplicons at a lower temperature in the same reaction. The method is demonstrated here for genotyping single-nucleotide alleles of the human HEXA gene responsible for Tay-Sachs disease and for genotyping SNP alleles near the human p53 tumor suppressor gene. In each case, the final probe signals were normalized against total single-stranded DNA generated in the same reaction. Normalization reduces the coefficient of variation among replicates from 17.22% to as little as 2.78% and permits endpoint genotyping with >99.7% accuracy. These assays are robust because they are consistent over a wide range of input DNA concentrations and give the same results regardless of how many cycles of linear amplification have elapsed. The method is also sufficiently powerful to distinguish between samples with a 1:1 ratio of two alleles from samples comprised of

  16. Poison 1 - a programme for calculation of reactivity transients due to fission product poisoning and its application in continuous determination of xenon and samarium poisoning in reactor KS-150

    International Nuclear Information System (INIS)

    Rana, S.B.

    1973-12-01

    The report contains a user's description of the 3-dimensional programme POISON 1 for calculating reactivity transients due to fission-product poisoning during changes of reactor power. The chapter dealing with Xe poisoning contains a description of Xe tables, the method of operational determination of Xe poisoning, use of Xe transients for calibrating control rods and means of shutting down the reactor without being overriden by Xe poisoning. Sm poisoning is determined continuously on the basis of the power diagram of reactor operation. In conclusion a possibility of using the programme in a process computer in combination with self-powered detectors as local power sensors is indicated. (author)

  17. Endpoint-based parallel data processing in a parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2014-08-12

    Endpoint-based parallel data processing in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI composed of data communications endpoints, each endpoint including a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task, the compute nodes coupled for data communications through the PAMI, including establishing a data communications geometry, the geometry specifying, for tasks representing processes of execution of the parallel application, a set of endpoints that are used in collective operations of the PAMI including a plurality of endpoints for one of the tasks; receiving in endpoints of the geometry an instruction for a collective operation; and executing the instruction for a collective operation through the endpoints in dependence upon the geometry, including dividing data communications operations among the plurality of endpoints for one of the tasks.

  18. Verification of models for ballistic movement time and endpoint variability.

    Science.gov (United States)

    Lin, Ray F; Drury, Colin G

    2013-01-01

    A hand control movement is composed of several ballistic movements. The time required in performing a ballistic movement and its endpoint variability are two important properties in developing movement models. The purpose of this study was to test potential models for predicting these two properties. Twelve participants conducted ballistic movements of specific amplitudes using a drawing tablet. The measured data of movement time and endpoint variability were then used to verify the models. This study was successful with Hoffmann and Gan's movement time model (Hoffmann, 1981; Gan and Hoffmann 1988) predicting more than 90.7% data variance for 84 individual measurements. A new theoretically developed ballistic movement variability model, proved to be better than Howarth, Beggs, and Bowden's (1971) model, predicting on average 84.8% of stopping-variable error and 88.3% of aiming-variable errors. These two validated models will help build solid theoretical movement models and evaluate input devices. This article provides better models for predicting end accuracy and movement time of ballistic movements that are desirable in rapid aiming tasks, such as keying in numbers on a smart phone. The models allow better design of aiming tasks, for example button sizes on mobile phones for different user populations.

  19. Imaging biomarkers as surrogate endpoints for drug development

    International Nuclear Information System (INIS)

    Richter, Wolf S.

    2006-01-01

    The employment of biomarkers (including imaging biomarkers, especially PET) in drug development has gained increasing attention during recent years. This has been partly stimulated by the hope that the integration of biomarkers into drug development programmes may be a means to increase the efficiency and effectiveness of the drug development process by early identification of promising drug candidates - thereby counteracting the rising costs of drug development. More importantly, however, the interest in biomarkers for drug development is the logical consequence of recent advances in biosciences and medicine which are leading to target-specific treatments in the framework of ''personalised medicine''. A considerable proportion of target-specific drugs will show effects in subgroups of patients only. Biomarkers are a means to identify potential responders, or patient subgroups at risk for specific side-effects. Biomarkers are used in early drug development in the context of translational medicine to gain information about the drug's potential in different patient groups and disease states. The information obtained at this stage is mainly important for designing subsequent clinical trials and to identify promising drug candidates. Biomarkers in later phases of clinical development may - if properly validated - serve as surrogate endpoints for clinical outcomes. Regulatory agencies in the EU and the USA have facilitated the use of biomarkers early in the development process. The validation of biomarkers as surrogate endpoints is part of FDA's ''critical path initiative''. (orig.)

  20. Time-dependent efficacy of longitudinal biomarker for clinical endpoint.

    Science.gov (United States)

    Kolamunnage-Dona, Ruwanthi; Williamson, Paula R

    2018-06-01

    Joint modelling of longitudinal biomarker and event-time processes has gained its popularity in recent years as they yield more accurate and precise estimates. Considering this modelling framework, a new methodology for evaluating the time-dependent efficacy of a longitudinal biomarker for clinical endpoint is proposed in this article. In particular, the proposed model assesses how well longitudinally repeated measurements of a biomarker over various time periods (0,t) distinguish between individuals who developed the disease by time t and individuals who remain disease-free beyond time t. The receiver operating characteristic curve is used to provide the corresponding efficacy summaries at various t based on the association between longitudinal biomarker trajectory and risk of clinical endpoint prior to each time point. The model also allows detecting the time period over which a biomarker should be monitored for its best discriminatory value. The proposed approach is evaluated through simulation and illustrated on the motivating dataset from a prospective observational study of biomarkers to diagnose the onset of sepsis.

  1. Cortical plasticity as a new endpoint measurement for chronic pain

    Directory of Open Access Journals (Sweden)

    Zhuo Min

    2011-07-01

    Full Text Available Abstract Animal models of chronic pain are widely used to investigate basic mechanisms of chronic pain and to evaluate potential novel drugs for treating chronic pain. Among the different criteria used to measure chronic pain, behavioral responses are commonly used as the end point measurements. However, not all chronic pain conditions can be easily measured by behavioral responses such as the headache, phantom pain and pain related to spinal cord injury. Here I propose that cortical indexes, that indicate neuronal plastic changes in pain-related cortical areas, can be used as endpoint measurements for chronic pain. Such cortical indexes are not only useful for those chronic pain conditions where a suitable animal model is lacking, but also serve as additional screening methods for potential drugs to treat chronic pain in humans. These cortical indexes are activity-dependent immediate early genes, electrophysiological identified plastic changes and biochemical assays of signaling proteins. It can be used to evaluate novel analgesic compounds that may act at peripheral or spinal sites. I hope that these new cortical endpoint measurements will facilitate our search for new, and more effective, pain medicines, and help to reduce false lead drug targets.

  2. A yeast screening system for simultaneously monitoring multiple genetic endpoints

    International Nuclear Information System (INIS)

    Dixon, M.L.; Mortimer, R.K.

    1986-01-01

    Mutation, recombination, and mitochondrial deficiencies have been proposed to have roles in the carcinogenic process. The authors describe a diploid strain of the yeast Saccharomyces cerevisiae capable of detecting this wide spectrum of genetic changes. The markers used for monitoring these events have been especially well characterized genetically. Ultraviolet light was chosen as a model carcinogenic agent to test this system. In addition to highly significant increases in the frequencies of each genetic change, increases in the absolute numbers of each change indicated induction and not selective survival. The relative amounts of each type of genetic change varied with dose. The wide spectrum of endpoints monitored in the XD83 yeast system may allow the detection of certain carcinogens and other genetically toxic agents which have escaped detection in more limited systems. Since only one strain is required to simultaneously monitor these genetic changes, this assay system should facilitate comparisons of the induced changes and be more efficient than using multiple strains to monitor the same endpoints. (Auth.)

  3. Parametric Study of Beta-Endpoint Energy in Direct Energy Converters

    Science.gov (United States)

    2007-01-01

    value to the endpoint energy of nickel-63 ( Ni63 ), whose endpoint energy is 66 keV. Only an approximation is sought. Nickel-63 is an easily...is known to vary from Sr90’s spectrum, where instead of peaking at approximately one third of the endpoint energy, the peak of Ni63 ’s output spectrum

  4. Lethal and sublethal endpoints observed for Artemia exposed to two reference toxicants and an ecotoxicological concern organic compound.

    Science.gov (United States)

    Manfra, Loredana; Canepa, Sara; Piazza, Veronica; Faimali, Marco

    2016-01-01

    Swimming speed alteration and mortality assays with the marine crustacean Artemia franciscana were carried out. EC50 and LC50 values after 24-48h exposures were calculated for two reference toxicants, copper sulphate pentahydrate (CuSO4·5H2O) and Sodium Dodecyl Sulphate (SDS), and an ecotoxicological concern organic compound, Diethylene Glycol (DEG). Different end-points have been evaluated, in order to point out their sensitivity levels. The swimming speed alteration (SSA) was compared to mortality values and also to the hatching rate inhibition (literature data). SSA resulted to be more sensitive than the mortality and with a sensitivity comparable to (or even higher than) the hatching rate endpoint. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Optimal descriptor as a translator of eclectic data into endpoint prediction: mutagenicity of fullerene as a mathematical function of conditions.

    Science.gov (United States)

    Toropov, Andrey A; Toropova, Alla P

    2014-06-01

    The experimental data on the bacterial reverse mutation test on C60 nanoparticles (TA100) is examined as an endpoint. By means of the optimal descriptors calculated with the Monte Carlo method a mathematical model of the endpoint has been built up. The model is the mathematical function of (i) dose (g/plate); (ii) metabolic activation (i.e. with S9 mix or without S9 mix); and (iii) illumination (i.e. dark or irradiation). The statistical quality of the model is the following: n=10, r(2)=0.7549, q(2)=0.5709, s=7.67, F=25 (Training set); n=5, r(2)=0.8987, s=18.4 (Calibration set); and n=5, r(2)=0.6968, s=10.9 (Validation set). Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. QED contribution to the color-singlet J/ψ production in Υ decay near the endpoint

    International Nuclear Information System (INIS)

    Liu Xiaohui

    2010-01-01

    A recent study indicates that the α 2 α s 2 order QED processes of Υ→J/ψ+X decay are compatible with those of QCD processes. However, in the endpoint region, the nonrelativistic QED calculation breaks down since the collinear degrees of freedom are missing under the framework of this effective theory. In this paper we apply the soft-collinear effective theory (SCET) to study the color-singlet QED process at the kinematic limit. Within this approach we are able to sum the kinematic logarithms by running operators using the renormalization group equations of soft-collinear effective theory, which will lead to a dramatic change in the momentum distribution near the endpoint and the spectrum shape consistent with the experimental results.

  7. On the Higher Moments of Particle Multiplicity, Chemical Freeze-Out, and QCD Critical Endpoint

    Directory of Open Access Journals (Sweden)

    A. Tawfik

    2013-01-01

    Full Text Available We calculate the first six nonnormalized moments of particle multiplicity within the framework of the hadron resonance gas model. In terms of the lower order moments and corresponding correlation functions, general expressions of higher order moments are derived. Thermal evolution of the first four normalized moments and their products (ratios are studied at different chemical potentials, so that it is possible to evaluate them at chemical freeze-out curve. It is found that a nonmonotonic behaviour reflecting the dynamical fluctuation and strong correlation of particles starts to appear from the normalized third order moment. We introduce novel conditions for describing the chemical freeze-out curve. Although the hadron resonance gas model does not contain any information on the criticality related to the chiral dynamics and singularity in the physical observables, we are able to find out the location of the QCD critical endpoint at μ ~ 350  MeV and temperature T ~ 162  MeV.

  8. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  9. Radiological impacts analysis with use of new endpoint as complementary safety indicators

    International Nuclear Information System (INIS)

    Peralta Vital, J.L.; Gil Castillo, R.; Fleitas Estevez, G.G.; Olivera Acosta, J.

    2015-01-01

    The paper shows the new safety indicators on risk assessment (safety assessment) to radioactive waste environmental management implementation (concentrations and fluxes of naturally occurring radioactive materials (NORM)). The endpoint obtained, allow the best analysis of the radiological impact associated to radioactive waste isolation system. The common safety indicators for safety assessment purpose, dose and risk, are very time dependent, increasing the uncertainties in the results for long term assessment. The complementary and new proposed endpoints are more stable and they are not affected by changes in the critical group, pathways, etc. The NORM values on facility site were obtained as result of national surveys, the natural concentrations of U, Ra, Th, K has been associated with the variation of the lithologies in 3 geographical areas of the Country (Occidental, Central and Oriental). The results obtained are related with the safety assessment topics and allowed to apply the new complementary safety indicators, by comparisons between the natural concentrations and fluxes on site and its calculated values for the conceptual repository design. In order to normalize the concentration results, the analysis was realized adopting the criteria of the Repository Equivalent Rock Volume (RERV). The preliminary comparison showed that the calculated concentrations and fluxes in the Cuban conceptual radioactive waste repository are not higher than the natural values in the host rock. According to the application of new safety indicators, the reference disposal facility does not increase the natural activity concentration and fluxes in the environment. In order to implement these new safety indicator it has been used the current 226 Ra inventory of the Repository and the 226 Ra as natural concentration on the site. (authors)

  10. Low dose response analysis through a cytogenetic end-point

    International Nuclear Information System (INIS)

    Bojtor, I.; Koeteles, G.J.

    1998-01-01

    The effects of low doses were studied on human lymphocytes of various individuals. The frequency of micronuclei in cytokinesis-blocked cultured lymphocytes was taken as end-point. The probability distribution of radiation-induced increment was statistically proved and identified as to be asymmetric when the blood samples had been irradiated with doses of 0.01-0.05 Gy of X-rays, similarly to that in unirradiated control population. On the contrary, at or above 1 Gy the corresponding normal curve could be accepted only reflecting an approximately symmetrical scatter of the increments about their mean value. It was found that the slope as well as the closeness of correlation of the variables considerably changed when lower and lower dose ranges had been selected. Below approximately 0.2 Gy even an unrelatedness was found betwen the absorbed dose and the increment

  11. Hausdorff dimension of exponential parameter rays and their endpoints

    International Nuclear Information System (INIS)

    Bailesteanu, Mihai; Balan, Horia Vlad; Schleicher, Dierk

    2008-01-01

    We investigate the set I of parameters κ for which the singular value of z map e z + κ converges to ∞. The set I consists of uncountably many parameter rays, plus landing points of some of these rays (Förster et al 2008 Proc Am. Math. Soc. 136 at press (Preprint math.DS/0311427)). We show that the parameter rays have Hausdorff dimension 1, which implies (Qiu 1994 Acta Math. Sin. (N.S.) 10 362–8) that the ray endpoints in I alone have dimension 2. Analogous results were known for dynamical planes of exponential maps (Karpińska 1999 C. R. Acad. Sci. Paris Sér. I: Math. 328 1039–44; Schleicher and Zimmer 2003 J. Lond. Math. Soc. 67 380–400); our result shows that this also holds in parameter space

  12. Biomarkers of intermediate endpoints in environmental and occupational health

    DEFF Research Database (Denmark)

    Knudsen, Lisbeth E; Hansen, Ase M

    2007-01-01

    The use of biomarkers in environmental and occupational health is increasing due to increasing demands on information about health risks from unfavourable exposures. Biomarkers provide information about individual loads. Biomarkers of intermediate endpoints benefit in comparison with biomarkers...... of exposure from the fact that they are closer to the adverse outcome in the pathway from exposure to health effects and may provide powerful information for intervention. Some biomarkers are specific, e.g., DNA and protein adducts, while others are unspecific like the cytogenetic biomarkers of chromosomal...... health effect from the result of the measurement has been performed for the cytogenetic biomarkers showing a predictive value of high levels of CA and increased risk of cancer. The use of CA in future studies is, however, limited by the laborious and sensitive procedure of the test and lack of trained...

  13. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  14. Monte Carlo simulations to calculate energy doses in a cow after continuous ingestion of CS 137 and K 40; Monte-Carlo-Simulationen zur Berechnung der Energiedosis in einem Rind nach kontinuierlicher Aufnahme von CS 137 und K 40

    Energy Technology Data Exchange (ETDEWEB)

    Pichl, E. [Technische Univ. Graz (Austria). Inst. fuer Medizintechnik; Rabitsch, H. [Technische Univ. Graz (Austria). Arbeitsgebiet Strahlenphysik

    2009-07-01

    Currently ICRP (International Commission on Radiological Protection) develops a new recommendation to estimate the natural radiation exposure of an agreed set of animals and reference plants. For estimating effective dose in humans and animals, the incorporated activities of natural and artificial radionuclides in body tissues and contents of the digestive system have to be known. It was the aim of this investigation to calculate energy doses caused by Cs 137 and K 40 in the reproductive organs (uterus, ovaries) of a cow. During its whole lifetime from 1986 to 1992, the cow incorporated continuously Cs 137 which was due to the fallout following the Chernobyl accident. K 40 occurs naturally in the cow's fodder. The cow was born in a highly contaminated region of Styria, Austria, and was infertile since 1990. The activities of Cs 137 and K 40 in the cow's fodder and in tissues, organs and contents of the digestive system of the carcass were measured simultaneously with the help of semiconductor detectors. To calculate the specific absorbed fractions by means of the Monte Carlo code MCNP, an appropriate simulation model for the reproductive organs and their surrounding tissues was developed. The contents of rectum and urinary bladder account for the main part of the energy dose in the reproductive organs. Comparison of our results with data from other investigations showed, that lifetime accumulation of Cs 137 and K 40 was too low to cause radiation inferred infertility. (orig.)

  15. Smarandache Continued Fractions

    OpenAIRE

    Ibstedt, H.

    2001-01-01

    The theory of general continued fractions is developed to the extent required in order to calculate Smarandache continued fractions to a given number of decimal places. Proof is given for the fact that Smarandache general continued fractions built with positive integer Smarandache sequences baving only a finite number of terms equal to 1 is convergent. A few numerical results are given.

  16. Investigation of allergenicity of some cosmetic mixtures by using ex vivo local lymph node assay-BrdU endpoints.

    Science.gov (United States)

    Ulker, Ozge Cemiloglu; Kaymak, Yesim; Karakaya, Asuman

    2014-01-01

    Balsam of Peru and fragrance mix are commonly used in cosmetic products. Allergy to fragrance is the most common cause of cosmetic contact dermatitis. In the present study, ex vivo local lymph node assay-5-bromo-2'-deoxyuridine (LLNA-BrdU) was used to evaluate the dermal sensitization potential of these cosmetic mixtures. The stimulation index values and estimated concentration (EC3) values were calculated and the potency classification was found for each mixture. At the same time, in order to measure the irritant effect without having to use additional animals, a combination of ex vivo LLNA-BrdU and the irritancy assay was conducted. Th1 [interleukin (IL)-2, interferon-γ] and Th2 cytokine (IL-4, IL-5) releases from lymph node cell culture were investigated as non-radioactive endpoints. According to the results of ex vivo LLNA-BrdU assays, EC3 values were found to be 3.09% (moderate) for balsam of Peru and 4.44% (moderate) for fragrance mix. Cytokine analysis results indicate that both Th1 and Th2 cytokines are involved in the regulation of murine contact allergy and can be considered as useful endpoints. In conclusion, according to our results, fragrance mix and balsam of Peru can be considered as moderate sensitizers; however, in high concentrations, both of them have irritation properties. The cytokines investigated can be considered as the endpoints of the ex vivo LLNA-BrdU assay. © 2014 S. Karger AG, Basel.

  17. Influence of nutrient medium composition on uranium toxicity and choice of the most sensitive growth related endpoint in Lemna minor.

    Science.gov (United States)

    Horemans, Nele; Van Hees, May; Saenen, Eline; Van Hoeck, Arne; Smolders, Valérie; Blust, Ronny; Vandenhove, Hildegarde

    2016-01-01

    Uranium (U) toxicity is known to be highly dependent on U speciation and bioavailability. To assess the impact of uranium on plants, a growth inhibition test was set up in the freshwater macrophyte Lemna minor. First growth media with different compositions were tested in order to find a medium fit for testing U toxicity in L. minor. Following arguments were used for medium selection: the ability to sustain L. minor growth, a high solubility of U in the medium and a high percentage of the more toxic U-species namely UO2(2+). Based on these selection criteria a with a low phosphate concentration of 0.5 mg L(-1) and supplemented with 5 mM MES (2-(N-morpholino)ethanesulfonic acid) to ensure pH stability was chosen. This medium also showed highest U toxicity compared to the other tested media. Subsequently a full dose response curve for U was established by exposing L. minor plants to U concentrations ranging from 0.05 μM up to 150 μM for 7 days. Uranium was shown to adversely affect growth of L. minor in a dose dependent manner with EC10, EC30 and EC50 values ranging between 1.6 and 4.8 μM, 7.7-16.4 μM and 19.4-37.2 μM U, respectively, depending on the growth endpoint. Four different growth related endpoints were tested: frond area, frond number, fresh weight and dry weight. Although differences in relative growth rates and associated ECx-values calculated on different endpoints are small (maximal twofold difference), frond area is recommended to be used to measure U-induced growth effects as it is a sensitive growth endpoint and easy to measure in vivo allowing for measurements over time. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Assessment of sublethal endpoints after chronic exposure of the nematode Caenorhabditis elegans to palladium, platinum and rhodium.

    Science.gov (United States)

    Schertzinger, Gerhard; Zimmermann, Sonja; Grabner, Daniel; Sures, Bernd

    2017-11-01

    The aim of this study was to investigate chronic effects of the platinum-group elements (PGE) palladium (Pd), platinum (Pt) and rhodium (Rh) on the nematode Caenorhabditis elegans. Aquatic toxicity testing was carried out according to ISO 10872 by determining 96 h EC 50 values for sublethal endpoints, including growth, fertility and reproduction. Single PGE standard solutions were used as metal source. Based on the EC 50 values for Pt, reproduction (96 h EC 50  = 497 μg/L) was the most sensitive endpoint followed by fertility (96 h EC 50  = 726 μg/L) and growth (96 h EC 50  = 808 μg/L). For Pd, no precise EC 50 values could be calculated due to bell-shaped concentration response curves, but the 96 h EC 50 for reproduction ranged between 10 and 100 μg/L. Pd and Pt had effects on all endpoints. With raising element concentrations reproduction was inhibited first. At a certain concentration, fertility was also affected, which in turn had an additional effect on reproduction. Growth inhibition can also lead to a loss of fertility if the worms do not reach an appropriate body size to become fertile. Rhodium showed no inhibition of any endpoint between concentrations of 100 to 10,000 μg Rh/L. The results of this study allow the following order of PGE with respect to decreasing toxicity to C. elegans: Pd > Pt » Rh. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. SpEnD: Linked Data SPARQL Endpoints Discovery Using Search Engines

    Science.gov (United States)

    Yumusak, Semih; Dogdu, Erdogan; Kodaz, Halife; Kamilaris, Andreas; Vandenbussche, Pierre-Yves

    In this study, a novel metacrawling method is proposed for discovering and monitoring linked data sources on the Web. We implemented the method in a prototype system, named SPARQL Endpoints Discovery (SpEnD). SpEnD starts with a "search keyword" discovery process for finding relevant keywords for the linked data domain and specifically SPARQL endpoints. Then, these search keywords are utilized to find linked data sources via popular search engines (Google, Bing, Yahoo, Yandex). By using this method, most of the currently listed SPARQL endpoints in existing endpoint repositories, as well as a significant number of new SPARQL endpoints, have been discovered. Finally, we have developed a new SPARQL endpoint crawler (SpEC) for crawling and link analysis.

  20. Correlates of protection for inactivated enterovirus 71 vaccine: the analysis of immunological surrogate endpoints.

    Science.gov (United States)

    Zhu, Wenbo; Jin, Pengfei; Li, Jing-Xin; Zhu, Feng-Cai; Liu, Pei

    2017-09-01

    Inactivated Enterovirus 71 (EV71) vaccines showed significant efficacy against the diseases associated with EV71 and a neutralizing antibody (NTAb) titer of 1:16-1:32 was suggested as the correlates of the vaccine protection. This paper aims to further estimate the immunological surrogate endpoints for the protection of inactivated EV71 vaccines and the effect factors. Pre-vaccination NTAb against EV71 at baseline (day 0), post-vaccination NTAb against EV71 at day 56, and the occurrence of laboratory-confirmed EV71-associated diseases during a 24-months follow-up period were collected from a phase 3 efficacy trial of an inactivated EV71 vaccine. We used the mixed-scaled logit model and the absolute sigmoid function by some extensions in continuous models to estimate the immunological surrogate endpoint for the EV71 vaccine protection, respectively. For children with a negative baseline of EV71 NTAb titers, an antibody level of 26.6 U/ml (1:30) was estimated to provide at least a 50% protection for 12 months, and an antibody level of 36.2 U/ml (1:42) may be needed to achieve a 50% protective level of the population for 24 months. Both the pre-vaccination NTAb level and the vaccine protective period could affect the estimation of the immunological surrogate for EV71 vaccine. A post-vaccination NTAb titer of 1:42 or more may be needed for long-term protection. NCT01508247.

  1. Inconsistent selection and definition of local and regional endpoints in breast cancer research.

    Science.gov (United States)

    Moossdorff, M; van Roozendaal, L M; Schipper, R-J; Strobbe, L J A; Voogd, A C; Tjan-Heijnen, V C G; Smidt, M L

    2014-12-01

    Results in breast cancer research are reported using study endpoints. Most are composite endpoints (such as locoregional recurrence), consisting of several components (for example local recurrence) that are in turn composed of specific events (such as skin recurrence). Inconsistent endpoint selection and definition might lead to unjustified conclusions when comparing study outcomes. This study aimed to determine which locoregional endpoints are used in breast cancer studies, and how these endpoints and their components are defined. PubMed was searched for breast cancer studies published in nine leading journals in 2011. Articles using endpoints with a local or regional component were included and definitions were compared. Twenty-three different endpoints with a local or regional component were extracted from 44 articles. Most frequently used were disease-free survival (25 articles), recurrence-free survival (7), local control (4), locoregional recurrence-free survival (3) and event-free survival (3). Different endpoints were used for similar outcomes. Of 23 endpoints, five were not defined and 18 were defined only partially. Of these, 16 contained a local and 13 a regional component. Included events were not specified in 33 of 57 (local) and 27 of 50 (regional) cases. Definitions of local components inconsistently included carcinoma in situ and skin and chest wall recurrences. Regional components inconsistently included specific nodal sites and skin and chest wall recurrences. Breast cancer studies use many different endpoints with a locoregional component. Definitions of endpoints and events are either not provided or vary between trials. To improve transparency, facilitate trial comparison and avoid unjustified conclusions, authors should report detailed definitions of all endpoints. © 2014 BJS Society Ltd. Published by John Wiley & Sons Ltd.

  2. Sensitive endpoints in extended one-generation reproductive toxicity study versus two generation?

    DEFF Research Database (Denmark)

    Christiansen, Sofie

    . The protocol includes assessment of novel endpoints of concern and developmental landmarks such as anogenital distance, nipple retention (both sensitive endpoints for anti-androgenic effects in male offspring) and mammary gland development (sensitive endpoint for oestrogen action) and may also include...... during critical period of development in contrast to the parental generation. Retrospective analysis of available two-generation studies, however, indicate that the assessment included in the study of other endpoints in the male offspring such as histopathology of reproductive organs and semen quality...

  3. 21 CFR 314.510 - Approval based on a surrogate endpoint or on an effect on a clinical endpoint other than survival...

    Science.gov (United States)

    2010-04-01

    ... Serious or Life-Threatening Illnesses § 314.510 Approval based on a surrogate endpoint or on an effect on... well-controlled. The applicant shall carry out any such studies with due diligence. ...

  4. Free-time and fixed end-point multi-target optimal control theory: Application to quantum computing

    International Nuclear Information System (INIS)

    Mishima, K.; Yamashita, K.

    2011-01-01

    Graphical abstract: The two-state Deutsch-Jozsa algortihm used to demonstrate the utility of free-time and fixed-end point multi-target optimal control theory. Research highlights: → Free-time and fixed-end point multi-target optimal control theory (FRFP-MTOCT) was constructed. → The features of our theory include optimization of the external time-dependent perturbations with high transition probabilities, that of the temporal duration, the monotonic convergence, and the ability to optimize multiple-laser pulses simultaneously. → The advantage of the theory and a comparison with conventional fixed-time and fixed end-point multi-target optimal control theory (FIFP-MTOCT) are presented by comparing data calculated using the present theory with those published previously [K. Mishima, K. Yamashita, Chem. Phys. 361 (2009) 106]. → The qubit system of our interest consists of two polar NaCl molecules coupled by dipole-dipole interaction. → The calculation examples show that our theory is useful for minor adjustment of the external fields. - Abstract: An extension of free-time and fixed end-point optimal control theory (FRFP-OCT) to monotonically convergent free-time and fixed end-point multi-target optimal control theory (FRFP-MTOCT) is presented. The features of our theory include optimization of the external time-dependent perturbations with high transition probabilities, that of the temporal duration, the monotonic convergence, and the ability to optimize multiple-laser pulses simultaneously. The advantage of the theory and a comparison with conventional fixed-time and fixed end-point multi-target optimal control theory (FIFP-MTOCT) are presented by comparing data calculated using the present theory with those published previously [K. Mishima, K. Yamashita, Chem. Phys. 361, (2009), 106]. The qubit system of our interest consists of two polar NaCl molecules coupled by dipole-dipole interaction. The calculation examples show that our theory is useful for minor

  5. CaFE: a tool for binding affinity prediction using end-point free energy methods.

    Science.gov (United States)

    Liu, Hui; Hou, Tingjun

    2016-07-15

    Accurate prediction of binding free energy is of particular importance to computational biology and structure-based drug design. Among those methods for binding affinity predictions, the end-point approaches, such as MM/PBSA and LIE, have been widely used because they can achieve a good balance between prediction accuracy and computational cost. Here we present an easy-to-use pipeline tool named Calculation of Free Energy (CaFE) to conduct MM/PBSA and LIE calculations. Powered by the VMD and NAMD programs, CaFE is able to handle numerous static coordinate and molecular dynamics trajectory file formats generated by different molecular simulation packages and supports various force field parameters. CaFE source code and documentation are freely available under the GNU General Public License via GitHub at https://github.com/huiliucode/cafe_plugin It is a VMD plugin written in Tcl and the usage is platform-independent. tingjunhou@zju.edu.cn. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Ajuste de las simulaciones de flujos continuados para el cálculo del Límite de Potencia Eólica; Calculation of Wind Power Limit adjusting the Continuation Power Flow

    Directory of Open Access Journals (Sweden)

    Ariel Santos Fuentefria

    2012-07-01

    Full Text Available La integración de la energía eólica en los sistemas eléctricos puede provocar problemas de estabilidad ligados fundamentalmente a la variación aleatoria del viento y que se reflejan en la tensión y la frecuencia del sistema. Por lo que conocer el Límite de Potencia Eólica (LPE que puede insertarse en la red sin que esta pierda la estabilidad es un aspecto de extrema importancia, en el cual se han realizando métodos de cálculo para encontrar dicho límite. Estos métodos se desarrollan teniendo en cuenta las restricciones del sistema en estado estacionario, en estado dinámico o ambos. En el siguiente trabajo se desarrolla un método para el cálculo de LPE teniendo en cuenta las restricciones en estado estacionario del sistema. El método propuesto se basa en un análisis de flujo continuado, complementado con el método de Producción Mínima de Potencia Activa, desarrollado en la bibliografía. Se prueba en el sistema eléctrico de la Isla de la Juventud, Cuba y se usa elsoftware libre PSAT para la realización de estos estudios.  The wind power insertion in the power system is an important issue and can create some instability problems in voltage and system frequency due to stochastic origin of wind. Know the Wind Power Limit is a very importantmatter. Existing In bibliography a few methods for calculation of wind power limit. The calculation is based in static constrains, dynamic constraints or both. In this paper is developed a method for the calculation of wind power limit using some adjust in the continuation power flow, and having into account the static constrains. The method is complemented with Minimal Power Production Criterion. The method is proved in the Isla de la Juventud Electric System. The software used in the simulations was the Power System Analysis Toolbox (PSAT.

  7. Declination Calculator

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Declination is calculated using the current International Geomagnetic Reference Field (IGRF) model. Declination is calculated using the current World Magnetic Model...

  8. Is Doubling of Serum Creatinine a Valid Clinical 'Hard' Endpoint in Clinical Nephrology Trials?

    NARCIS (Netherlands)

    Lambers Heerspink, H. J.; Perkovic, V.; de Zeeuw, D.

    2011-01-01

    The composite of end stage renal disease (ESRD), doubling of serum creatinine and (renal) death, is a frequently used endpoint in randomized clinical trials in nephrology. Doubling of serum creatinine is a well-accepted part of this endpoint because a doubling of serum creatinine reflects a large

  9. Right-handed currents at B→ K l+l− kinematic endpoint

    Indian Academy of Sciences (India)

    2017-10-09

    Oct 9, 2017 ... The recent LHCb measured values of these observables are used to conclude an evidence of right-handed currents at the kinematic endpoint of this decay mode. As the conclusion is drawn at the maximum dilepton invariant mass square ( q 2 ) kinematic endpoint, it relies only on heavy quark symmetries ...

  10. Restoration for the future: Setting endpoints and targets and selecting indicators of progress and success

    Science.gov (United States)

    Daniel C. Dey; Callie Jo Schweitzer; John M. Kabrick

    2014-01-01

    Setting endpoints and targets in forest restoration is a complicated task that is best accomplished in cooperative partnerships that account for the ecology of the system, production of desired ecosystem goods and services, economics and well-being of society, and future environments. Clearly written and quantitative endpoints and intermediary targets need to be...

  11. Restoration for the future: endpoints, targets, and indicators of progress and success

    Science.gov (United States)

    Daniel C. Dey; Callie Jo. Schweitzer

    2014-01-01

    Setting endpoints and targets in forest restoration is a complicated task that is best accomplished in cooperative partnerships that account for the ecology of the system, production of desired ecosystem goods and services, economics and well-being of society, and future environments. Clearly described and quantitative endpoints and intermediary targets are needed to...

  12. Observations on Three Endpoint Properties and Their Relationship to Regulatory Outcomes of European Oncology Marketing Applications.

    Science.gov (United States)

    Liberti, Lawrence; Stolk, Pieter; McAuslane, James Neil; Schellens, Jan; Breckenridge, Alasdair M; Leufkens, Hubert

    2015-06-01

    Guidance and exploratory evidence indicate that the type of endpoints and the magnitude of their outcome can define a therapy's clinical activity; however, little empirical evidence relates specific endpoint properties with regulatory outcomes. We explored the relationship of 3 endpoint properties to regulatory outcomes by assessing 50 oncology marketing authorization applications (MAAs; reviewed from 2009 to 2013). Overall, 16 (32%) had a negative outcome. The most commonly used hard endpoints were overall survival (OS) and the duration of response or stable disease. OS was a component of 91% approved and 63% failed MAAs. The most commonly used surrogate endpoints were progression-free survival (PFS), response rate, and health-related quality of life assessments. There was no difference (p = .3801) between the approved and failed MAA cohorts in the proportion of hard endpoints used. A mean of slightly more than four surrogate endpoints were used per approved MAA compared with slightly more than two for failed MAAs. Longer OS and PFS duration outcomes were generally associated with approvals, often when not statistically significant. The approved cohort was associated with a preponderance of statistically significant (p < .05) improvements in primary endpoints (p < .0001 difference between the approved and failed groups). Three key endpoint properties (type of endpoint [hard/surrogate], magnitude of an endpoint outcome, and its statistical significance) are consistent with the European Medicines Agency guidance and, notwithstanding the contribution of unique disease-specific circumstances, are associated with a predictable positive outcome for oncology MAAs. Regulatory decisions made by the European Medicines Agency determine which new medicines will be available to European prescribers and for which therapeutic indications. Regulatory success or failure can be influenced by many factors. This study assessed three key properties of endpoints used in

  13. Defining the end-point of mastication: A conceptual model.

    Science.gov (United States)

    Gray-Stuart, Eli M; Jones, Jim R; Bronlund, John E

    2017-10-01

    properties define the end-point texture and enduring sensory perception of the food. © 2017 Wiley Periodicals, Inc.

  14. Kendall-Theil Robust Line (KTRLine--version 1.0)-A Visual Basic Program for Calculating and Graphing Robust Nonparametric Estimates of Linear-Regression Coefficients Between Two Continuous Variables

    Science.gov (United States)

    Granato, Gregory E.

    2006-01-01

    The Kendall-Theil Robust Line software (KTRLine-version 1.0) is a Visual Basic program that may be used with the Microsoft Windows operating system to calculate parameters for robust, nonparametric estimates of linear-regression coefficients between two continuous variables. The KTRLine software was developed by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, for use in stochastic data modeling with local, regional, and national hydrologic data sets to develop planning-level estimates of potential effects of highway runoff on the quality of receiving waters. The Kendall-Theil robust line was selected because this robust nonparametric method is resistant to the effects of outliers and nonnormality in residuals that commonly characterize hydrologic data sets. The slope of the line is calculated as the median of all possible pairwise slopes between points. The intercept is calculated so that the line will run through the median of input data. A single-line model or a multisegment model may be specified. The program was developed to provide regression equations with an error component for stochastic data generation because nonparametric multisegment regression tools are not available with the software that is commonly used to develop regression models. The Kendall-Theil robust line is a median line and, therefore, may underestimate total mass, volume, or loads unless the error component or a bias correction factor is incorporated into the estimate. Regression statistics such as the median error, the median absolute deviation, the prediction error sum of squares, the root mean square error, the confidence interval for the slope, and the bias correction factor for median estimates are calculated by use of nonparametric methods. These statistics, however, may be used to formulate estimates of mass, volume, or total loads. The program is used to read a two- or three-column tab-delimited input file with variable names in the first row and

  15. Endpoints and cutpoints in head and neck oncology trials: methodical background, challenges, current practice and perspectives.

    Science.gov (United States)

    Hezel, Marcus; von Usslar, Kathrin; Kurzweg, Thiemo; Lörincz, Balazs B; Knecht, Rainald

    2016-04-01

    This article reviews the methodical and statistical basics of designing a trial, with a special focus on the process of defining and choosing endpoints and cutpoints as the foundations of clinical research, and ultimately that of evidence-based medicine. There has been a significant progress in the treatment of head and neck cancer in the past few decades. Currently available treatment options can have a variety of different goals, depending e.g. on tumor stage, among other factors. The outcome of a specific treatment in clinical trials is measured using endpoints. Besides classical endpoints, such as overall survival or organ preservation, other endpoints like quality of life are becoming increasingly important in designing and conducting a trial. The present work is based on electronic research and focuses on the solid methodical and statistical basics of a clinical trial, on the structure of study designs and on the presentation of various endpoints.

  16. Considerations in choosing a primary endpoint that measures durability of virological suppression in an antiretroviral trial.

    Science.gov (United States)

    Gilbert, P B; Ribaudo, H J; Greenberg, L; Yu, G; Bosch, R J; Tierney, C; Kuritzkes, D R

    2000-09-08

    At present, many clinical trials of anti-HIV-1 therapies compare treatments by a primary endpoint that measures the durability of suppression of HIV-1 replication. Several durability endpoints are compared. Endpoints are compared by their implicit assumptions regarding surrogacy for clinical outcomes, sample size requirements, and accommodations for inter-patient differences in baseline plasma HIV-1-RNA levels and in initial treatment response. Virological failure is defined by the non-suppression of virus levels at a prespecified follow-up time T(early virological failure), or by relapse. A binary virological failure endpoint is compared with three time-to-virological failure endpoints: time from (i) randomization that assigns early failures a failure time of T weeks; (ii) randomization that extends the early failure time T for slowly responding subjects; and (iii) virological response that assigns non-responders a failure time of 0 weeks. Endpoint differences are illustrated with Agouron's trial 511. In comparing high with low-dose nelfinavir (NFV) regimens in Agouron 511, the difference in Kaplan-Meier estimates of the proportion not failing by 24 weeks is 16.7% (P = 0.048), 6.5% (P = 0.29) and 22.9% (P = 0.0030) for endpoints (i), (ii) and (iii), respectively. The results differ because NFV suppresses virus more quickly at the higher dose, and the endpoints weigh this treatment difference differently. This illustrates that careful consideration needs to be given to choosing a primary endpoint that will detect treatment differences of interest. A time from randomization endpoint is usually recommended because of its advantages in flexibility and sample size, especially at interim analyses, and for its interpretation for patient management.

  17. A data-driven weighting scheme for multivariate phenotypic endpoints recapitulates zebrafish developmental cascades

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Guozhu, E-mail: gzhang6@ncsu.edu [Bioinformatics Research Center, North Carolina State University, Raleigh, NC (United States); Roell, Kyle R., E-mail: krroell@ncsu.edu [Bioinformatics Research Center, North Carolina State University, Raleigh, NC (United States); Truong, Lisa, E-mail: lisa.truong@oregonstate.edu [Department of Environmental and Molecular Toxicology, Sinnhuber Aquatic Research Laboratory, Oregon State University, Corvallis, OR (United States); Tanguay, Robert L., E-mail: robert.tanguay@oregonstate.edu [Department of Environmental and Molecular Toxicology, Sinnhuber Aquatic Research Laboratory, Oregon State University, Corvallis, OR (United States); Reif, David M., E-mail: dmreif@ncsu.edu [Bioinformatics Research Center, North Carolina State University, Raleigh, NC (United States); Department of Biological Sciences, Center for Human Health and the Environment, North Carolina State University, Raleigh, NC (United States)

    2017-01-01

    Zebrafish have become a key alternative model for studying health effects of environmental stressors, partly due to their genetic similarity to humans, fast generation time, and the efficiency of generating high-dimensional systematic data. Studies aiming to characterize adverse health effects in zebrafish typically include several phenotypic measurements (endpoints). While there is a solid biomedical basis for capturing a comprehensive set of endpoints, making summary judgments regarding health effects requires thoughtful integration across endpoints. Here, we introduce a Bayesian method to quantify the informativeness of 17 distinct zebrafish endpoints as a data-driven weighting scheme for a multi-endpoint summary measure, called weighted Aggregate Entropy (wAggE). We implement wAggE using high-throughput screening (HTS) data from zebrafish exposed to five concentrations of all 1060 ToxCast chemicals. Our results show that our empirical weighting scheme provides better performance in terms of the Receiver Operating Characteristic (ROC) curve for identifying significant morphological effects and improves robustness over traditional curve-fitting approaches. From a biological perspective, our results suggest that developmental cascade effects triggered by chemical exposure can be recapitulated by analyzing the relationships among endpoints. Thus, wAggE offers a powerful approach for analysis of multivariate phenotypes that can reveal underlying etiological processes. - Highlights: • Introduced a data-driven weighting scheme for multiple phenotypic endpoints. • Weighted Aggregate Entropy (wAggE) implies differential importance of endpoints. • Endpoint relationships reveal developmental cascade effects triggered by exposure. • wAggE is generalizable to multi-endpoint data of different shapes and scales.

  18. Allergenicity evaluation of fragrance mix and its ingredients by using ex vivo local lymph node assay-BrdU endpoints.

    Science.gov (United States)

    Ulker, Ozge Cemiloglu; Kaymak, Yesim; Karakaya, Asuman

    2014-03-01

    The present studies were performed to compare the differences between sensitization potency of fragrance mix and its ingredients (oak moss absolute, isoeugenol, eugenol, cinnamal, hydroxycitronellal, geraniol, cinnamic alcohol, alpha amyl cinnamal), by using ex vivo LLNA-BrdU ELISA. The SI and EC3 values were calculated and potency classification was found for the mixture and for each ingredients. TH1 cytokines (IL-2, IFN-γ) and TH2 cytokines (IL-4, IL-5) releases from lymph node cell culture were also investigated as contact sensitization endpoints. The EC3 values were calculated and the potency of contact sensitization were classified for fragrance mix, oak moss absolute, isoeugenol, eugenol, cinnamal, hydroxycitronellal, geraniol, cinnamic alcohol, alpha amyl cinnamal respectively: 4.4% (moderate), 3.4% (moderate), 0.88% (strong), 16.6% (weak), 1.91% (moderate), 9.77% (moderate), 13.1% (weak), 17.93% (weak), 7.74% (moderate). According to our results it should be concluded that exposure to fragrance mix does not constitute an evidently increased hazard compared to exposure to each of the eight fragrance ingredients separately. Cytokine analyses results indicate that both TH1 and TH2 cytokines are involved in the regulation of murine contact allergy and can be considered as useful endpoints. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Calculation of Mitral Valve Area in Mitral Stenosis: Comparison of Continuity Equation and Pressure Half Time With Two-Dimensional Planimetry in Patients With and Without Associated Aortic or Mitral Regurgitation or Atrial Fibrillation

    Directory of Open Access Journals (Sweden)

    Roya Sattarzadeh

    2018-01-01

    Full Text Available Accurate measurement of Mitral Valve Area (MVA is essential to determining the Mitral Stenosis (MS severity and to achieving the best management strategies for this disease. The goal of the present study is to compare mitral valve area (MVA measurement by Continuity Equation (CE and Pressure Half-Time (PHT methods with that of 2D-Planimetry (PL in patients with moderate to severe mitral stenosis (MS. This comparison also was performed in subgroups of patients with significant Aortic Insufficiency (AI, Mitral Regurgitation (MR and Atrial Fibrillation (AF. We studied 70 patients with moderate to severe MS who were referred to echocardiography clinic. MVA was determined by PL, CE and PHT methods. The agreement and correlations between MVA’s obtained from various methods were determined by kappa index, Bland-Altman analysis, and linear regression analysis. The mean values for MVA calculated by CE was 0.81 cm (±0.27 and showed good correlation with those calculated by PL (0.95 cm, ±0.26 in whole population (r=0.771, P<0.001 and MR subgroup (r=0.763, P<0.001 and normal sinus rhythm and normal valve subgroups (r=0.858, P<0.001 and r=0.867, P<0.001, respectively. But CE methods didn’t show any correlation in AF and AI subgroups. MVA measured by PHT had a good correlation with that measured by PL in whole population (r=0.770, P<0.001 and also in NSR (r=0.814, P<0.001 and normal valve subgroup (r=0.781, P<0.001. Subgroup with significant AI and those with significant MR showed moderate correlation (r=0.625, P=0.017 and r=0.595, P=0.041, respectively. Bland Altman Analysis showed that CE would estimate MVA smaller in comparison with PL in the whole population and all subgroups and PHT would estimate MVA larger in comparison with PL in the whole population and all subgroups. The mean bias for CE and PHT are 0.14 cm and -0.06 cm respectively. In patients with moderate to severe mitral stenosis, in the absence of concomitant AF, AI or MR, the accuracy

  20. Continuous auditing & continuous monitoring : Continuous value?

    NARCIS (Netherlands)

    van Hillo, Rutger; Weigand, Hans; Espana, S; Ralyte, J; Souveyet, C

    2016-01-01

    Advancements in information technology, new laws and regulations and rapidly changing business conditions have led to a need for more timely and ongoing assurance with effectively working controls. Continuous Auditing (CA) and Continuous Monitoring (CM) technologies have made this possible by

  1. Comparing sensitivity of ecotoxicological effect endpoints between laboratory and field

    DEFF Research Database (Denmark)

    Selck, H.; Riemann, B.; Christoffersen, K.

    2002-01-01

    multispecies field tests using tributyltin (TBT) and linear alkylbenzene sulfonates (LAS) were compared with published laboratory single-species test results and measured in situ concentrations. Extrapolation methods were evaluated by comparing predicted no-effect concentrations (PNECs), calculated by AF...

  2. Effects of un-ionized ammonia on histological, endocrine, and whole organism endpoints in slimy sculpin (Cottus cognatus)

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, P. [Toxicology Centre, University of Saskatchewan, 44 Campus Drive, Saskatoon, SK, S7N 5B3 (Canada)], E-mail: paula.spencer@usask.ca; Pollock, R.; Dube, M. [Toxicology Centre, University of Saskatchewan, 44 Campus Drive, Saskatoon, SK, S7N 5B3 (Canada)

    2008-12-11

    Ammonia is known to be an important toxicant in aquatic environments. Although ammonia toxicity has been well studied in many fish species, effects of chronic exposure on slimy sculpin (Cottus cognatus), a critical biomonitoring species for northern aquatic habitats, are not well known. Further, with increasing mining development in Canada's north, this information is critical to better predict potential effects of mine effluent discharges on northern fish species. Slimy sculpin were exposed to six concentrations of un-ionized ammonia (NH{sub 3}) relevant to concentrations found in northern mining effluents: control (0 ppm), 0.278 ppm, 0.556 ppm, 0.834 ppm, 1.112 ppm, and 1.668 ppm. An LC{sub 50} of 1.529 ppm was calculated from mortality data. Histopathological examination of gills indicated significant tissue damage, measured as lamellar fusion and epithelial lifting, at 0.834 ppm, 1.112 ppm, and 1.668 ppm. Using gill endpoints, NOEC and LOEC were calculated as 0.556 ppm and 0.834 ppm, respectively. An EC{sub 50} of 0.775 ppm was determined for lamellar fusion and an EC{sub 50} of 0.842 ppm for epithelial lifting. Hemorrhage of gills was present in mortalities, which occurred at 1.668 ppm of un-ionized ammonia. A significant decrease in liver somatic index (LSI) was seen in both male and female fish at 0.834 ppm and 1.112 ppm, respectively. Gonadosomatic index (GSI) in female fish significantly increased at 1.668 ppm un-ionized ammonia with an associated significant increase in total wholebody testosterone concentrations. GSI in male fish also significantly increased at 1.668 ppm but no differences were seen in testosterone concentrations. No significant differences were seen in gonad histopathological assessments or condition factor. Gill histopathology endpoints may be a more sensitive indicator for detecting effects in slimy sculpin exposed to ammonia than traditional chronic endpoints. Results from this study indicate that ammonia concentrations commonly

  3. Cutting Out Continuations

    DEFF Research Database (Denmark)

    Bahr, Patrick; Hutton, Graham

    2016-01-01

    In the field of program transformation, one often transforms programs into continuation-passing style to make their flow of control explicit, and then immediately removes the resulting continuations using defunctionalisation to make the programs first-order. In this article, we show how these two...... transformations can be fused together into a single transformation step that cuts out the need to first introduce and then eliminate continuations. Our approach is calculational, uses standard equational reasoning techniques, and is widely applicable....

  4. Environmentally acceptable endpoints for PAHs at a manufactured gas plant site

    Energy Technology Data Exchange (ETDEWEB)

    Stroo, H.F.; Jensen, R.; Loehr, R.C.; Nakles, D.V.; Fairbrother, A.; Liban, C.B. [ThermoRetec Corp., Carson, CA (USA)

    2000-09-01

    Samples from a former manufactured gas plant (MGP) site in Santa Barbara, CA were tested to evaluate the environmentally acceptable endpoints (EAE) process for setting risk-based cleanup criteria. The research was part of an ongoing effort to develop and demonstrate a protocol for assessing risk-based criteria for MGP sites that incorporates the availability of polycyclic aromatic hydrocarbons (PAHs). Six soil samples were subjected to a battery of physical and biological tests that focused on determining the 'availability' of the soil-bound contaminants to groundwater, ecological receptors, and human receptors. Results demonstrated that sorption to soil, matrix effects, aging, and treatment can significantly reduce chemical availability. Including these reduced availability results in risk assessment calculations yielded environmentally protective cleanup levels almost 3-10 times greater than levels derived using California default risk assessment assumptions. Using an EAE-based approach for MGP soils, especially those containing lampblack, could provide more realistic risk assessment. 23 refs., 6 tabs.

  5. Evaluation of early efficacy endpoints for proof-of-concept trials.

    Science.gov (United States)

    Chen, Cong; Sun, Linda; Li, Chih-Lin

    2013-03-11

    A Phase II proof-of-concept (POC) trial usually uses an early efficacy endpoint other than a clinical endpoint as the primary endpoint. Because of the advancement in bioscience and technology, which has yielded a number of new surrogate biomarkers, drug developers often have more candidate endpoints to choose from than they can handle. As a result, selection of endpoint and its effect size as well as choice of type I/II error rates are often at the center of heated debates in design of POC trials. While optimization of the trade-off between benefit and cost is the implicit objective in such a decision-making process, it is seldom explicitly accounted for in practice. In this research note, motivated by real examples from the oncology field, we provide practical measures for evaluation of early efficacy endpoints (E4) for POC trials. We further provide optimal design strategies for POC trials that include optimal Go-No Go decision criteria for initiation of Phase III and optimal resource allocation strategies for conducting multiple POC trials in a portfolio under fixed resources. Although oncology is used for illustration purpose, the same idea developed in this research note also applies to similar situations in other therapeutic areas or in early-stage drug development in that a Go-No Go decision has to rely on limited data from an early efficacy endpoint and cost-effectiveness is the main concern.

  6. A comparison of chemoembolization endpoints using angiographic versus transcatheter intraarterial perfusion/MR imaging monitoring.

    Science.gov (United States)

    Lewandowski, Robert J; Wang, Dingxin; Gehl, James; Atassi, Bassel; Ryu, Robert K; Sato, Kent; Nemcek, Albert A; Miller, Frank H; Mulcahy, Mary F; Kulik, Laura; Larson, Andrew C; Salem, Riad; Omary, Reed A

    2007-10-01

    Transcatheter arterial chemoembolization (TACE) is an established treatment for unresectable liver cancer. This study was conducted to test the hypothesis that angiographic endpoints during TACE are measurable and reproducible by comparing subjective angiographic versus objective magnetic resonance (MR) endpoints of TACE. The study included 12 consecutive patients who presented for TACE for surgically unresectable HCC or progressive hepatic metastases despite chemotherapy. All procedures were performed with a dedicated imaging system. Angiographic series before and after TACE were reviewed independently by three board-certified interventional radiologists. A subjective angiographic chemoembolization endpoint (SACE) classification scheme, modified from an established angiographic grading system in the cardiology literature, was designed to assist in reproducibly classifying angiographic endpoints. Reproducibility in SACE classification level was compared among operators, and MR imaging perfusion reduction was compared with SACE levels for each observer. Twelve patients successfully underwent 15 separate TACE sessions. SACE levels ranged from I through IV. There was moderate agreement in SACE classification (kappa = 0.46 +/- 0.12). There was no correlation between SACE level and MR perfusion reduction (r = 0.16 for one operator and 0.02 for the other two). Angiographic endpoints during TACE vary widely, have moderate reproducibility among operators, and do not correlate with functional MR imaging perfusion endpoints. Future research should aim to determine ideal angiographic and functional MR imaging endpoints for TACE according to outcome measures such as imaging response, pathologic response, and survival.

  7. A step towards standardization: A method for end-point titer determination by fluorescence index of an automated microscope. End-point titer determination by fluorescence index.

    Science.gov (United States)

    Carbone, Teresa; Gilio, Michele; Padula, Maria Carmela; Tramontano, Giuseppina; D'Angelo, Salvatore; Pafundi, Vito

    2018-05-01

    Indirect Immunofluorescence (IIF) is widely considered the Gold Standard for Antinuclear Antibody (ANA) screening. However, the high inter-reader variability remains the major disadvantage associated with ANA testing and the main reason for the increasing demand of the computer-aided immunofluorescence microscope. Previous studies proposed the quantification of the fluorescence intensity as an alternative for the classical end-point titer evaluation. However, the different distribution of bright/dark light linked to the nature of the self-antigen and its location in the cells result in different mean fluorescence intensities. The aim of the present study was to correlate Fluorescence Index (F.I.) with end-point titers for each well-defined ANA pattern. Routine serum samples were screened for ANA testing on HEp-2000 cells using Immuno Concepts Image Navigator System, and positive samples were serially diluted to assign the end-point titer. A comparison between F.I. and end-point titers related to 10 different staining patterns was made. According to our analysis, good technical performance of F.I. (97% sensitivity and 94% specificity) was found. A significant correlation between quantitative reading of F.I. and end-point titer groups was observed using Spearman's test and regression analysis. A conversion scale of F.I. in end-point titers for each recognized ANA-pattern was obtained. The Image Navigator offers the opportunity to improve worldwide harmonization of ANA test results. In particular, digital F.I. allows quantifying ANA titers by using just one sample dilution. It could represent a valuable support for the routine laboratory and an effective tool to reduce inter- and intra-laboratory variability. Copyright © 2018. Published by Elsevier B.V.

  8. Ovarian cancer clinical trial endpoints: Society of Gynecologic Oncology white paper

    Science.gov (United States)

    Herzog, Thomas J.; Armstrong, Deborah K.; Brady, Mark F.; Coleman, Robert L.; Einstein, Mark H.; Monk, Bradley J.; Mannel, Robert S.; Thigpen, J. Tate; Umpierre, Sharee A.; Villella, Jeannine A.; Alvarez, Ronald D.

    2015-01-01

    Objective To explore the value of multiple clinical endpoints in the unique setting of ovarian cancer. Methods A clinical trial workgroup was established by the Society of Gynecologic Oncology to develop a consensus statement via multiple conference calls, meetings and white paper drafts. Results Clinical trial endpoints have profound effects on late phase clinical trial design, result interpretation, drug development, and regulatory approval of therapeutics. Selection of the optimal clinical trial endpoint is particularly provocative in ovarian cancer where long overall survival (OS) is observed. The lack of new regulatory approvals and the lack of harmony between regulatory bodies globally for ovarian cancer therapeutics are of concern. The advantages and disadvantages of the numerous endpoints available are herein discussed within the unique context of ovarian cancer where both crossover and post-progression therapies potentially uncouple surrogacy between progression-free survival (PFS) and OS, the two most widely supported and utilized endpoints. The roles of patient reported outcomes (PRO) and health related quality of life (HRQoL) are discussed, but even these widely supported parameters are affected by the unique characteristics of ovarian cancer where a significant percentage of patients may be asymptomatic. Original data regarding the endpoint preferences of ovarian cancer advocates is presented. Conclusions Endpoint selection in ovarian cancer clinical trials should reflect the impact on disease burden and unique characteristics of the treatment cohort while reflecting true patient benefit. Both OS and PFS have led to regulatory approvals and are clinically important. OS remains the most objective and accepted endpoint because it is least vulnerable to bias; however, the feasibility of OS in ovarian cancer is compromised by the requirement for large trial size, prolonged time-line for final analysis, and potential for unintended loss of treatment effect

  9. Ovarian cancer clinical trial endpoints: Society of Gynecologic Oncology white paper.

    Science.gov (United States)

    Herzog, Thomas J; Armstrong, Deborah K; Brady, Mark F; Coleman, Robert L; Einstein, Mark H; Monk, Bradley J; Mannel, Robert S; Thigpen, J Tate; Umpierre, Sharee A; Villella, Jeannine A; Alvarez, Ronald D

    2014-01-01

    To explore the value of multiple clinical endpoints in the unique setting of ovarian cancer. A clinical trial workgroup was established by the Society of Gynecologic Oncology to develop a consensus statement via multiple conference calls, meetings and white paper drafts. Clinical trial endpoints have profound effects on late phase clinical trial design, result interpretation, drug development, and regulatory approval of therapeutics. Selection of the optimal clinical trial endpoint is particularly provocative in ovarian cancer where long overall survival (OS) is observed. The lack of new regulatory approvals and the lack of harmony between regulatory bodies globally for ovarian cancer therapeutics are of concern. The advantages and disadvantages of the numerous endpoints available are herein discussed within the unique context of ovarian cancer where both crossover and post-progression therapies potentially uncouple surrogacy between progression-free survival (PFS) and OS, the two most widely supported and utilized endpoints. The roles of patient reported outcomes (PRO) and health related quality of life (HRQoL) are discussed, but even these widely supported parameters are affected by the unique characteristics of ovarian cancer where a significant percentage of patients may be asymptomatic. Original data regarding the endpoint preferences of ovarian cancer advocates is presented. Endpoint selection in ovarian cancer clinical trials should reflect the impact on disease burden and unique characteristics of the treatment cohort while reflecting true patient benefit. Both OS and PFS have led to regulatory approvals and are clinically important. OS remains the most objective and accepted endpoint because it is least vulnerable to bias; however, the feasibility of OS in ovarian cancer is compromised by the requirement for large trial size, prolonged time-line for final analysis, and potential for unintended loss of treatment effect from active post-progression therapies

  10. Gene expression profiles in auricle skin as a possible additional endpoint for determination of sensitizers: A multi-endpoint evaluation of the local lymph node assay.

    Science.gov (United States)

    Tsuchiyama, Hiromi; Maeda, Akihisa; Nakajima, Mayumi; Kitsukawa, Mika; Takahashi, Kei; Miyoshi, Tomoya; Mutsuga, Mayu; Asaoka, Yoshiji; Miyamoto, Yohei; Oshida, Keiyu

    2017-10-05

    The murine local lymph node assay (LLNA) is widely used to test chemicals to induce skin sensitization. Exposure of mouse auricle skin to a sensitizer results in proliferation of local lymph node T cells, which has been measured by in vivo incorporation of H 3 -methyl thymidine or 5-bromo-2'-deoxyuridine (BrdU). The stimulation index (SI), the ratio of the mean proliferation in each treated group to that in the concurrent vehicle control group, is frequently used as a regulatory-authorized endpoint for LLNA. However, some non-sensitizing irritants, such as sodium dodecyl sulfate (SDS) or methyl salicylate (MS), have been reported as false-positives by this endpoint. In search of a potential endpoint to enhance the specificity of existing endpoints, we evaluated 3 contact sensitizers; (hexyl cinnamic aldehyde [HCA], oxazolone [OXA], and 2,4-dinitrochlorobenzene [DNCB]), 1 respiratory sensitizer (toluene 2,4-diisocyanate [TDI]), and 2 non-sensitizing irritants (MS and SDS) by several endpoints in LLNA. Each test substance was applied to both ears of female CBA/Ca mice daily for 3 consecutive days. The ears and auricle lymph node cells were analyzed on day 5 for endpoints including the SI value, lymph node cell count, cytokine release from lymph node cells, and histopathological changes and gene expression profiles in auricle skin. The SI values indicated that all the test substances induced significant proliferation of lymph node cells. The lymph node cell counts showed no significant changes by the non-sensitizers assessed. The inflammatory findings of histopathology were similar among the auricle skins treated by sensitizers and irritants. Gene expression profiles of cytokines IFN-γ, IL-4, and IL-17 in auricle skin were similar to the cytokine release profiles in draining lymph node cells. In addition, the gene expression of the chemokine CXCL1 and/or CXCL2 showed that it has the potential to discriminate sensitizers and non-sensitizing irritants. Our results

  11. An European inter-laboratory validation of alternative endpoints of the murine local lymph node assay: 2nd round.

    Science.gov (United States)

    Ehling, G; Hecht, M; Heusener, A; Huesler, J; Gamer, A O; van Loveren, H; Maurer, Th; Riecke, K; Ullmann, L; Ulrich, P; Vandebriel, R; Vohr, H-W

    2005-08-15

    The original local lymph node assay (LLNA) is based on the use of radioactive labelling to measure cell proliferation. Other endpoints for the assessment of proliferation are also authorized by the OECD Guideline 429 provided there is appropriate scientific support, including full citations and description of the methodology (OECD, 2002. OECD Guideline for the Testing of Chemicals; Skin Sensitization: Local Lymph Node Assay, Guideline 429. Paris, adopted 24th April 2002.). Here, we describe the outcome of the second round of an inter-laboratory validation of alternative endpoints in the LLNA conducted in nine laboratories in Europe. The validation study was managed and supervised by the Swiss Agency for Therapeutic Products (Swissmedic) in Bern. Ear-draining lymph node (LN) weight and cell counts were used to assess LN cell proliferation instead of [3H]TdR incorporation. In addition, the acute inflammatory skin reaction was measured by ear weight determination of circular biopsies of the ears to identify skin irritation properties of the test items. The statistical analysis was performed in the department of statistics at the university of Bern. Similar to the EC(3) values defined for the radioactive method, threshold values were calculated for the endpoints measured in this modification of the LLNA. It was concluded that all parameters measured have to be taken into consideration for the categorisation of compounds due to their sensitising potencies. Therefore, an assessment scheme has been developed which turned out to be of great importance to consistently assess sensitisation versus irritancy based on the data of the different parameters. In contrast to the radioactive method, irritants have been picked up by all the laboratories applying this assessment scheme.

  12. Novel joint selection methods can reduce sample size for rheumatoid arthritis clinical trials with ultrasound endpoints.

    Science.gov (United States)

    Allen, John C; Thumboo, Julian; Lye, Weng Kit; Conaghan, Philip G; Chew, Li-Ching; Tan, York Kiat

    2018-03-01

    To determine whether novel methods of selecting joints through (i) ultrasonography (individualized-ultrasound [IUS] method), or (ii) ultrasonography and clinical examination (individualized-composite-ultrasound [ICUS] method) translate into smaller rheumatoid arthritis (RA) clinical trial sample sizes when compared to existing methods utilizing predetermined joint sites for ultrasonography. Cohen's effect size (ES) was estimated (ES^) and a 95% CI (ES^L, ES^U) calculated on a mean change in 3-month total inflammatory score for each method. Corresponding 95% CIs [nL(ES^U), nU(ES^L)] were obtained on a post hoc sample size reflecting the uncertainty in ES^. Sample size calculations were based on a one-sample t-test as the patient numbers needed to provide 80% power at α = 0.05 to reject a null hypothesis H 0 : ES = 0 versus alternative hypotheses H 1 : ES = ES^, ES = ES^L and ES = ES^U. We aimed to provide point and interval estimates on projected sample sizes for future studies reflecting the uncertainty in our study ES^S. Twenty-four treated RA patients were followed up for 3 months. Utilizing the 12-joint approach and existing methods, the post hoc sample size (95% CI) was 22 (10-245). Corresponding sample sizes using ICUS and IUS were 11 (7-40) and 11 (6-38), respectively. Utilizing a seven-joint approach, the corresponding sample sizes using ICUS and IUS methods were nine (6-24) and 11 (6-35), respectively. Our pilot study suggests that sample size for RA clinical trials with ultrasound endpoints may be reduced using the novel methods, providing justification for larger studies to confirm these observations. © 2017 Asia Pacific League of Associations for Rheumatology and John Wiley & Sons Australia, Ltd.

  13. Endpoint behavior of the pion distribution amplitude in QCD sum rules with nonlocal condensates

    International Nuclear Information System (INIS)

    Mikhailov, S. V.; Pimikov, A. V.; Stefanis, N. G.

    2010-01-01

    Starting from the QCD sum rules with nonlocal condensates for the pion distribution amplitude, we derive another sum rule for its derivative and its ''integral derivatives''--defined in this work. We use this new sum rule to analyze the fine details of the pion distribution amplitude in the endpoint region x∼0. The results for endpoint-suppressed and flattop (or flatlike) pion distribution amplitudes are compared with those we obtained with differential sum rules by employing two different models for the distribution of vacuum-quark virtualities. We determine the range of values of the derivatives of the pion distribution amplitude and show that endpoint-suppressed distribution amplitudes lie within this range, while those with endpoint enhancement--flat-type or Chernyak-Zhitnitsky like--yield values outside this range.

  14. 78 FR 49530 - Gastroenterology Regulatory Endpoints and the Advancement of Therapeutics; Public Workshop

    Science.gov (United States)

    2013-08-14

    ...] Gastroenterology Regulatory Endpoints and the Advancement of Therapeutics; Public Workshop AGENCY: Food and Drug... for Drug Evaluation and Research, in cosponsorship with the American College of Gastroenterology, the... American Society for Pediatric Gastroenterology, Hepatology, and Nutrition, and the Pediatric IBD...

  15. On weighted hardy inequalities on semiaxis for functions vanishing at the endpoints

    Directory of Open Access Journals (Sweden)

    Stepanov Vladimir

    1997-01-01

    Full Text Available We study the weighted Hardy inequalities on the semiaxis of the form for functions vanishing at the endpoints together with derivatives up to the order . The case is completely characterized.

  16. 76 FR 51993 - Draft Guidance for Industry on Standards for Clinical Trial Imaging Endpoints; Availability

    Science.gov (United States)

    2011-08-19

    ... clinical trials of therapeutic drugs and biological products. The draft guidance describes standards... important imaging endpoint is used in a clinical trial of a therapeutic drug or biological product... Services to the Chairman of [[Page 51994

  17. Reporting and evaluation of HIV-related clinical endpoints in two multicenter international clinical trials

    DEFF Research Database (Denmark)

    Lifson, A; Rhame, F; Bellosa, W

    2006-01-01

    adjudication between reviewers before diagnostic certainty was assigned. CONCLUSION: Important requirements for HIV trials using clinical endpoints include objective definitions of "confirmed" and "probable," a formal reporting process with adequate information and supporting source documentation, evaluation......PURPOSE: The processes for reporting and review of progression of HIV disease clinical endpoints are described for two large phase III international clinical trials. METHOD: SILCAAT and ESPRIT are multicenter randomized HIV trials evaluating the impact of interleukin-2 on disease progression...... and death in HIV-infected patients receiving antiretroviral therapy. We report definitions used for HIV progression of disease endpoints, procedures for site reporting of such events, processes for independent review of reported events by an Endpoint Review Committee (ERC), and the procedure...

  18. CONTAIN calculations

    International Nuclear Information System (INIS)

    Scholtyssek, W.

    1995-01-01

    In the first phase of a benchmark comparison, the CONTAIN code was used to calculate an assumed EPR accident 'medium-sized leak in the cold leg', especially for the first two days after initiation of the accident. The results for global characteristics compare well with those of FIPLOC, MELCOR and WAVCO calculations, if the same materials data are used as input. However, significant differences show up for local quantities such as flows through leakages. (orig.)

  19. How Can Viral Dynamics Models Inform Endpoint Measures in Clinical Trials of Therapies for Acute Viral Infections?

    Directory of Open Access Journals (Sweden)

    Carolin Vegvari

    Full Text Available Acute viral infections pose many practical challenges for the accurate assessment of the impact of novel therapies on viral growth and decay. Using the example of influenza A, we illustrate how the measurement of infection-related quantities that determine the dynamics of viral load within the human host, can inform investigators on the course and severity of infection and the efficacy of a novel treatment. We estimated the values of key infection-related quantities that determine the course of natural infection from viral load data, using Markov Chain Monte Carlo methods. The data were placebo group viral load measurements collected during volunteer challenge studies, conducted by Roche, as part of the oseltamivir trials. We calculated the values of the quantities for each patient and the correlations between the quantities, symptom severity and body temperature. The greatest variation among individuals occurred in the viral load peak and area under the viral load curve. Total symptom severity correlated positively with the basic reproductive number. The most sensitive endpoint for therapeutic trials with the goal to cure patients is the duration of infection. We suggest laboratory experiments to obtain more precise estimates of virological quantities that can supplement clinical endpoint measurements.

  20. SpEnD: Linked Data SPARQL Endpoints Discovery Using Search Engines

    OpenAIRE

    Yumusak, Semih; Dogdu, Erdogan; Kodaz, Halife; Kamilaris, Andreas

    2016-01-01

    In this study, a novel metacrawling method is proposed for discovering and monitoring linked data sources on the Web. We implemented the method in a prototype system, named SPARQL Endpoints Discovery (SpEnD). SpEnD starts with a "search keyword" discovery process for finding relevant keywords for the linked data domain and specifically SPARQL endpoints. Then, these search keywords are utilized to find linked data sources via popular search engines (Google, Bing, Yahoo, Yandex). By using this ...

  1. Continuous Problem of Function Continuity

    Science.gov (United States)

    Jayakody, Gaya; Zazkis, Rina

    2015-01-01

    We examine different definitions presented in textbooks and other mathematical sources for "continuity of a function at a point" and "continuous function" in the context of introductory level Calculus. We then identify problematic issues related to definitions of continuity and discontinuity: inconsistency and absence of…

  2. The use of intermediate endpoints in the design of type 1 diabetes prevention trials.

    Science.gov (United States)

    Krischer, Jeffrey P

    2013-09-01

    This paper presents a rationale for the selection of intermediate endpoints to be used in the design of type 1 diabetes prevention clinical trials. Relatives of individuals diagnosed with type 1 diabetes were enrolled on the TrialNet Natural History Study and screened for diabetes-related autoantibodies. Those with two or more such autoantibodies were analysed with respect to increased HbA1c, decreased C-peptide following an OGTT, or abnormal OGTT values as intermediate markers of disease progression. Over 2 years, a 10% increase in HbA1c, and a 20% or 30% decrease in C-peptide from baseline, or progression to abnormal OGTT, occurred with a frequency between 20% and 41%. The 3- to 5-year risk of type 1 diabetes following each intermediate endpoint was high, namely 47% to 84%. The lower the incidence of the endpoint being reached, the higher the risk of diabetes. A diabetes prevention trial using these intermediate endpoints would require a 30% to 50% smaller sample size than one using type 1 diabetes as the endpoint. The use of an intermediate endpoint in diabetes prevention is based on the generally held view of disease progression from initial occurrence of autoantibodies through successive immunological and metabolic changes to manifest type 1 diabetes. Thus, these markers are suitable for randomised phase 2 trials, which can more rapidly screen promising new therapies, allowing them to be subsequently confirmed in definitive phase 3 trials.

  3. Development of Cardiovascular and Neurodevelopmental Metrics as Sublethal Endpoints for the Fish Embryo Toxicity Test.

    Science.gov (United States)

    Krzykwa, Julie C; Olivas, Alexis; Jeffries, Marlo K Sellin

    2018-06-19

    The fathead minnow fish embryo toxicity (FET) test has been proposed as a more humane alternative to current toxicity testing methods, as younger organisms are thought to experience less distress during toxicant exposure. However, the FET test protocol does not include endpoints that allow for the prediction of sublethal adverse outcomes, limiting its utility relative to other test types. Researchers have proposed the development of sublethal endpoints for the FET test to increase its utility. The present study 1) developed methods for previously unmeasured sublethal metrics in fathead minnows (i.e., spontaneous contraction frequency and heart rate) and 2) investigated the responsiveness of several sublethal endpoints related to growth (wet weight, length, and growth-related gene expression), neurodevelopment (spontaneous contraction frequency, and neurodevelopmental gene expression), and cardiovascular function and development (pericardial area, eye size and cardiovascular related gene expression) as additional FET test metrics using the model toxicant 3,4-dichloroaniline. Of the growth, neurological and cardiovascular endpoints measured, length, eye size and pericardial area were found to more responsive than the other endpoints, respectively. Future studies linking alterations in these endpoints to longer-term adverse impacts are needed to fully evaluate the predictive power of these metrics in chemical and whole effluent toxicity testing. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  4. Five criteria for using a surrogate endpoint to predict treatment effect based on data from multiple previous trials.

    Science.gov (United States)

    Baker, Stuart G

    2018-02-20

    A surrogate endpoint in a randomized clinical trial is an endpoint that occurs after randomization and before the true, clinically meaningful, endpoint that yields conclusions about the effect of treatment on true endpoint. A surrogate endpoint can accelerate the evaluation of new treatments but at the risk of misleading conclusions. Therefore, criteria are needed for deciding whether to use a surrogate endpoint in a new trial. For the meta-analytic setting of multiple previous trials, each with the same pair of surrogate and true endpoints, this article formulates 5 criteria for using a surrogate endpoint in a new trial to predict the effect of treatment on the true endpoint in the new trial. The first 2 criteria, which are easily computed from a zero-intercept linear random effects model, involve statistical considerations: an acceptable sample size multiplier and an acceptable prediction separation score. The remaining 3 criteria involve clinical and biological considerations: similarity of biological mechanisms of treatments between the new trial and previous trials, similarity of secondary treatments following the surrogate endpoint between the new trial and previous trials, and a negligible risk of harmful side effects arising after the observation of the surrogate endpoint in the new trial. These 5 criteria constitute an appropriately high bar for using a surrogate endpoint to make a definitive treatment recommendation. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  5. Evaluation of non-radioactive endpoints of ex vivo local lymph node assay-BrdU to investigate select contact sensitizers.

    Science.gov (United States)

    Ulker, Ozge Cemiloglu; Ates, Ilker; Atak, Aysegul; Karakaya, Asuman

    2013-01-01

    The present study sought to verify the utility of the non-radioactive endpoints LLNA BrdU (5-bromo-2'-deoxyuridine) ex vivo incorporation and cytokine release using auricular lymph node cells isolated from BALB/c mice topically treated with a strong (formaldehyde or p-phenylene-diamine [PPD]), moderate sensitizer (cinnamal), or weak sensitizer (eugenol). Stimulation index (SI) and EC₃ values were calculated for each agent. Based on the results of ex vivo LLNA-BrdU assays, EC₃ values were calculated to be 0.29, 0.09, 1.91, and 16.60% for formaldehyde, PPD, cinnamal, and eugenol, respectively. These results were in good agreement with data from previous standard radioactive LLNA. Cytokine analyses indicated T(H)1 and T(H)2 cytokine involvement in the regulation of murine contact allergy and these could be utilized as endpoints in assessments of contact allergy in mice. In conclusion, the current study provided evidence that the non-radioactive endpoint LLNA BrdU ex vivo incorporation could be of use as a viable alternative approach to assess the skin sensitization potential of test compound with respect to improving animal welfare. This is of particular importance in the case of any laboratory where it might be difficult to handle and/or readily employ radioisotopes. Further studies will be required to confirm--across test agents--the reproducibility as well as the limits of utility of this new ex vivo BrdU method.

  6. A web-based endpoint adjudication system for interim analyses in clinical trials.

    Science.gov (United States)

    Nolen, Tracy L; Dimmick, Bill F; Ostrosky-Zeichner, Luis; Kendrick, Amy S; Sable, Carole; Ngai, Angela; Wallace, Dennis

    2009-02-01

    A data monitoring committee (DMC) is often employed to assess trial progress and review safety data and efficacy endpoints throughout a trail. Interim analyses performed for the DMC should use data that are as complete and verified as possible. Such analyses are complicated when data verification involves subjective study endpoints or requires clinical expertise to determine each subject's status with respect to the study endpoint. Therefore, procedures are needed to obtain adjudicated data for interim analyses in an efficient manner. In the past, methods for handling such data included using locally reported results as surrogate endpoints, adjusting analysis methods for unadjudicated data, or simply performing the adjudication as rapidly as possible. These methods all have inadequacies that make their sole usage suboptimal. For a study of prophylaxis for invasive candidiasis, adjudication of both study eligibility criteria and clinical endpoints prior to two interim analyses was required. Because the study was expected to enroll at a moderate rate and the sponsor required adjudicated endpoints to be used for interim analyses, an efficient process for adjudication was required. We created a web-based endpoint adjudication system (WebEAS) that allows for expedited review by the endpoint adjudication committee (EAC). This system automatically identifies when a subject's data are complete, creates a subject profile from the study data, and assigns EAC reviewers. The reviewers use the WebEAS to review the subject profile and submit their completed review form. The WebEAS then compares the reviews, assigns an additional review as a tiebreaker if needed, and stores the adjudicated data. The study for which this system was originally built was administratively closed after 10 months with only 38 subjects enrolled. The adjudication process was finalized and the WebEAS system activated prior to study closure. Some website accessibility issues presented initially. However

  7. Health Endpoint Attributed to Sulfur Dioxide Air Pollutants

    Directory of Open Access Journals (Sweden)

    Geravandi

    2015-07-01

    Full Text Available Background Sulfur dioxide is a colorless gas, released from burning of coal, high-sulfur coal,s and diesel fuel. Sulfur dioxide harms human health by reacting with the moisture in the nose, nasal cavity and throat and this is the way by which it destroys the nerves in the respiratory system. Objectives The aim of this study was to focus on identifying the effects associated with sulfur dioxide on health in Ahvaz, Iran. Materials and Methods Data collections were performed by Ahvaz meteorological organization and the department of environment. Sampling was performed for 24 hours in four stations. Methods of sampling and analysis were according to US environmental protection agency (EPA guideline. Afterwards, we processed the raw data including instruction set correction of averaging, coding and filtering by Excel software and then, the impact of meteorological parameters were converted as the input file to the AirQ model. Finally, we calculated the health effects of exposure to sulfur dioxide. Results According to the findings, the concentration of sulfur dioxide in Ahvaz had an annual average of 51 μg/m3. Sum of the numbers of hospital admissions for respiratory diseases attributed to sulfur dioxide was 25 cases in 2012. Approximately, 5% of the total hospital admissions for respiratory disease and respiratory mortality happened when sulfur dioxide concentration was more than 10 mg/m3. Conclusions According to the results of this study, this increase could be due to higher fuel consumption, usage of gasoline in vehicles, oil industry, and steel and heavy industries in Ahwaz. The risk of mortality and morbidity were detected at the current concentrations of air pollutants.

  8. First observation of bald patches in a filament channel and at a barb endpoint

    Science.gov (United States)

    López Ariste, A.; Aulanier, G.; Schmieder, B.; Sainz Dalda, A.

    2006-09-01

    The 3D magnetic field topology of solar filaments/prominences is strongly debated, because it is not directly measureable in the corona. Among various prominence models, several are consistent with many observations, but their related topologies are very different. We conduct observations to address this paradigm. We measure the photospheric vector magnetic field in several small flux concentrations surrounding a filament observed far from disc center. Our objective is to test for the presence/absence of magnetic dips around/below the filament body/barb, which is a strong constraint on prominence models, and that is still untested by observations. Our observations are performed with the THEMIS/MTR instrument. The four Stokes parameters are extracted, from which the vector magnetic fields are calculated using a PCA inversion. The resulting vector fields are then deprojected onto the photospheric plane. The 180° ambiguity is then solved by selecting the only solution that matches filament chirality rules. Considering the weakness of the resulting magnetic fields, a careful analysis of the inversion procedure and its error bars was performed, to avoid over-interpretation of noisy or ambiguous Stokes profiles. Thanks to the simultaneous multi-wavelength THEMIS observations, the vector field maps are coaligned with the Hα image of the filament. By definition, photospheric dips are identifiable where the horizontal component of the magnetic field points from a negative toward a positive polarity. Among six bipolar regions analyzed in the filament channel, four at least display photospheric magnetic dips, i.e. bald patches. For barbs, the topology of the endpoint is that of a bald patch located next to a parasitic polarity, not of an arcade pointing within the polarity. The observed magnetic field topology in the photosphere tends to support models of prominence based on magnetic dips located within weakly twisted flux tubes. Their underlying and lateral extensions form

  9. Comparing three novel endpoints for developmental osteotoxicity in the embryonic stem cell test

    International Nuclear Information System (INIS)

    Nieden, Nicole I. zur; Davis, Lesley A.; Rancourt, Derrick E.

    2010-01-01

    Birth defects belong to the most serious side effects of pharmaceutical compounds or environmental chemicals. In vivo, teratogens most often affect the normal development of bones, causing growth retardation, limb defects or craniofacial malformations. The embryonic stem cell test (EST) is one of the most promising models that allow the in vitro prediction of embryotoxicity, with one of its endpoints being bone tissue development. The present study was designed to describe three novel inexpensive endpoints to assess developmental osteotoxicity using the model compounds penicillin G (non-teratogenic), 5-fluorouracil (strong teratogen) and all-trans retinoic acid (bone teratogen). These three endpoints were: quantification of matrix incorporated calcium by (1) morphometric analysis and (2) measurement of calcium levels as well as (3) activity of alkaline phosphatase, an enzyme involved in matrix calcification. To evaluate our data, we have compared the concentration curves and resulting ID 50 s of the new endpoints with mRNA expression for osteocalcin. Osteocalcin is an exclusive marker found only in mineralized tissues, is regulated upon compound treatment and reliably predicts the potential of a chemical entity acting as a bone teratogen. By comparing the new endpoints to quantitative expression of osteocalcin, which we previously identified as suitable to detect developmental osteotoxicity, we were ultimately able to illustrate IMAGE analysis and Ca 2+ deposition assays as two reliable novel endpoints for the EST. This is of particular importance for routine industrial assessment of novel compounds as these two new endpoints may substitute previously used molecular read-out methods, which are often costly and time-consuming.

  10. Business continuity

    International Nuclear Information System (INIS)

    Breunhoelder, Gert

    2002-01-01

    This presentation deals with the following keypoints: Information Technology (IT) Business Continuity and Recovery essential for any business; lessons learned after Sept. 11 event; Detailed planning, redundancy and testing being the key elements for probability estimation of disasters

  11. Burnout calculation

    International Nuclear Information System (INIS)

    Li, D.

    1980-01-01

    Reviewed is the effect of heat flux of different system parameters on critical density in order to give an initial view on the value of several parameters. A thorough analysis of different equations is carried out to calculate burnout is steam-water flows in uniformly heated tubes, annular, and rectangular channels and rod bundles. Effect of heat flux density distribution and flux twisting on burnout and storage determination according to burnout are commended [ru

  12. Endpoint in plasma etch process using new modified w-multivariate charts and windowed regression

    Science.gov (United States)

    Zakour, Sihem Ben; Taleb, Hassen

    2017-09-01

    Endpoint detection is very important undertaking on the side of getting a good understanding and figuring out if a plasma etching process is done in the right way, especially if the etched area is very small (0.1%). It truly is a crucial part of supplying repeatable effects in every single wafer. When the film being etched has been completely cleared, the endpoint is reached. To ensure the desired device performance on the produced integrated circuit, the high optical emission spectroscopy (OES) sensor is employed. The huge number of gathered wavelengths (profiles) is then analyzed and pre-processed using a new proposed simple algorithm named Spectra peak selection (SPS) to select the important wavelengths, then we employ wavelet analysis (WA) to enhance the performance of detection by suppressing noise and redundant information. The selected and treated OES wavelengths are then used in modified multivariate control charts (MEWMA and Hotelling) for three statistics (mean, SD and CV) and windowed polynomial regression for mean. The employ of three aforementioned statistics is motivated by controlling mean shift, variance shift and their ratio (CV) if both mean and SD are not stable. The control charts show their performance in detecting endpoint especially W-mean Hotelling chart and the worst result is given by CV statistic. As the best detection of endpoint is given by the W-Hotelling mean statistic, this statistic will be used to construct a windowed wavelet Hotelling polynomial regression. This latter can only identify the window containing endpoint phenomenon.

  13. Bioremediation of hydrocarbon-contaminated soils: are treatability and ecotoxicity endpoints related?

    International Nuclear Information System (INIS)

    Visser, S.

    1999-01-01

    To determine if there is a relationship between biotreatability and ecotoxicity endpoints in a wide range of hydrocarbon-contaminated soils, including medium and heavy crude oil-contaminated flare pit wastes and lubrication oil contaminated soil, research was conducted. Each test material was analyzed for pH, water repellency, electrical conductivity, available N and P, total extractable hydrocarbons, oil and grease, and toxicity to seedling emergence, root elongation in barley, lettuce and canola, earthworm survival and luminescent bacteria (Microtox), prior to, and following three months of bioremediation in the laboratory. By monitoring soil respiration, progress of the bioremediation process and determination of a treatment endpoint were assessed. The time required to attain a treatment endpoint under laboratory conditions can range from 30 days to 100 days depending on the concentration of hydrocarbons and degree of weathering. Most flare pits are biotreatable, averaging a loss of 25-30% of hydrocarbons during bioremediation. Once a treatment endpoint is achieved, residual hydrocarbons contents almost always exceeds Alberta Tier I criteria for mineral oil and grease. As a result of bioremediation treatments, hydrophobicity is often reduced from severe to low. Many flare pit materials are still moderately to extremely toxic after reaching a treatment endpoint. (Abstract only)

  14. Multi-Toxic Endpoints of the Foodborne Mycotoxins in Nematode Caenorhabditis elegans

    Directory of Open Access Journals (Sweden)

    Zhendong Yang

    2015-12-01

    Full Text Available Aflatoxins B1 (AFB1, deoxynivalenol (DON, fumonisin B1 (FB1, T-2 toxin (T-2, and zearalenone (ZEA are the major foodborne mycotoxins of public health concerns. In the present study, the multiple toxic endpoints of these naturally-occurring mycotoxins were evaluated in Caenorhabditis elegans model for their lethality, toxic effects on growth and reproduction, as well as influence on lifespan. We found that the lethality endpoint was more sensitive for T-2 toxicity with the EC50 at 1.38 mg/L, the growth endpoint was relatively sensitive for AFB1 toxic effects, and the reproduction endpoint was more sensitive for toxicities of AFB1, FB1, and ZEA. Moreover, the lifespan endpoint was sensitive to toxic effects of all five tested mycotoxins. Data obtained from this study may serve as an important contribution to knowledge on assessment of mycotoxin toxic effects, especially for assessing developmental and reproductive toxic effects, using the C. elegans model.

  15. Meeting report: Measuring endocrine-sensitive endpoints within the first years of life

    DEFF Research Database (Denmark)

    Arbuckle, T.E.; Hauser, R.; Swan, S.H.

    2008-01-01

    An international workshop tided "Assessing Endocrine-Related Endpoints within the First Years of Life" was held 30 April-1 May 2007, in Ottawa, Ontario, Canada. Representatives from a number of pregnancy cohort studies in North America and Europe presented options for measuring various endocrine......-sensitive endpoints in early life and discussed issues related to performing and using those measures. The workshop focused on measuring reproductive tract developmental endpoints [e.g., anogenital distance (AGD)], endocrine status, and infant anthropometry. To the extent possible, workshop participants strove...... on the genital exam. Although a number of outcome measures recommended during the genital exam have been associated with exposure to endocrine-disrupting chemicals, little is known about how predictive these effects are of later reproductive health or other chronic health conditions....

  16. Deep inelastic scattering near the endpoint in soft-collinear effective theory

    International Nuclear Information System (INIS)

    Chay, Junegone; Kim, Chul

    2007-01-01

    We apply the soft-collinear effective theory to deep inelastic scattering near the endpoint region. The forward scattering amplitude and the structure functions are shown to factorize as a convolution of the Wilson coefficients, the jet functions, and the parton distribution functions. The behavior of the parton distribution functions near the endpoint region is considered. It turns out that it evolves with the Altarelli-Parisi kernel even in the endpoint region, and the parton distribution function can be factorized further into a collinear part and the soft Wilson line. The factorized form for the structure functions is obtained by the two-step matching, and the radiative corrections or the evolution for each factorized part can be computed in perturbation theory. We present the radiative corrections of each factorized part to leading order in α s , including the zero-bin subtraction for the collinear part

  17. Implementation of minimally invasive and objective humane endpoints in the study of murine Plasmodium infections

    DEFF Research Database (Denmark)

    Dellavalle, B; Kirchhoff, J; Maretty, L

    2014-01-01

    SUMMARY Defining appropriate and objective endpoints for animal research can be difficult. Previously we evaluated and implemented a body temperature (BT) of ECM) and were interested in a similar endpoint for a model of severe malarial...... anaemia (SMA). Furthermore, we investigate the potential of a minimally invasive, non-contact infrared thermometer for repeated BT measurement. ECM was induced with Plasmodium berghei ANKA infection in C57Bl/6 mice. SMA was induced with Plasmodium chabaudi AS infection in A/J mice. Our previous published...... endpoint was applied in ECM and 30 °C was pre-determined as the lowest permitted limit for termination in SMA according to consultation with the Danish Animal Inspectorate. Infrared thermometer was compared with the rectal probe after cervical dislocation, ECM and SMA. Linear regression analysis of rectal...

  18. Continuous Dropout.

    Science.gov (United States)

    Shen, Xu; Tian, Xinmei; Liu, Tongliang; Xu, Fang; Tao, Dacheng

    2017-10-03

    Dropout has been proven to be an effective algorithm for training robust deep networks because of its ability to prevent overfitting by avoiding the co-adaptation of feature detectors. Current explanations of dropout include bagging, naive Bayes, regularization, and sex in evolution. According to the activation patterns of neurons in the human brain, when faced with different situations, the firing rates of neurons are random and continuous, not binary as current dropout does. Inspired by this phenomenon, we extend the traditional binary dropout to continuous dropout. On the one hand, continuous dropout is considerably closer to the activation characteristics of neurons in the human brain than traditional binary dropout. On the other hand, we demonstrate that continuous dropout has the property of avoiding the co-adaptation of feature detectors, which suggests that we can extract more independent feature detectors for model averaging in the test stage. We introduce the proposed continuous dropout to a feedforward neural network and comprehensively compare it with binary dropout, adaptive dropout, and DropConnect on Modified National Institute of Standards and Technology, Canadian Institute for Advanced Research-10, Street View House Numbers, NORB, and ImageNet large scale visual recognition competition-12. Thorough experiments demonstrate that our method performs better in preventing the co-adaptation of feature detectors and improves test performance.

  19. Angiographic core laboratory reproducibility analyses: implications for planning clinical trials using coronary angiography and left ventriculography end-points.

    Science.gov (United States)

    Steigen, Terje K; Claudio, Cheryl; Abbott, David; Schulzer, Michael; Burton, Jeff; Tymchak, Wayne; Buller, Christopher E; John Mancini, G B

    2008-06-01

    To assess reproducibility of core laboratory performance and impact on sample size calculations. Little information exists about overall reproducibility of core laboratories in contradistinction to performance of individual technicians. Also, qualitative parameters are being adjudicated increasingly as either primary or secondary end-points. The comparative impact of using diverse indexes on sample sizes has not been previously reported. We compared initial and repeat assessments of five quantitative parameters [e.g., minimum lumen diameter (MLD), ejection fraction (EF), etc.] and six qualitative parameters [e.g., TIMI myocardial perfusion grade (TMPG) or thrombus grade (TTG), etc.], as performed by differing technicians and separated by a year or more. Sample sizes were calculated from these results. TMPG and TTG were also adjudicated by a second core laboratory. MLD and EF were the most reproducible, yielding the smallest sample size calculations, whereas percent diameter stenosis and centerline wall motion require substantially larger trials. Of the qualitative parameters, all except TIMI flow grade gave reproducibility characteristics yielding sample sizes of many 100's of patients. Reproducibility of TMPG and TTG was only moderately good both within and between core laboratories, underscoring an intrinsic difficulty in assessing these. Core laboratories can be shown to provide reproducibility performance that is comparable to performance commonly ascribed to individual technicians. The differences in reproducibility yield huge differences in sample size when comparing quantitative and qualitative parameters. TMPG and TTG are intrinsically difficult to assess and conclusions based on these parameters should arise only from very large trials.

  20. Comparison of RNA-seq and microarray-based models for clinical endpoint prediction.

    Science.gov (United States)

    Zhang, Wenqian; Yu, Ying; Hertwig, Falk; Thierry-Mieg, Jean; Zhang, Wenwei; Thierry-Mieg, Danielle; Wang, Jian; Furlanello, Cesare; Devanarayan, Viswanath; Cheng, Jie; Deng, Youping; Hero, Barbara; Hong, Huixiao; Jia, Meiwen; Li, Li; Lin, Simon M; Nikolsky, Yuri; Oberthuer, André; Qing, Tao; Su, Zhenqiang; Volland, Ruth; Wang, Charles; Wang, May D; Ai, Junmei; Albanese, Davide; Asgharzadeh, Shahab; Avigad, Smadar; Bao, Wenjun; Bessarabova, Marina; Brilliant, Murray H; Brors, Benedikt; Chierici, Marco; Chu, Tzu-Ming; Zhang, Jibin; Grundy, Richard G; He, Min Max; Hebbring, Scott; Kaufman, Howard L; Lababidi, Samir; Lancashire, Lee J; Li, Yan; Lu, Xin X; Luo, Heng; Ma, Xiwen; Ning, Baitang; Noguera, Rosa; Peifer, Martin; Phan, John H; Roels, Frederik; Rosswog, Carolina; Shao, Susan; Shen, Jie; Theissen, Jessica; Tonini, Gian Paolo; Vandesompele, Jo; Wu, Po-Yen; Xiao, Wenzhong; Xu, Joshua; Xu, Weihong; Xuan, Jiekun; Yang, Yong; Ye, Zhan; Dong, Zirui; Zhang, Ke K; Yin, Ye; Zhao, Chen; Zheng, Yuanting; Wolfinger, Russell D; Shi, Tieliu; Malkas, Linda H; Berthold, Frank; Wang, Jun; Tong, Weida; Shi, Leming; Peng, Zhiyu; Fischer, Matthias

    2015-06-25

    Gene expression profiling is being widely applied in cancer research to identify biomarkers for clinical endpoint prediction. Since RNA-seq provides a powerful tool for transcriptome-based applications beyond the limitations of microarrays, we sought to systematically evaluate the performance of RNA-seq-based and microarray-based classifiers in this MAQC-III/SEQC study for clinical endpoint prediction using neuroblastoma as a model. We generate gene expression profiles from 498 primary neuroblastomas using both RNA-seq and 44 k microarrays. Characterization of the neuroblastoma transcriptome by RNA-seq reveals that more than 48,000 genes and 200,000 transcripts are being expressed in this malignancy. We also find that RNA-seq provides much more detailed information on specific transcript expression patterns in clinico-genetic neuroblastoma subgroups than microarrays. To systematically compare the power of RNA-seq and microarray-based models in predicting clinical endpoints, we divide the cohort randomly into training and validation sets and develop 360 predictive models on six clinical endpoints of varying predictability. Evaluation of factors potentially affecting model performances reveals that prediction accuracies are most strongly influenced by the nature of the clinical endpoint, whereas technological platforms (RNA-seq vs. microarrays), RNA-seq data analysis pipelines, and feature levels (gene vs. transcript vs. exon-junction level) do not significantly affect performances of the models. We demonstrate that RNA-seq outperforms microarrays in determining the transcriptomic characteristics of cancer, while RNA-seq and microarray-based models perform similarly in clinical endpoint prediction. Our findings may be valuable to guide future studies on the development of gene expression-based predictive models and their implementation in clinical practice.

  1. A systematic comparison of recurrent event models for application to composite endpoints.

    Science.gov (United States)

    Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine

    2018-01-04

    Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.

  2. Continuity theory

    CERN Document Server

    Nel, Louis

    2016-01-01

    This book presents a detailed, self-contained theory of continuous mappings. It is mainly addressed to students who have already studied these mappings in the setting of metric spaces, as well as multidimensional differential calculus. The needed background facts about sets, metric spaces and linear algebra are developed in detail, so as to provide a seamless transition between students' previous studies and new material. In view of its many novel features, this book will be of interest also to mature readers who have studied continuous mappings from the subject's classical texts and wish to become acquainted with a new approach. The theory of continuous mappings serves as infrastructure for more specialized mathematical theories like differential equations, integral equations, operator theory, dynamical systems, global analysis, topological groups, topological rings and many more. In light of the centrality of the topic, a book of this kind fits a variety of applications, especially those that contribute to ...

  3. Calculator calculus

    CERN Document Server

    McCarty, George

    1982-01-01

    How THIS BOOK DIFFERS This book is about the calculus. What distinguishes it, however, from other books is that it uses the pocket calculator to illustrate the theory. A computation that requires hours of labor when done by hand with tables is quite inappropriate as an example or exercise in a beginning calculus course. But that same computation can become a delicate illustration of the theory when the student does it in seconds on his calculator. t Furthermore, the student's own personal involvement and easy accomplishment give hi~ reassurance and en­ couragement. The machine is like a microscope, and its magnification is a hundred millionfold. We shall be interested in limits, and no stage of numerical approximation proves anything about the limit. However, the derivative of fex) = 67.SgX, for instance, acquires real meaning when a student first appreciates its values as numbers, as limits of 10 100 1000 t A quick example is 1.1 , 1.01 , 1.001 , •••• Another example is t = 0.1, 0.01, in the functio...

  4. Radiometric titration of officinal radiopharmaceuticals using radioactive kryptonates as end-point indicators. I

    International Nuclear Information System (INIS)

    Toelgyessy, J.; Dillinger, P.; Harangozo, M.; Jombik, J.

    1980-01-01

    A method for the determination of salicylic, acetylsalicylic and benzoic acids in officinal pharmaceutical based on radiometric titration with 0.1 mol.l -1 NaOH was developed. The end-point was detected with the aid of radioactive glass kryptonate. After the end-point, the excess titrant attacks the glass surface layers and this results in releasing 85 Kr, and consequently, in decreasing the radioactivity of the kryptonate employed. The radioactive kryptonate used as an indicator was prepared by the bombardment of glass with accelerated 85 Kr ions. The developed method is simple, accurate and correct. (author)

  5. A duplex endpoint PCR assay for rapid detection and differentiation of Leptospira strains.

    Science.gov (United States)

    Benacer, Douadi; Zain, Siti Nursheena Mohd; Lewis, John W; Khalid, Mohd Khairul Nizam Mohd; Thong, Kwai Lin

    2017-01-01

    This study aimed to develop a duplex endpoint PCR assay for rapid detection and differentiation of Leptospira strains. Primers were designed to target the rrs (LG1/LG2) and ligB (LP1/LP2) genes to confirm the presence of the Leptospira genus and the pathogenic species, respectively. The assay showed 100% specificity against 17 Leptospira strains with a limit of detection of 23.1pg/µl of leptospiral DNA and sensitivity of 103 leptospires/ml in both spiked urine and water. Our duplex endpoint PCR assay is suitable for rapid early detection of Leptospira with high sensitivity and specificity.

  6. Comparison of methods for accurate end-point detection of potentiometric titrations

    International Nuclear Information System (INIS)

    Villela, R L A; Borges, P P; Vyskočil, L

    2015-01-01

    Detection of the end point in potentiometric titrations has wide application on experiments that demand very low measurement uncertainties mainly for certifying reference materials. Simulations of experimental coulometric titration data and consequential error analysis of the end-point values were conducted using a programming code. These simulations revealed that the Levenberg-Marquardt method is in general more accurate than the traditional second derivative technique used currently as end-point detection for potentiometric titrations. Performance of the methods will be compared and presented in this paper

  7. Comparison of methods for accurate end-point detection of potentiometric titrations

    Science.gov (United States)

    Villela, R. L. A.; Borges, P. P.; Vyskočil, L.

    2015-01-01

    Detection of the end point in potentiometric titrations has wide application on experiments that demand very low measurement uncertainties mainly for certifying reference materials. Simulations of experimental coulometric titration data and consequential error analysis of the end-point values were conducted using a programming code. These simulations revealed that the Levenberg-Marquardt method is in general more accurate than the traditional second derivative technique used currently as end-point detection for potentiometric titrations. Performance of the methods will be compared and presented in this paper.

  8. Verifying Elimination Programs with a Special Emphasis on Cysticercosis Endpoints and Postelimination Surveillance

    Directory of Open Access Journals (Sweden)

    Sukwan Handali

    2012-01-01

    Full Text Available Methods are needed for determining program endpoints or postprogram surveillance for any elimination program. Cysticercosis has the necessary effective strategies and diagnostic tools for establishing an elimination program; however, tools to verify program endpoints have not been determined. Using a statistical approach, the present study proposed that taeniasis and porcine cysticercosis antibody assays could be used to determine with a high statistical confidence whether an area is free of disease. Confidence would be improved by using secondary tests such as the taeniasis coproantigen assay and necropsy of the sentinel pigs.

  9. Model of a black hole gas submitted to background gravitational field for active galaxy nuclei with application to calculating the continuous emission spectra of massless particles (Photons: neutrinos and gravitons)

    International Nuclear Information System (INIS)

    Pinto Neto, A.

    1987-01-01

    A new theoretical model for active galaxy nuclei which describes the continuous spectrum of rest massless particles (photons, neutrinos and gravitons) in the frequency range from radiofrequency to gamma ray frequency, is presented. The model consists in a black hole gas interacting with a background gravitacional field. The previously models proposed for active galaxy nuclei are exposured. Whole theoretical fundaments based on Einstein general relativity theory for defining and studying singularity properties (black holes) are also presented. (M.C.K.) [pt

  10. FENDL/MC. Library of continuous energy cross sections in ACE format for neutron-photon transport calculations with the Monte Carlo N-particle Transport Code system MCNP 4A. Version 1.1 of March 1995. Summary documentation

    International Nuclear Information System (INIS)

    Pashchenko, A.B.; Wienke, H.; Ganesan, S.

    1996-01-01

    Selected neutron reaction nuclear data evaluations for elements of interest to the IAEA's program on Fusion Evaluated Nuclear Data Library (FENDL) have been processed into ACE format using the NJOY system by R.E. MacFarlane. This document summarizes the resulting continuous energy cross-section data library FENDL/MC version 1.1. The data are available cost free, upon request from the IAEA Nuclear Data Section, online or on magnetic tape. (author). 1 tab

  11. Continuation calculus

    NARCIS (Netherlands)

    Geron, B.; Geuvers, J.H.; de'Liguoro, U.; Saurin, A.

    2013-01-01

    Programs with control are usually modeled using lambda calculus extended with control operators. Instead of modifying lambda calculus, we consider a different model of computation. We introduce continuation calculus, or CC, a deterministic model of computation that is evaluated using only head

  12. Predicting the outcome of oral food challenges with hen's egg through skin test end-point titration.

    Science.gov (United States)

    Tripodi, S; Businco, A Di Rienzo; Alessandri, C; Panetta, V; Restani, P; Matricardi, P M

    2009-08-01

    Oral food challenge (OFC) is the diagnostic 'gold standard' of food allergies but it is laborious and time consuming. Attempts to predict a positive OFC through specific IgE assays or conventional skin tests so far gave suboptimal results. To test whether skin test with titration curves predict with enough confidence the outcome of an oral food challenge. Children (n=47; mean age 6.2 +/- 4.2 years) with suspected and diagnosed allergic reactions to hen's egg (HE) were examined through clinical history, physical examination, oral food challenge, conventional and end-point titrated skin tests with HE white extract and determination of serum specific IgE against HE white. Predictive decision points for a positive outcome of food challenges were calculated through receiver operating characteristic (ROC) analysis for HE white using IgE concentration, weal size and end-point titration (EPT). OFC was positive (Sampson's score >or=3) in 20/47 children (42.5%). The area under the ROC curve obtained with the EPT method was significantly bigger than the one obtained by measuring IgE-specific antibodies (0.99 vs. 0.83, P<0.05) and weal size (0.99 vs. 0.88, P<0.05). The extract's dilution that successfully discriminated a positive from a negative OFC (sensitivity 95%, specificity 100%) was 1 : 256, corresponding to a concentration of 5.9 microg/mL of ovotransferrin, 22.2 microg/mL of ovalbumin, and 1.4 microg/mL of lysozyme. EPT is a promising approach to optimize the use of skin prick tests and to predict the outcome of OFC with HE in children. Further studies are needed to test whether this encouraging finding can be extended to other populations and food allergens.

  13. Characterization and evaluation of a modified local lymph node assay using ATP content as a non-radio isotopic endpoint.

    Science.gov (United States)

    Idehara, Kenji; Yamagishi, Gaku; Yamashita, Kunihiko; Ito, Michio

    2008-01-01

    The murine local lymph node assay (LLNA) is an accepted and widely used method for assessing the skin-sensitizing potential of chemicals. Here, we describe a non-radio isotopic modified LLNA in which adenosine triphosphate (ATP) content is used as an endpoint instead of radioisotope (RI); the method is termed LLNA modified by Daicel based on ATP content (LLNA-DA). Groups of female CBA/JNCrlj mice were treated topically on the dorsum of both ears with test chemicals or a vehicle control on days 1, 2, and 3; an additional fourth application was conducted on day 7. Pretreatment with 1% sodium lauryl sulfate solution was performed 1 h before each application. On day 8, the amount of ATP in the draining auricular lymph nodes was measured as an alternative endpoint by the luciferin-luciferase assay in terms of bioluminescence (relative light units, RLU). A stimulation index (SI) relative to the concurrent vehicle control was derived based on the RLU value, and an SI of 3 was set as the cut-off value. Using the LLNA-DA method, 31 chemicals were tested and the results were compared with those of other test methods. The accuracy of LLNA-DA vs LLNA, guinea pig tests, and human tests was 93% (28/30), 80% (20/25), and 79% (15/19), respectively. The estimated concentration (EC) 3 value was calculated and compared with that of the original LLNA. It was found that the EC3 values obtained by LLNA-DA were almost equal to those obtained by the original LLNA. The SI value based on ATP content is similar to that of the original LLNA as a result of the modifications in the chemical treatment procedure, which contribute to improving the SI value. It is concluded that LLNA-DA is a promising non-RI alternative method for evaluating the skin-sensitizing potential of chemicals.

  14. Effects of short- and long-term exposures to copper on lethal and reproductive endpoints of the harpacticoid copepod Tigriopus fulvus.

    Science.gov (United States)

    Biandolino, Francesca; Parlapiano, Isabella; Faraponova, Olga; Prato, Ermelinda

    2018-01-01

    The long-term exposure provides a realistic measurement of the effects of toxicants on aquatic organisms. The harpacticoid copepod Tigriopus fulvus has a wide geographical distribution and is considered as an ideal model organism for ecotoxicological studies for its good sensitivity to different toxicants. In this study, acute, sub-chronic and chronic toxicity tests based on lethal and reproductive responses of Tigriopus fulvus to copper were performed. The number of moults during larval development was chosen as an endpoint for sub-chronic test. Sex ratio, inhibitory effect on larval development, hatching time, fecundity, brood number, nauplii/brood, total newborn production, etc, were calculated in the chronic test (28d). Lethal effect of copper to nauplii showed the LC50-48h of 310 ± 72µgCu/L (mean ± sd). It was observed a significant inhibition of larval development at sublethal copper concentrations, after 4 and 7 d. After 4d, the EC50 value obtained for the endpoint in "moult naupliar reduction" was of 55.8 ± 2.5µgCu/L (mean ± sd). The EC50 for the inhibition of naupliar development into copepodite stage, was of 21.7 ± 4.4µgCu/L (mean ± sd), after 7 days. Among the different traits tested, copper did not affect sex ratio and growth, while fecundity and total nauplii production were the most sensitive endpoints. The reproductive endpoints offer the advantage of being detectable at very low pollutant concentrations. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Continuation calculus

    Directory of Open Access Journals (Sweden)

    Bram Geron

    2013-09-01

    Full Text Available Programs with control are usually modeled using lambda calculus extended with control operators. Instead of modifying lambda calculus, we consider a different model of computation. We introduce continuation calculus, or CC, a deterministic model of computation that is evaluated using only head reduction, and argue that it is suitable for modeling programs with control. It is demonstrated how to define programs, specify them, and prove them correct. This is shown in detail by presenting in CC a list multiplication program that prematurely returns when it encounters a zero. The correctness proof includes termination of the program. In continuation calculus we can model both call-by-name and call-by-value. In addition, call-by-name functions can be applied to call-by-value results, and conversely.

  16. Surrogate endpoints for overall survival in digestive oncology trials: which candidates? A questionnaires survey among clinicians and methodologists

    OpenAIRE

    Bonnetain Franck; Bedenne Laurent; Methy Nicolas

    2010-01-01

    Abstract Background Overall survival (OS) is the gold standard for the demonstration of a clinical benefit in cancer trials. Replacement of OS by a surrogate endpoint allows to reduce trial duration. To date, few surrogate endpoints have been validated in digestive oncology. The aim of this study was to draw up an ordered list of potential surrogate endpoints for OS in digestive cancer trials, by way of a survey among clinicians and methodologists. Secondary objective was to obtain their opin...

  17. Serum urate as surrogate endpoint for flares in people with gout

    DEFF Research Database (Denmark)

    Stamp, Lisa K; Birger Morillon, Melanie; Taylor, William J

    2018-01-01

    Objectives The primary efficacy outcome in trials of urate lowering therapy (ULT) for gout is serum urate (SU). The aim of this study was to examine the strength of the relationship between SU and patient-important outcomes to determine whether SU is an adequate surrogate endpoint for clinical tr...

  18. 78 FR 73199 - Draft Guidance for Industry on Bioequivalence Studies With Pharmacokinetic Endpoints for Drugs...

    Science.gov (United States)

    2013-12-05

    ... exposure measures is suitable for documenting BE (e.g., transdermal delivery systems and certain rectal and... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2013-D-1464] Draft Guidance for Industry on Bioequivalence Studies With Pharmacokinetic Endpoints for Drugs Submitted...

  19. Baseline characteristics in the Aliskiren Trial in Type 2 Diabetes Using Cardio-Renal Endpoints (ALTITUDE)

    DEFF Research Database (Denmark)

    Parving, Hans-Henrik; Brenner, Barry M; McMurray, John J V

    2012-01-01

    Patients with type 2 diabetes are at enhanced risk for macro- and microvascular complications. Albuminuria and/or reduced kidney function further enhances the vascular risk. We initiated the Aliskiren Trial in Type 2 Diabetes Using Cardio-Renal Endpoints (ALTITUDE). Aliskiren, a novel direct renin...

  20. Can joint sound assess soft and hard endpoints of the Lachman test?: A preliminary study.

    Science.gov (United States)

    Hattori, Koji; Ogawa, Munehiro; Tanaka, Kazunori; Matsuya, Ayako; Uematsu, Kota; Tanaka, Yasuhito

    2016-05-12

    The Lachman test is considered to be a reliable physical examination for anterior cruciate ligament (ACL) injury. Patients with a damaged ACL demonstrate a soft endpoint feeling. However, examiners judge the soft and hard endpoints subjectively. The purpose of our study was to confirm objective performance of the Lachman test using joint auscultation. Human and porcine knee joints were examined. Knee joint sound during the Lachman test (Lachman sound) was analyzed by fast Fourier transformation. As quantitative indices of Lachman sound, the peak sound as the maximum relative amplitude (acoustic pressure) and its frequency were used. The mean Lachman peak sound for healthy volunteer knees was 86.9 ± 12.9 Hz in frequency and -40 ± 2.5 dB in acoustic pressure. The mean Lachman peak sound for intact porcine knees was 84.1 ± 9.4 Hz and -40.5 ± 1.7 dB. Porcine knees with ACL deficiency had a soft endpoint feeling during the Lachman test. The Lachman peak sounds of porcine knees with ACL deficiency were dispersed into four distinct groups, with center frequencies of around 40, 160, 450, and 1600. The Lachman peak sound was capable of assessing soft and hard endpoints of the Lachman test objectively.

  1. Coulometric Titration of Ethylenediaminetetraacetate (EDTA) with Spectrophotometric Endpoint Detection: An Experiment for the Instrumental Analysis Laboratory

    Science.gov (United States)

    Williams, Kathryn R.; Young, Vaneica Y.; Killian, Benjamin J.

    2011-01-01

    Ethylenediaminetetraacetate (EDTA) is commonly used as an anticoagulant in blood-collection procedures. In this experiment for the instrumental analysis laboratory, students determine the quantity of EDTA in commercial collection tubes by coulometric titration with electrolytically generated Cu[superscript 2+]. The endpoint is detected…

  2. A covariant open bosonic string field theory including the endpoint and middlepoint interaction

    International Nuclear Information System (INIS)

    Liu, B.G.; Northwest Univ., Xian; Chen, Y.X.

    1988-01-01

    Extending the usual endpoint and midpoint interactions, we introduce numerous kinds of interactions, labelled by a parameter λ and obtain a non-commutative and associative string field algebra by adding up all interactions. With this algebra we develop a covariant open bosonic string field theory, which reduces to Witten's open bosonic string field theory under a special string length choice. (orig.)

  3. Using an Ecosystem Approach to complement protection schemes based on organism-level endpoints

    International Nuclear Information System (INIS)

    Bradshaw, Clare; Kapustka, Lawrence; Barnthouse, Lawrence; Brown, Justin; Ciffroy, Philippe; Forbes, Valery; Geras'kin, Stanislav; Kautsky, Ulrik; Bréchignac, François

    2014-01-01

    Radiation protection goals for ecological resources are focussed on ecological structures and functions at population-, community-, and ecosystem-levels. The current approach to radiation safety for non-human biota relies on organism-level endpoints, and as such is not aligned with the stated overarching protection goals of international agencies. Exposure to stressors can trigger non-linear changes in ecosystem structure and function that cannot be predicted from effects on individual organisms. From the ecological sciences, we know that important interactive dynamics related to such emergent properties determine the flows of goods and services in ecological systems that human societies rely upon. A previous Task Group of the IUR (International Union of Radioecology) has presented the rationale for adding an Ecosystem Approach to the suite of tools available to manage radiation safety. In this paper, we summarize the arguments for an Ecosystem Approach and identify next steps and challenges ahead pertaining to developing and implementing a practical Ecosystem Approach to complement organism-level endpoints currently used in radiation safety. - Highlights: • An Ecosystem Approach to radiation safety complements the organism-level approach. • Emergent properties in ecosystems are not captured by organism-level endpoints. • The proposed Ecosystem Approach better aligns with management goals. • Practical guidance with respect to system-level endpoints is needed. • Guidance on computational model selection would benefit an Ecosystem Approach

  4. Impact of confinement housing on study end-points in the calf model of cryptosporidiosis.

    Science.gov (United States)

    Graef, Geneva; Hurst, Natalie J; Kidder, Lance; Sy, Tracy L; Goodman, Laura B; Preston, Whitney D; Arnold, Samuel L M; Zambriski, Jennifer A

    2018-04-01

    Diarrhea is the second leading cause of death in children confinement housing, and Interval Collection (IC), which permits use of box stalls. CFC mimics human challenge model methodology but it is unknown if confinement housing impacts study end-points and if data gathered via this method is suitable for generalization to human populations. Using a modified crossover study design we compared CFC and IC and evaluated the impact of housing on study end-points. At birth, calves were randomly assigned to confinement (n = 14) or box stall housing (n = 9), or were challenged with 5 x 107 C. parvum oocysts, and followed for 10 days. Study end-points included fecal oocyst shedding, severity of diarrhea, degree of dehydration, and plasma cortisol. Calves in confinement had no significant differences in mean log oocysts enumerated per gram of fecal dry matter between CFC and IC samples (P = 0.6), nor were there diurnal variations in oocyst shedding (P = 0.1). Confinement housed calves shed significantly more oocysts (P = 0.05), had higher plasma cortisol (P = 0.001), and required more supportive care (P = 0.0009) than calves in box stalls. Housing method confounds study end-points in the calf model of cryptosporidiosis. Due to increased stress data collected from calves in confinement housing may not accurately estimate the efficacy of chemotherapeutics targeting C. parvum.

  5. Deep vadose zone remediation: technical and policy challenges, opportunities, and progress in achieving cleanup endpoints

    International Nuclear Information System (INIS)

    Wellman, D.M.; Freshley, M.D.; Truex, M.J.; Lee, M.H.

    2013-01-01

    Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical. Use of risk-informed alternate endpoints provides a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable the establishment of a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment that integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system-based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches. (authors)

  6. Alternate Endpoints for Deep Vadose Zone Environments: Challenges, Opportunities, and Progress - 13036

    International Nuclear Information System (INIS)

    Wellman, Dawn M.; Freshley, Mark D.; Truex, Michael J.; Lee, M. Hope

    2013-01-01

    Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical to achieve. Use of risk-informed alternate endpoints provide a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable establishing a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment that integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system-based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches. (authors)

  7. Alternate Endpoints for Deep Vadose Zone Environments: Challenges, Opportunities, and Progress - 13036

    Energy Technology Data Exchange (ETDEWEB)

    Wellman, Dawn M.; Freshley, Mark D.; Truex, Michael J.; Lee, M. Hope [Pacific Northwest National Laboratory, 902 Battelle Blvd, Richland, WA, 99352 (United States)

    2013-07-01

    Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical to achieve. Use of risk-informed alternate endpoints provide a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable establishing a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment that integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system-based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches. (authors)

  8. Deep vadose zone remediation: technical and policy challenges, opportunities, and progress in achieving cleanup endpoints

    Energy Technology Data Exchange (ETDEWEB)

    Wellman, D.M.; Freshley, M.D.; Truex, M.J.; Lee, M.H. [Pacific Northwest National Laboratory, Richland, Washington (United States)

    2013-07-01

    Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical. Use of risk-informed alternate endpoints provides a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable the establishment of a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment that integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system-based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches. (authors)

  9. Systematic review and consensus definitions for the Standardised Endpoints in Perioperative Medicine (StEP) initiative

    DEFF Research Database (Denmark)

    Myles, P S; Boney, O; Botti, M

    2018-01-01

    Medicine initiative was established to derive a set of standardised endpoints for use in perioperative clinical trials. METHODS: We undertook a systematic review to identify measures of patient comfort used in the anaesthetic, surgical, and other perioperative literature. A multi-round Delphi consensus...

  10. End-point construction and systematic titration error in linear titration curves-complexation reactions

    NARCIS (Netherlands)

    Coenegracht, P.M.J.; Duisenberg, A.J.M.

    The systematic titration error which is introduced by the intersection of tangents to hyperbolic titration curves is discussed. The effects of the apparent (conditional) formation constant, of the concentration of the unknown component and of the ranges used for the end-point construction are

  11. Transmission assessment surveys (TAS) to define endpoints for lymphatic filariasis mass drug administration

    DEFF Research Database (Denmark)

    Chu, Brian K.; Deming, Michael; Biritwum, Nana-Kwadwo

    2013-01-01

    Lymphatic filariasis (LF) is targeted for global elimination through treatment of entire at-risk populations with repeated annual mass drug administration (MDA). Essential for program success is defining and confirming the appropriate endpoint for MDA when transmission is presumed to have reached...

  12. Correlates of protection for rotavirus vaccines: Possible alternative trial endpoints, opportunities, and challenges.

    Science.gov (United States)

    Angel, Juana; Steele, A Duncan; Franco, Manuel A

    2014-01-01

    Rotavirus (RV) is a major vaccine-preventable killer of young children worldwide. Two RV vaccines are globally commercially available and other vaccines are in different stages of development. Due to the absence of a suitable correlate of protection (CoP), all RV vaccine efficacy trials have had clinical endpoints. These trials represent an important challenge since RV vaccines have to be introduced in many different settings, placebo-controlled studies are unethical due to the availability of licensed vaccines, and comparator assessments for new vaccines with clinical endpoints are very large, complex, and expensive to conduct. A CoP as a surrogate endpoint would allow predictions of vaccine efficacy for new RV vaccines and enable a regulatory pathway, contributing to the more rapid development of new RV vaccines. The goal of this review is to summarize experiences from RV natural infection and vaccine studies to evaluate potential CoP for use as surrogate endpoints for assessment of new RV vaccines, and to explore challenges and opportunities in the field.

  13. Population modelling to compare chronic external radiotoxicity between individual and population endpoints in four taxonomic groups.

    Science.gov (United States)

    Alonzo, Frédéric; Hertel-Aas, Turid; Real, Almudena; Lance, Emilie; Garcia-Sanchez, Laurent; Bradshaw, Clare; Vives I Batlle, Jordi; Oughton, Deborah H; Garnier-Laplace, Jacqueline

    2016-02-01

    In this study, we modelled population responses to chronic external gamma radiation in 12 laboratory species (including aquatic and soil invertebrates, fish and terrestrial mammals). Our aim was to compare radiosensitivity between individual and population endpoints and to examine how internationally proposed benchmarks for environmental radioprotection protected species against various risks at the population level. To do so, we used population matrix models, combining life history and chronic radiotoxicity data (derived from laboratory experiments and described in the literature and the FREDERICA database) to simulate changes in population endpoints (net reproductive rate R0, asymptotic population growth rate λ, equilibrium population size Neq) for a range of dose rates. Elasticity analyses of models showed that population responses differed depending on the affected individual endpoint (juvenile or adult survival, delay in maturity or reduction in fecundity), the considered population endpoint (R0, λ or Neq) and the life history of the studied species. Among population endpoints, net reproductive rate R0 showed the lowest EDR10 (effective dose rate inducing 10% effect) in all species, with values ranging from 26 μGy h(-1) in the mouse Mus musculus to 38,000 μGy h(-1) in the fish Oryzias latipes. For several species, EDR10 for population endpoints were lower than the lowest EDR10 for individual endpoints. Various population level risks, differing in severity for the population, were investigated. Population extinction (predicted when radiation effects caused population growth rate λ to decrease below 1, indicating that no population growth in the long term) was predicted for dose rates ranging from 2700 μGy h(-1) in fish to 12,000 μGy h(-1) in soil invertebrates. A milder risk, that population growth rate λ will be reduced by 10% of the reduction causing extinction, was predicted for dose rates ranging from 24 μGy h(-1) in mammals to 1800 μGy h(-1) in

  14. Population modelling to compare chronic external radiotoxicity between individual and population endpoints in four taxonomic groups

    International Nuclear Information System (INIS)

    Alonzo, Frédéric; Hertel-Aas, Turid; Real, Almudena; Lance, Emilie; Garcia-Sanchez, Laurent; Bradshaw, Clare; Vives i Batlle, Jordi; Oughton, Deborah H.; Garnier-Laplace, Jacqueline

    2016-01-01

    In this study, we modelled population responses to chronic external gamma radiation in 12 laboratory species (including aquatic and soil invertebrates, fish and terrestrial mammals). Our aim was to compare radiosensitivity between individual and population endpoints and to examine how internationally proposed benchmarks for environmental radioprotection protected species against various risks at the population level. To do so, we used population matrix models, combining life history and chronic radiotoxicity data (derived from laboratory experiments and described in the literature and the FREDERICA database) to simulate changes in population endpoints (net reproductive rate R_0, asymptotic population growth rate λ, equilibrium population size N_e_q) for a range of dose rates. Elasticity analyses of models showed that population responses differed depending on the affected individual endpoint (juvenile or adult survival, delay in maturity or reduction in fecundity), the considered population endpoint (R_0, λ or N_e_q) and the life history of the studied species. Among population endpoints, net reproductive rate R_0 showed the lowest EDR_1_0 (effective dose rate inducing 10% effect) in all species, with values ranging from 26 μGy h"−"1 in the mouse Mus musculus to 38,000 μGy h"−"1 in the fish Oryzias latipes. For several species, EDR_1_0 for population endpoints were lower than the lowest EDR_1_0 for individual endpoints. Various population level risks, differing in severity for the population, were investigated. Population extinction (predicted when radiation effects caused population growth rate λ to decrease below 1, indicating that no population growth in the long term) was predicted for dose rates ranging from 2700 μGy h"−"1 in fish to 12,000 μGy h"−"1 in soil invertebrates. A milder risk, that population growth rate λ will be reduced by 10% of the reduction causing extinction, was predicted for dose rates ranging from 24 μGy h"−"1

  15. The Impact of Chemoembolization Endpoints on Survival in Hepatocellular Carcinoma Patients

    Science.gov (United States)

    Jin, Brian; Wang, Dingxin; Lewandowski, Robert J.; Riaz, Ahsun; Ryu, Robert K.; Sato, Kent T.; Larson, Andrew C.; Salem, Riad; Omary, Reed A.

    2010-01-01

    OBJECTIVE To investigate the relationship between angiographic embolic endpoints of transarterial chemoembolization (TACE) and survival in patients with hepatocellular carcinoma (HCC). MATERIALS AND METHODS This study retrospectively assessed 105 patients with surgically unresectable HCC who underwent TACE. Patients were classified according to a previously established subjective angiographic chemoembolization endpoint (SACE) scale. Only one patient was classified as SACE level 1 and thus excluded from all subsequent analysis. Survival was evaluated with Kaplan-Meier analysis. Multivariate analysis with Cox’s proportional hazard regression model was used to determine independent prognostic risk factors of survival. RESULTS Overall median survival was 21.1 months (95% confidence interval [CI], 15.9–26.4). Patients embolized to SACE levels 2 and 3 were aggregated and had a significantly higher median survival (25.6 months; 95% CI, 16.2–35.0) than patients embolized to SACE level 4 (17.1 months; 95% CI, 13.3–20.9) (p = 0.035). Multivariate analysis indicated that SACE level 4 (Hazard ratio [HR], 2.49; 95% CI, 1.41–4.42; p = 0.002), European Cooperative Oncology Group performance status > 0 (HR, 1.97; 95% CI, 1.15–3.37; p = 0.013), American Joint Committee on Cancer stage 3 or 4 (HR, 2.42; 95% CI, 1.27–4.60; p = 0.007), and Child-Pugh class B (HR, 1.94; 95% CI, 1.09–3.46; p = 0.025) were all independent negative prognostic indicators of survival. CONCLUSION Embolization to an intermediate, sub-stasis endpoint (SACE levels 2 and 3) during TACE improves survival compared to embolization to a higher, stasis endpoint (SACE level 4). Interventional oncologists should consider targeting these intermediate, sub-stasis angiographic endpoints during TACE. PMID:21427346

  16. Predicting location-specific extreme coastal floods in the future climate by introducing a probabilistic method to calculate maximum elevation of the continuous water mass caused by a combination of water level variations and wind waves

    Science.gov (United States)

    Leijala, Ulpu; Björkqvist, Jan-Victor; Johansson, Milla M.; Pellikka, Havu

    2017-04-01

    Future coastal management continuously strives for more location-exact and precise methods to investigate possible extreme sea level events and to face flooding hazards in the most appropriate way. Evaluating future flooding risks by understanding the behaviour of the joint effect of sea level variations and wind waves is one of the means to make more comprehensive flooding hazard analysis, and may at first seem like a straightforward task to solve. Nevertheless, challenges and limitations such as availability of time series of the sea level and wave height components, the quality of data, significant locational variability of coastal wave height, as well as assumptions to be made depending on the study location, make the task more complicated. In this study, we present a statistical method for combining location-specific probability distributions of water level variations (including local sea level observations and global mean sea level rise) and wave run-up (based on wave buoy measurements). The goal of our method is to obtain a more accurate way to account for the waves when making flooding hazard analysis on the coast compared to the approach of adding a separate fixed wave action height on top of sea level -based flood risk estimates. As a result of our new method, we gain maximum elevation heights with different return periods of the continuous water mass caused by a combination of both phenomena, "the green water". We also introduce a sensitivity analysis to evaluate the properties and functioning of our method. The sensitivity test is based on using theoretical wave distributions representing different alternatives of wave behaviour in relation to sea level variations. As these wave distributions are merged with the sea level distribution, we get information on how the different wave height conditions and shape of the wave height distribution influence the joint results. Our method presented here can be used as an advanced tool to minimize over- and

  17. Characterisation of Maillard reaction products derived from LEKFD--a pentapeptide found in β-lactoglobulin sequence, glycated with glucose--by tandem mass spectrometry, molecular orbital calculations and gel filtration chromatography coupled with continuous photodiode array.

    Science.gov (United States)

    Yamaguchi, Keiko; Homma, Takeshi; Nomi, Yuri; Otsuka, Yuzuru

    2014-02-15

    Maillard reaction peptides (MRPs) contribute to taste, aroma, colour, texture and biological activity. However, peptide degradation or the cross-linking of MRPs in the Maillard reaction has not been investigated clearly. A peptide of LEKFD, a part of β-lactoglobulin, was heated at 110 °C for 24h with glucose and the reaction products were analysed by HPLC with ODS, ESI-MS, ESI-MS/MS and HPLC with gel-filtration column and DAD detector. In the HPLC fractions, an imminium ion of LEK*FD, a pyrylium ion or a hydroxymethyl furylium ion of LEK*FD, and KFD and EK were detected by ESI-MS. Therefore, those products may be produced by the Maillard reaction. The molecular orbital of glycated LEKFD at the lysine epsilon-amino residue with Schiff base form was calculated by MOPAC. HPLC with gel-filtration column showed cross-linking and degradation of peptides. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Guidelines for time-to-event end-point definitions in trials for pancreatic cancer. Results of the DATECAN initiative (Definition for the Assessment of Time-to-event End-points in CANcer trials)

    NARCIS (Netherlands)

    Bonnetain, Franck; Bonsing, Bert; Conroy, Thierry; Dousseau, Adelaide; Glimelius, Bengt; Haustermans, Karin; Lacaine, François; van Laethem, Jean Luc; Aparicio, Thomas; Aust, Daniela; Bassi, Claudio; Berger, Virginie; Chamorey, Emmanuel; Chibaudel, Benoist; Dahan, Laeticia; de Gramont, Aimery; Delpero, Jean Robert; Dervenis, Christos; Ducreux, Michel; Gal, Jocelyn; Gerber, Erich; Ghaneh, Paula; Hammel, Pascal; Hendlisz, Alain; Jooste, Valérie; Labianca, Roberto; Latouche, Aurelien; Lutz, Manfred; Macarulla, Teresa; Malka, David; Mauer, Muriel; Mitry, Emmanuel; Neoptolemos, John; Pessaux, Patrick; Sauvanet, Alain; Tabernero, Josep; Taieb, Julien; van Tienhoven, Geertjan; Gourgou-Bourgade, Sophie; Bellera, Carine; Mathoulin-Pélissier, Simone; Collette, Laurence

    2014-01-01

    Using potential surrogate end-points for overall survival (OS) such as Disease-Free- (DFS) or Progression-Free Survival (PFS) is increasingly common in randomised controlled trials (RCTs). However, end-points are too often imprecisely defined which largely contributes to a lack of homogeneity across

  19. Adaptive endpoint detection of seismic signal based on auto-correlated function

    International Nuclear Information System (INIS)

    Fan Wanchun; Shi Ren

    2000-01-01

    There are certain shortcomings for the endpoint detection by time-waveform envelope and/or by checking the travel table (both labelled as the artificial detection method). Based on the analysis of the auto-correlation function, the notion of the distance between auto-correlation functions was quoted, and the characterizations of the noise and the signal with noise were discussed by using the distance. Then, the method of auto-adaptable endpoint detection of seismic signal based on auto-correlated similarity was summed up. The steps of implementation and determining of the thresholds were presented in detail. The experimental results that were compared with the methods based on artificial detecting show that this method has higher sensitivity even in a low SNR circumstance

  20. Muscle Synergies Heavily Influence the Neural Control of Arm Endpoint Stiffness and Energy Consumption.

    Science.gov (United States)

    Inouye, Joshua M; Valero-Cuevas, Francisco J

    2016-02-01

    Much debate has arisen from research on muscle synergies with respect to both limb impedance control and energy consumption. Studies of limb impedance control in the context of reaching movements and postural tasks have produced divergent findings, and this study explores whether the use of synergies by the central nervous system (CNS) can resolve these findings and also provide insights on mechanisms of energy consumption. In this study, we phrase these debates at the conceptual level of interactions between neural degrees of freedom and tasks constraints. This allows us to examine the ability of experimentally-observed synergies--correlated muscle activations--to control both energy consumption and the stiffness component of limb endpoint impedance. In our nominal 6-muscle planar arm model, muscle synergies and the desired size, shape, and orientation of endpoint stiffness ellipses, are expressed as linear constraints that define the set of feasible muscle activation patterns. Quadratic programming allows us to predict whether and how energy consumption can be minimized throughout the workspace of the limb given those linear constraints. We show that the presence of synergies drastically decreases the ability of the CNS to vary the properties of the endpoint stiffness and can even preclude the ability to minimize energy. Furthermore, the capacity to minimize energy consumption--when available--can be greatly affected by arm posture. Our computational approach helps reconcile divergent findings and conclusions about task-specific regulation of endpoint stiffness and energy consumption in the context of synergies. But more generally, these results provide further evidence that the benefits and disadvantages of muscle synergies go hand-in-hand with the structure of feasible muscle activation patterns afforded by the mechanics of the limb and task constraints. These insights will help design experiments to elucidate the interplay between synergies and the mechanisms

  1. Comparison of mammalian and fish cell line cytotoxicity: impact of endpoint and exposure duration

    International Nuclear Information System (INIS)

    Guelden, Michael; Moerchel, Sabine; Seibert, Hasso

    2005-01-01

    Comparisons of acute toxic concentrations of chemicals to fish in vivo and cytotoxic concentrations to fish cell lines in vitro reveal rather good correlations of the toxic potencies in vitro and in vivo, but a clearly lower sensitivity of the fish cells. To examine whether the low sensitivity is specific for fish cells, cytotoxic potencies of reference chemicals from the Multicenter Evaluation of In Vitro Cytotoxicity program (MEIC) reported for the fish cell lines R1 and RTG-2 were compared with those obtained with the mouse Balb/c 3T3 cell line. Cytotoxic potencies (EC 50 values) for MEIC reference chemicals were determined with exponentially growing Balb/c 3T3 cells using three different test protocols. To assess both endpoints, cell proliferation and cell survival, EC 50 values were measured for the decrease in final cell protein after 24 and 72 h of exposure and for the reduction of cell protein increase during 24 h of exposure. EC 50 values obtained with the fish cell lines R1 and RTG-2 using cell survival as endpoint were taken from the MEIC data base. The comparison of cytotoxic potencies shows that, in general, the fish cell lines and the mammalian cell line are almost equally sensitive towards the cytotoxic action of chemicals. The mammalian cell line assay, however, becomes considerably more sensitive, by factors of 3.4-8.5, than the fish cell line assays, if cell growth instead of cell survival is used as endpoint. It is concluded, that cell proliferation might be a better endpoint than cell survival and that mammalian cell lines might be suited to assess fish acute toxicity

  2. Energy metabolism and biotransformation as endpoints to pre-screen hepatotoxicity using a liver spheroid model

    International Nuclear Information System (INIS)

    Xu Jinsheng; Purcell, Wendy M.

    2006-01-01

    The current study investigated liver spheroid culture as an in vitro model to evaluate the endpoints relevant to the status of energy metabolism and biotransformation after exposure to test toxicants. Mature rat liver spheroids were exposed to diclofenac, galactosamine, isoniazid, paracetamol, m-dinitrobenzene (m-DNB) and 3-nitroaniline (3-NA) for 24 h. Pyruvate uptake, galactose biotransformation, lactate release and glucose secretion were evaluated after exposure. The results showed that pyruvate uptake and lactate release by mature liver spheroids in culture were maintained at a relatively stable level. These endpoints, together with glucose secretion and galactose biotransformation, were related to and could reflect the status of energy metabolism and biotransformation in hepatocytes. After exposure, all of the test agents significantly reduced glucose secretion, which was shown to be the most sensitive endpoint of those evaluated. Diclofenac, isoniazid, paracetamol and galactosamine reduced lactate release (P < 0.01), but m-DNB increased lactate release (P < 0.01). Diclofenac, isoniazid and paracetamol also reduced pyruvate uptake (P < 0.01), while galactosamine had little discernible effect. Diclofenac, galactosamine, paracetamol and m-DNB also reduced galactose biotransformation (P < 0.01), by contrast, isoniazid did not. The metabolite of m-DNB, 3-NA, which served as a negative control, did not cause significant changes in lactate release, pyruvate uptake or galactose biotransformation. It is concluded that pyruvate uptake, galactose biotransformation, lactate release and glucose secretion can be used as endpoints for evaluating the status of energy metabolism and biotransformation after exposure to test agents using the liver spheroid model to pre-screen hepatotoxicity

  3. Kinetic titration with differential thermometric determination of the end-point.

    Science.gov (United States)

    Sajó, I

    1968-06-01

    A method has been described for the determination of concentrations below 10(-4)M by applying catalytic reactions and using thermometric end-point determination. A reference solution, identical with the sample solution except for catalyst, is titrated with catalyst solution until the rates of reaction become the same, as shown by a null deflection on a galvanometer connected via bridge circuits to two opposed thermistors placed in the solutions.

  4. A New Test Unit for Disintegration End-Point Determination of Orodispersible Films.

    Science.gov (United States)

    Low, Ariana; Kok, Si Ling; Khong, Yuet Mei; Chan, Sui Yung; Gokhale, Rajeev

    2015-11-01

    No standard time or pharmacopoeia disintegration test method for orodispersible films (ODFs) exists. The USP disintegration test for tablets and capsules poses significant challenges for end-point determination when used for ODFs. We tested a newly developed disintegration test unit (DTU) against the USP disintegration test. The DTU is an accessory to the USP disintegration apparatus. It holds the ODF in a horizontal position, allowing top-view of the ODF during testing. A Gauge R&R study was conducted to assign relative contributions of the total variability from the operator, sample or the experimental set-up. Precision was compared using commercial ODF products in different media. Agreement between the two measurement methods was analysed. The DTU showed improved repeatability and reproducibility compared to the USP disintegration system with tighter standard deviations regardless of operator or medium. There is good agreement between the two methods, with the USP disintegration test giving generally longer disintegration times possibly due to difficulty in end-point determination. The DTU provided clear end-point determination and is suitable for quality control of ODFs during product developmental stage or manufacturing. This may facilitate the development of a standardized methodology for disintegration time determination of ODFs. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  5. End-point impedance measurements across dominant and nondominant hands and robotic assistance with directional damping.

    Science.gov (United States)

    Erden, Mustafa Suphi; Billard, Aude

    2015-06-01

    The goal of this paper is to perform end-point impedance measurements across dominant and nondominant hands while doing airbrush painting and to use the results for developing a robotic assistance scheme. We study airbrush painting because it resembles in many ways manual welding, a standard industrial task. The experiments are performed with the 7 degrees of freedom KUKA lightweight robot arm. The robot is controlled in admittance using a force sensor attached at the end-point, so as to act as a free-mass and be passively guided by the human. For impedance measurements, a set of nine subjects perform 12 repetitions of airbrush painting, drawing a straight-line on a cartoon horizontally placed on a table, while passively moving the airbrush mounted on the robot's end-point. We measure hand impedance during the painting task by generating sudden and brief external forces with the robot. The results show that on average the dominant hand displays larger impedance than the nondominant in the directions perpendicular to the painting line. We find the most significant difference in the damping values in these directions. Based on this observation, we develop a "directional damping" scheme for robotic assistance and conduct a pilot study with 12 subjects to contrast airbrush painting with and without robotic assistance. Results show significant improvement in precision with both dominant and nondominant hands when using robotic assistance.

  6. Using quantitative structure-activity relationships (QSAR) to predict toxic endpoints for polycyclic aromatic hydrocarbons (PAH).

    Science.gov (United States)

    Bruce, Erica D; Autenrieth, Robin L; Burghardt, Robert C; Donnelly, K C; McDonald, Thomas J

    2008-01-01

    Quantitative structure-activity relationships (QSAR) offer a reliable, cost-effective alternative to the time, money, and animal lives necessary to determine chemical toxicity by traditional methods. Additionally, humans are exposed to tens of thousands of chemicals in their lifetimes, necessitating the determination of chemical toxicity and screening for those posing the greatest risk to human health. This study developed models to predict toxic endpoints for three bioassays specific to several stages of carcinogenesis. The ethoxyresorufin O-deethylase assay (EROD), the Salmonella/microsome assay, and a gap junction intercellular communication (GJIC) assay were chosen for their ability to measure toxic endpoints specific to activation-, induction-, and promotion-related effects of polycyclic aromatic hydrocarbons (PAH). Shape-electronic, spatial, information content, and topological descriptors proved to be important descriptors in predicting the toxicity of PAH in these bioassays. Bioassay-based toxic equivalency factors (TEF(B)) were developed for several PAH using the quantitative structure-toxicity relationships (QSTR) developed. Predicting toxicity for a specific PAH compound, such as a bioassay-based potential potency (PP(B)) or a TEF(B), is possible by combining the predicted behavior from the QSTR models. These toxicity estimates may then be incorporated into a risk assessment for compounds that lack toxicity data. Accurate toxicity predictions are made by examining each type of endpoint important to the process of carcinogenicity, and a clearer understanding between composition and toxicity can be obtained.

  7. Reduction of animal suffering in rabies vaccine potency testing by introduction of humane endpoints.

    Science.gov (United States)

    Takayama-Ito, Mutsuyo; Lim, Chang-Kweng; Nakamichi, Kazuo; Kakiuchi, Satsuki; Horiya, Madoka; Posadas-Herrera, Guillermo; Kurane, Ichiro; Saijo, Masayuki

    2017-03-01

    Potency controls of inactivated rabies vaccines for human use are confirmed by the National Institutes of Health challenge test in which lethal infection with severe neurological symptoms should be observed in approximately half of the mice inoculated with the rabies virus. Weight loss, decreased body temperature, and the presence of rabies-associated neurological signs have been proposed as humane endpoints. The potential for reduction of animal suffering by introducing humane endpoints in the potency test for inactivated rabies vaccine for human use was investigated. The clinical signs were scored and body weight was monitored. The average times to death following inoculation were 10.49 and 10.99 days post-inoculation (dpi) by the potency and challenge control tests, respectively, whereas the average times to showing Score-2 signs (paralysis, trembling, and coma) were 6.26 and 6.55 dpi, respectively. Body weight loss of more than 15% appeared at 5.82 and 6.42 dpi. The data provided here support the introduction of obvious neuronal signs combined with a body weight loss of ≥15% as a humane endpoint to reduce the time of animal suffering by approximately 4 days. Copyright © 2017 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  8. Monte Carlo alpha calculation

    Energy Technology Data Exchange (ETDEWEB)

    Brockway, D.; Soran, P.; Whalen, P.

    1985-01-01

    A Monte Carlo algorithm to efficiently calculate static alpha eigenvalues, N = ne/sup ..cap alpha..t/, for supercritical systems has been developed and tested. A direct Monte Carlo approach to calculating a static alpha is to simply follow the buildup in time of neutrons in a supercritical system and evaluate the logarithmic derivative of the neutron population with respect to time. This procedure is expensive, and the solution is very noisy and almost useless for a system near critical. The modified approach is to convert the time-dependent problem to a static ..cap alpha../sup -/eigenvalue problem and regress ..cap alpha.. on solutions of a/sup -/ k/sup -/eigenvalue problem. In practice, this procedure is much more efficient than the direct calculation, and produces much more accurate results. Because the Monte Carlo codes are intrinsically three-dimensional and use elaborate continuous-energy cross sections, this technique is now used as a standard for evaluating other calculational techniques in odd geometries or with group cross sections.

  9. Deep penetration calculations

    International Nuclear Information System (INIS)

    Thompson, W.L.; Deutsch, O.L.; Booth, T.E.

    1980-04-01

    Several Monte Carlo techniques are compared in the transport of neutrons of different source energies through two different deep-penetration problems each with two parts. The first problem involves transmission through a 200-cm concrete slab. The second problem is a 90 0 bent pipe jacketed by concrete. In one case the pipe is void, and in the other it is filled with liquid sodium. Calculations are made with two different Los Alamos Monte Carlo codes: the continuous-energy code MCNP and the multigroup code MCMG

  10. Comparison between amperometric and true potentiometric end-point detection in the determination of water by the Karl Fischer method.

    Science.gov (United States)

    Cedergren, A

    1974-06-01

    A rapid and sensitive method using true potentiometric end-point detection has been developed and compared with the conventional amperometric method for Karl Fischer determination of water. The effect of the sulphur dioxide concentration on the shape of the titration curve is shown. By using kinetic data it was possible to calculate the course of titrations and make comparisons with those found experimentally. The results prove that the main reaction is the slow step, both in the amperometric and the potentiometric method. Results obtained in the standardization of the Karl Fischer reagent showed that the potentiometric method, including titration to a preselected potential, gave a standard deviation of 0.001(1) mg of water per ml, the amperometric method using extrapolation 0.002(4) mg of water per ml and the amperometric titration to a pre-selected diffusion current 0.004(7) mg of water per ml. Theories and results dealing with dilution effects are presented. The time of analysis was 1-1.5 min for the potentiometric and 4-5 min for the amperometric method using extrapolation.

  11. Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking.

    Science.gov (United States)

    Maimon-Dror, Roni O; Fernandez-Quesada, Jorge; Zito, Giuseppe A; Konnaris, Charalambos; Dziemian, Sabine; Faisal, A Aldo

    2017-07-01

    Eye-movements are the only directly observable behavioural signals that are highly correlated with actions at the task level, and proactive of body movements and thus reflect action intentions. Moreover, eye movements are preserved in many movement disorders leading to paralysis (or amputees) from stroke, spinal cord injury, Parkinson's disease, multiple sclerosis, and muscular dystrophy among others. Despite this benefit, eye tracking is not widely used as control interface for robotic interfaces in movement impaired patients due to poor human-robot interfaces. We demonstrate here how combining 3D gaze tracking using our GT3D binocular eye tracker with custom designed 3D head tracking system and calibration method enables continuous 3D end-point control of a robotic arm support system. The users can move their own hand to any location of the workspace by simple looking at the target and winking once. This purely eye tracking based system enables the end-user to retain free head movement and yet achieves high spatial end point accuracy in the order of 6 cm RMSE error in each dimension and standard deviation of 4 cm. 3D calibration is achieved by moving the robot along a 3 dimensional space filling Peano curve while the user is tracking it with their eyes. This results in a fully automated calibration procedure that yields several thousand calibration points versus standard approaches using a dozen points, resulting in beyond state-of-the-art 3D accuracy and precision.

  12. Energetic endpoints provide early indicators of life history effects in a freshwater gastropod exposed to the fungicide, pyraclostrobin

    International Nuclear Information System (INIS)

    Fidder, Bridgette N.; Reátegui-Zirena, Evelyn G.; Olson, Adric D.; Salice, Christopher J.

    2016-01-01

    Organismal energetics provide important insights into the effects of environmental toxicants. We aimed to determine the effects of pyraclostrobin on Lymnaea stagnalis by examining energy allocation patterns and life history traits. Juvenile snails exposed to pyraclostrobin decreased feeding rate and increased apparent avoidance behaviors at environmentally relevant concentrations. In adults, we found that sublethal concentrations of pyraclostrobin did not affect reproductive output, however, there were significant effects on developmental endpoints with longer time to hatch and decreased hatching success in pyraclostrobin-exposed egg masses. Further, there were apparent differences in developmental effects depending on whether mothers were also exposed to pyraclostrobin suggesting this chemical can exert intergenerational effects. Pyraclostrobin also affected protein and carbohydrate content of eggs in mothers that were exposed to pyraclostrobin. Significant effects on macronutrient content of eggs occurred at lower concentrations than effects on gross endpoints such as hatching success and time to hatch suggesting potential value for these endpoints as early indicators of ecologically relevant stress. These results provide important insight into the effects of a common fungicide on important endpoints for organismal energetics and life history. - Highlights: • We exposed a freshwater snail to relevant concentrations of pyraclostrobin. • We monitored energetic and life history endpoints. • Pyraclostrobin affected feeding, hatching success and egg macronutrient content. • Energetic-based endpoints may provide valuable insight to toxic effects. - The fungicide pyraclostrobin at environmentally relevant concentrations effects a range of life history and energetic endpoints in the freshwater snail, Lymnaea stagnalis.

  13. Importance of glomerular filtration rate change as surrogate endpoint for the future incidence of end-stage renal disease in general Japanese population: community-based cohort study.

    Science.gov (United States)

    Kanda, Eiichiro; Usui, Tomoko; Kashihara, Naoki; Iseki, Chiho; Iseki, Kunitoshi; Nangaku, Masaomi

    2018-04-01

    Because of the necessity for extended period and large costs until the event occurs, surrogate endpoints are indispensable for implementation of clinical studies to improve chronic kidney disease (CKD) patients' prognosis. Subjects with serum creatinine level for a baseline period over 1-3 years were enrolled (n = 69,238) in this community-based prospective cohort study in Okinawa, Japan, and followed up for 15 years. The endpoint was end-stage renal disease (ESRD). The percent of estimated glomerular filtration rate (%eGFR) change was calculated on the basis of the baseline period. Subjects had a mean ± SD age, 55.59 ± 14.69 years; eGFR, 80.15 ± 21.15 ml/min/1.73 m 2 . Among the subjects recruited, 15.81% had a low eGFR (<60 ml/min/1.73 m 2 ) and 36.1/100,000 person years developed ESRD. Cox proportional hazards models adjusted for baseline characteristics showed that the risk of ESRD tended to be high with high rates of decrease in %eGFR changes over 2 or 3 years in the high- and low-eGFR groups. The specificities and positive predictive values for ESRD based on a cutoff value of %eGFR change of less than -30% over 2 or 3 years were high in the high- and low-eGFR groups. %eGFR change tends to be associated with the risk of ESRD. %eGFR change of less than -30% over 2 or 3 years can be a candidate surrogate endpoint for ESRD in the general Japanese population.

  14. Alternative Endpoints and Approaches for the Remediation of Contaminated Groundwater at Complex Sites - 13426

    Energy Technology Data Exchange (ETDEWEB)

    Deeb, Rula A.; Hawley, Elisabeth L. [ARCADIS, U.S., 2000 Powell St., 7th Floor, Emeryville, California 94608 (United States)

    2013-07-01

    The goal of United States (U.S.) Department of Energy's (DOE)'s environmental remediation programs is to restore groundwater to beneficial use, similar to many other Federal and state environmental cleanup programs. Based on past experience, groundwater remediation to pre-contamination conditions (i.e., drinking water standards or non-detectable concentrations) can be successfully achieved at many sites. At a subset of the most complex sites, however, complete restoration is not likely achievable within the next 50 to 100 years using today's technology. This presentation describes several approaches used at complex sites in the face of these technical challenges. Many complex sites adopted a long-term management approach, whereby contamination was contained within a specified area using active or passive remediation techniques. Consistent with the requirements of their respective environmental cleanup programs, several complex sites selected land use restrictions and used risk management approaches to accordingly adopt alternative cleanup goals (alternative endpoints). Several sites used long-term management designations and approaches in conjunction with the alternative endpoints. Examples include various state designations for groundwater management zones, technical impracticability (TI) waivers or greater risk waivers at Superfund sites, and the use of Monitored Natural Attenuation (MNA) or other passive long-term management approaches over long time frames. This presentation will focus on findings, statistics, and case studies from a recently-completed report for the Department of Defense's Environmental Security Technology Certification Program (ESTCP) (Project ER-0832) on alternative endpoints and approaches for groundwater remediation at complex sites under a variety of Federal and state cleanup programs. The primary objective of the project was to provide environmental managers and regulators with tools, metrics, and information needed

  15. Alternative Endpoints and Approaches for the Remediation of Contaminated Groundwater at Complex Sites - 13426

    International Nuclear Information System (INIS)

    Deeb, Rula A.; Hawley, Elisabeth L.

    2013-01-01

    The goal of United States (U.S.) Department of Energy's (DOE)'s environmental remediation programs is to restore groundwater to beneficial use, similar to many other Federal and state environmental cleanup programs. Based on past experience, groundwater remediation to pre-contamination conditions (i.e., drinking water standards or non-detectable concentrations) can be successfully achieved at many sites. At a subset of the most complex sites, however, complete restoration is not likely achievable within the next 50 to 100 years using today's technology. This presentation describes several approaches used at complex sites in the face of these technical challenges. Many complex sites adopted a long-term management approach, whereby contamination was contained within a specified area using active or passive remediation techniques. Consistent with the requirements of their respective environmental cleanup programs, several complex sites selected land use restrictions and used risk management approaches to accordingly adopt alternative cleanup goals (alternative endpoints). Several sites used long-term management designations and approaches in conjunction with the alternative endpoints. Examples include various state designations for groundwater management zones, technical impracticability (TI) waivers or greater risk waivers at Superfund sites, and the use of Monitored Natural Attenuation (MNA) or other passive long-term management approaches over long time frames. This presentation will focus on findings, statistics, and case studies from a recently-completed report for the Department of Defense's Environmental Security Technology Certification Program (ESTCP) (Project ER-0832) on alternative endpoints and approaches for groundwater remediation at complex sites under a variety of Federal and state cleanup programs. The primary objective of the project was to provide environmental managers and regulators with tools, metrics, and information needed to evaluate

  16. Guidelines for time-to-event end-point definitions in trials for pancreatic cancer. Results of the DATECAN initiative (Definition for the Assessment of Time-to-event End-points in CANcer trials).

    Science.gov (United States)

    Bonnetain, Franck; Bonsing, Bert; Conroy, Thierry; Dousseau, Adelaide; Glimelius, Bengt; Haustermans, Karin; Lacaine, François; Van Laethem, Jean Luc; Aparicio, Thomas; Aust, Daniela; Bassi, Claudio; Berger, Virginie; Chamorey, Emmanuel; Chibaudel, Benoist; Dahan, Laeticia; De Gramont, Aimery; Delpero, Jean Robert; Dervenis, Christos; Ducreux, Michel; Gal, Jocelyn; Gerber, Erich; Ghaneh, Paula; Hammel, Pascal; Hendlisz, Alain; Jooste, Valérie; Labianca, Roberto; Latouche, Aurelien; Lutz, Manfred; Macarulla, Teresa; Malka, David; Mauer, Muriel; Mitry, Emmanuel; Neoptolemos, John; Pessaux, Patrick; Sauvanet, Alain; Tabernero, Josep; Taieb, Julien; van Tienhoven, Geertjan; Gourgou-Bourgade, Sophie; Bellera, Carine; Mathoulin-Pélissier, Simone; Collette, Laurence

    2014-11-01

    Using potential surrogate end-points for overall survival (OS) such as Disease-Free- (DFS) or Progression-Free Survival (PFS) is increasingly common in randomised controlled trials (RCTs). However, end-points are too often imprecisely defined which largely contributes to a lack of homogeneity across trials, hampering comparison between them. The aim of the DATECAN (Definition for the Assessment of Time-to-event End-points in CANcer trials)-Pancreas project is to provide guidelines for standardised definition of time-to-event end-points in RCTs for pancreatic cancer. Time-to-event end-points currently used were identified from a literature review of pancreatic RCT trials (2006-2009). Academic research groups were contacted for participation in order to select clinicians and methodologists to participate in the pilot and scoring groups (>30 experts). A consensus was built after 2 rounds of the modified Delphi formal consensus approach with the Rand scoring methodology (range: 1-9). For pancreatic cancer, 14 time to event end-points and 25 distinct event types applied to two settings (detectable disease and/or no detectable disease) were considered relevant and included in the questionnaire sent to 52 selected experts. Thirty experts answered both scoring rounds. A total of 204 events distributed over the 14 end-points were scored. After the first round, consensus was reached for 25 items; after the second consensus was reached for 156 items; and after the face-to-face meeting for 203 items. The formal consensus approach reached the elaboration of guidelines for standardised definitions of time-to-event end-points allowing cross-comparison of RCTs in pancreatic cancer. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. The Oral HIV/AIDS Research Alliance: updated case definitions of oral disease endpoints.

    Science.gov (United States)

    Shiboski, C H; Patton, L L; Webster-Cyriaque, J Y; Greenspan, D; Traboulsi, R S; Ghannoum, M; Jurevic, R; Phelan, J A; Reznik, D; Greenspan, J S

    2009-07-01

    The Oral HIV/AIDS Research Alliance (OHARA) is part of the AIDS Clinical Trials Group (ACTG), the largest HIV clinical trials organization in the world. Its main objective is to investigate oral complications associated with HIV/AIDS as the epidemic is evolving, in particular, the effects of antiretrovirals on oral mucosal lesion development and associated fungal and viral pathogens. The OHARA infrastructure comprises: the Epidemiologic Research Unit (at the University of California San Francisco), the Medical Mycology Unit (at Case Western Reserve University) and the Virology/Specimen Banking Unit (at the University of North Carolina). The team includes dentists, physicians, virologists, mycologists, immunologists, epidemiologists and statisticians. Observational studies and clinical trials are being implemented at ACTG-affiliated sites in the US and resource-poor countries. Many studies have shared end-points, which include oral diseases known to be associated with HIV/AIDS measured by trained and calibrated ACTG study nurses. In preparation for future protocols, we have updated existing diagnostic criteria of the oral manifestations of HIV published in 1992 and 1993. The proposed case definitions are designed to be used in large-scale epidemiologic studies and clinical trials, in both US and resource-poor settings, where diagnoses may be made by non-dental healthcare providers. The objective of this article is to present updated case definitions for HIV-related oral diseases that will be used to measure standardized clinical end-points in OHARA studies, and that can be used by any investigator outside of OHARA/ACTG conducting clinical research that pertains to these end-points.

  18. Is automated kinetic measurement superior to end-point for advanced oxidation protein product?

    Science.gov (United States)

    Oguz, Osman; Inal, Berrin Bercik; Emre, Turker; Ozcan, Oguzhan; Altunoglu, Esma; Oguz, Gokce; Topkaya, Cigdem; Guvenen, Guvenc

    2014-01-01

    Advanced oxidation protein product (AOPP) was first described as an oxidative protein marker in chronic uremic patients and measured with a semi-automatic end-point method. Subsequently, the kinetic method was introduced for AOPP assay. We aimed to compare these two methods by adapting them to a chemistry analyzer and to investigate the correlation between AOPP and fibrinogen, the key molecule responsible for human plasma AOPP reactivity, microalbumin, and HbA1c in patients with type II diabetes mellitus (DM II). The effects of EDTA and citrate-anticogulated tubes on these two methods were incorporated into the study. This study included 93 DM II patients (36 women, 57 men) with HbA1c levels > or = 7%, who were admitted to the diabetes and nephrology clinics. The samples were collected in EDTA and in citrate-anticoagulated tubes. Both methods were adapted to a chemistry analyzer and the samples were studied in parallel. In both types of samples, we found a moderate correlation between the kinetic and the endpoint methods (r = 0.611 for citrate-anticoagulated, r = 0.636 for EDTA-anticoagulated, p = 0.0001 for both). We found a moderate correlation between fibrinogen-AOPP and microalbumin-AOPP levels only in the kinetic method (r = 0.644 and 0.520 for citrate-anticoagulated; r = 0.581 and 0.490 for EDTA-anticoagulated, p = 0.0001). We conclude that adaptation of the end-point method to automation is more difficult and it has higher between-run CV% while application of the kinetic method is easier and it may be used in oxidative stress studies.

  19. On weighted hardy inequalities on semiaxis for functions vanishing at the endpoints

    Directory of Open Access Journals (Sweden)

    Vladimir Stepanov

    1997-01-01

    Full Text Available We study the weighted Hardy inequalities on the semiaxis of the form ‖Fu‖2≤C‖F(kv‖2  (1 for functions vanishing at the endpoints together with derivatives up to the order k−1. The case k=2 is completely characterized.

  20. Adaptive endpoint detection of seismic signal based on auto-correlated function

    International Nuclear Information System (INIS)

    Fan Wanchun; Shi Ren

    2001-01-01

    Based on the analysis of auto-correlation function, the notion of the distance between auto-correlation function was quoted, and the characterization of the noise and the signal with noise were discussed by using the distance. Then, the method of auto- adaptable endpoint detection of seismic signal based on auto-correlated similarity was summed up. The steps of implementation and determining of the thresholds were presented in detail. The experimental results that were compared with the methods based on artificial detecting show that this method has higher sensitivity even in a low signal with noise ratio circumstance

  1. Giant Magnetic Fluctuations at the Critical Endpoint in Insulating HoMnO3

    Science.gov (United States)

    Choi, Y. J.; Lee, N.; Sharma, P. A.; Kim, S. B.; Vajk, O. P.; Lynn, J. W.; Oh, Y. S.; Cheong, S.-W.

    2013-04-01

    Although abundant research has focused recently on the quantum criticality of itinerant magnets, critical phenomena of insulating magnets in the vicinity of critical endpoints (CEP’s) have rarely been revealed. Here we observe an emergent CEP at 2.05 T and 2.2 K with a suppressed thermal conductivity and concomitant strong critical fluctuations evident via a divergent magnetic susceptibility (e.g., χ''(2.05T,2.2K)/χ''(3T,2.2K)≈23,500%, comparable to the critical opalescence in water) in the hexagonal insulating antiferromagnet HoMnO3.

  2. Detection of Bordetella pertussis from Clinical Samples by Culture and End-Point PCR in Malaysian Patients.

    Science.gov (United States)

    Ting, Tan Xue; Hashim, Rohaidah; Ahmad, Norazah; Abdullah, Khairul Hafizi

    2013-01-01

    Pertussis or whooping cough is a highly infectious respiratory disease caused by Bordetella pertussis. In vaccinating countries, infants, adolescents, and adults are relevant patients groups. A total of 707 clinical specimens were received from major hospitals in Malaysia in year 2011. These specimens were cultured on Regan-Lowe charcoal agar and subjected to end-point PCR, which amplified the repetitive insertion sequence IS481 and pertussis toxin promoter gene. Out of these specimens, 275 were positive: 4 by culture only, 6 by both end-point PCR and culture, and 265 by end-point PCR only. The majority of the positive cases were from ≤3 months old patients (77.1%) (P 0.05). Our study showed that the end-point PCR technique was able to pick up more positive cases compared to culture method.

  3. Definitions and validation criteria for biomarkers and surrogate endpoints: development and testing of a quantitative hierarchical levels of evidence schema

    DEFF Research Database (Denmark)

    Lassere, Marissa N; Johnson, Kent R; Boers, Maarten

    2007-01-01

    endpoints, and leading indicators, a quantitative surrogate validation schema was developed and subsequently evaluated at a stakeholder workshop. RESULTS: The search identified several classification schema and definitions. Components of these were incorporated into a new quantitative surrogate validation...... of the National Institutes of Health definitions of biomarker, surrogate endpoint, and clinical endpoint was useful. CONCLUSION: Further development and application of this schema provides incentives and guidance for effective biomarker and surrogate endpoint research, and more efficient drug discovery...... are then applied if there is serious counterevidence. A total score (0 to 15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. It was proposed that the term "surrogate" be restricted to markers attaining Levels 1 or 2 only. Most stakeholders agreed that this operationalization...

  4. Does bisphenol A induce superfeminization in Marisa cornuarietis? Part I: intra- and inter-laboratory variability in test endpoints.

    Science.gov (United States)

    Forbes, Valery E; Selck, Henriette; Palmqvist, Annemette; Aufderheide, John; Warbritton, Ryan; Pounds, Nadine; Thompson, Roy; van der Hoeven, Nelly; Caspers, Norbert

    2007-03-01

    It has been claimed that bisphenol A (BPA) induces superfeminization in the freshwater gastropod, Marisa cornuarietis. To explore the reproducibility of prior work, here we present results from a three-laboratory study, the objectives of which were to determine the mean and variability in test endpoints (i.e., adult fecundity, egg hatchability, and juvenile growth) under baseline conditions and to identify the sources of variability. A major source of variability for all of the measured endpoints was due to differences within and among individuals. With few exceptions, variability among laboratories and among replicate tanks within laboratories contributed little to the observed variability in endpoints. The results highlight the importance of obtaining basic knowledge of husbandry requirements and baseline information on life-history traits of potential test species prior to designing toxicity test protocols. Understanding of the levels and sources of endpoint variability is essential so that statistically robust and ecologically relevant tests of chemicals can be conducted.

  5. Definitions and validation criteria for biomarkers and surrogate endpoints: development and testing of a quantitative hierarchical levels of evidence schema

    DEFF Research Database (Denmark)

    Lassere, Marissa N; Johnson, Kent R; Boers, Maarten

    2007-01-01

    endpoints, and leading indicators, a quantitative surrogate validation schema was developed and subsequently evaluated at a stakeholder workshop. RESULTS: The search identified several classification schema and definitions. Components of these were incorporated into a new quantitative surrogate validation...... level of evidence schema that evaluates biomarkers along 4 domains: Target, Study Design, Statistical Strength, and Penalties. Scores derived from 3 domains the Target that the marker is being substituted for, the Design of the (best) evidence, and the Statistical strength are additive. Penalties...... of the National Institutes of Health definitions of biomarker, surrogate endpoint, and clinical endpoint was useful. CONCLUSION: Further development and application of this schema provides incentives and guidance for effective biomarker and surrogate endpoint research, and more efficient drug discovery...

  6. The time-dependent "cure-death" model investigating two equally important endpoints simultaneously in trials treating high-risk patients with resistant pathogens.

    Science.gov (United States)

    Sommer, Harriet; Wolkewitz, Martin; Schumacher, Martin

    2017-07-01

    A variety of primary endpoints are used in clinical trials treating patients with severe infectious diseases, and existing guidelines do not provide a consistent recommendation. We propose to study simultaneously two primary endpoints, cure and death, in a comprehensive multistate cure-death model as starting point for a treatment comparison. This technique enables us to study the temporal dynamic of the patient-relevant probability to be cured and alive. We describe and compare traditional and innovative methods suitable for a treatment comparison based on this model. Traditional analyses using risk differences focus on one prespecified timepoint only. A restricted logrank-based test of treatment effect is sensitive to ordered categories of responses and integrates information on duration of response. The pseudo-value regression provides a direct regression model for examination of treatment effect via difference in transition probabilities. Applied to a topical real data example and simulation scenarios, we demonstrate advantages and limitations and provide an insight into how these methods can handle different kinds of treatment imbalances. The cure-death model provides a suitable framework to gain a better understanding of how a new treatment influences the time-dynamic cure and death process. This might help the future planning of randomised clinical trials, sample size calculations, and data analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Definitions and validation criteria for biomarkers and surrogate endpoints: development and testing of a quantitative hierarchical levels of evidence schema.

    Science.gov (United States)

    Lassere, Marissa N; Johnson, Kent R; Boers, Maarten; Tugwell, Peter; Brooks, Peter; Simon, Lee; Strand, Vibeke; Conaghan, Philip G; Ostergaard, Mikkel; Maksymowych, Walter P; Landewe, Robert; Bresnihan, Barry; Tak, Paul-Peter; Wakefield, Richard; Mease, Philip; Bingham, Clifton O; Hughes, Michael; Altman, Doug; Buyse, Marc; Galbraith, Sally; Wells, George

    2007-03-01

    There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Our objective was to review the literature on biomarkers and surrogates to develop a hierarchical schema that systematically evaluates and ranks the surrogacy status of biomarkers and surrogates; and to obtain feedback from stakeholders. After a systematic search of Medline and Embase on biomarkers, surrogate (outcomes, endpoints, markers, indicators), intermediate endpoints, and leading indicators, a quantitative surrogate validation schema was developed and subsequently evaluated at a stakeholder workshop. The search identified several classification schema and definitions. Components of these were incorporated into a new quantitative surrogate validation level of evidence schema that evaluates biomarkers along 4 domains: Target, Study Design, Statistical Strength, and Penalties. Scores derived from 3 domains the Target that the marker is being substituted for, the Design of the (best) evidence, and the Statistical strength are additive. Penalties are then applied if there is serious counterevidence. A total score (0 to 15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. It was proposed that the term "surrogate" be restricted to markers attaining Levels 1 or 2 only. Most stakeholders agreed that this operationalization of the National Institutes of Health definitions of biomarker, surrogate endpoint, and clinical endpoint was useful. Further development and application of this schema provides incentives and guidance for effective biomarker and surrogate endpoint research, and more efficient drug discovery, development, and approval.

  8. Comparing and combining biomarkers as principle surrogates for time-to-event clinical endpoints.

    Science.gov (United States)

    Gabriel, Erin E; Sachs, Michael C; Gilbert, Peter B

    2015-02-10

    Principal surrogate endpoints are useful as targets for phase I and II trials. In many recent trials, multiple post-randomization biomarkers are measured. However, few statistical methods exist for comparison of or combination of biomarkers as principal surrogates, and none of these methods to our knowledge utilize time-to-event clinical endpoint information. We propose a Weibull model extension of the semi-parametric estimated maximum likelihood method that allows for the inclusion of multiple biomarkers in the same risk model as multivariate candidate principal surrogates. We propose several methods for comparing candidate principal surrogates and evaluating multivariate principal surrogates. These include the time-dependent and surrogate-dependent true and false positive fraction, the time-dependent and the integrated standardized total gain, and the cumulative distribution function of the risk difference. We illustrate the operating characteristics of our proposed methods in simulations and outline how these statistics can be used to evaluate and compare candidate principal surrogates. We use these methods to investigate candidate surrogates in the Diabetes Control and Complications Trial. Copyright © 2014 John Wiley & Sons, Ltd.

  9. BOP2: Bayesian optimal design for phase II clinical trials with simple and complex endpoints.

    Science.gov (United States)

    Zhou, Heng; Lee, J Jack; Yuan, Ying

    2017-09-20

    We propose a flexible Bayesian optimal phase II (BOP2) design that is capable of handling simple (e.g., binary) and complicated (e.g., ordinal, nested, and co-primary) endpoints under a unified framework. We use a Dirichlet-multinomial model to accommodate different types of endpoints. At each interim, the go/no-go decision is made by evaluating a set of posterior probabilities of the events of interest, which is optimized to maximize power or minimize the number of patients under the null hypothesis. Unlike other existing Bayesian designs, the BOP2 design explicitly controls the type I error rate, thereby bridging the gap between Bayesian designs and frequentist designs. In addition, the stopping boundary of the BOP2 design can be enumerated prior to the onset of the trial. These features make the BOP2 design accessible to a wide range of users and regulatory agencies and particularly easy to implement in practice. Simulation studies show that the BOP2 design has favorable operating characteristics with higher power and lower risk of incorrectly terminating the trial than some existing Bayesian phase II designs. The software to implement the BOP2 design is freely available at www.trialdesign.org. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Patient-specific dosimetric endpoints based treatment plan quality control in radiotherapy

    International Nuclear Information System (INIS)

    Song, Ting; Zhou, Linghong; Staub, David; Chen, Mingli; Lu, Weiguo; Tian, Zhen; Jia, Xun; Li, Yongbao; Jiang, Steve B; Gu, Xuejun

    2015-01-01

    In intensity modulated radiotherapy (IMRT), the optimal plan for each patient is specific due to unique patient anatomy. To achieve such a plan, patient-specific dosimetric goals reflecting each patient’s unique anatomy should be defined and adopted in the treatment planning procedure for plan quality control. This study is to develop such a personalized treatment plan quality control tool by predicting patient-specific dosimetric endpoints (DEs). The incorporation of patient specific DEs is realized by a multi-OAR geometry-dosimetry model, capable of predicting optimal DEs based on the individual patient’s geometry. The overall quality of a treatment plan is then judged with a numerical treatment plan quality indicator and characterized as optimal or suboptimal. Taking advantage of clinically available prostate volumetric modulated arc therapy (VMAT) treatment plans, we built and evaluated our proposed plan quality control tool. Using our developed tool, six of twenty evaluated plans were identified as sub-optimal plans. After plan re-optimization, these suboptimal plans achieved better OAR dose sparing without sacrificing the PTV coverage, and the dosimetric endpoints of the re-optimized plans agreed well with the model predicted values, which validate the predictability of the proposed tool. In conclusion, the developed tool is able to accurately predict optimally achievable DEs of multiple OARs, identify suboptimal plans, and guide plan optimization. It is a useful tool for achieving patient-specific treatment plan quality control. (paper)

  11. Mesoscale simulation of semiflexible chains. I. Endpoint distribution and chain dynamics

    Science.gov (United States)

    Groot, Robert D.

    2013-06-01

    The endpoint distribution and dynamics of semiflexible fibers are studied by numerical simulation. A brief overview is given over the analytical theory of flexible and semiflexible polymers. In particular, a closed expression is given for the relaxation spectrum of wormlike chains, which determines polymer diffusion and rheology. Next a simulation model for wormlike chains with full hydrodynamic interaction is described, and relations for the bending and torsion modulus are given. Two methods are introduced to include torsion stiffness into the model. The model is validated by simulating single chains in a heat bath, and comparing the endpoint distribution of the chains with established Monte Carlo results. It is concluded that torsion stiffness leads to a slightly shorter effective persistence length for a given bending stiffness. To further validate the simulation model, polymer diffusion is studied for fixed persistence length and varying polymer length N. The diffusion constant shows crossover from Rouse (D ∝ N-1) to reptation behaviour (D ∝ N-2). The terminal relaxation time obtained from the monomer displacement is consistent with the theory of wormlike chains. The probability for chain crossing has also been studied. This probability is so low that it does not influence the present results.

  12. Hairy black holes and the endpoint of AdS{sub 4} charged superradiance

    Energy Technology Data Exchange (ETDEWEB)

    Dias, Óscar J.C.; Masachs, Ramon [STAG research centre and Mathematical Sciences, University of Southampton,Southampton (United Kingdom)

    2017-02-24

    We construct hairy black hole solutions that merge with the anti-de Sitter (AdS{sub 4}) Reissner-Nordström black hole at the onset of superradiance. These hairy black holes have, for a given mass and charge, higher entropy than the corresponding AdS{sub 4}-Reissner-Nordström black hole. Therefore, they are natural candidates for the endpoint of the charged superradiant instability. On the other hand, hairy black holes never dominate the canonical and grand-canonical ensembles. The zero-horizon radius of the hairy black holes is a soliton (i.e. a boson star under a gauge transformation). We construct our solutions perturbatively, for small mass and charge, so that the properties of hairy black holes can be used to testify and compare with the endpoint of initial value simulations. We further discuss the near-horizon scalar condensation instability which is also present in global AdS{sub 4}-Reissner-Nordström black holes. We highlight the different nature of the near-horizon and superradiant instabilities and that hairy black holes ultimately exist because of the non-linear instability of AdS.

  13. Transgenerational endpoints provide increased sensitivity and insight into multigenerational responses of Lymnaea stagnalis exposed to cadmium.

    Science.gov (United States)

    Reátegui-Zirena, Evelyn G; Fidder, Bridgette N; Olson, Adric D; Dawson, Daniel E; Bilbo, Thomas R; Salice, Christopher J

    2017-05-01

    Ecotoxicology provides data to inform environmental management. Many testing protocols do not consider offspring fitness and toxicant sensitivity. Cadmium (Cd) is a well-studied and ubiquitous toxicant but little is known about the effects on offspring of exposed parents (transgenerational effects). This study had three objectives: to identify endpoints related to offspring performance; to determine whether parental effects would manifest as a change in Cd tolerance in offspring and how parental exposure duration influenced the manifestation of parental effects. Adult snails were exposed to Cd 0, 25, 50, 100, 200 and 400 μg Cd/L for eight weeks. There were effects on adult endpoints (e.g., growth, reproduction) but only at the highest concentrations (>100 μg/L). Alternatively, we observed significant transgenerational effects at all Cd concentrations. Surprisingly, we found increased Cd tolerance in hatchlings from all parental Cd exposure concentrations even though eggs and hatchlings were in Cd-free conditions for 6 weeks. Explicit consideration of offspring performance adds value to current toxicity testing protocols. Parental exposure duration has important implications for offspring effects and that contaminant concentrations that are not directly toxic to parents can cause transgenerational changes in resistance that have significant implications for toxicity testing and adaptive responses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Protocol of the Definition for the Assessment of Time-to-event Endpoints in CANcer trials (DATECAN) project: formal consensus method for the development of guidelines for standardised time-to-event endpoints' definitions in cancer clinical trials.

    Science.gov (United States)

    Bellera, Carine A; Pulido, Marina; Gourgou, Sophie; Collette, Laurence; Doussau, Adélaïde; Kramar, Andrew; Dabakuyo, Tienhan Sandrine; Ouali, Monia; Auperin, Anne; Filleron, Thomas; Fortpied, Catherine; Le Tourneau, Christophe; Paoletti, Xavier; Mauer, Murielle; Mathoulin-Pélissier, Simone; Bonnetain, Franck

    2013-03-01

    In randomised phase III cancer clinical trials, the most objectively defined and only validated time-to-event endpoint is overall survival (OS). The appearance of new types of treatments and the multiplication of lines of treatment have resulted in the use of surrogate endpoints for overall survival such as progression-free survival (PFS), or time-to-treatment failure. Their development is strongly influenced by the necessity of reducing clinical trial duration, cost and number of patients. However, while these endpoints are frequently used, they are often poorly defined and definitions can differ between trials which may limit their use as primary endpoints. Moreover, this variability of definitions can impact on the trial's results by affecting estimation of treatments' effects. The aim of the Definition for the Assessment of Time-to-event Endpoints in CANcer trials (DATECAN) project is to provide recommendations for standardised definitions of time-to-event endpoints in randomised cancer clinical trials. We will use a formal consensus methodology based on experts' opinions which will be obtained in a systematic manner. Definitions will be independently developed for several cancer sites, including pancreatic, breast, head and neck and colon cancer, as well as sarcomas and gastrointestinal stromal tumours (GISTs). The DATECAN project should lead to the elaboration of recommendations that can then be used as guidelines by researchers participating in clinical trials. This process should lead to a standardisation of the definitions of commonly used time-to-event endpoints, enabling appropriate comparisons of future trials' results. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Smile esthetics: calculated beauty?

    Science.gov (United States)

    Lecocq, Guillaume; Truong Tan Trung, Lisa

    2014-06-01

    Esthetic demand from patients continues to increase. Consequently, the treatments we offer are moving towards more discreet or invisible techniques using lingual brackets in order to achieve harmonious, balanced results in line with our treatment goals. As orthodontists, we act upon relationships between teeth and bone. And the equilibrium they create impacts the entire face via the smile. A balanced smile is essential to an esthetic outcome and is governed by rules, which guide both the practitioner and patient. A smile can be described in terms of mathematical ratios and proportions but beauty cannot be calculated. For the smile to sit harmoniously within the face, we need to take into account facial proportions and the possibility of their being modified by our orthopedic appliances or by surgery. Copyright © 2014 CEO. Published by Elsevier Masson SAS. All rights reserved.

  16. A patient and community-centered approach selecting endpoints for a randomized trial of a novel advance care planning tool

    Directory of Open Access Journals (Sweden)

    Bridges JFP

    2018-02-01

    Full Text Available John FP Bridges,1,2 Norah L Crossnohere,2 Anne L Schuster,1 Judith A Miller,3 Carolyn Pastorini,3,† Rebecca A Aslakson2,4,5 1Department of Health Policy and Management, The Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, 2Department of Health, Behavior, and Society, The Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, 3Patient-Centered Outcomes Research Institute (PCORI Project, Baltimore, MD, 4Department of Anesthesiology and Critical Care Medicine, The Johns Hopkins School of Medicine, Baltimore, MD, 5Armstrong Institute for Patient Safety and Quality, The Johns Hopkins School of Medicine, Baltimore, MD, USA †Carolyn Pastorini passed away on August 24, 2015 Background: Despite a movement toward patient-centered outcomes, best practices on how to gather and refine patients’ perspectives on research endpoints are limited. Advanced care planning (ACP is inherently patient centered and would benefit from patient prioritization of endpoints for ACP-related tools and studies.Objective: This investigation sought to prioritize patient-centered endpoints for the content and evaluation of an ACP video being developed for patients undergoing major surgery. We also sought to highlight an approach using complementary engagement and research strategies to document priorities and preferences of patients and other stakeholders.Materials and methods: Endpoints identified from a previously published environmental scan were operationalized following rating by a caregiver co-investigator, refinement by a patient co-investigator, review by a stakeholder committee, and validation by patients and family members. Finalized endpoints were taken to a state fair where members of the public who indicated that they or a loved one had undergone major surgery prioritized their most relevant endpoints and provided comments.Results: Of the initial 50 ACP endpoints identified from the review, 12 endpoints were selected for public

  17. Comparative Analysis of Dynamic Cell Viability, Migration and Invasion Assessments by Novel Real-Time Technology and Classic Endpoint Assays

    Science.gov (United States)

    Limame, Ridha; Wouters, An; Pauwels, Bea; Fransen, Erik; Peeters, Marc; Lardon, Filip; De Wever, Olivier; Pauwels, Patrick

    2012-01-01

    Background Cell viability and motility comprise ubiquitous mechanisms involved in a variety of (patho)biological processes including cancer. We report a technical comparative analysis of the novel impedance-based xCELLigence Real-Time Cell Analysis detection platform, with conventional label-based endpoint methods, hereby indicating performance characteristics and correlating dynamic observations of cell proliferation, cytotoxicity, migration and invasion on cancer cells in highly standardized experimental conditions. Methodology/Principal Findings Dynamic high-resolution assessments of proliferation, cytotoxicity and migration were performed using xCELLigence technology on the MDA-MB-231 (breast cancer) and A549 (lung cancer) cell lines. Proliferation kinetics were compared with the Sulforhodamine B (SRB) assay in a series of four cell concentrations, yielding fair to good correlations (Spearman's Rho 0.688 to 0.964). Cytotoxic action by paclitaxel (0–100 nM) correlated well with SRB (Rho>0.95) with similar IC50 values. Reference cell migration experiments were performed using Transwell plates and correlated by pixel area calculation of crystal violet-stained membranes (Rho 0.90) and optical density (OD) measurement of extracted dye (Rho>0.95). Invasion was observed on MDA-MB-231 cells alone using Matrigel-coated Transwells as standard reference method and correlated by OD reading for two Matrigel densities (Rho>0.95). Variance component analysis revealed increased variances associated with impedance-based detection of migration and invasion, potentially caused by the sensitive nature of this method. Conclusions/Significance The xCELLigence RTCA technology provides an accurate platform for non-invasive detection of cell viability and motility. The strong correlations with conventional methods imply a similar observation of cell behavior and interchangeability with other systems, illustrated by the highly correlating kinetic invasion profiles on different

  18. Associations between maternal lifestyle factors and neonatal body composition in the Screening for Pregnancy Endpoints (Cork) cohort study.

    Science.gov (United States)

    Dahly, Darren L; Li, Xia; Smith, Hazel A; Khashan, Ali S; Murray, Deirdre M; Kiely, Mairead E; O'B Hourihane, Jonathan; McCarthy, Fergus P; Kenny, Louise C; Kearney, Patricia M

    2018-02-01

    Neonatal body composition likely mediates fetal influences on life long chronic disease risk. A better understanding of how maternal lifestyle is related to newborn body composition could thus inform intervention efforts. Using Cork participant data (n = 1754) from the Screening for Pregnancy Endpoints (SCOPE) cohort study [ECM5(10)05/02/08], we estimated how pre-pregnancy body size, gestational weight gain, exercise, alcohol, smoking and diet were related to neonatal fat and fat-free mass, as well as length and gestational age at birth, using quantile regression. Maternal factors were measured by a trained research midwife at 15 gestational weeks, in addition to a 3rd trimester weight measurement used to calculate weight gain. Infant body composition was measured using air-displacement plethysmography. Healthy (versus excess) gestational weight gain was associated with lower median fat-free mass [-112 g, 95% confidence interval (CI): -47 to -176) and fat mass (-33 g, 95% CI: -1 to -65) in the offspring; and a 103 g decrease in the 95th centile of fat mass (95% CI: -33 to -174). Maternal normal weight status (versus obesity) was associated with lower median fat mass (-48 g, 95% CI: -12 to -84). At the highest centiles, fat mass was lower among infants of women who engaged in frequent moderate-intensity exercise early in the pregnancy (-92 g at the 95th centile, 95% CI: -168 to -16). Lastly, women who never smoked tended to have longer babies with more fat mass and fat-free mass. No other lifestyle factors were strongly related to infant body composition. These results suggest that supporting healthy maternal lifestyles could reduce the risk of excess fat accumulation in the offspring, without adversely affecting fat-free mass development, length or gestational age. © The Author 2017; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association

  19. Comparative analysis of dynamic cell viability, migration and invasion assessments by novel real-time technology and classic endpoint assays.

    Directory of Open Access Journals (Sweden)

    Ridha Limame

    Full Text Available BACKGROUND: Cell viability and motility comprise ubiquitous mechanisms involved in a variety of (pathobiological processes including cancer. We report a technical comparative analysis of the novel impedance-based xCELLigence Real-Time Cell Analysis detection platform, with conventional label-based endpoint methods, hereby indicating performance characteristics and correlating dynamic observations of cell proliferation, cytotoxicity, migration and invasion on cancer cells in highly standardized experimental conditions. METHODOLOGY/PRINCIPAL FINDINGS: Dynamic high-resolution assessments of proliferation, cytotoxicity and migration were performed using xCELLigence technology on the MDA-MB-231 (breast cancer and A549 (lung cancer cell lines. Proliferation kinetics were compared with the Sulforhodamine B (SRB assay in a series of four cell concentrations, yielding fair to good correlations (Spearman's Rho 0.688 to 0.964. Cytotoxic action by paclitaxel (0-100 nM correlated well with SRB (Rho>0.95 with similar IC(50 values. Reference cell migration experiments were performed using Transwell plates and correlated by pixel area calculation of crystal violet-stained membranes (Rho 0.90 and optical density (OD measurement of extracted dye (Rho>0.95. Invasion was observed on MDA-MB-231 cells alone using Matrigel-coated Transwells as standard reference method and correlated by OD reading for two Matrigel densities (Rho>0.95. Variance component analysis revealed increased variances associated with impedance-based detection of migration and invasion, potentially caused by the sensitive nature of this method. CONCLUSIONS/SIGNIFICANCE: The xCELLigence RTCA technology provides an accurate platform for non-invasive detection of cell viability and motility. The strong correlations with conventional methods imply a similar observation of cell behavior and interchangeability with other systems, illustrated by the highly correlating kinetic invasion profiles on

  20. Towards better environmental performance of wastewater sludge treatment using endpoint approach in LCA methodology

    Directory of Open Access Journals (Sweden)

    Isam Alyaseri

    2017-03-01

    Full Text Available The aim of this study is to use the life cycle assessment method to measure the environmental performance of the sludge incineration process in a wastewater treatment plant and to propose an alternative that can reduce the environmental impact. To show the damages caused by the treatment processes, the study aimed to use an endpoint approach in evaluating impacts on human health, ecosystem quality, and resources due to the processes. A case study was taken at Bissell Point Wastewater Treatment Plant in Saint Louis, Missouri, U.S. The plant-specific data along with literature data from technical publications were used to build an inventory, and then analyzed the environmental burdens from sludge handling unit in the year 2011. The impact assessment method chosen was ReCipe 2008. The existing scenario (dewatering-multiple hearth incineration-ash to landfill was evaluated and three alternative scenarios (fluid bed incineration and anaerobic digestion with and without land application with energy recovery from heat or biogas were proposed and analyzed to find the one with the least environmental impact. The existing scenario shows that the most significant impacts are related to depletion in resources and damage to human health. These impacts mainly came from the operation phase (electricity and fuel consumption and emissions related to combustion. Alternatives showed better performance than the existing scenario. Using ReCipe endpoint methodology, and among the three alternatives tested, the anaerobic digestion had the best overall environmental performance. It is recommended to convert to fluid bed incineration if the concerns were more about human health or to anaerobic digestion if the concerns were more about depletion in resources. The endpoint approach may simplify the outcomes of this study as follows: if the plant is converted to fluid bed incineration, it could prevent an average of 43.2 DALYs in human life, save 0.059 species in the area

  1. Secondary efficacy endpoints of the pentavalent rotavirus vaccine against gastroenteritis in sub-Saharan Africa.

    Science.gov (United States)

    Tapia, Milagritos D; Armah, George; Breiman, Robert F; Dallas, Michael J; Lewis, Kristen D C; Sow, Samba O; Rivers, Stephen B; Levine, Myron M; Laserson, Kayla F; Feikin, Daniel R; Victor, John C; Ciarlet, Max; Neuzil, Kathleen M; Steele, A Duncan

    2012-04-27

    The efficacy of the pentavalent rotavirus vaccine (PRV), RotaTeq(®), was evaluated in a double-blind, placebo-controlled, multicenter Phase III clinical trial conducted (April 2007-March 2009) in 3 low-income countries in Africa: Ghana, Kenya, and Mali. In total, 5468 infants were randomized 1:1 to receive 3 doses of PRV/placebo at approximately 6, 10, and 14 weeks of age; concomitant administration with routine EPI vaccines, including OPV, was allowed. HIV-infected infants were not excluded. The primary endpoint, vaccine efficacy (VE) against severe-rotavirus gastroenteritis (RVGE), as measured by Vesikari scoring system (VSS, score ≥11), from ≥14 days following Dose 3 through a follow-up period of nearly 2 years in the combined 3 African countries, and secondary endpoints by total follow-up period have been previously reported. In this study, we report post hoc subgroup analyses on secondary endpoints of public health importance. VE against RVGE of any severity was 49.2% (95%CI: 29.9, 63.5) through the first year of life and 30.5% (95%CI: 16.7, 42.2) through the complete follow-up period. VE against severe-gastroenteritis of any etiology was 21.5% (95%CI: vaccine-contained G and P types (G1-G4, P1A[8]), (ii) non-vaccine G types (G8, G9, G10), and (iii) non-vaccine P types (P1B[4], P2A[6]) was 34.0% (95%CI:11.2, 51.2), 81.8% (95%CI:16.5, 98.0) and 40.7% (95%CI:8.4, 62.1), respectively. There was a trend towards higher VE with higher disease severity, although in some cases the numbers were small. In African countries with high under-5 mortality rates, PRV significantly reduced RVGE through nearly 2 years of follow-up; more modest reductions were observed against gastroenteritis of any etiology. PRV provides protection against severe-RVGE caused by diverse rotavirus genotypes, including those not contained in the vaccine. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Multiple-endpoint assay provides a detailed mechanistic view of responses to herbicide exposure in Chlamydomonas reinhardtii

    International Nuclear Information System (INIS)

    Nestler, Holger; Groh, Ksenia J.; Schönenberger, René; Behra, Renata; Schirmer, Kristin; Eggen, Rik I.L.; Suter, Marc J.-F.

    2012-01-01

    The release of herbicides into the aquatic environment raises concerns about potential detrimental effects on ecologically important non-target species, such as unicellular algae, necessitating ecotoxicological risk assessment. Algal toxicity tests based on growth, a commonly assessed endpoint, are integrative, and hence do not provide information about underlying toxic mechanisms and effects. This limitation may be overcome by measuring more specific biochemical and physiological endpoints. In the present work, we developed and applied a novel multiple-endpoint assay, and analyzed the effects of the herbicides paraquat, diuron and norflurazon, each representing a specific mechanism of toxic action, on the single celled green alga Chlamydomonas reinhardtii. The endpoints added to assessment of growth were pigment content, maximum and effective photosystem II quantum yield, ATP content, esterase and oxidative activity. All parameters were measured at 2, 6 and 24 h of exposure, except for growth and pigment content, which were determined after 6 and 24 h only. Effective concentrations causing 50% of response (EC50s) and lowest observable effect concentrations (LOECs) were determined for all endpoints and exposure durations where possible. The assay provided a detailed picture of the concentration- and time-dependent development of effects elicited by the analyzed herbicides, thus improving the understanding of the underlying toxic mechanisms. Furthermore, the response patterns were unique to the respective herbicide and reflected the different mechanisms of toxicity. The comparison of the endpoint responses and sensitivities revealed that several physiological and biochemical parameters reacted earlier or stronger to disturbances than growth. Overall, the presented multiple-endpoint assay constitutes a promising basis for investigating stressor and toxicant effects in green algae.

  3. Root length of aquatic plant, Lemna minor L., as an optimal toxicity endpoint for biomonitoring of mining effluents.

    Science.gov (United States)

    Gopalapillai, Yamini; Vigneault, Bernard; Hale, Beverley A

    2014-10-01

    Lemna minor, a free-floating macrophyte, is used for biomonitoring of mine effluent quality under the Metal Mining Effluent Regulations (MMER) of the Environmental Effects Monitoring (EEM) program in Canada and is known to be sensitive to trace metals commonly discharged in mine effluents such as Ni. Environment Canada's standard toxicity testing protocol recommends frond count (FC) and dry weight (DW) as the 2 required toxicity endpoints-this is similar to other major protocols such as those by the US Environmental Protection Agency (USEPA) and the Organisation for Economic Co-operation and Development (OECD)-that both require frond growth or biomass endpoints. However, we suggest that similar to terrestrial plants, average root length (RL) of aquatic plants will be an optimal and relevant endpoint. As expected, results demonstrate that RL is the ideal endpoint based on the 3 criteria: accuracy (i.e., toxicological sensitivity to contaminant), precision (i.e., lowest variance), and ecological relevance (metal mining effluents). Roots are known to play a major role in nutrient uptake in conditions of low nutrient conditions-thus having ecological relevance to freshwater from mining regions. Root length was the most sensitive and precise endpoint in this study where water chemistry varied greatly (pH and varying concentrations of Ca, Mg, Na, K, dissolved organic carbon, and an anthropogenic organic contaminant, sodium isopropyl xanthates) to match mining effluent ranges. Although frond count was a close second, dry weight proved to be an unreliable endpoint. We conclude that toxicity testing for the floating macrophyte should require average RL measurement as a primary endpoint. © 2014 SETAC.

  4. Multiple-endpoint assay provides a detailed mechanistic view of responses to herbicide exposure in Chlamydomonas reinhardtii

    Energy Technology Data Exchange (ETDEWEB)

    Nestler, Holger [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Ueberlandstrasse 133, 8600 Duebendorf (Switzerland); ETH Zurich, Swiss Federal Institute of Technology, Institute of Biogeochemistry and Pollutant Dynamics, Universitaetstrasse 16, 8092 Zurich (Switzerland); Groh, Ksenia J.; Schoenenberger, Rene; Behra, Renata [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Ueberlandstrasse 133, 8600 Duebendorf (Switzerland); Schirmer, Kristin [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Ueberlandstrasse 133, 8600 Duebendorf (Switzerland); ETH Zurich, Swiss Federal Institute of Technology, Institute of Biogeochemistry and Pollutant Dynamics, Universitaetstrasse 16, 8092 Zurich (Switzerland); EPF Lausanne, School of Architecture, Civil and Environmental Engineering, 1015 Lausanne (Switzerland); Eggen, Rik I.L. [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Ueberlandstrasse 133, 8600 Duebendorf (Switzerland); ETH Zurich, Swiss Federal Institute of Technology, Institute of Biogeochemistry and Pollutant Dynamics, Universitaetstrasse 16, 8092 Zurich (Switzerland); Suter, Marc J.-F., E-mail: suter@eawag.ch [Eawag, Swiss Federal Institute of Aquatic Science and Technology, Ueberlandstrasse 133, 8600 Duebendorf (Switzerland); ETH Zurich, Swiss Federal Institute of Technology, Institute of Biogeochemistry and Pollutant Dynamics, Universitaetstrasse 16, 8092 Zurich (Switzerland)

    2012-04-15

    The release of herbicides into the aquatic environment raises concerns about potential detrimental effects on ecologically important non-target species, such as unicellular algae, necessitating ecotoxicological risk assessment. Algal toxicity tests based on growth, a commonly assessed endpoint, are integrative, and hence do not provide information about underlying toxic mechanisms and effects. This limitation may be overcome by measuring more specific biochemical and physiological endpoints. In the present work, we developed and applied a novel multiple-endpoint assay, and analyzed the effects of the herbicides paraquat, diuron and norflurazon, each representing a specific mechanism of toxic action, on the single celled green alga Chlamydomonas reinhardtii. The endpoints added to assessment of growth were pigment content, maximum and effective photosystem II quantum yield, ATP content, esterase and oxidative activity. All parameters were measured at 2, 6 and 24 h of exposure, except for growth and pigment content, which were determined after 6 and 24 h only. Effective concentrations causing 50% of response (EC50s) and lowest observable effect concentrations (LOECs) were determined for all endpoints and exposure durations where possible. The assay provided a detailed picture of the concentration- and time-dependent development of effects elicited by the analyzed herbicides, thus improving the understanding of the underlying toxic mechanisms. Furthermore, the response patterns were unique to the respective herbicide and reflected the different mechanisms of toxicity. The comparison of the endpoint responses and sensitivities revealed that several physiological and biochemical parameters reacted earlier or stronger to disturbances than growth. Overall, the presented multiple-endpoint assay constitutes a promising basis for investigating stressor and toxicant effects in green algae.

  5. Between strong continuity and almost continuity

    Directory of Open Access Journals (Sweden)

    J.K. Kohli

    2010-04-01

    Full Text Available As embodied in the title of the paper strong and weak variants of continuity that lie strictly between strong continuity of Levine and almost continuity due to Singal and Singal are considered. Basic properties of almost completely continuous functions (≡ R-maps and δ-continuous functions are studied. Direct and inverse transfer of topological properties under almost completely continuous functions and δ-continuous functions are investigated and their place in the hier- archy of variants of continuity that already exist in the literature is out- lined. The class of almost completely continuous functions lies strictly between the class of completely continuous functions studied by Arya and Gupta (Kyungpook Math. J. 14 (1974, 131-143 and δ-continuous functions defined by Noiri (J. Korean Math. Soc. 16, (1980, 161-166. The class of almost completely continuous functions properly contains each of the classes of (1 completely continuous functions, and (2 al- most perfectly continuous (≡ regular set connected functions defined by Dontchev, Ganster and Reilly (Indian J. Math. 41 (1999, 139-146 and further studied by Singh (Quaestiones Mathematicae 33(2(2010, 1–11 which in turn include all δ-perfectly continuous functions initi- ated by Kohli and Singh (Demonstratio Math. 42(1, (2009, 221-231 and so include all perfectly continuous functions introduced by Noiri (Indian J. Pure Appl. Math. 15(3 (1984, 241-250.

  6. The Use of Tools, Modelling Methods, Data Types, and Endpoints in Systems Medicine: A Survey on Projects of the German e:Med-Programme.

    Science.gov (United States)

    Gietzelt, Matthias; Höfer, Thomas; Knaup-Gregori, Petra; König, Rainer; Löpprich, Martin; Poos, Alexandra; Ganzinger, Matthias

    2016-01-01

    Systems medicine is the consequent continuation of research efforts on the road to an individualized medicine. Thereby, systems medicine tries to offer a holistic view on the patient by combining different data sources to highlight different perspectives on the patient's health. Our research question was to identify the main data types, modelling methods, analysis tools, and endpoints currently used and studied in systems medicine. Therefore, we conducted a survey on projects with a systems medicine background. Fifty participants completed this survey. The results of the survey were analyzed using histograms and cross tables, and finally compared to results of a former literature review with the same research focus. The data types reported in this survey were widely diversified. As expected, genomic and phenotype data were used most frequently. In contrast, environmental and behavioral data were rarely used in the projects. Overall, the cross tables of the data types in the survey and the literature review showed overlapping results.

  7. Comparison endpoint study of process plasma and secondary electron beam exciter optical emission spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Stephan Thamban, P. L.; Yun, Stuart; Padron-Wells, Gabriel; Hosch, Jimmy W.; Goeckner, Matthew J. [Department of Mechanical Engineering, University of Texas at Dallas, 800W Campbell Road, Richardson, Texas 75080 (United States); Department of Electrical Engineering, University of Texas at Dallas, 800W Campbell Road, Richardson, Texas 75080 (United States); Verity Instruments, Inc., 2901 Eisenhower Street, Carrollton, Texas 75007 (United States); Department of Mathematical Sciences, University of Texas at Dallas, 800 W Campbell Road, Richardson, Texas 75080 (United States)

    2012-11-15

    Traditionally process plasmas are often studied and monitored by optical emission spectroscopy. Here, the authors compare experimental measurements from a secondary electron beam excitation and direct process plasma excitation to discuss and illustrate its distinctiveness in the study of process plasmas. They present results that show excitations of etch process effluents in a SF{sub 6} discharge and endpoint detection capabilities in dark plasma process conditions. In SF{sub 6} discharges, a band around 300 nm, not visible in process emission, is observed and it can serve as a good indicator of etch product emission during polysilicon etches. Based on prior work reported in literature the authors believe this band is due to SiF{sub 4} gas phase species.

  8. Impact of copula directional specification on multi-trial evaluation of surrogate endpoints

    Science.gov (United States)

    Renfro, Lindsay A.; Shang, Hongwei; Sargent, Daniel J.

    2014-01-01

    Evaluation of surrogate endpoints using patient-level data from multiple trials is the gold standard, where multi-trial copula models are used to quantify both patient-level and trial-level surrogacy. While limited consideration has been given in the literature to copula choice (e.g., Clayton), no prior consideration has been given to direction of implementation (via survival versus distribution functions). We demonstrate that evenwith the “correct” copula family, directional misspecification leads to biased estimates of patient-level and trial-level surrogacy. We illustrate with a simulation study and a re-analysis of disease-free survival as a surrogate for overall survival in early stage colon cancer. PMID:24905465

  9. Simultaneous small-sample comparisons in longitudinal or multi-endpoint trials using multiple marginal models

    DEFF Research Database (Denmark)

    Pallmann, Philip; Ritz, Christian; Hothorn, Ludwig A

    2018-01-01

    , however only asymptotically. In this paper, we show how to make the approach also applicable to small-sample data problems. Specifically, we discuss the computation of adjusted P values and simultaneous confidence bounds for comparisons of randomised treatment groups as well as for levels......Simultaneous inference in longitudinal, repeated-measures, and multi-endpoint designs can be onerous, especially when trying to find a reasonable joint model from which the interesting effects and covariances are estimated. A novel statistical approach known as multiple marginal models greatly...... simplifies the modelling process: the core idea is to "marginalise" the problem and fit multiple small models to different portions of the data, and then estimate the overall covariance matrix in a subsequent, separate step. Using these estimates guarantees strong control of the family-wise error rate...

  10. Determination of the Acidity of Oils Using Paraformaldehyde as a Thermometric End-Point Indicator

    Directory of Open Access Journals (Sweden)

    Carneiro Mário J. D.

    2002-01-01

    Full Text Available The determination of the acidity of oils by catalytic thermometric titrimetry using paraformaldehyde as the thermometric end-point indicator was investigated. The sample solvent was a 1:1 (v/v mixture of toluene and 2-propanol and the titrant was 0.1 mol L-1 aqueous sodium hydroxide. Paraformaldehyde, being insoluble in the sample solvent, does not present the inconvenience of other indicators that change the properties of the solvent due to composition changes. The titration can therefore be done effectively in the same medium as the standard potentiometric and visual titration methods. The results of the application of the method to both non-refined and refined oils are presented herein. The proposed method has advantages in relation to the potentiometric method in terms of speed and simplicity.

  11. End-Point Contact Force Control with Quantitative Feedback Theory for Mobile Robots

    Directory of Open Access Journals (Sweden)

    Shuhuan Wen

    2012-12-01

    Full Text Available Robot force control is an important issue for intelligent mobile robotics. The end-point stiffness of a robot is a key and open problem in the research community. The control strategies are mostly dependent on both the specifications of the task and the environment of the robot. Due to the limited stiffness of the end-effector, we may adopt inherent torque to feedback the oscillations of the controlled force. This paper proposes an effective control strategy which contains a controller using quantitative feedback theory. The nested loop controllers take into account the physical limitation of the system's inner variables and harmful interference. The biggest advantage of the method is its simplicity in both the design process and the implementation of the control algorithm in engineering practice. Taking the one-link manipulator as an example, numerical experiments are carried out to verify the proposed control method. The results show the satisfactory performance.

  12. Elastic collisions of classical point particles on a finite frictionless linear track with perfectly reflecting endpoints

    Science.gov (United States)

    DeLuca, R.

    2006-03-01

    Repeated elastic collisions of point particles on a finite frictionless linear track with perfectly reflecting endpoints are considered. The problem is analysed by means of an elementary linear algebra approach. It is found that, starting with a state consisting of a projectile particle in motion at constant velocity and a target particle at rest in a fixed known position, the points at which collisions occur on track, when plotted versus progressive numerals, corresponding to the collisions themselves, show periodic patterns for a rather large choice of values of the initial position x(0) and on the mass ratio r. For certain values of these parameters, however, only regular behaviour over a large number of collisions is detected.

  13. Patient-reported outcomes in insomnia: development of a conceptual framework and endpoint model.

    Science.gov (United States)

    Kleinman, Leah; Buysse, Daniel J; Harding, Gale; Lichstein, Kenneth; Kalsekar, Anupama; Roth, Thomas

    2013-01-01

    This article describes qualitative research conducted with patients with clinical diagnoses of insomnia and focuses on the development of a conceptual framework and endpoint model that identifies a hierarchy and interrelationships of potential outcomes in insomnia research. Focus groups were convened to discuss how patients experience insomnia and to generate items for patient-reported questionnaires on insomnia and associated daytime consequences. Results for the focus group produced two conceptual frameworks: one for sleep and one for daytime impairment. Each conceptual framework consists of hypothesized domains and items in each domain based on patient language taken from the focus group. These item pools may ultimately serve as a basis to develop new questionnaires to assess insomnia.

  14. Correlation of Hip Fracture with Other Fracture Types: Toward a Rational Composite Hip Fracture Endpoint

    Science.gov (United States)

    Colón-Emeric, Cathleen; Pieper, Carl F.; Grubber, Janet; Van Scoyoc, Lynn; Schnell, Merritt L; Van Houtven, Courtney Harold; Pearson, Megan; Lafleur, Joanne; Lyles, Kenneth W.; Adler, Robert A.

    2016-01-01

    Purpose With ethical requirements to the enrollment of lower risk subjects, osteoporosis trials are underpowered to detect reduction in hip fractures. Different skeletal sites have different levels of fracture risk and response to treatment. We sought to identify fracture sites which cluster with hip fracture at higher than expected frequency; if these sites respond to treatment similarly, then a composite fracture endpoint could provide a better estimate of hip fracture reduction. Methods Cohort study using Veterans Affairs and Medicare administrative data. Male Veterans (n=5,036,536) aged 50-99 years receiving VA primary care between1999-2009 were included. Fractures were ascertained using ICD9 and CPT codes and classified by skeletal site. Pearson correlation coefficients, logistic regression and kappa statistics, were used to describe the correlation between each fracture type and hip fracture within individuals, without regards to the timing of the events. Results 595,579 (11.8%) men suffered 1 or more fractures and 179,597 (3.6%) suffered 2 or more fractures during the time under study. Of those with one or more fractures, rib was the most common site (29%), followed by spine (22%), hip (21%) and femur (20%). The fracture types most highly correlated with hip fracture were pelvic/acetabular (Pearson correlation coefficient 0.25, p<0.0001), femur (0.15, p<0.0001), and shoulder (0.11, p<0.0001). Conclusions Pelvic, acetabular, femur, and shoulder fractures cluster with hip fractures within individuals at greater than expected frequency. If we observe similar treatment risk reductions within that cluster, subsequent trials could consider use of a composite endpoint to better estimate hip fracture risk. PMID:26151123

  15. Clinical endpoint adjudication in a contemporary all-comers coronary stent investigation: methodology and external validation.

    Science.gov (United States)

    Vranckx, Pascal; McFadden, Eugene; Cutlip, Donald E; Mehran, Roxana; Swart, Michael; Kint, P P; Zijlstra, Felix; Silber, Sigmund; Windecker, Stephan; Serruys, Patrick W C J

    2013-01-01

    Globalisation in coronary stent research calls for harmonization of clinical endpoint definitions and event adjudication. Little has been published about the various processes used for event adjudication or their impact on outcome reporting. We performed a validation of the clinical event committee (CEC) adjudication process on 100 suspected events in the RESOLUTE All-comers trial (Resolute-AC). Two experienced Clinical Research Organisations (CRO) that had already extensive internal validation processes in place, participated in the study. After initial adjudication by the primary-CEC, events were cross-adjudicated by an external-CEC using the same definitions. Major discrepancies affecting the primary end point of target-lesion failure (TLF), a composite of cardiac death, target vessel myocardial infarction (TV-MI), or clinically-indicated target-lesion revascularization (CI-TLR), were analysed by an independent oversight committee who provided recommendations for harmonization. Discordant adjudications were reconsidered by the primary CEC. Subsequently, the RAC database was interrogated for cases that based on these recommendations merited re-adjudication and these cases were also re-adjudicated by the primary CEC. Final discrepancies in adjudication of individual components of TLF occurred in 7 out of 100 events in 5 patients. Discrepancies for the (hierarchical) primary endpoint occurred in 5 events (2 cardiac deaths and 3 TV-MI). After application of harmonization recommendations to the overall RAC population (n=2292), the primary CEC adjudicated 3 additional clinical-TLRs and considered 1 TV-MI as no event. A harmonization process provided a high level of concordance for event adjudication and improved accuracy for final event reporting. These findings suggest it is feasible to pool clinical event outcome data across clinical trials even when different CECs are responsible for event adjudication. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Cadmium phytotoxicity: Quantitative sensitivity relationships between classical endpoints and antioxidative enzyme biomarkers

    Energy Technology Data Exchange (ETDEWEB)

    Rosa Correa, Albertina Xavier da [Centro de Ciencias Tecnologicas da Terra e do Mar, Universidade do Vale do Itajai, Rua Uruguai, 458, 88302-202 Itajai SC (Brazil); Roerig, Leonardo Rubi [Centro de Ciencias Tecnologicas da Terra e do Mar, Universidade do Vale do Itajai, Rua Uruguai, 458, 88302-202 Itajai SC (Brazil); Verdinelli, Miguel A. [Centro de Ciencias Tecnologicas da Terra e do Mar, Universidade do Vale do Itajai, Rua Uruguai, 458, 88302-202 Itajai SC (Brazil); Cotelle, Sylvie [Centre des Sciences de l' Environnement, Universite de Metz, 57000 Metz (France); Ferard, Jean-Francois [Centre des Sciences de l' Environnement, Universite de Metz, 57000 Metz (France); Radetski, Claudemir Marcos [Centro de Ciencias Tecnologicas da Terra e do Mar, Universidade do Vale do Itajai, Rua Uruguai, 458, 88302-202 Itajai SC (Brazil)]. E-mail: radetski@univali.br

    2006-03-15

    In this work, cadmium phytotoxicity and quantitative sensitivity relationships between different hierarchical endpoints in plants cultivated in a contaminated soil were studied. Thus, germination rate, biomass growth and antioxidative enzyme activity (i.e. superoxide dismutase, peroxidase, catalase and glutathione reductase) in three terrestrial plants (Avena sativa L., Brassica campestris L. cv. Chinensis, Lactuca sativa L. cv. hanson) were analyzed. Plant growth tests were carried out according to an International Standard Organization method and the results were analyzed by ANOVA followed by Williams' test. The concentration of Cd{sup 2+} that had the smallest observed significant negative effect (LOEC) on plant biomass was 6.25, 12.5 and 50 mg Cd/kg dry soil for lettuce, oat and Chinese cabbage, respectively. Activity of all enzymes studied increased significantly compared to enzyme activity in plant controls. For lettuce, LOEC values (mg Cd/kg dry soil) for enzymic activity ranged from 0.05 (glutathione reductase) to 0.39 (catalase). For oat, LOEC values (mg Cd/kg dry soil) ranged from 0.19 (for superoxide dismutase and glutathione reductase) to 0.39 (for catalase and peroxidase). For Chinese cabbage, LOEC values (mg Cd/kg dry soil) ranged from 0.19 (peroxidase, catalase and glutathione reductase) to 0.39 (superoxide dismutase). Classical (i.e. germination and biomass) and biochemical (i.e. enzyme activity) endpoints were compared to establish a sensitivity ranking, which was: enzyme activity > biomass > germination rate. For cadmium-soil contamination, the determination of quantitative sensitivity relationships (QSR) between classical and antioxidative enzyme biomarkers showed that the most sensitive plant species have, generally, the lowest QSR values.

  17. Cadmium phytotoxicity: Quantitative sensitivity relationships between classical endpoints and antioxidative enzyme biomarkers

    International Nuclear Information System (INIS)

    Rosa Correa, Albertina Xavier da; Roerig, Leonardo Rubi; Verdinelli, Miguel A.; Cotelle, Sylvie; Ferard, Jean-Francois; Radetski, Claudemir Marcos

    2006-01-01

    In this work, cadmium phytotoxicity and quantitative sensitivity relationships between different hierarchical endpoints in plants cultivated in a contaminated soil were studied. Thus, germination rate, biomass growth and antioxidative enzyme activity (i.e. superoxide dismutase, peroxidase, catalase and glutathione reductase) in three terrestrial plants (Avena sativa L., Brassica campestris L. cv. Chinensis, Lactuca sativa L. cv. hanson) were analyzed. Plant growth tests were carried out according to an International Standard Organization method and the results were analyzed by ANOVA followed by Williams' test. The concentration of Cd 2+ that had the smallest observed significant negative effect (LOEC) on plant biomass was 6.25, 12.5 and 50 mg Cd/kg dry soil for lettuce, oat and Chinese cabbage, respectively. Activity of all enzymes studied increased significantly compared to enzyme activity in plant controls. For lettuce, LOEC values (mg Cd/kg dry soil) for enzymic activity ranged from 0.05 (glutathione reductase) to 0.39 (catalase). For oat, LOEC values (mg Cd/kg dry soil) ranged from 0.19 (for superoxide dismutase and glutathione reductase) to 0.39 (for catalase and peroxidase). For Chinese cabbage, LOEC values (mg Cd/kg dry soil) ranged from 0.19 (peroxidase, catalase and glutathione reductase) to 0.39 (superoxide dismutase). Classical (i.e. germination and biomass) and biochemical (i.e. enzyme activity) endpoints were compared to establish a sensitivity ranking, which was: enzyme activity > biomass > germination rate. For cadmium-soil contamination, the determination of quantitative sensitivity relationships (QSR) between classical and antioxidative enzyme biomarkers showed that the most sensitive plant species have, generally, the lowest QSR values

  18. A conscious mouse model of gastric ileus using clinically relevant endpoints

    Directory of Open Access Journals (Sweden)

    Shao Yuanlin

    2005-06-01

    Full Text Available Abstract Background Gastric ileus is an unsolved clinical problem and current treatment is limited to supportive measures. Models of ileus using anesthetized animals, muscle strips or isolated smooth muscle cells do not adequately reproduce the clinical situation. Thus, previous studies using these techniques have not led to a clear understanding of the pathophysiology of ileus. The feasibility of using food intake and fecal output as simple, clinically relevant endpoints for monitoring ileus in a conscious mouse model was evaluated by assessing the severity and time course of various insults known to cause ileus. Methods Delayed food intake and fecal output associated with ileus was monitored after intraperitoneal injection of endotoxin, laparotomy with bowel manipulation, thermal injury or cerulein induced acute pancreatitis. The correlation of decreased food intake after endotoxin injection with gastric ileus was validated by measuring gastric emptying. The effect of endotoxin on general activity level and feeding behavior was also determined. Small bowel transit was measured using a phenol red marker. Results Each insult resulted in a transient and comparable decrease in food intake and fecal output consistent with the clinical picture of ileus. The endpoints were highly sensitive to small changes in low doses of endotoxin, the extent of bowel manipulation, and cerulein dose. The delay in food intake directly correlated with delayed gastric emptying. Changes in general activity and feeding behavior were insufficient to explain decreased food intake. Intestinal transit remained unchanged at the times measured. Conclusion Food intake and fecal output are sensitive markers of gastric dysfunction in four experimental models of ileus. In the mouse, delayed gastric emptying appears to be the major cause of the anorexic effect associated with ileus. Gastric dysfunction is more important than small bowel dysfunction in this model. Recovery of

  19. Establishment of Early Endpoints in Mouse Total-Body Irradiation Model.

    Directory of Open Access Journals (Sweden)

    Amory Koch

    Full Text Available Acute radiation sickness (ARS following exposure to ionizing irradiation is characterized by radiation-induced multiorgan dysfunction/failure that refers to progressive dysfunction of two or more organ systems, the etiological agent being radiation damage to cells and tissues over time. Radiation sensitivity data on humans and animals has made it possible to describe the signs associated with ARS. A mouse model of total-body irradiation (TBI has previously been developed that represents the likely scenario of exposure in the human population. Herein, we present the Mouse Intervention Scoring System (MISS developed at the Veterinary Sciences Department (VSD of the Armed Forces Radiobiology Research Institute (AFRRI to identify moribund mice and decrease the numbers of mice found dead, which is therefore a more humane refinement to death as the endpoint. Survival rates were compared to changes in body weights and temperatures in the mouse (CD2F1 male TBI model (6-14 Gy, 60Co γ-rays at 0.6 Gy min-1, which informed improvements to the Scoring System. Individual tracking of animals via implanted microchips allowed for assessment of criteria based on individuals rather than by group averages. From a total of 132 mice (92 irradiated, 51 mice were euthanized versus only four mice that were found dead (7% of non-survivors. In this case, all four mice were found dead after overnight periods between observations. Weight loss alone was indicative of imminent succumbing to radiation injury, however mice did not always become moribund within 24 hours while having weight loss >30%. Only one survivor had a weight loss of greater than 30%. Temperature significantly dropped only 2-4 days before death/euthanasia in 10 and 14 Gy animals. The score system demonstrates a significant refinement as compared to using subjective assessment of morbidity or death as the endpoint for these survival studies.

  20. Magnetic Field Calculator

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Magnetic Field Calculator will calculate the total magnetic field, including components (declination, inclination, horizontal intensity, northerly intensity,...

  1. Dependence of QSAR models on the selection of trial descriptor sets: a demonstration using nanotoxicity endpoints of decorated nanotubes.

    Science.gov (United States)

    Shao, Chi-Yu; Chen, Sing-Zuo; Su, Bo-Han; Tseng, Yufeng J; Esposito, Emilio Xavier; Hopfinger, Anton J

    2013-01-28

    Little attention has been given to the selection of trial descriptor sets when designing a QSAR analysis even though a great number of descriptor classes, and often a greater number of descriptors within a given class, are now available. This paper reports an effort to explore interrelationships between QSAR models and descriptor sets. Zhou and co-workers (Zhou et al., Nano Lett. 2008, 8 (3), 859-865) designed, synthesized, and tested a combinatorial library of 80 surface modified, that is decorated, multi-walled carbon nanotubes for their composite nanotoxicity using six endpoints all based on a common 0 to 100 activity scale. Each of the six endpoints for the 29 most nanotoxic decorated nanotubes were incorporated as the training set for this study. The study reported here includes trial descriptor sets for all possible combinations of MOE, VolSurf, and 4D-fingerprints (FP) descriptor classes, as well as including and excluding explicit spatial contributions from the nanotube. Optimized QSAR models were constructed from these multiple trial descriptor sets. It was found that (a) both the form and quality of the best QSAR models for each of the endpoints are distinct and (b) some endpoints are quite dependent upon 4D-FP descriptors of the entire nanotube-decorator complex. However, other endpoints yielded equally good models only using decorator descriptors with and without the decorator-only 4D-FP descriptors. Lastly, and most importantly, the quality, significance, and interpretation of a QSAR model were found to be critically dependent on the trial descriptor sets used within a given QSAR endpoint study.

  2. An evaluation of culture results during treatment for tuberculosis as surrogate endpoints for treatment failure and relapse.

    Directory of Open Access Journals (Sweden)

    Patrick P J Phillips

    Full Text Available It is widely acknowledged that new regimens are urgently needed for the treatment of tuberculosis. The primary endpoint in the Phase III trials is a composite outcome of failure at the end of treatment or relapse after stopping treatment. Such trials are usually both long and expensive. Valid surrogate endpoints measured during or at the end of treatment could dramatically reduce both the time and cost of assessing the effectiveness of new regimens. The objective of this study was to evaluate sputum culture results on solid media during treatment as surrogate endpoints for poor outcome. Data were obtained from twelve randomised controlled trials conducted by the British Medical Research Council in the 1970s and 80s in East Africa and East Asia, consisting of 6974 participants and 49 different treatment regimens. The month two culture result was shown to be a poor surrogate in East Africa but a good surrogate in Hong Kong. In contrast, the month three culture was a good surrogate in trials conducted in East Africa but not in Hong Kong. As well as differences in location, ethnicity and probable strain of Mycobacteria tuberculosis, Hong Kong trials more often evaluated regimens with rifampicin throughout and intermittent regimens, and patients in East African trials more often presented with extensive cavitation and were slower to convert to culture negative during treatment. An endpoint that is a summary measure of the longitudinal profile of culture results over time or that is able to detect the presence of M. tuberculosis later in treatment is more likely to be a better endpoint for a phase II trial than a culture result at a single time point and may prove to be an acceptable surrogate. More data are needed before any endpoint can be used as a surrogate in a confirmatory phase III trial.

  3. Radiometric titration of officinal radiopharmaceuticals using radioactive kryptonates as end-point indicators. I. Salicylic, acetylosalicylic, benzoic acids

    Energy Technology Data Exchange (ETDEWEB)

    Toelgyessy, J; Dillinger, P [Slovenska Vysoka Skola Technicka, Bratislava (Czechoslovakia). Chemickotechnologicka Fakulta; Harangozo, M; Jombik, J [Komenskeho Univ., Bratislava (Czechoslovakia). Farmaceuticka Fakulta

    1980-01-01

    A method for the determination of salicylic, acetylsalicylic and benzoic acids in officinal pharmaceutical based on radiometric titration with 0.1 mol.l/sup -1/ NaOH was developed. The end-point was detected with the aid of radioactive glass kryptonate. After the end-point, the excess titrant attacks the glass surface layers and this results in releasing /sup 85/Kr, and consequently, in decreasing the radioactivity of the kryptonate employed. The radioactive kryptonate used as an indicator was prepared by the bombardment of glass with accelerated /sup 85/Kr ions. The developed method is simple, accurate and correct.

  4. Use of endpoint adjudication to improve the quality and validity of endpoint assessment for medical device development and post marketing evaluation: Rationale and best practices. A report from the cardiac safety research consortium.

    Science.gov (United States)

    Seltzer, Jonathan H; Heise, Ted; Carson, Peter; Canos, Daniel; Hiatt, Jo Carol; Vranckx, Pascal; Christen, Thomas; Cutlip, Donald E

    2017-08-01

    This white paper provides a summary of presentations, discussions and conclusions of a Thinktank entitled "The Role of Endpoint Adjudication in Medical Device Clinical Trials". The think tank was cosponsored by the Cardiac Safety Research Committee, MDEpiNet and the US Food and Drug Administration (FDA) and was convened at the FDA's White Oak headquarters on March 11, 2016. Attention was focused on tailoring best practices for evaluation of endpoints in medical device clinical trials, practical issues in endpoint adjudication of therapeutic, diagnostic, biomarker and drug-device combinations, and the role of adjudication in regulatory and reimbursement issues throughout the device lifecycle. Attendees included representatives from medical device companies, the FDA, Centers for Medicare and Medicaid Services (CMS), end point adjudication specialist groups, clinical research organizations, and active, academically based adjudicators. The manuscript presents recommendations from the think tank regarding (1) rationale for when adjudication is appropriate, (2) best practices establishment and operation of a medical device adjudication committee and (3) the role of endpoint adjudication for post market evaluation in the emerging era of real world evidence. Copyright © 2017. Published by Elsevier Inc.

  5. Coil protection calculator for TFTR

    International Nuclear Information System (INIS)

    Marsala, R.J.; Woolley, R.D.

    1987-01-01

    A new coil protection calculator (CPC) is presented in this paper. It is now being developed for TFTR's magnetic field coils will replace the existing coil fault detector. The existing fault detector sacrifices TFTR operating capability for simplicity. The new CPC will permit operation up to the actual coil limits by accurately and continuously computing coil parameters in real-time. The improvement will allow TFTR to operate with higher plasma currents and will permit the optimization of pulse repetition rates

  6. Configuration space Faddeev calculations

    International Nuclear Information System (INIS)

    Payne, G.L.; Klink, W.H.; Ployzou, W.N.

    1991-01-01

    The detailed study of few-body systems provides one of the most precise tools for studying the dynamics of nuclei. Our research program consists of a careful theoretical study of the nuclear few-body systems. During the past year we have completed several aspects of this program. We have continued our program of using the trinucleon system to investigate the validity of various realistic nucleon-nucleon potentials. Also, the effects of meson-exchange currents in nuclear systems have been studied. Initial calculations using the configuration-space Faddeev equations for nucleon-deuteron scattering have been completed. With modifications to treat relativistic systems, few-body methods can be applied to phenomena that are sensitive to the structure of the individual hadrons. We have completed a review of Relativistic Hamiltonian Dynamics in Nuclear and Particle Physics for Advances in Nuclear Physics. Although it is called a review, it is a large document that contains a significant amount of new research

  7. Endpoint-based parallel data processing with non-blocking collective instructions in a parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Cernohous, Bob R; Ratterman, Joseph D; Smith, Brian E

    2014-11-11

    Endpoint-based parallel data processing with non-blocking collective instructions in a PAMI of a parallel computer is disclosed. The PAMI is composed of data communications endpoints, each including a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task. The compute nodes are coupled for data communications through the PAMI. The parallel application establishes a data communications geometry specifying a set of endpoints that are used in collective operations of the PAMI by associating with the geometry a list of collective algorithms valid for use with the endpoints of the geometry; registering in each endpoint in the geometry a dispatch callback function for a collective operation; and executing without blocking, through a single one of the endpoints in the geometry, an instruction for the collective operation.

  8. CO2 flowrate calculator

    International Nuclear Information System (INIS)

    Carossi, Jean-Claude

    1969-02-01

    A CO 2 flowrate calculator has been designed for measuring and recording the gas flow in the loops of Pegase reactor. The analog calculator applies, at every moment, Bernoulli's formula to the values that characterize the carbon dioxide flow through a nozzle. The calculator electronics is described (it includes a sampling calculator and a two-variable function generator), with its amplifiers, triggers, interpolator, multiplier, etc. Calculator operation and setting are presented

  9. Biological and chemical tests of contaminated soils to determine bioavailability and environmentally acceptable endpoints (EAE)

    International Nuclear Information System (INIS)

    Montgomery, C.R.; Menzie, C.A.; Pauwells, S.J.

    1995-01-01

    The understanding of the concept of bioavailability of soil contaminants to receptors and its use in supporting the development of EAE is growing but still incomplete. Nonetheless, there is increased awareness of the importance of such data to determine acceptable cleanup levels and achieve timely site closures. This presentation discusses a framework for biological and chemical testing of contaminated soils developed as part of a Gas Research Institute (GRI) project entitled ''Environmentally Acceptable Endpoints in Soil Using a Risk Based Approach to Contaminated Site Management Based on Bioavailability of Chemicals in Soil.'' The presentation reviews the GRI program, and summarizes the findings of the biological and chemical testing section published in the GRI report. The three primary components of the presentation are: (1) defining the concept of bioavailability within the existing risk assessment paradigm, (2) assessing the usefulness of the existing tests to measure bioavailability and test frameworks used to interpret these measurements, and (3) suggesting how a small selection of relevant tests could be incorporated into a flexible testing scheme for soils to address this issue

  10. Relative Biological Effectiveness of HZE Particles for Chromosomal Exchanges and Other Surrogate Cancer Risk Endpoints.

    Directory of Open Access Journals (Sweden)

    Eliedonna Cacao

    Full Text Available The biological effects of high charge and energy (HZE particle exposures are of interest in space radiation protection of astronauts and cosmonauts, and estimating secondary cancer risks for patients undergoing Hadron therapy for primary cancers. The large number of particles types and energies that makeup primary or secondary radiation in HZE particle exposures precludes tumor induction studies in animal models for all but a few particle types and energies, thus leading to the use of surrogate endpoints to investigate the details of the radiation quality dependence of relative biological effectiveness (RBE factors. In this report we make detailed RBE predictions of the charge number and energy dependence of RBE's using a parametric track structure model to represent experimental results for the low dose response for chromosomal exchanges in normal human lymphocyte and fibroblast cells with comparison to published data for neoplastic transformation and gene mutation. RBE's are evaluated against acute doses of γ-rays for doses near 1 Gy. Models that assume linear or non-targeted effects at low dose are considered. Modest values of RBE (10 are predicted at low doses <0.1 Gy. The radiation quality dependence of RBE's against the effects of acute doses γ-rays found for neoplastic transformation and gene mutation studies are similar to those found for simple exchanges if a linear response is assumed at low HZE particle doses. Comparisons of the resulting model parameters to those used in the NASA radiation quality factor function are discussed.

  11. Elimination of endpoint-discontinuity artifacts in the analysis of spectra in reciprocal space

    International Nuclear Information System (INIS)

    Yoo, S. D.; Aspnes, D. E.

    2001-01-01

    Reciprocal-space analysis offers several advantages for determining critical point parameters in optical and other spectra, for example the separation of baseline effects, information, and noise in low-, medium-, and high-index Fourier coefficients, respectively. However, endpoint-discontinuity artifacts can obscure much of the information when segments are isolated for analysis. We developed a procedure for eliminating these artifacts and recovering buried information by minimizing in the white-noise region the mean-square deviation between the Fourier coefficients of the data and those of low-order polynomials, then subtracting the resulting coefficients from the data over the entire range. We find that spectral analysis is optimized if no false data are used, i.e., when the number of points transformed equals the number of actual data points in the segment. Using fractional differentiation we develop a simple derivation of the variation of the reciprocal-space coefficients with index n for Lorentzian and Gaussian line shapes in direct space. More generally, we show that the definition of critical point energies in terms of phase coherence of the Fourier coefficients allows these energies to be determined for a broad class of line shapes even if the direct-space line shapes themselves are not known. Limitations for undersampled or highly broadened spectra are discussed, along with extensions to two- or higher-dimensional arrays of data. [copyright] 2001 American Institute of Physics

  12. Center-Within-Trial Versus Trial-Level Evaluation of Surrogate Endpoints

    Science.gov (United States)

    Renfro, Lindsay A.; Shi, Qian; Xue, Yuan; Li, Junlong; Shang, Hongwei; Sargent, Daniel J.

    2014-01-01

    Evaluation of candidate surrogate endpoints using individual patient data from multiple clinical trials is considered the gold standard approach to validate surrogates at both patient and trial levels. However, this approach assumes the availability of patient-level data from a relatively large collection of similar trials, which may not be possible to achieve for a given disease application. One common solution to the problem of too few similar trials involves performing trial-level surrogacy analyses on trial sub-units (e.g., centers within trials), thereby artificially increasing the trial-level sample size for feasibility of the multi-trial analysis. To date, the practical impact of treating trial sub-units (centers) identically to trials in multi-trial surrogacy analyses remains unexplored, and conditions under which this ad hoc solution may in fact be reasonable have not been identified. We perform a simulation study to identify such conditions, and demonstrate practical implications using a multi-trial dataset of patients with early stage colon cancer. PMID:25061255

  13. End-Point Immobilization of Recombinant Thrombomodulin via Sortase-Mediated Ligation

    Science.gov (United States)

    Jiang, Rui; Weingart, Jacob; Zhang, Hailong; Ma, Yong; Sun, Xue-Long

    2012-01-01

    We report an enzymatic end-point modification and immobilization of recombinant human thrombomodulin (TM), a cofactor for activation of anticoagulant protein C pathway via thrombin. First, a truncated TM mutant consisting of epidermal growth factor-like domains 4–6 (TM456) with a conserved pentapeptide LPETG motif at its C-terminal was expressed and purified in E. coli. Next, the truncated TM456 derivative was site-specifically modified with N-terminal diglycine containing molecules such as biotin and the fluorescent probe dansyl via sortase A (SrtA) mediated ligation (SML). The successful ligations were confirmed by SDS-PAGE and fluorescence imaging. Finally, the truncated TM456 was immobilized onto N-terminal diglycine-functionalized glass slide surface via SML directly. Alternatively, the truncated TM456 was biotinylated via SML and then immobilized onto streptavidin-functionalized glass slide surface indirectly. The successful immobilizations were confirmed by fluorescence imaging. The bioactivity of the immobilized truncated TM456 was further confirmed by protein C activation assay, in which enhanced activation of protein C by immobilized recombinant TM was observed. The sortase A-catalyzed surface ligation took place under mild conditions and is rapid occurring in a single step without prior chemical modification of the target protein. This site-specific covalent modification leads to molecules being arranged in a definitively ordered fashion and facilitating the preservation of the protein’s biological activity. PMID:22372933

  14. Update of the International Consensus on Palliative Radiotherapy Endpoints for Future Clinical Trials in Bone Metastases

    Energy Technology Data Exchange (ETDEWEB)

    Chow, Edward, E-mail: Edward.Chow@sunnybrook.ca [Department of Radiation Oncology, Odette Cancer Centre, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, ON (Canada); Hoskin, Peter [Mount Vernon Centre for Cancer Treatment, Mount Vernon Hospital, Northwood, Middlesex (United Kingdom); Mitera, Gunita; Zeng Liang [Department of Radiation Oncology, Odette Cancer Centre, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, ON (Canada); Lutz, Stephen [Department of Radiation Oncology, Blanchard Valley Regional Cancer Center, Findlay, OH (United States); Roos, Daniel [Department of Radiation Oncology, Royal Adelaide Hospital, Adelaide, South Australia (Australia); Hahn, Carol [Department of Radiation Oncology, Duke University Medical Center, Durham, NC (United States); Linden, Yvette van der [Radiotherapeutic Institute Friesland, Leeuwarden (Netherlands); Hartsell, William [Department of Radiation Oncology, Advocate Good Samaritan Cancer Center, Downers Grove, IL (United States); Kumar, Eshwar [Department of Oncology, Atlantic Health Sciences Cancer Centre, Saint John Regional Hospital, Saint John, NB (Canada)

    2012-04-01

    Purpose: To update the international consensus on palliative radiotherapy endpoints for future clinical trials in bone metastases by surveying international experts regarding previous uncertainties within the 2002 consensus, changes that may be necessary based on practice pattern changes and research findings since that time. Methods and Materials: A two-phase survey was used to determine revisions and new additions to the 2002 consensus. A total of 49 experts from the American Society for Radiation Oncology, the European Society for Therapeutic Radiology and Oncology, the Faculty of Radiation Oncology of the Royal Australian and New Zealand College of Radiologists, and the Canadian Association of Radiation Oncology who are directly involved in the care of patients with bone metastases participated in this survey. Results: Consensus was established in areas involving response definitions, eligibility criteria for future trials, reirradiation, changes in systemic therapy, radiation techniques, parameters at follow-up, and timing of assessments. Conclusion: An outline for trials in bone metastases was updated based on survey and consensus. Investigators leading trials in bone metastases are encouraged to adopt the revised guideline to promote consistent reporting. Areas for future research were identified. It is intended for the consensus to be re-examined in the future on a regular basis.

  15. Determining significant endpoints for ecological risk analyses. 1997 annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    Hinton, T.G.; Congdon, J.; Rowe, C.; Scott, D. [Univ. of Georgia, Aiken, SC (US). Savannah River Ecology Lab.; Bedford, J.; Whicker, F.W. [Colorado State Univ., Fort Collins, CO (US)

    1997-11-01

    'This report summarizes the first year''s progress of research funded under the Department of Energy''s Environmental Management Science Program. The research was initiated to better determine ecological risks from toxic and radioactive contaminants. More precisely, the research is designed to determine the relevancy of sublethal cellular damage to the performance of individuals and to identify characteristics of non-human populations exposed to chronic, low-level radiation, as is typically found on many DOE sites. The authors propose to establish a protocol to assess risks to non-human species at higher levels of biological organization by relating molecular damage to more relevant responses that reflect population health. They think that they can achieve this by coupling changes in metabolic rates and energy allocation patterns to meaningful population response variables, and by using novel biological dosimeters in controlled, manipulative dose/effects experiments. They believe that a scientifically defensible endpoint for measuring ecological risks can only be determined once its understood the extent to which molecular damage from contaminant exposure is detrimental at the individual and population levels of biological organization.'

  16. The interpolation method based on endpoint coordinate for CT three-dimensional image

    International Nuclear Information System (INIS)

    Suto, Yasuzo; Ueno, Shigeru.

    1997-01-01

    Image interpolation is frequently used to improve slice resolution to reach spatial resolution. Improved quality of reconstructed three-dimensional images can be attained with this technique as a result. Linear interpolation is a well-known and widely used method. The distance-image method, which is a non-linear interpolation technique, is also used to convert CT value images to distance images. This paper describes a newly developed method that makes use of end-point coordinates: CT-value images are initially converted to binary images by thresholding them and then sequences of pixels with 1-value are arranged in vertical or horizontal directions. A sequence of pixels with 1-value is defined as a line segment which has starting and end points. For each pair of adjacent line segments, another line segment was composed by spatial interpolation of the start and end points. Binary slice images are constructed from the composed line segments. Three-dimensional images were reconstructed from clinical X-ray CT images, using three different interpolation methods and their quality and processing speed were evaluated and compared. (author)

  17. Speech endpoint detection with non-language speech sounds for generic speech processing applications

    Science.gov (United States)

    McClain, Matthew; Romanowski, Brian

    2009-05-01

    Non-language speech sounds (NLSS) are sounds produced by humans that do not carry linguistic information. Examples of these sounds are coughs, clicks, breaths, and filled pauses such as "uh" and "um" in English. NLSS are prominent in conversational speech, but can be a significant source of errors in speech processing applications. Traditionally, these sounds are ignored by speech endpoint detection algorithms, where speech regions are identified in the audio signal prior to processing. The ability to filter NLSS as a pre-processing step can significantly enhance the performance of many speech processing applications, such as speaker identification, language identification, and automatic speech recognition. In order to be used in all such applications, NLSS detection must be performed without the use of language models that provide knowledge of the phonology and lexical structure of speech. This is especially relevant to situations where the languages used in the audio are not known apriori. We present the results of preliminary experiments using data from American and British English speakers, in which segments of audio are classified as language speech sounds (LSS) or NLSS using a set of acoustic features designed for language-agnostic NLSS detection and a hidden-Markov model (HMM) to model speech generation. The results of these experiments indicate that the features and model used are capable of detection certain types of NLSS, such as breaths and clicks, while detection of other types of NLSS such as filled pauses will require future research.

  18. Analysing data from patient-reported outcome and quality of life endpoints for cancer clinical trials

    DEFF Research Database (Denmark)

    Bottomley, Andrew; Pe, Madeline; Sloan, Jeff

    2016-01-01

    Measures of health-related quality of life (HRQOL) and other patient-reported outcomes generate important data in cancer randomised trials to assist in assessing the risks and benefits of cancer therapies and fostering patient-centred cancer care. However, the various ways these measures are anal......Measures of health-related quality of life (HRQOL) and other patient-reported outcomes generate important data in cancer randomised trials to assist in assessing the risks and benefits of cancer therapies and fostering patient-centred cancer care. However, the various ways these measures...... are analysed and interpreted make it difficult to compare results across trials, and hinders the application of research findings to inform publications, product labelling, clinical guidelines, and health policy. To address these problems, the Setting International Standards in Analyzing Patient......-Reported Outcomes and Quality of Life Endpoints Data (SISAQOL) initiative has been established. This consortium, directed by the European Organisation for Research and Treatment of Cancer (EORTC), was convened to provide recommendations on how to standardise the analysis of HRQOL and other patient-reported outcomes...

  19. Update of the International Consensus on Palliative Radiotherapy Endpoints for Future Clinical Trials in Bone Metastases

    International Nuclear Information System (INIS)

    Chow, Edward; Hoskin, Peter; Mitera, Gunita; Zeng Liang; Lutz, Stephen; Roos, Daniel; Hahn, Carol; Linden, Yvette van der; Hartsell, William; Kumar, Eshwar

    2012-01-01

    Purpose: To update the international consensus on palliative radiotherapy endpoints for future clinical trials in bone metastases by surveying international experts regarding previous uncertainties within the 2002 consensus, changes that may be necessary based on practice pattern changes and research findings since that time. Methods and Materials: A two-phase survey was used to determine revisions and new additions to the 2002 consensus. A total of 49 experts from the American Society for Radiation Oncology, the European Society for Therapeutic Radiology and Oncology, the Faculty of Radiation Oncology of the Royal Australian and New Zealand College of Radiologists, and the Canadian Association of Radiation Oncology who are directly involved in the care of patients with bone metastases participated in this survey. Results: Consensus was established in areas involving response definitions, eligibility criteria for future trials, reirradiation, changes in systemic therapy, radiation techniques, parameters at follow-up, and timing of assessments. Conclusion: An outline for trials in bone metastases was updated based on survey and consensus. Investigators leading trials in bone metastases are encouraged to adopt the revised guideline to promote consistent reporting. Areas for future research were identified. It is intended for the consensus to be re-examined in the future on a regular basis.

  20. Improved Survival Endpoints With Adjuvant Radiation Treatment in Patients With High-Risk Early-Stage Endometrial Carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Elshaikh, Mohamed A., E-mail: melshai1@hfhs.org [Department of Radiation Oncology, Henry Ford Hospital, Detroit, Michigan (United States); Vance, Sean; Suri, Jaipreet S. [Department of Radiation Oncology, Henry Ford Hospital, Detroit, Michigan (United States); Mahan, Meredith [Public Health Science, Henry Ford Hospital, Detroit, Michigan (United States); Munkarah, Adnan [Division of Gynecologic Oncology, Department of Women' s Health Services, Henry Ford Hospital, Detroit, Michigan (United States)

    2014-02-01

    Purpose/Objective(s): To determine the impact of adjuvant radiation treatment (RT) on recurrence-free survival (RFS), disease-specific survival (DSS), and overall survival (OS) in patients with high-risk 2009 International Federation of Gynecology and Obstetrics stage I-II endometrial carcinoma. Methods and Materials: We identified 382 patients with high-risk EC who underwent hysterectomy. RFS, DSS, and OS were calculated from the date of hysterectomy by use of the Kaplan-Meier method. Cox regression modeling was used to explore the risks associated with various factors on survival endpoints. Results: The median follow-up time for the study cohort was 5.4 years. The median age was 71 years. All patients underwent hysterectomy and salpingo-oophorectomy, 93% had peritoneal cytology, and 85% underwent lymphadenectomy. Patients with endometrioid histology constituted 72% of the study cohort, serous in 16%, clear cell in 7%, and mixed histology in 4%. Twenty-three percent of patients had stage II disease. Adjuvant management included RT alone in 220 patients (57%), chemotherapy alone in 25 patients (7%), and chemoradiation therapy in 27 patients (7%); 110 patients (29%) were treated with close surveillance. The 5-year RFS, DSS, and OS were 76%, 88%, and 73%, respectively. On multivariate analysis, adjuvant RT was a significant predictor of RFS (P<.001) DSS (P<.001), and OS (P=.017). Lymphovascular space involvement was a significant predictor of RFS and DSS (P<.001). High tumor grade was a significant predictor for RFS (P=.038) and DSS (P=.025). Involvement of the lower uterine segment was also a predictor of RFS (P=.049). Age at diagnosis and lymphovascular space involvement were significant predictors of OS: P<.001 and P=.002, respectively. Conclusion: In the treatment of patients with high-risk features, our study suggests that adjuvant RT significantly improves recurrence-free, disease-specific, and overall survival in patients with early-stage endometrial carcinoma

  1. Fourier-transform infrared spectroscopy as a novel approach to providing effect-based endpoints in duckweed toxicity testing.

    Science.gov (United States)

    Hu, Li-Xin; Ying, Guang-Guo; Chen, Xiao-Wen; Huang, Guo-Yong; Liu, You-Sheng; Jiang, Yu-Xia; Pan, Chang-Gui; Tian, Fei; Martin, Francis L

    2017-02-01

    Traditional duckweed toxicity tests only measure plant growth inhibition as an endpoint, with limited effects-based data. The present study aimed to investigate whether Fourier-transform infrared (FTIR) spectroscopy could enhance the duckweed (Lemna minor L.) toxicity test. Four chemicals (Cu, Cd, atrazine, and acetochlor) and 4 metal-containing industrial wastewater samples were tested. After exposure of duckweed to the chemicals, standard toxicity endpoints (frond number and chlorophyll content) were determined; the fronds were also interrogated using FTIR spectroscopy under optimized test conditions. Biochemical alterations associated with each treatment were assessed and further analyzed by multivariate analysis. The results showed that comparable x% of effective concentration (ECx) values could be achieved based on FTIR spectroscopy in comparison with those based on traditional toxicity endpoints. Biochemical alterations associated with different doses of toxicant were mainly attributed to lipid, protein, nucleic acid, and carbohydrate structural changes, which helped to explain toxic mechanisms. With the help of multivariate analysis, separation of clusters related to different exposure doses could be achieved. The present study is the first to show successful application of FTIR spectroscopy in standard duckweed toxicity tests with biochemical alterations as new endpoints. Environ Toxicol Chem 2017;36:346-353. © 2016 SETAC. © 2016 SETAC.

  2. Designing quantitative structure activity relationships to predict specific toxic endpoints for polybrominated diphenyl ethers in mammalian cells.

    Science.gov (United States)

    Rawat, S; Bruce, E D

    2014-01-01

    Polybrominated diphenyl ethers (PBDEs) are known as effective flame retardants and have vast industrial application in products like plastics, building materials and textiles. They are found to be structurally similar to thyroid hormones that are responsible for regulating metabolism in the body. Structural similarity with the hormones poses a threat to human health because, once in the system, PBDEs have the potential to affect thyroid hormone transport and metabolism. This study was aimed at designing quantitative structure-activity relationship (QSAR) models for predicting toxic endpoints, namely cell viability and apoptosis, elicited by PBDEs in mammalian cells. Cell viability was evaluated quantitatively using a general cytotoxicity bioassay using Janus Green dye and apoptosis was evaluated using a caspase assay. This study has thus modelled the overall cytotoxic influence of PBDEs at an early and a late endpoint by the Genetic Function Approximation method. This research was a twofold process including running in vitro bioassays to collect data on the toxic endpoints and modeling the evaluated endpoints using QSARs. Cell viability and apoptosis responses for Hep G2 cells exposed to PBDEs were successfully modelled with an r(2) of 0.97 and 0.94, respectively.

  3. Updated standardized endpoint definitions for transcatheter aortic valve implantation: The Valve Academic Research Consortium-2 consensus document

    NARCIS (Netherlands)

    A.P. Kappetein (Arie Pieter); S.J. Head (Stuart); P. Généreux (Philippe); N. Piazza (Nicolo); N.M. van Mieghem (Nicolas); E.H. Blackstone (Eugene); T.G. Brott (Thomas); D.J. Cohen (David J.); D.E. Cutlip (Donald); G.A. van Es (Gerrit Anne); R.T. Hahn (Rebecca); A.J. Kirtane (Ajay); M. Krucoff (Mitchell); S. Kodali (Susheel); M.J. Mack (Michael); R. Mehran (Roxana); J. Rodés-Cabau (Josep); P. Vranckx (Pascal); J.G. Webb (John); S.W. Windecker (Stephan); P.W.J.C. Serruys (Patrick); M.B. Leon (Martin)

    2012-01-01

    textabstractObjectives: The aim of the current Valvular Academic Research Consortium (VARC)-2 initiative was to revisit the selection and definitions of transcatheter aortic valve implantation (TAVI)- clinical endpoints to make them more suitable to the present and future needs of clinical trials.

  4. Definitions and validation criteria for biomarkers and surrogate endpoints: development and testing of a quantitative hierarchical levels of evidence schema

    NARCIS (Netherlands)

    Lassere, Marissa N.; Johnson, Kent R.; Boers, Maarten; Tugwell, Peter; Brooks, Peter; Simon, Lee; Strand, Vibeke; Conaghan, Philip G.; Ostergaard, Mikkel; Maksymowych, Walter P.; Landewe, Robert; Bresnihan, Barry; Tak, Paul-Peter; Wakefield, Richard; Mease, Philip; Bingham, Clifton O.; Hughes, Michael; Altman, Doug; Buyse, Marc; Galbraith, Sally; Wells, George

    2007-01-01

    OBJECTIVE: There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Our objective was to review the literature on biomarkers and surrogates to

  5. Traditional and new composite endpoints in heart failure clinical trials : facilitating comprehensive efficacy assessments and improving trial efficiency

    NARCIS (Netherlands)

    Anker, Stefan D. t; Schroeder, Stefan; Atar, Dan; Bax, Jeroen J.; Ceconi, Claudio; Cowie, Martin R.; AdamCrisp,; Dominjon, Fabienne; Ford, Ian; Ghofrani, Hossein-Ardeschir; Gropper, Savion; Hindricks, Gerhard; Hlatky, Mark A.; Holcomb, Richard; Honarpour, Narimon; Jukema, J. Wouter; Kim, Albert M.; Kunz, Michael; Lefkowitz, Martin; Le Floch, Chantal; Landmesser, Ulf; McDonagh, Theresa A.; McMurray, John J.; Merkely, Bela; Packer, Milton; Prasad, Krishna; Revkin, James; Rosano, Giuseppe M. C.; Somaratne, Ransi; Stough, Wendy Gattis; Voors, Adriaan A.; Ruschitzka, Frank

    Composite endpoints are commonly used as the primary measure of efficacy in heart failure clinical trials to assess the overall treatment effect and to increase the efficiency of trials. Clinical trials still must enrol large numbers of patients to accrue a sufficient number of outcome events and

  6. Energetic endpoints provide early indicators of life history effects in a freshwater gastropod exposed to the fungicide, pyraclostrobin.

    Science.gov (United States)

    Fidder, Bridgette N; Reátegui-Zirena, Evelyn G; Olson, Adric D; Salice, Christopher J

    2016-04-01

    Organismal energetics provide important insights into the effects of environmental toxicants. We aimed to determine the effects of pyraclostrobin on Lymnaea stagnalis by examining energy allocation patterns and life history traits. Juvenile snails exposed to pyraclostrobin decreased feeding rate and increased apparent avoidance behaviors at environmentally relevant concentrations. In adults, we found that sublethal concentrations of pyraclostrobin did not affect reproductive output, however, there were significant effects on developmental endpoints with longer time to hatch and decreased hatching success in pyraclostrobin-exposed egg masses. Further, there were apparent differences in developmental effects depending on whether mothers were also exposed to pyraclostrobin suggesting this chemical can exert intergenerational effects. Pyraclostrobin also affected protein and carbohydrate content of eggs in mothers that were exposed to pyraclostrobin. Significant effects on macronutrient content of eggs occurred at lower concentrations than effects on gross endpoints such as hatching success and time to hatch suggesting potential value for these endpoints as early indicators of ecologically relevant stress. These results provide important insight into the effects of a common fungicide on important endpoints for organismal energetics and life history. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. The Impact of Multiple Endpoint Dependency on "Q" and "I"[superscript 2] in Meta-Analysis

    Science.gov (United States)

    Thompson, Christopher Glen; Becker, Betsy Jane

    2014-01-01

    A common assumption in meta-analysis is that effect sizes are independent. When correlated effect sizes are analyzed using traditional univariate techniques, this assumption is violated. This research assesses the impact of dependence arising from treatment-control studies with multiple endpoints on homogeneity measures "Q" and…

  8. Local expressions for one-loop calculations

    International Nuclear Information System (INIS)

    Wasson, D.A.; Koonin, S.E.

    1991-01-01

    We develop local expressions for the contributions of the short-wavelength vacuum modes to the one-loop vacuum energy. These expressions significantly improve the convergence properties of various ''brute-force'' calculational methods. They also provide a continuous series of approximations that interpolate between the brute-force calculations and the derivative expansion

  9. Completely continuous and weakly completely continuous abstract ...

    Indian Academy of Sciences (India)

    An algebra A is called right completely continuous (right weakly completely continuous) ... Moreover, some applications of these results in group algebras are .... A linear subspace S(G) of L1(G) is said to be a Segal algebra, if it satisfies the.

  10. Acute effects of a prooxidant herbicide on the microalga Chlamydomonas reinhardtii: Screening cytotoxicity and genotoxicity endpoints

    International Nuclear Information System (INIS)

    Esperanza, Marta; Cid, Ángeles; Herrero, Concepción; Rioboo, Carmen

    2015-01-01

    Highlights: • Mitochondrial membrane potential constituted the most sensitive parameter assayed. • Several genotoxicity methods were applied for first time in ecotoxicological studies. • Oxidative DNA base damage (8-OHdG) was induced by paraquat exposure. • Cells with DNA strand breakage and subG1-nuclei increased in treated cultures. • Typical apoptosis hallmarks were observed in microalgal cells exposed to paraquat. - Abstract: Since recent evidence has demonstrated that many types of chemicals exhibit oxidative and/or genotoxic potential on living organisms, reactive oxygen species (ROS) formation and DNA damage are currently the best accepted paradigms to assess the potential hazardous biological effects of a wide range of contaminants. The goal of this study was to evaluate the sensitivity of different cytotoxicity and genotoxicity responses on the model microalga Chlamydomonas reinhardtii exposed to the prooxidant herbicide paraquat. In addition to the growth endpoint, cell viability, mitochondrial membrane potential and presence of reactive oxygen species (ROS) were assayed as potential markers of cytotoxicity using flow cytometry (FCM). To study the effects of paraquat on C. reinhardtii DNA, several genotoxicity approaches were implemented for the first time in an ecotoxicological study on microalgae. Oxidative DNA base damage was analysed by measuring the oxidative DNA lesion 8-OHdG by FCM. DNA fragmentation was analysed by different methods: comet assay, and cell cycle analysis by FCM, with a particular focus on the presence of subG1-nuclei. Finally, effects on morphology of nuclei were monitored through DAPI staining. The evaluation of these endpoints showed that several physiological and biochemical parameters reacted to oxidative stress disturbances with greater sensitivity than integrative parameters such as growth rates or cell viability. The experiments revealed concentration-dependent cytotoxicity (ROS formation, depolarization of

  11. Acute effects of a prooxidant herbicide on the microalga Chlamydomonas reinhardtii: Screening cytotoxicity and genotoxicity endpoints

    Energy Technology Data Exchange (ETDEWEB)

    Esperanza, Marta; Cid, Ángeles; Herrero, Concepción; Rioboo, Carmen, E-mail: carmen.rioboo@udc.es

    2015-08-15

    Highlights: • Mitochondrial membrane potential constituted the most sensitive parameter assayed. • Several genotoxicity methods were applied for first time in ecotoxicological studies. • Oxidative DNA base damage (8-OHdG) was induced by paraquat exposure. • Cells with DNA strand breakage and subG1-nuclei increased in treated cultures. • Typical apoptosis hallmarks were observed in microalgal cells exposed to paraquat. - Abstract: Since recent evidence has demonstrated that many types of chemicals exhibit oxidative and/or genotoxic potential on living organisms, reactive oxygen species (ROS) formation and DNA damage are currently the best accepted paradigms to assess the potential hazardous biological effects of a wide range of contaminants. The goal of this study was to evaluate the sensitivity of different cytotoxicity and genotoxicity responses on the model microalga Chlamydomonas reinhardtii exposed to the prooxidant herbicide paraquat. In addition to the growth endpoint, cell viability, mitochondrial membrane potential and presence of reactive oxygen species (ROS) were assayed as potential markers of cytotoxicity using flow cytometry (FCM). To study the effects of paraquat on C. reinhardtii DNA, several genotoxicity approaches were implemented for the first time in an ecotoxicological study on microalgae. Oxidative DNA base damage was analysed by measuring the oxidative DNA lesion 8-OHdG by FCM. DNA fragmentation was analysed by different methods: comet assay, and cell cycle analysis by FCM, with a particular focus on the presence of subG1-nuclei. Finally, effects on morphology of nuclei were monitored through DAPI staining. The evaluation of these endpoints showed that several physiological and biochemical parameters reacted to oxidative stress disturbances with greater sensitivity than integrative parameters such as growth rates or cell viability. The experiments revealed concentration-dependent cytotoxicity (ROS formation, depolarization of

  12. Heterogeneous Calculation of {epsilon}

    Energy Technology Data Exchange (ETDEWEB)

    Jonsson, Alf

    1961-02-15

    A heterogeneous method of calculating the fast fission factor given by Naudet has been applied to the Carlvik - Pershagen definition of {epsilon}. An exact calculation of the collision probabilities is included in the programme developed for the Ferranti - Mercury computer.

  13. Heterogeneous Calculation of ε

    International Nuclear Information System (INIS)

    Jonsson, Alf

    1961-02-01

    A heterogeneous method of calculating the fast fission factor given by Naudet has been applied to the Carlvik - Pershagen definition of ε. An exact calculation of the collision probabilities is included in the programme developed for the Ferranti - Mercury computer

  14. Effect of dietary oils on peripheral neuropathy-related endpoints in dietary obese rats

    Directory of Open Access Journals (Sweden)

    Coppey L

    2018-04-01

    Full Text Available Lawrence Coppey,1 Eric Davidson,1 Hanna Shevalye,1 Michael E Torres,1 Mark A Yorek1–4 1Department of Internal Medicine, University of Iowa, Iowa City, IA, USA; 2Department of Veterans Affairs Iowa City Health Care System, Iowa City, IA, USA; 3Department of Veterans Affairs, Veterans Affairs Center for the Prevention and Treatment of Visual Loss, Iowa City, IA, USA; 4Fraternal Order of Eagles Diabetes Research Center, University of Iowa, Iowa City, IA, USA Purpose: This study aimed to determine the effect of dietary oils (olive, safflower, evening primrose, flaxseed, or menhaden enriched in different mono unsaturated fatty acids or polyunsaturated fatty acids on peripheral neuropathies in diet-induced obese Sprague-Dawley rats.Materials and methods: Rats at 12 weeks of age were fed a high-fat diet (45% kcal for 16 weeks. Afterward, the rats were fed diets with 50% of the kilocalories of fat derived from lard replaced by the different dietary oils. In addition, a control group fed a standard diet (4% kcal fat and a high fat fed group (45% kcal were maintained. The treatment period was 32 weeks. The endpoints evaluated included motor and sensory nerve conduction velocity, thermal sensitivity, innervation of sensory nerves in the cornea and skin, and vascular relaxation by epineurial arterioles.Results: Menhaden oil provided the greatest benefit for improving peripheral nerve damage caused by dietary obesity. Similar results were obtained when we examined acetylcholine-mediated vascular relaxation of epineurial arterioles of the sciatic nerve. Enriching the diets with fatty acids derived from the other oils provided minimal to partial improvements.Conclusion: These studies suggest that omega-3 polyunsaturated fatty acids derived from fish oil could be an effective treatment for neural and vascular complications associated with obesity. Keywords: peripheral neuropathy, fish oil, omega-3 polyunsaturated fatty acids, omega-6 polyunsaturated fatty

  15. Comparison of the sensitivity of different toxicological endpoints in Caco-2 cells after cadmium chloride treatment

    Energy Technology Data Exchange (ETDEWEB)

    Boveri, M.; Pazos, P.; Gennari, A.; Casado, J.; Hartung, T.; Prieto, P. [ECVAM, Inst. for Health and Consumer Protection, Joint Research Centre, European Commission, Ispra (Italy)

    2004-04-01

    The human colorectal adenocarcinoma cell line Caco-2 is a widely used in vitro model of the intestinal barrier. Cadmium chloride (CdCl{sub 2}) is a highly toxic metal compound, ubiquitous in the biosphere, able to enter the food chain and to reach the intestinal epithelium, causing structural and functional damages. The aim of this work was to characterise cadmium toxicity in Caco-2 cells and, in particular, to compare the sensitivity of different endpoints revealing damage both on the epithelial barrier and at the cellular or molecular level. After 24-h exposure of the cells to CdCl{sub 2}, lactate dehydrogenase (LDH) leakage showed cadmium-induced cell toxicity, significant from 25 {mu}M CdCl{sub 2} and above, and analysis of different cell death pathways indicated the presence of necrosis after treatment with 50 {mu}M CdCl{sub 2}. At the molecular level, we observed an increase in the protective protein heat shock protein 70 (HSP70), starting at 10 {mu}M CdCl{sub 2}. At the barrier level, transepithelial electrical resistance (TEER) decreased while paracellular permeability (PCP) significantly increased after the treatment, showing an EC{sub 50} of 6 and 16 {mu}M CdCl{sub 2}, respectively, and indicating the loss of barrier integrity. In conclusion, our data reveal that CdCl{sub 2} toxicity in Caco-2 cells can be detected at the barrier level at very low concentrations; also, HSP70 was shown to be a sensitive marker for detecting in vitro cadmium-induced toxicity. (orig.)

  16. Is Overall Mortality the Right Composite Endpoint in Clinical Trials of Acute Respiratory Distress Syndrome?

    Science.gov (United States)

    Villar, Jesús; Martínez, Domingo; Mosteiro, Fernando; Ambrós, Alfonso; Añón, José M; Ferrando, Carlos; Soler, Juan A; Montiel, Raquel; Vidal, Anxela; Conesa-Cayuela, Luís A; Blanco, Jesús; Arrojo, Regina; Solano, Rosario; Capilla, Lucía; Del Campo, Rafael; Civantos, Belén; Fernández, María Mar; Aldecoa, César; Parra, Laura; Gutiérrez, Andrea; Martínez-Jiménez, Chanel; González-Martín, Jesús M; Fernández, Rosa L; Kacmarek, Robert M

    2018-06-01

    Overall mortality in patients with acute respiratory distress syndrome is a composite endpoint because it includes death from multiple causes. In most acute respiratory distress syndrome trials, it is unknown whether reported deaths are due to acute respiratory distress syndrome or the underlying disease, unrelated to the specific intervention tested. We investigated the causes of death after contracting acute respiratory distress syndrome in a large cohort. A secondary analysis from three prospective, multicenter, observational studies. A network of multidisciplinary ICUs. We studied 778 patients with moderate-to-severe acute respiratory distress syndrome treated with lung-protective ventilation. None. We examined death in the ICU from individual causes. Overall ICU mortality was 38.8% (95% CI, 35.4-42.3). Causes of acute respiratory distress syndrome modified the risk of death. Twenty-three percent of deaths occurred from refractory hypoxemia due to nonresolving acute respiratory distress syndrome. Most patients died from causes unrelated to acute respiratory distress syndrome: 48.7% of nonsurvivors died from multisystem organ failure, and cancer or brain injury was involved in 37.1% of deaths. When quantifying the true burden of acute respiratory distress syndrome outcome, we identified 506 patients (65.0%) with one or more exclusion criteria for enrollment into current interventional trials. Overall ICU mortality of the "trial cohort" (21.3%) was markedly lower than the parent cohort (relative risk, 0.55; 95% CI, 0.43-0.70; p respiratory distress syndrome patients are not directly related to lung damage but to extrapulmonary multisystem organ failure. It would be challenging to prove that specific lung-directed therapies have an effect on overall survival.

  17. Transmission assessment surveys (TAS to define endpoints for lymphatic filariasis mass drug administration: a multicenter evaluation.

    Directory of Open Access Journals (Sweden)

    Brian K Chu

    Full Text Available BACKGROUND: Lymphatic filariasis (LF is targeted for global elimination through treatment of entire at-risk populations with repeated annual mass drug administration (MDA. Essential for program success is defining and confirming the appropriate endpoint for MDA when transmission is presumed to have reached a level low enough that it cannot be sustained even in the absence of drug intervention. Guidelines advanced by WHO call for a transmission assessment survey (TAS to determine if MDA can be stopped within an LF evaluation unit (EU after at least five effective rounds of annual treatment. To test the value and practicality of these guidelines, a multicenter operational research trial was undertaken in 11 countries covering various geographic and epidemiological settings. METHODOLOGY: The TAS was conducted twice in each EU with TAS-1 and TAS-2 approximately 24 months apart. Lot quality assurance sampling (LQAS formed the basis of the TAS survey design but specific EU characteristics defined the survey site (school or community, eligible population (6-7 year olds or 1(st-2(nd graders, survey type (systematic or cluster-sampling, target sample size, and critical cutoff (a statistically powered threshold below which transmission is expected to be no longer sustainable. The primary diagnostic tools were the immunochromatographic (ICT test for W. bancrofti EUs and the BmR1 test (Brugia Rapid or PanLF for Brugia spp. EUs. PRINCIPAL FINDINGS/CONCLUSIONS: In 10 of 11 EUs, the number of TAS-1 positive cases was below the critical cutoff, indicating that MDA could be stopped. The same results were found in the follow-up TAS-2, therefore, confirming the previous decision outcome. Sample sizes were highly sex and age-representative and closely matched the target value after factoring in estimates of non-participation. The TAS was determined to be a practical and effective evaluation tool for stopping MDA although its validity for longer-term post

  18. Predictive validity of endpoints used in electrophysiological modelling of migraine in the trigeminovascular system.

    Science.gov (United States)

    Farkas, Bence; Kardos, Péter; Orosz, Szabolcs; Tarnawa, István; Csekő, Csongor; Lévay, György; Farkas, Sándor; Lendvai, Balázs; Kovács, Péter

    2015-11-02

    The trigeminovascular system has a pivotal role in the pathomechanism of migraine. The aim of the present study was to further develop existing models of migraine making them more suitable for testing the effects of compounds with presumed antimigraine activity in anaesthetised rats. Simultaneous recording of ongoing activity of spontaneously active neurons in the trigeminocervical complex as well as their discharges evoked by electrical stimulation of the dura mater via activation of A- and C-sensory fibres were carried out. Effects of sumatriptan, propranolol and topiramate were evaluated prior to and after application of a mixture containing inflammatory mediators on the dura. Propranolol (10 mg/kg s.c) and topiramate (30 mg/kg s.c.) resulted in a tendency to decrease the level of both spontaneous and evoked activity, while sumatriptan (1 mg/kg s.c.) did not exhibit any effect on recorded parameters. Application of an inflammatory soup to the dura mater boosted up spontaneous activity, which could be significantly attenuated by propranolol and topiramate but not by sumatriptan. In addition, all compounds prevented the delayed increase of spontaneous firing. In contrast to the ongoing activity, evoked responses were not augmented by inflammatory mediators. Nevertheless, inhibitory effect of propranolol and topiramate was evident when considering A- or C-fibre responses. Findings do not support the view that electrically evoked responses are useful for the measurement of trigeminal sensitization. It is proposed however, that inhibition of enhanced firing (immediate and/or delayed) evoked by inflammatory mediators as an endpoint have higher predictive validity regarding the clinical effectiveness of compounds. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Determining significant endpoints for ecological risk analyses. 1998 annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    Hinton, T.G.; Congdon, J.; Scott, D. [Univ. of Georgia, Aiken, SC (US). Savannah River Ecology Lab.; Rowe, C. [Univ. of Puerto Rico, San Juan (PR); Bedford, J.; Whicker, W. [Colorado State Univ., Fort Collins, CO (US)

    1998-06-01

    'The goal of this report is to establish a protocol for assessing risks to non-human populations exposed to environmental stresses typically found on many DOE sites. The authors think that they can achieve this by using novel biological dosimeters in controlled, manipulative dose/effects experiments, and by coupling changes in metabolic rates and energy allocation patterns to meaningful population response variables (such as age-specific survivorship, reproductive output, age at maturity and longevity). This research is needed to determine the relevancy of sublethal cellular damage to the performance of individuals and populations exposed to chronic, low-level radiation, and radiation with concomitant exposure to chemicals. They believe that a scientifically defensible endpoint for measuring ecological risks can only be determined once its understood the extent to which molecular damage from contaminant exposure is detrimental at the individual and population levels of biological organization. The experimental facility will allow them to develop a credible assessment tool for appraising ecological risks, and to evaluate the effects of radionuclide/chemical synergisms on non-human species. This report summarizes work completed midway of a 3-year project that began in November 1996. Emphasis to date has centered on three areas: (1) developing a molecular probe to measure stable chromosomal aberrations known as reciprocal translocations, (2) constructing an irradiation facility where the statistical power inherent in replicated mesocosms can be used to address the response of non-human organisms to exposures from low levels of radiation and metal contaminants, and (3) quantifying responses of organisms living in contaminated mesocosms and field sites.'

  20. Transmission assessment surveys (TAS) to define endpoints for lymphatic filariasis mass drug administration: a multicenter evaluation.

    Science.gov (United States)

    Chu, Brian K; Deming, Michael; Biritwum, Nana-Kwadwo; Bougma, Windtaré R; Dorkenoo, Améyo M; El-Setouhy, Maged; Fischer, Peter U; Gass, Katherine; Gonzalez de Peña, Manuel; Mercado-Hernandez, Leda; Kyelem, Dominique; Lammie, Patrick J; Flueckiger, Rebecca M; Mwingira, Upendo J; Noordin, Rahmah; Offei Owusu, Irene; Ottesen, Eric A; Pavluck, Alexandre; Pilotte, Nils; Rao, Ramakrishna U; Samarasekera, Dilhani; Schmaedick, Mark A; Settinayake, Sunil; Simonsen, Paul E; Supali, Taniawati; Taleo, Fasihah; Torres, Melissa; Weil, Gary J; Won, Kimberly Y

    2013-01-01

    Lymphatic filariasis (LF) is targeted for global elimination through treatment of entire at-risk populations with repeated annual mass drug administration (MDA). Essential for program success is defining and confirming the appropriate endpoint for MDA when transmission is presumed to have reached a level low enough that it cannot be sustained even in the absence of drug intervention. Guidelines advanced by WHO call for a transmission assessment survey (TAS) to determine if MDA can be stopped within an LF evaluation unit (EU) after at least five effective rounds of annual treatment. To test the value and practicality of these guidelines, a multicenter operational research trial was undertaken in 11 countries covering various geographic and epidemiological settings. The TAS was conducted twice in each EU with TAS-1 and TAS-2 approximately 24 months apart. Lot quality assurance sampling (LQAS) formed the basis of the TAS survey design but specific EU characteristics defined the survey site (school or community), eligible population (6-7 year olds or 1(st)-2(nd) graders), survey type (systematic or cluster-sampling), target sample size, and critical cutoff (a statistically powered threshold below which transmission is expected to be no longer sustainable). The primary diagnostic tools were the immunochromatographic (ICT) test for W. bancrofti EUs and the BmR1 test (Brugia Rapid or PanLF) for Brugia spp. EUs. In 10 of 11 EUs, the number of TAS-1 positive cases was below the critical cutoff, indicating that MDA could be stopped. The same results were found in the follow-up TAS-2, therefore, confirming the previous decision outcome. Sample sizes were highly sex and age-representative and closely matched the target value after factoring in estimates of non-participation. The TAS was determined to be a practical and effective evaluation tool for stopping MDA although its validity for longer-term post-MDA surveillance requires further investigation.

  1. Providing Continuous Assurance

    NARCIS (Netherlands)

    Kocken, Jonne; Hulstijn, Joris

    2017-01-01

    It has been claimed that continuous assurance can be attained by combining continuous monitoring by management, with continuous auditing of data streams and the effectiveness of internal controls by an external auditor. However, we find that in existing literature the final step to continuous

  2. Microcomputer generated pipe support calculations

    International Nuclear Information System (INIS)

    Hankinson, R.F.; Czarnowski, P.; Roemer, R.E.

    1991-01-01

    The cost and complexity of pipe support design has been a continuing challenge to the construction and modification of commercial nuclear facilities. Typically, pipe support design or qualification projects have required large numbers of engineers centrally located with access to mainframe computer facilities. Much engineering time has been spent repetitively performing a sequence of tasks to address complex design criteria and consolidating the results of calculations into documentation packages in accordance with strict quality requirements. The continuing challenges of cost and quality, the need for support engineering services at operating plant sites, and the substantial recent advances in microcomputer systems suggested that a stand-alone microcomputer pipe support calculation generator was feasible and had become a necessity for providing cost-effective and high quality pipe support engineering services to the industry. This paper outlines the preparation for, and the development of, an integrated pipe support design/evaluation software system which maintains all computer programs in the same environment, minimizes manual performance of standard or repetitive tasks, and generates a high quality calculation which is consistent and easily followed

  3. Solution of the Cauchy problem for a continuous limit of the Toda lattice and its superextension

    International Nuclear Information System (INIS)

    Saveliev, M.V.; Sorba, P.

    1991-01-01

    A supersymmetric equation associated with a continuum limit of the classical superalgebra sl(n/n+1) is constructed. This equation can be considered as a superextension of a continuous limit of the Toda lattice with fixed end-points or, in other words, as a supersymmetric version of the heavenly equation. A solution of the Cauchy problem for the continuous limit of the Toda lattice and for its superextension is given using some formal reasonings. (orig.)

  4. Calculation of toroidal fusion reactor blankets by Monte Carlo

    International Nuclear Information System (INIS)

    Macdonald, J.L.; Cashwell, E.D.; Everett, C.J.

    1977-01-01

    A brief description of the calculational method is given. The code calculates energy deposition in toroidal geometry, but is a continuous energy Monte Carlo code, treating the reaction cross sections as well as the angular scattering distributions in great detail

  5. Modulation of cigarette smoke-related end-points in mutagenesis and carcinogenesis

    International Nuclear Information System (INIS)

    De Flora, Silvio; D'Agostini, Francesco; Balansky, Roumen; Camoirano, Anna; Bennicelli, Carlo; Bagnasco, Maria; Cartiglia, Cristina; Tampa, Elena; Longobardi, Maria Grazia; Lubet, Ronald A.; Izzotti, Alberto

    2003-01-01

    The epidemic of lung cancer and the increase of other tumours and chronic degenerative diseases associated with tobacco smoking have represented one of the most dramatic catastrophes of the 20th century. The control of this plague is one of the major challenges of preventive medicine for the next decades. The imperative goal is to refrain from smoking. However, chemoprevention by dietary and/or pharmacological agents provides a complementary strategy, which can be targeted not only to current smokers but also to former smokers and passive smokers. This article summarises the results of studies performed in our laboratories during the last 10 years, and provides new data generated in vitro, in experimental animals and in humans. We compared the ability of 63 putative chemopreventive agents to inhibit the bacterial mutagenicity of mainstream cigarette smoke. Modulation by ethanol and the mechanisms involved were also investigated both in vitro and in vivo. Several studies evaluated the effects of dietary chemopreventive agents towards smoke-related intermediate biomarkers in various cells, tissues and organs of rodents. The investigated end-points included metabolic parameters, adducts to haemoglobin, bulky adducts to nuclear DNA, oxidative DNA damage, adducts to mitochondrial DNA, apoptosis, cytogenetic damage in alveolar macrophages, bone marrow and peripheral blood erytrocytes, proliferation markers, and histopathological alterations. The agents tested in vivo included N-acetyl-L-cysteine, 1,2-dithiole-3-thione, oltipraz, phenethyl isothiocyanate, 5,6-benzoflavone, and sulindac. We started applying multigene expression analysis to chemoprevention research, and postulated that an optimal agent should not excessively alter per se the physiological background of gene expression but should be able to attenuate the alterations produced by cigarette smoke or other carcinogens. We are working to develop an animal model for the induction of lung tumours following exposure

  6. The Growing Need for Validated Biomarkers and Endpoints for Dry Eye Clinical Research.

    Science.gov (United States)

    Roy, Neeta S; Wei, Yi; Kuklinski, Eric; Asbell, Penny A

    2017-05-01

    medical challenge with a prevalence rate ranging from 8% to 50%. Many clinicians and researchers across the globe are searching for better answers to understand the mechanisms related to the development and chronicity of DED. Though there have been many clinical trials for DED, few new treatments have emerged over the last decade. Biomarkers may provide the needed breakthrough to propel our understanding of DED to the next level and the potential to realize our goal of truly personalized medicine based on scientific evidence. Clinical trials and research on DED have suffered from the lack of validated biomarkers and less than objective and reproducible endpoints. Current work on biomarkers has provided the groundwork to move forward. This review highlights primarily ocular biomarkers that have been investigated for use in DED, discusses the methodologic outcomes in providing objective metrics for clinical research, and suggests recommendations for further work.

  7. Core calculations of JMTR

    Energy Technology Data Exchange (ETDEWEB)

    Nagao, Yoshiharu [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment

    1998-03-01

    In material testing reactors like the JMTR (Japan Material Testing Reactor) of 50 MW in Japan Atomic Energy Research Institute, the neutron flux and neutron energy spectra of irradiated samples show complex distributions. It is necessary to assess the neutron flux and neutron energy spectra of an irradiation field by carrying out the nuclear calculation of the core for every operation cycle. In order to advance core calculation, in the JMTR, the application of MCNP to the assessment of core reactivity and neutron flux and spectra has been investigated. In this study, in order to reduce the time for calculation and variance, the comparison of the results of the calculations by the use of K code and fixed source and the use of Weight Window were investigated. As to the calculation method, the modeling of the total JMTR core, the conditions for calculation and the adopted variance reduction technique are explained. The results of calculation are shown. Significant difference was not observed in the results of neutron flux calculations according to the difference of the modeling of fuel region in the calculations by K code and fixed source. The method of assessing the results of neutron flux calculation is described. (K.I.)

  8. New drugs and patient-centred end-points in old age: setting the wheels in motion.

    Science.gov (United States)

    Mangoni, Arduino A; Pilotto, Alberto

    2016-01-01

    Older patients with various degrees of frailty and disability, a key population target of pharmacological interventions in acute and chronic disease states, are virtually neglected in pre-marketing studies assessing the efficacy and safety of investigational drugs. Moreover, aggressively pursuing established therapeutic targets in old age, e.g. blood pressure, serum glucose or cholesterol concentrations, is not necessarily associated with the beneficial effects, and the acceptable safety, reported in younger patient cohorts. Measures of self-reported health and functional status might represent additional, more meaningful, therapeutic end-points in the older population, particularly in patients with significant frailty and relatively short life expectancy, e.g. in the presence of cancer and/or neurodegenerative disease conditions. Strategies enhancing early knowledge about key pharmacological characteristics of investigational drugs targeting older adults are discussed, together with the rationale for incorporating non-traditional, patient-centred, end-points in this ever-increasing group.

  9. Comparison of RESRAD with hand calculations

    International Nuclear Information System (INIS)

    Rittmann, P.D.

    1995-09-01

    This report is a continuation of an earlier comparison done with two other computer programs, GENII and PATHRAE. The dose calculations by the two programs were compared with each other and with hand calculations. These band calculations have now been compared with RESRAD Version 5.41 to examine the use of standard models and parameters in this computer program. The hand calculations disclosed a significant computational error in RESRAD. The Pu-241 ingestion doses are five orders of magnitude too small. In addition, the external doses from some nuclides differ greatly from expected values. Both of these deficiencies have been corrected in later versions of RESRAD

  10. How "humane" is your endpoint? Refining the science-driven approach for termination of animal studies of chronic infection.

    OpenAIRE

    Nuno H Franco; Margarida Correia-Neves; I Anna S Olsson

    2012-01-01

    Public concern on issues such as animal welfare or the scientific validity and clinical value of animal research is growing, resulting in increasing regulatory demands for animal research. Abiding to the most stringent animal welfare standards, while having scientific objectives as the main priority, is often challenging. To do so, endpoints of studies involving severe, progressive diseases need to be established considering how early in the disease process the scientific objectives can be ac...

  11. Coil protection calculator for TFTR

    International Nuclear Information System (INIS)

    Marsala, R.J.; Lawson, J.E.; Persing, R.G.; Senko, T.R.; Woolley, R.D.

    1989-01-01

    A new coil protection system (CPS) is being developed to replace the existing TFTR magnetic coil fault detector. The existing fault detector sacrifices TFTR operating capability for simplicity. The new CPS, when installed in October of 1988, will permit operation up to the actual coil stress limits parameters in real-time. The computation will be done in a microprocessor based Coil Protection Calculator (CPC) currently under construction at PPL. THe new CPC will allow TFTR to operate with higher plasma currents and will permit the optimization of pulse repetition rates. The CPC will provide real-time estimates of critical coil and bus temperatures and stresses based on real-time redundant measurements of coil currents, coil cooling water inlet temperature, and plasma current. The critical parameter calculations are compared to prespecified limits. If these limits are reached or exceeded, protection action will be initiated to a hard wired control system (HCS), which will shut down the power supplies. The CPC consists of a redundant VME based microprocessor system which will sample all input data and compute all stress quantities every ten milliseconds. Thermal calculations will be approximated every 10ms with an exact solution occurring every second. The CPC features continuous cross-checking of redundant input signal, automatic detection of internal failure modes, monitoring and recording of calculated results, and a quick, functional verification of performance via an internal test system. (author)

  12. Pharmaceutics, Drug Delivery and Pharmaceutical Technology: A New Test Unit for Disintegration End-Point Determination of Orodispersible Films.

    Science.gov (United States)

    Low, Ariana; Kok, Si Ling; Khong, Yuetmei; Chan, Sui Yung; Gokhale, Rajeev

    2015-11-01

    No standard time or pharmacopoeia disintegration test method for orodispersible films (ODFs) exists. The USP disintegration test for tablets and capsules poses significant challenges for end-point determination when used for ODFs. We tested a newly developed disintegration test unit (DTU) against the USP disintegration test. The DTU is an accessory to the USP disintegration apparatus. It holds the ODF in a horizontal position, allowing top-view of the ODF during testing. A Gauge R&R study was conducted to assign relative contributions of the total variability from the operator, sample or the experimental set-up. Precision was compared using commercial ODF products in different media. Agreement between the two measurement methods was analysed. The DTU showed improved repeatability and reproducibility compared to the USP disintegration system with tighter standard deviations regardless of operator or medium. There is good agreement between the two methods, with the USP disintegration test giving generally longer disintegration times possibly due to difficulty in end-point determination. The DTU provided clear end-point determination and is suitable for quality control of ODFs during product developmental stage or manufacturing. This may facilitate the development of a standardized methodology for disintegration time determination of ODFs. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:3893-3903, 2015. Copyright © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  13. The role of categorization and scale endpoint comparisons in numerical information processing: A two-process model.

    Science.gov (United States)

    Tao, Tao; Wyer, Robert S; Zheng, Yuhuang

    2017-03-01

    We propose a two-process conceptualization of numerical information processing to describe how people form impressions of a score that is described along a bounded scale. According to the model, people spontaneously categorize a score as high or low. Furthermore, they compare the numerical discrepancy between the score and the endpoint of the scale to which it is closer, if they are not confident of their categorization, and use implications of this comparison as a basis for judgment. As a result, their evaluation of the score is less extreme when the range of numbers along the scale is large (e.g., from 0 to 100) than when it is small (from 0 to 10). Six experiments support this two-process model and demonstrate its generalizability. Specifically, the magnitude of numbers composing the scale has less impact on judgments (a) when the score being evaluated is extreme, (b) when individuals are unmotivated to engage in endpoint comparison processes (i.e., they are low in need for cognition), and (c) when they are unable to do so (i.e., they are under cognitive load). Moreover, the endpoint to which individuals compare the score can depend on their regulatory focus. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Endpoint design for future renal denervation trials - Novel implications for a new definition of treatment response to renal denervation.

    Science.gov (United States)

    Lambert, Thomas; Nahler, Alexander; Rohla, Miklos; Reiter, Christian; Grund, Michael; Kammler, Jürgen; Blessberger, Hermann; Kypta, Alexander; Kellermair, Jörg; Schwarz, Stefan; Starnawski, Jennifer A; Lichtenauer, Michael; Weiss, Thomas W; Huber, Kurt; Steinwender, Clemens

    2016-10-01

    Defining an adequate endpoint for renal denervation trials represents a major challenge. A high inter-individual and intra-individual variability of blood pressure levels as well as a partial or total non-adherence on antihypertensive drugs hamper treatment evaluations after renal denervation. Blood pressure measurements at a single point in time as used as primary endpoint in most clinical trials on renal denervation, might not be sufficient to discriminate between patients who do or do not respond to renal denervation. We compared the traditional responder classification (defined as systolic 24-hour blood pressure reduction of -5mmHg six months after renal denervation) with a novel definition of an ideal respondership (based on a 24h blood pressure reduction at no point in time, one, or all follow-up timepoints). We were able to re-classify almost a quarter of patients. Blood pressure variability was substantial in patients traditionally defined as responders. On the other hand, our novel classification of an ideal respondership seems to be clinically superior in discriminating sustained from pseudo-response to renal denervation. Based on our observations, we recommend that the traditional response classification should be reconsidered and possibly strengthened by using a composite endpoint of 24h-BP reductions at different follow-up-visits. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. A Comparison of Real-Time and Endpoint Cell Viability Assays for Improved Synthetic Lethal Drug Validation.

    Science.gov (United States)

    Single, Andrew; Beetham, Henry; Telford, Bryony J; Guilford, Parry; Chen, Augustine

    2015-12-01

    Cell viability assays fulfill a central role in drug discovery studies. It is therefore important to understand the advantages and disadvantages of the wide variety of available assay methodologies. In this study, we compared the performance of three endpoint assays (resazurin reduction, CellTiter-Glo, and nuclei enumeration) and two real-time systems (IncuCyte and xCELLigence). Of the endpoint approaches, both the resazurin reduction and CellTiter-Glo assays showed higher cell viabilities when compared directly to stained nuclei counts. The IncuCyte and xCELLigence real-time systems were comparable, and both were particularly effective at tracking the effects of drug treatment on cell proliferation at sub-confluent growth. However, the real-time systems failed to evaluate contrasting cell densities between drug-treated and control-treated cells at full growth confluency. Here, we showed that using real-time systems in combination with endpoint assays alleviates the disadvantages posed by each approach alone, providing a more effective means to evaluate drug toxicity in monolayer cell cultures. Such approaches were shown to be effective in elucidating the toxicity of synthetic lethal drugs in an isogenic pair of MCF10A breast cell lines. © 2015 Society for Laboratory Automation and Screening.

  16. Electronics Environmental Benefits Calculator

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Electronics Environmental Benefits Calculator (EEBC) was developed to assist organizations in estimating the environmental benefits of greening their purchase,...

  17. Electrical installation calculations basic

    CERN Document Server

    Kitcher, Christopher

    2013-01-01

    All the essential calculations required for basic electrical installation workThe Electrical Installation Calculations series has proved an invaluable reference for over forty years, for both apprentices and professional electrical installation engineers alike. The book provides a step-by-step guide to the successful application of electrical installation calculations required in day-to-day electrical engineering practice. A step-by-step guide to everyday calculations used on the job An essential aid to the City & Guilds certificates at Levels 2 and 3Fo

  18. Electrical installation calculations advanced

    CERN Document Server

    Kitcher, Christopher

    2013-01-01

    All the essential calculations required for advanced electrical installation workThe Electrical Installation Calculations series has proved an invaluable reference for over forty years, for both apprentices and professional electrical installation engineers alike. The book provides a step-by-step guide to the successful application of electrical installation calculations required in day-to-day electrical engineering practiceA step-by-step guide to everyday calculations used on the job An essential aid to the City & Guilds certificates at Levels 2 and 3For apprentices and electrical installatio

  19. Radar Signature Calculation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: The calculation, analysis, and visualization of the spatially extended radar signatures of complex objects such as ships in a sea multipath environment and...

  20. Waste Package Lifting Calculation

    International Nuclear Information System (INIS)

    H. Marr

    2000-01-01

    The objective of this calculation is to evaluate the structural response of the waste package during the horizontal and vertical lifting operations in order to support the waste package lifting feature design. The scope of this calculation includes the evaluation of the 21 PWR UCF (pressurized water reactor uncanistered fuel) waste package, naval waste package, 5 DHLW/DOE SNF (defense high-level waste/Department of Energy spent nuclear fuel)--short waste package, and 44 BWR (boiling water reactor) UCF waste package. Procedure AP-3.12Q, Revision 0, ICN 0, calculations, is used to develop and document this calculation

  1. FIPRED Project - Experiments and calculations

    International Nuclear Information System (INIS)

    Ohai, D.; Dumitrescu, I.; Doca, C.; Meleg, T.; Benga, D.

    2009-01-01

    Full text: The FIPRED (Fission Products Release from Debris Bed) Project was developed by INR in the framework of EC FP6 SARNET (2004-2008) and will be continued in EC FP6 SARNET2 (2009-2013). The project objective is the evaluation of fission product release from debris bed resulted after reactor severe accident by natural UO 2 sintered pellets self disintegration by oxidation. A large experimental program was performed covering the main parameters influencing granulometric distribution of powders (fragments) resulted from UO 2 sintered pellets self disintegration by air oxidation. The paper presents experimental results obtained and material equation obtained by mathematical calculations. (authors)

  2. Symmetries applied to reactor calculations

    International Nuclear Information System (INIS)

    Makai, M.

    1982-03-01

    Three problems of a reactor-calculational model are discussed with the help of symmetry considerations. 1/ A coarse mesh method applicable to any geometry is derived. It is shown that the coarse mesh solution can be constructed from a few standard boundary value problems. 2/ A second stage homogenization method is given based on the Bloch theorem. This ensures the continuity of the current and the flux at the boundary. 3/ The validity of the micro-macro separation is shown for heterogeneous lattices. A formula for the neutron density is derived for cell homogenization. (author)

  3. Business Continuity Management Plan

    Science.gov (United States)

    2014-12-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT BUSINESS CONTINUITY MANAGEMENT PLAN December 2014......maximum 200 words) Navy Supply Systems Command (NAVSUP) lacks a business process framework for the development of Business Continuity Management

  4. Plants under continuous light

    NARCIS (Netherlands)

    Velez Ramirez, A.I.; Ieperen, van W.; Vreugdenhill, D.; Millenaar, F.F.

    2011-01-01

    Continuous light is an essential tool for understanding the plant circadian clock. Additionally, continuous light might increase greenhouse food production. However, using continuous light in research and practice has its challenges. For instance, most of the circadian clock-oriented experiments

  5. PWR core design calculations

    International Nuclear Information System (INIS)

    Trkov, A.; Ravnik, M.; Zeleznik, N.

    1992-01-01

    Functional description of the programme package Cord-2 for PWR core design calculations is presented. Programme package is briefly described. Use of the package and calculational procedures for typical core design problems are treated. Comparison of main results with experimental values is presented as part of the verification process. (author) [sl

  6. Uneconomical top calculation method

    International Nuclear Information System (INIS)

    De Noord, M.; Vanm Sambeek, E.J.W.

    2003-08-01

    The methodology used to calculate the financial gap of renewable electricity sources and technologies is described. This methodology is used for calculating the production subsidy levels (MEP subsidies) for new renewable electricity projects in 2004 and 2005 in the Netherlands [nl

  7. Abdominal pain endpoints currently recommended by the FDA and EMA for adult patients with irritable bowel syndrome may not be reliable in children.

    Science.gov (United States)

    Saps, M; Lavigne, J V

    2015-06-01

    The Food and Drug Administration (FDA) recommended ≥30% decrease on patient-reported outcomes for pain be considered clinically significant in clinical trials for adults with irritable bowel syndrome. This percent change approach may not be appropriate for children. We compared three alternate approaches to determining clinically significant reductions in pain among children. 80 children with functional abdominal pain participated in a study of the efficacy of amitriptyline. Endpoints included patient-reported estimates of feeling better, and pain Visual Analog Scale (VAS). The minimum clinically important difference in pain report was calculated as (i) mean change in VAS score for children reporting being 'better'; (ii) percent changes in pain (≥30% and ≥50%) on the VAS; and (iii) statistically reliable changes on the VAS for 68% and 95% confidence intervals. There was poor agreement between the three approaches. 43.6% of the children who met the FDA ≥30% criterion for clinically significant change did not achieve a reliable level of improvement (95% confidence interval). Children's self-reported ratings of being better may not be statistically reliable. A combined approach in which children must report improvement as better and achieve a statistically significant change may be more appropriate for outcomes in clinical trials. © 2015 John Wiley & Sons Ltd.

  8. The free fractions of circulating docosahexaenoic acid and eicosapentenoic acid as optimal end-point of measure in bioavailability studies on n-3 fatty acids.

    Science.gov (United States)

    Scarsi, Claudia; Levesque, Ann; Lisi, Lucia; Navarra, Pierluigi

    2015-05-01

    The high complexity of n-3 fatty acids absorption process, along with the huge amount of endogenous fraction, makes bioavailability studies with these agents very challenging and deserving special consideration. In this paper we report the results of a bioequivalence study between a new formulation of EPA+DHA ethyl esters developed by IBSA Institut Biochimique and reference medicinal product present on the Italian market. Bioequivalence was demonstrated according to the criteria established by the EMA Guideline on the Investigation of Bioequivalence. We found that the free fractions represent a better and more sensitive end-point for bioequivalence investigations on n-3 fatty acids, since: (i) the overall and intra-subject variability of PK parameters was markedly lower compared to the same variability calculated on the total DHA and EPA fractions; (ii) the absorption process was completed within 4h, and the whole PK profile could be drawn within 12-15 h from drug administration. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Factors Influencing Depression Endpoints Research (FINDER: baseline results of Italian patients with depression

    Directory of Open Access Journals (Sweden)

    Grassi Luigi

    2009-05-01

    Full Text Available Abstract Background Factors Influencing Depression Endpoints Research (FINDER is a 6-month, prospective, observational study carried out in 12 European countries aimed at investigating health-related quality of life (HRQoL in outpatients receiving pharmacological treatment for a first or new depressive episode. Baseline characteristics of patients enrolled in Italy are presented. Methods All treatment decisions were at the discretion of the investigator. Data were collected at baseline and after 3 and 6 months of treatment. Baseline evaluations included demographics, medical and psychiatric history, and medications used in the last 24 months and prescribed at enrolment. The Hospital Anxiety and Depression Scale (HADS, was adopted to evaluate depressive symptoms, while somatic and painful physical symptoms were assessed by using the Somatic Symptom Inventory (SSI and a 0 to 100 mm visual analogue scale (VAS, HRQoL via 36-item Short Form Health Survey (SF-36, and the European Quality of Life 5-Dimensions (EQ-5D instrument. Results A total of 513 patients were recruited across 38 sites. The mean ± standard deviation (SD age at first depressive episode was 38.7 ± 15.9 years, the mean duration of depression 10.6 ± 12.3 years. The most common psychiatric comorbidities in the previous 24 months were anxiety/panic (72.6% and obsessive/compulsive disorders (13.4%, while 35.9% had functional somatic syndromes. Most patients (65.1% reported pain from any cause. Monotherapy with selective serotonin reuptake inhibitors (SSRIs and tricyclic antidepressants (TCAs was prescribed at enrolment in 64.5% and 6.4% of the cases, respectively. The most commonly prescribed agents were sertraline (17.3%, escitalopram (16.2%, venlaflaxine (15.6% and paroxetine (14.8%. The mean HADS subscores for depression and anxiety were 13.3 ± 4.2 and 12.2 ± 3.9, respectively; 76.4% of patients could be defined as being 'probable cases' for depression and 66.2% for anxiety. The

  10. Pressure Injury Progression and Factors Associated With Different End-Points in a Home Palliative Care Setting: A Retrospective Chart Review Study.

    Science.gov (United States)

    Artico, Marco; D'Angelo, Daniela; Piredda, Michela; Petitti, Tommasangelo; Lamarca, Luciano; De Marinis, Maria Grazia; Dante, Angelo; Lusignani, Maura; Matarese, Maria

    2018-07-01

    Patients with advanced illnesses show the highest prevalence for pressure injuries. In the palliative care setting, the ultimate goal is injury healing, but equally important is wound maintenance, wound palliation (wound-related pain and symptom management), and primary and secondary wound prevention. To describe the course of healing for pressure injuries in a home palliative care setting according to different end-points, and to explore patient and caregiver characteristics and specific care activities associated with their achievement. Four-year retrospective chart review of 669 patients cared for in a home palliative care service, of those 124 patients (18.5%) had at least one pressure injury with a survival rate less than or equal to six months. The proportion of healed pressure injuries was 24.4%. Of the injuries not healed, 34.0% were in a maintenance phase, whereas 63.6% were in a process of deterioration. Body mass index (P = 0.0014), artificial nutrition (P = 0.002), and age pay attention to artificial nutrition, continuous deep sedation, and the caregiver's role and gender. Copyright © 2018 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  11. Confirmatory versus explorative endpoint analysis: Decision-making on the basis of evidence available from market authorization and early benefit assessment for oncology drugs.

    Science.gov (United States)

    Niehaus, Ines; Dintsios, Charalabos-Markos

    2018-03-26

    The early benefit assessment of pharmaceuticals in Germany and their preceding market authorization pursue different objectives. This is reflected by the inclusion of varying confirmatory endpoints within the evaluation of oncology drugs in early benefit assessment versus market authorization, with both relying on the same evidence. Data from assessments up to July 2015 are used to estimate the impact of explorative in comparison to confirmatory endpoints on market authorization and early benefit assessment by contrasting the benefit-risk ratio of EMA and the benefit-harm balance of the HTA jurisdiction. Agreement between market authorization and early benefit assessment is examined by Cohen's kappa (k). 21 of 41 assessments were considered in the analysis. Market authorization is more confirmatory than early benefit assessment because it includes a higher proportion of primary endpoints. The latter implies a primary endpoint to be relevant for the benefit-harm balance in only 67% of cases (0.078). Explorative mortality endpoints reached the highest agreement regarding the mutual consideration for the risk-benefit ratio and the benefit-harm balance (0.000). For explorative morbidity endpoints (-0.600), quality of life (-0.600) and side effects (-0.949) no agreement is ascertainable. To warrant a broader confirmatory basis for decisions supported by HTA, closer inter-institutional cooperation of approval authorities and HTA jurisdictions by means of reliable joint advice for manufacturers regarding endpoint definition would be favorable. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Dose calculation for electrons

    International Nuclear Information System (INIS)

    Hirayama, Hideo

    1995-01-01

    The joint working group of ICRP/ICRU is advancing the works of reviewing the ICRP publication 51 by investigating the data related to radiation protection. In order to introduce the 1990 recommendation, it has been demanded to carry out calculation for neutrons, photons and electrons. As for electrons, EURADOS WG4 (Numerical Dosimetry) rearranged the data to be calculated at the meeting held in PTB Braunschweig in June, 1992, and the question and request were presented by Dr. J.L. Chartier, the responsible person, to the researchers who are likely to undertake electron transport Monte Carlo calculation. The author also has carried out the requested calculation as it was the good chance to do the mutual comparison among various computation codes regarding electron transport calculation. The content that the WG requested to calculate was the absorbed dose at depth d mm when parallel electron beam enters at angle α into flat plate phantoms of PMMA, water and ICRU4-element tissue, which were placed in vacuum. The calculation was carried out by the versatile electron-photon shower computation Monte Carlo code, EGS4. As the results, depth dose curves and the dependence of absorbed dose on electron energy, incident angle and material are reported. The subjects to be investigated are pointed out. (K.I.)

  13. Large scale GW calculations

    International Nuclear Information System (INIS)

    Govoni, Marco; Argonne National Lab., Argonne, IL; Galli, Giulia; Argonne National Lab., Argonne, IL

    2015-01-01

    We present GW calculations of molecules, ordered and disordered solids and interfaces, which employ an efficient contour deformation technique for frequency integration and do not require the explicit evaluation of virtual electronic states nor the inversion of dielectric matrices. We also present a parallel implementation of the algorithm, which takes advantage of separable expressions of both the single particle Green's function and the screened Coulomb interaction. The method can be used starting from density functional theory calculations performed with semilocal or hybrid functionals. The newly developed technique was applied to GW calculations of systems of unprecedented size, including water/semiconductor interfaces with thousands of electrons

  14. Radioactive cloud dose calculations

    International Nuclear Information System (INIS)

    Healy, J.W.

    1984-01-01

    Radiological dosage principles, as well as methods for calculating external and internal dose rates, following dispersion and deposition of radioactive materials in the atmosphere are described. Emphasis has been placed on analytical solutions that are appropriate for hand calculations. In addition, the methods for calculating dose rates from ingestion are discussed. A brief description of several computer programs are included for information on radionuclides. There has been no attempt to be comprehensive, and only a sampling of programs has been selected to illustrate the variety available

  15. PROSPECTS OF MANAGEMENT ACCOUNTING AND COST CALCULATION

    OpenAIRE

    Marian TAICU

    2014-01-01

    Progress in improving production technology requires appropriate measures to achieve an efficient management of costs. This raises the need for continuous improvement of management accounting and cost calculation. Accounting information in general, and management accounting information in particular, have gained importance in the current economic conditions, which are characterized by risk and uncertainty. The future development of management accounting and cost calculation is essential to me...

  16. Science in Action: National Stormwater Calculator (SWC) ...

    Science.gov (United States)

    Stormwater discharges continue to cause impairment of our Nation’s waterbodies. Regulations that require the retention and/or treatment of frequent, small storms that dominate runoff volumes and pollutant loads are becoming more common. EPA has developed the National Stormwater Calculator (SWC) to help support local, state, and national stormwater management objectives to reduce runoff through infiltration and retention using green infrastructure practices as low impact development (LID) controls. To inform the public on what the Stormwater Calculator is used for.

  17. PROSPECTS OF MANAGEMENT ACCOUNTING AND COST CALCULATION

    Directory of Open Access Journals (Sweden)

    Marian ŢAICU

    2014-11-01

    Full Text Available Progress in improving production technology requires appropriate measures to achieve an efficient management of costs. This raises the need for continuous improvement of management accounting and cost calculation. Accounting information in general, and management accounting information in particular, have gained importance in the current economic conditions, which are characterized by risk and uncertainty. The future development of management accounting and cost calculation is essential to meet the information needs of management.

  18. Handout on shielding calculation

    International Nuclear Information System (INIS)

    Heilbron Filho, P.F.L.

    1991-01-01

    In order to avoid the difficulties of the radioprotection supervisors in the tasks related to shielding calculations, is presented in this paper the basic concepts of shielding theory. It also includes exercises and examples. (author)

  19. Unit Cost Compendium Calculations

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Unit Cost Compendium (UCC) Calculations raw data set was designed to provide for greater accuracy and consistency in the use of unit costs across the USEPA...

  20. PHYSICOCHEMICAL PROPERTY CALCULATIONS

    Science.gov (United States)

    Computer models have been developed to estimate a wide range of physical-chemical properties from molecular structure. The SPARC modeling system approaches calculations as site specific reactions (pKa, hydrolysis, hydration) and `whole molecule' properties (vapor pressure, boilin...

  1. Magnetic Field Grid Calculator

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Magnetic Field Properties Calculator will computes the estimated values of Earth's magnetic field(declination, inclination, vertical component, northerly...

  2. Intercavitary implants dosage calculation

    International Nuclear Information System (INIS)

    Rehder, B.P.

    The use of spacial geometry peculiar to each treatment for the attainment of intercavitary and intersticial implants dosage calculation is presented. The study is made in patients with intercavitary implants by applying a modified Manchester technique [pt

  3. Casio Graphical Calculator Project.

    Science.gov (United States)

    Stott, Nick

    2001-01-01

    Shares experiences of a project aimed at developing and refining programs written on a Casio FX9750G graphing calculator. Describes in detail some programs used to develop mental strategies and problem solving skills. (MM)

  4. Small portable speed calculator

    Science.gov (United States)

    Burch, J. L.; Billions, J. C.

    1973-01-01

    Calculator is adapted stopwatch calibrated for fast accurate measurement of speeds. Single assembled unit is rugged, self-contained, and relatively inexpensive to manufacture. Potential market includes automobile-speed enforcement, railroads, and field-test facilities.

  5. Calculativeness and trust

    DEFF Research Database (Denmark)

    Frederiksen, Morten

    2014-01-01

    Williamson’s characterisation of calculativeness as inimical to trust contradicts most sociological trust research. However, a similar argument is found within trust phenomenology. This paper re-investigates Williamson’s argument from the perspective of Løgstrup’s phenomenological theory of trust....... Contrary to Williamson, however, Løgstrup’s contention is that trust, not calculativeness, is the default attitude and only when suspicion is awoken does trust falter. The paper argues that while Williamson’s distinction between calculativeness and trust is supported by phenomenology, the analysis needs...... to take actual subjective experience into consideration. It points out that, first, Løgstrup places trust alongside calculativeness as a different mode of engaging in social interaction, rather conceiving of trust as a state or the outcome of a decision-making process. Secondly, the analysis must take...

  6. Activities for Calculators.

    Science.gov (United States)

    Hiatt, Arthur A.

    1987-01-01

    Ten activities that give learners in grades 5-8 a chance to explore mathematics with calculators are provided. The activity cards involve such topics as odd addends, magic squares, strange projects, and conjecturing rules. (MNS)

  7. IRIS core criticality calculations

    International Nuclear Information System (INIS)

    Jecmenica, R.; Trontl, K.; Pevec, D.; Grgic, D.

    2003-01-01

    Three-dimensional Monte Carlo computer code KENO-VI of CSAS26 sequence of SCALE-4.4 code system was applied for pin-by-pin calculations of the effective multiplication factor for the first cycle IRIS reactor core. The effective multiplication factors obtained by the above mentioned Monte Carlo calculations using 27-group ENDF/B-IV library and 238-group ENDF/B-V library have been compared with the effective multiplication factors achieved by HELIOS/NESTLE, CASMO/SIMULATE, and modified CORD-2 nodal calculations. The results of Monte Carlo calculations are found to be in good agreement with the results obtained by the nodal codes. The discrepancies in effective multiplication factor are typically within 1%. (author)

  8. Current interruption transients calculation

    CERN Document Server

    Peelo, David F

    2014-01-01

    Provides an original, detailed and practical description of current interruption transients, origins, and the circuits involved, and how they can be calculated Current Interruption Transients Calculationis a comprehensive resource for the understanding, calculation and analysis of the transient recovery voltages (TRVs) and related re-ignition or re-striking transients associated with fault current interruption and the switching of inductive and capacitive load currents in circuits. This book provides an original, detailed and practical description of current interruption transients, origins,

  9. Source and replica calculations

    International Nuclear Information System (INIS)

    Whalen, P.P.

    1994-01-01

    The starting point of the Hiroshima-Nagasaki Dose Reevaluation Program is the energy and directional distributions of the prompt neutron and gamma-ray radiation emitted from the exploding bombs. A brief introduction to the neutron source calculations is presented. The development of our current understanding of the source problem is outlined. It is recommended that adjoint calculations be used to modify source spectra to resolve the neutron discrepancy problem

  10. Shielding calculations using FLUKA

    International Nuclear Information System (INIS)

    Yamaguchi, Chiri; Tesch, K.; Dinter, H.

    1988-06-01

    The dose equivalent on the surface of concrete shielding has been calculated using the Monte Carlo code FLUKA86 for incident proton energies from 10 to 800 GeV. The results have been compared with some simple equations. The value of the angular dependent parameter in Moyer's equation has been calculated from the locations where the values of the maximum dose equivalent occur. (author)

  11. The Factors Influencing Depression Endpoints Research (FINDER study: final results of Italian patients with depression

    Directory of Open Access Journals (Sweden)

    Quail Deborah

    2010-07-01

    Full Text Available Abstract Background Factors Influencing Depression Endpoints Research (FINDER is a 6-month, prospective, observational study carried out in 12 European countries aimed at investigating health-related quality of life (HRQoL in outpatients receiving treatment for a first or new depressive episode. The Italian HRQoL data at 6 months is described in this report, and the factors associated with HRQoL changes were determined. Methods Data were collected at baseline, 3 and 6 months of treatment. HRQoL was measured using components of the 36-item Short Form Health Survey (SF-36; mental component summary (MCS, physical component summary (PCS and the European Quality of Life-5 Dimensions (EQ-5D; visual analogue scale (VAS and health status index (HSI. The Hospital Anxiety and Depression Scale (HADS was adopted to evaluate depressive symptoms, while somatic and painful physical symptoms were assessed by using the 28-item Somatic Symptom Inventory (SSI-28 and a VAS. Results Of the initial 513 patients, 472 completed the 3-month observation and 466 the 6-month observation. The SF-36 and EQ-5D mean (± SD scores showed HRQoL improvements at 3 months and a further smaller improvement at 6 months, with the most positive effects for SF-36 MCS (baseline 22.0 ± 9.2, 3 months 34.6 ± 10.0; 6 months 39.3 ± 9.5 and EQ-5D HSI (baseline 0.4 ± 0.3; 3 months 0.7 ± 0.3; 6 months 0.7 ± 0.2. Depression and anxiety symptoms (HADS-D mean at baseline 13.3 ± 4.2; HADS-A mean at baseline 12.2 ± 3.9 consistently decreased during the first 3 months (8.7 ± 4.3; 7.5 ± 3.6 and showed a further positive change at 6 months (6.9 ± 4.3; 5.8 ± 3.4. Somatic and painful symptoms (SSI and VAS significantly decreased, with the most positive changes in the SSI-28 somatic item (mean at baseline 2.4 ± 0.7; mean change at 3 months: -0.5; 95% CI -0.6 to -0.5; mean change at 6 months: -0.7; 95% CI -0.8 to -0.7; in 'interference of overall pain with daily activities' (mean at baseline 45

  12. 40 CFR 98.413 - Calculating GHG emissions.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Calculating GHG emissions. 98.413 Section 98.413 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Industrial Greenhouse Gases § 98.413 Calculating...

  13. The effect of adherence to statin therapy on cardiovascular mortality: quantification of unmeasured bias using falsification end-points

    Directory of Open Access Journals (Sweden)

    Maarten J. Bijlsma

    2016-04-01

    Full Text Available Abstract Background To determine the clinical effectiveness of statins on cardiovascular mortality in practice, observational studies are needed. Control for confounding is essential in any observational study. Falsification end-points may be useful to determine if bias is present after adjustment has taken place. Methods We followed starters on statin therapy in the Netherlands aged 46 to 100 years over the period 1996 to 2012, from initiation of statin therapy until cardiovascular mortality or censoring. Within this group (n = 49,688, up to 16 years of follow-up, we estimated the effect of adherence to statin therapy (0 = completely non-adherent, 1 = fully adherent on ischemic heart diseases and cerebrovascular disease (ICD10-codes I20-I25 and I60-I69 as well as respiratory and endocrine disease mortality (ICD10-codes J00-J99 and E00-E90 as falsification end points, controlling for demographic factors, socio-economic factors, birth cohort, adherence to other cardiovascular medications, and diabetes using time-varying Cox regression models. Results Falsification end-points indicated that a simpler model was less biased than a model with more controls. Adherence to statins appeared to be protective against cardiovascular mortality (HR: 0.70, 95 % CI 0.61 to 0.81. Conclusions Falsification end-points helped detect overadjustment bias or bias due to competing risks, and thereby proved to be a useful technique in such a complex setting.

  14. A comparison of height and weight velocity as a part of the composite endpoint in pediatric HIV.

    Science.gov (United States)

    Benjamin, Daniel K; Miller, Wiliam C; Benjamin, Daniel K; Ryder, Robert W; Weber, David J; Walter, Emmanuel; McKinney, Ross E

    2003-11-07

    HIV adversely affects growth in children. Pediatric AIDS Clinical Trial Group (PACTG) protocols often use weight velocity [changes in weight z-score for age (WAZ)] as a part of the composite endpoint for phase II and III clinical trials. However, WAZ and height velocity (HAZ) have not been critically compared for their utility as part of the composite endpoint. HAZ and WAZ were compared to predict laboratory and clinical progression of HIV in a retrospective cohort study of HIV-infected children with data from PACTG Protocol 300. In both bivariable and multivariable analyses, changes in HAZ were more closely linked to subsequent progression than WAZ. Children with improved HAZ were somewhat less likely to exhibit virological failure [odds ratio (OR), 0.76; 95% confidence interval (CI) 0.51-1.14], than children with improved WAZ (OR, 1.45; 95% CI, 0.99,2.11). Children who had improved HAZ were less likely to exhibit immunological failure (OR, 0.7; 95% CI, 0.49-1.00), than children with improved WAZ (OR, 1.13; 95% CI, 0.82-1.57). Children who had improved HAZ were less likely to have other forms of clinical progression of HIV (OR, 0.55; 95% CI, 0.31-0.99), than children who had improved WAZ (OR, 1.0; 95% CI, 1.58-1.94). Increases in HAZ were associated with reduced risk of subsequent clinical progression and subsequent immune reconstitution and weakly associated with declines in HIV RNA. Changes in WAZ were not associated with laboratory outcomes relevant to pediatric HIV infection. Height velocity should be considered as a component of a composite clinical endpoint in future PACTG trials.

  15. Food Safety: Recommendations for Determining Doneness in Consumer Egg Dish Recipes and Measurement of Endpoint Temperatures When Recipes Are Followed

    Directory of Open Access Journals (Sweden)

    Sandria Godwin

    2016-06-01

    Full Text Available Many consumers do not follow recommended food safety practices for cooking egg dishes, such as pies, quiches, and casseroles, potentially leading to foodborne illnesses such as Salmonellosis. The United States Department of Agriculture (USDA recommends cooking egg mixtures until the center reaches 71 °C (160 °F. The objectives of this study were to determine what endpoint temperature information consumers receive from egg dish recipes, and if recipes would lead to safe temperatures when followed. Egg dish recipes (n = 226 from 65 websites, 50 cookbooks, and nine magazine titles (multiple issues of each were analyzed. Time was the most frequently used indicator, given in 92% of the recipes, with 15% using only time. Other indicators included: set (89, browned (76, clean toothpick/knife (60, puffed (27, and jiggled (13. Only two recipes indicated final endpoint temperatures. Three recipes (a pie, a quiche, and an egg casserole were chosen and prepared in triplicate to see if they would reach recommended temperatures. The pie and quiche were still liquid at 71 °C, and were well over the recommended temperature when cooked according to instructions, but the egg casserole was not consistently above 71 °C, when the recipe instructions indicated it was done and the center was light brown and “jiggled” This research indicates that consumers are not receiving information on endpoint temperatures in egg recipes, but the likelihood of foodborne illness is low since most dishes probably be cooked past the recommended temperature before the consumer considers them done unless there are many inclusions that may absorb liquid and reduce the appearance of liquid in the dish.

  16. Food Safety: Recommendations for Determining Doneness in Consumer Egg Dish Recipes and Measurement of Endpoint Temperatures When Recipes Are Followed

    Science.gov (United States)

    Godwin, Sandria; Maughan, Curtis; Chambers, Edgar

    2016-01-01

    Many consumers do not follow recommended food safety practices for cooking egg dishes, such as pies, quiches, and casseroles, potentially leading to foodborne illnesses such as Salmonellosis. The United States Department of Agriculture (USDA) recommends cooking egg mixtures until the center reaches 71 °C (160 °F). The objectives of this study were to determine what endpoint temperature information consumers receive from egg dish recipes, and if recipes would lead to safe temperatures when followed. Egg dish recipes (n = 226) from 65 websites, 50 cookbooks, and nine magazine titles (multiple issues of each) were analyzed. Time was the most frequently used indicator, given in 92% of the recipes, with 15% using only time. Other indicators included: set (89), browned (76), clean toothpick/knife (60), puffed (27), and jiggled (13). Only two recipes indicated final endpoint temperatures. Three recipes (a pie, a quiche, and an egg casserole) were chosen and prepared in triplicate to see if they would reach recommended temperatures. The pie and quiche were still liquid at 71 °C, and were well over the recommended temperature when cooked according to instructions, but the egg casserole was not consistently above 71 °C, when the recipe instructions indicated it was done and the center was light brown and “jiggled” This research indicates that consumers are not receiving information on endpoint temperatures in egg recipes, but the likelihood of foodborne illness is low since most dishes probably be cooked past the recommended temperature before the consumer considers them done unless there are many inclusions that may absorb liquid and reduce the appearance of liquid in the dish. PMID:28231140

  17. An European inter-laboratory validation of alternative endpoints of the murine local lymph node assay: first round.

    Science.gov (United States)

    Ehling, G; Hecht, M; Heusener, A; Huesler, J; Gamer, A O; van Loveren, H; Maurer, Th; Riecke, K; Ullmann, L; Ulrich, P; Vandebriel, R; Vohr, H-W

    2005-08-15

    The new OECD guideline 429 (skin sensitization: local lymph node assay) is based upon a protocol, which utilises the incorporation of radioactivity into DNA as a measure for cell proliferation in vivo. The guideline also enables the use of alternative endpoints in order to assess draining lymph node (LN) cell proliferation. Here we describe the first round of an inter-laboratory validation of alternative endpoints in the LLNA conducted in seven laboratories. The validation study was managed and supervised by the Swiss Agency for Therapeutic Products, Swissmedic. Statistical analyses of all data were performed by an independent centre at the University of Bern, Department of Statistics. Ear-draining, LN weight and cell count were used to assess proliferation instead of radioactive labeling of lymph node cells. In addition, the acute inflammatory skin reaction was measured by ear swelling and weight of circular biopsies of the ears to identify skin irritating properties of the test items. Hexylcinnamaldehyde (HCA) and three blinded test items were applied to female, 8--10 weeks old NMRI and BALB/c mice. Results were sent via the independent study coordinator to the statistician. The results of this first round showed that the alternative endpoints of the LLNA are sensitive and robust parameters. The use of ear weights added an important parameter assessing the skin irritation potential, which supports the differentiation of pure irritative from contact allergenic potential. There were absolute no discrepancies between the categorisation of the three test substances A--C determined by each single participating laboratories. The results highlighted also that many parameters do have an impact on the strength of the responses. Therefore, such parameters have to be taken into consideration for the categorisation of compounds due to their relative sensitizing potencies.

  18. An European inter-laboratory validation of alternative endpoints of the murine local lymph node assay: First round

    International Nuclear Information System (INIS)

    Ehling, G.; Hecht, M.; Heusener, A.; Huesler, J.; Gamer, A.O.; Loveren, H. van; Maurer, Th.; Riecke, K.; Ullmann, L.; Ulrich, P.; Vandebriel, R.; Vohr, H.-W.

    2005-01-01

    The new OECD guideline 429 (skin sensitization: local lymph node assay) is based upon a protocol, which utilises the incorporation of radioactivity into DNA as a measure for cell proliferation in vivo. The guideline also enables the use of alternative endpoints in order to assess draining lymph node (LN) cell proliferation. Here we describe the first round of an inter-laboratory validation of alternative endpoints in the LLNA conducted in seven laboratories. The validation study was managed and supervised by the Swiss Agency for Therapeutic Products, Swissmedic. Statistical analyses of all data were performed by an independent centre at the University of Bern, Department of Statistics. Ear-draining, LN weight and cell count were used to assess proliferation instead of radioactive labeling of lymph node cells. In addition, the acute inflammatory skin reaction was measured by ear swelling and weight of circular biopsies of the ears to identify skin irritating properties of the test items. Hexylcinnamaldehyde (HCA) and three blinded test items were applied to female, 8-10 weeks old NMRI and BALB/c mice. Results were sent via the independent study coordinator to the statistician. The results of this first round showed that the alternative endpoints of the LLNA are sensitive and robust parameters. The use of ear weights added an important parameter assessing the skin irritation potential, which supports the differentiation of pure irritative from contact allergenic potential. There were absolute no discrepancies between the categorisation of the three test substances A-C determined by each single participating laboratories. The results highlighted also that many parameters do have an impact on the strength of the responses. Therefore, such parameters have to be taken into consideration for the categorisation of compounds due to their relative sensitizing potencies

  19. Food Safety: Recommendations for Determining Doneness in Consumer Egg Dish Recipes and Measurement of Endpoint Temperatures When Recipes Are Followed.

    Science.gov (United States)

    Godwin, Sandria; Maughan, Curtis; Chambers, Edgar

    2016-06-23

    Many consumers do not follow recommended food safety practices for cooking egg dishes, such as pies, quiches, and casseroles, potentially leading to foodborne illnesses such as Salmonellosis. The United States Department of Agriculture (USDA) recommends cooking egg mixtures until the center reaches 71 °C (160 °F). The objectives of this study were to determine what endpoint temperature information consumers receive from egg dish recipes, and if recipes would lead to safe temperatures when followed. Egg dish recipes ( n = 226) from 65 websites, 50 cookbooks, and nine magazine titles (multiple issues of each) were analyzed. Time was the most frequently used indicator, given in 92% of the recipes, with 15% using only time. Other indicators included: set (89), browned (76), clean toothpick/knife (60), puffed (27), and jiggled (13). Only two recipes indicated final endpoint temperatures. Three recipes (a pie, a quiche, and an egg casserole) were chosen and prepared in triplicate to see if they would reach recommended temperatures. The pie and quiche were still liquid at 71 °C, and were well over the recommended temperature when cooked according to instructions, but the egg casserole was not consistently above 71 °C, when the recipe instructions indicated it was done and the center was light brown and "jiggled" This research indicates that consumers are not receiving information on endpoint temperatures in egg recipes, but the likelihood of foodborne illness is low since most dishes probably be cooked past the recommended temperature before the consumer considers them done unless there are many inclusions that may absorb liquid and reduce the appearance of liquid in the dish.

  20. Chloride and sulphate toxicity to Hydropsyche exocellata (Trichoptera, Hydropsychidae): Exploring intraspecific variation and sub-lethal endpoints

    International Nuclear Information System (INIS)

    Sala, Miquel; Faria, Melissa; Sarasúa, Ignacio; Barata, Carlos; Bonada, Núria; Brucet, Sandra; Llenas, Laia; Ponsá, Sergio; Prat, Narcís; Soares, Amadeu M.V.M.

    2016-01-01

    The rivers and streams of the world are becoming saltier due to human activities. In spite of the potential damage that salt pollution can cause on freshwater ecosystems, this is an issue that is currently poorly managed. Here we explored intraspecific differences in the sensitivity of freshwater fauna to two major ions (Cl"− and SO_4"2"−) using the net-spinning caddisfly Hydropsyche exocellata Dufour 1841 (Trichoptera, Hydropsychidae) as a model organism. We exposed H. exocellata to saline solutions (reaching a conductivity of 2.5 mS cm"−"1) with Cl"−:SO_4"2"− ratios similar to those occurring in effluents coming from the meat, mining and paper industries, which release dissolved salts to rivers and streams in Spain. We used two different populations, coming from low and high conductivity streams. To assess toxicity, we measured sub-lethal endpoints: locomotion, symmetry of the food-capturing nets and oxidative stress biomarkers. According to biomarkers and net building, the population historically exposed to lower conductivities (B10) showed higher levels of stress than the population historically exposed to higher conductivities (L102). However, the differences between populations were not strong. For example, net symmetry was lower in the B10 than in the L102 only 48 h after treatment was applied, and biomarkers showed a variety of responses, with no discernable pattern. Also, treatment effects were rather weak, i.e. only some endpoints, and in most cases only in the B10 population, showed a significant response to treatment. The lack of consistent differences between populations and treatments could be related to the high salt tolerance of H. exocellata, since both populations were collected from streams with relatively high conductivities. The sub-lethal effects tested in this study can offer an interesting and promising tool to monitor freshwater salinization by combining physiological and behavioural bioindicators. - Highlights: • We assessed Cl

  1. Chloride and sulphate toxicity to Hydropsyche exocellata (Trichoptera, Hydropsychidae): Exploring intraspecific variation and sub-lethal endpoints

    Energy Technology Data Exchange (ETDEWEB)

    Sala, Miquel [Centre Tecnològic Forestal de Catalunya - CTFC, Solsona, Catalunya (Spain); Faria, Melissa [CESAM, Departamento de Biologia, Universidade de Aveiro, 3810-193 Aveiro (Portugal); Sarasúa, Ignacio [Technische Universität München, Munich, Bayern (Germany); Barata, Carlos [Institute of Environmental Assessment and Water Research (IDAEA-CSIC), Barcelona (Spain); Bonada, Núria [Grup de Recerca Freshwater Ecology and Management (FEM), Departament d' Ecologia, Facultat de Biologia, Universitat de Barcelona (UB), Diagonal 643, 08028 Barcelona, Catalonia (Spain); Grup de Recerca Freshwater Ecology and Management (FEM), Departament d' Ecologia, Facultat de Biologia, Institut de Recerca de la Biodiversitat (IRBio), Universitat de Barcelona - UB, Diagonal 643, 08028 Barcelona, Catalonia (Spain); Brucet, Sandra [Aquatic Ecology Group, BETA Tecnio Centre, University of Vic - Central University of Catalonia, Vic, Catalonia (Spain); Catalan Institution for Research and Advanced Studies, ICREA, Barcelona 08010 (Spain); Llenas, Laia; Ponsá, Sergio [Aquatic Ecology Group, BETA Tecnio Centre, University of Vic - Central University of Catalonia, Vic, Catalonia (Spain); Prat, Narcís [Grup de Recerca Freshwater Ecology and Management (FEM), Departament d' Ecologia, Facultat de Biologia, Universitat de Barcelona (UB), Diagonal 643, 08028 Barcelona, Catalonia (Spain); Soares, Amadeu M.V.M. [CESAM, Departamento de Biologia, Universidade de Aveiro, 3810-193 Aveiro (Portugal); and others

    2016-10-01

    The rivers and streams of the world are becoming saltier due to human activities. In spite of the potential damage that salt pollution can cause on freshwater ecosystems, this is an issue that is currently poorly managed. Here we explored intraspecific differences in the sensitivity of freshwater fauna to two major ions (Cl{sup −} and SO{sub 4}{sup 2−}) using the net-spinning caddisfly Hydropsyche exocellata Dufour 1841 (Trichoptera, Hydropsychidae) as a model organism. We exposed H. exocellata to saline solutions (reaching a conductivity of 2.5 mS cm{sup −1}) with Cl{sup −}:SO{sub 4}{sup 2−} ratios similar to those occurring in effluents coming from the meat, mining and paper industries, which release dissolved salts to rivers and streams in Spain. We used two different populations, coming from low and high conductivity streams. To assess toxicity, we measured sub-lethal endpoints: locomotion, symmetry of the food-capturing nets and oxidative stress biomarkers. According to biomarkers and net building, the population historically exposed to lower conductivities (B10) showed higher levels of stress than the population historically exposed to higher conductivities (L102). However, the differences between populations were not strong. For example, net symmetry was lower in the B10 than in the L102 only 48 h after treatment was applied, and biomarkers showed a variety of responses, with no discernable pattern. Also, treatment effects were rather weak, i.e. only some endpoints, and in most cases only in the B10 population, showed a significant response to treatment. The lack of consistent differences between populations and treatments could be related to the high salt tolerance of H. exocellata, since both populations were collected from streams with relatively high conductivities. The sub-lethal effects tested in this study can offer an interesting and promising tool to monitor freshwater salinization by combining physiological and behavioural bioindicators

  2. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  3. A PHYSIOLOGICALLY BASED COMPUTATIONAL MODEL OF THE BPG AXIS IN FATHEAD MINNOWS: PREDICTING EFFECTS OF ENDOCRINE DISRUPTING CHEMICAL EXPOSURE ON REPRODUCTIVE ENDPOINTS

    Science.gov (United States)

    This presentation describes development and application of a physiologically-based computational model that simulates the brain-pituitary-gonadal (BPG) axis and other endpoints important in reproduction such as concentrations of sex steroid hormones, 17-estradiol, testosterone, a...

  4. Hazardous waste transportation risk assessment for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement -- human health endpoints

    International Nuclear Information System (INIS)

    Hartmann, H.M.; Policastro, A.J.; Lazaro, M.A.

    1994-01-01

    In this presentation, a quantitative methodology for assessing the risk associated with the transportation of hazardous waste (HW) is proposed. The focus is on identifying air concentrations of HW that correspond to specific human health endpoints

  5. Predicting treatment effect from surrogate endpoints and historical trials: an extrapolation involving probabilities of a binary outcome or survival to a specific time.

    Science.gov (United States)

    Baker, Stuart G; Sargent, Daniel J; Buyse, Marc; Burzykowski, Tomasz

    2012-03-01

    Using multiple historical trials with surrogate and true endpoints, we consider various models to predict the effect of treatment on a true endpoint in a target trial in which only a surrogate endpoint is observed. This predicted result is computed using (1) a prediction model (mixture, linear, or principal stratification) estimated from historical trials and the surrogate endpoint of the target trial and (2) a random extrapolation error estimated from successively leaving out each trial among the historical trials. The method applies to either binary outcomes or survival to a particular time that is computed from censored survival data. We compute a 95% confidence interval for the predicted result and validate its coverage using simulation. To summarize the additional uncertainty from using a predicted instead of true result for the estimated treatment effect, we compute its multiplier of standard error. Software is available for download. © 2011, The International Biometric Society No claim to original US government works.

  6. Evaluation of Gene Expression Endpoints in the Context of a Xenopus laevis Metamorphosis-based Bioassay to Detect Thyroid Hormone Disruptors

    Science.gov (United States)

    This study accentuates the need to examine multiple tissues and provides critical information required for optimization of exposure regimens and endpoint assessments that focus on the detection of disruption in TH-regulatory systems.

  7. Online plasma calculator

    Science.gov (United States)

    Wisniewski, H.; Gourdain, P.-A.

    2017-10-01

    APOLLO is an online, Linux based plasma calculator. Users can input variables that correspond to their specific plasma, such as ion and electron densities, temperatures, and external magnetic fields. The system is based on a webserver where a FastCGI protocol computes key plasma parameters including frequencies, lengths, velocities, and dimensionless numbers. FastCGI was chosen to overcome security problems caused by JAVA-based plugins. The FastCGI also speeds up calculations over PHP based systems. APOLLO is built upon the WT library, which turns any web browser into a versatile, fast graphic user interface. All values with units are expressed in SI units except temperature, which is in electron-volts. SI units were chosen over cgs units because of the gradual shift to using SI units within the plasma community. APOLLO is intended to be a fast calculator that also provides the user with the proper equations used to calculate the plasma parameters. This system is intended to be used by undergraduates taking plasma courses as well as graduate students and researchers who need a quick reference calculation.

  8. Archives: Continuing Medical Education

    African Journals Online (AJOL)

    Items 51 - 88 of 88 ... Archives: Continuing Medical Education. Journal Home > Archives: Continuing Medical Education. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register · Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives. 51 - 88 of 88 ...

  9. Identification and content validation of wound therapy clinical endpoints relevant to clinical practice and patient values for FDA approval. Part 1. Survey of the wound care community.

    Science.gov (United States)

    Driver, Vickie R; Gould, Lisa J; Dotson, Peggy; Gibbons, Gary W; Li, William W; Ennis, William J; Kirsner, Robert S; Eaglstein, William H; Bolton, Laura L; Carter, Marissa J

    2017-05-01

    Wounds that exhibit delayed healing add extraordinary clinical, economic, and personal burdens to patients, as well as to increasing financial costs to health systems. New interventions designed to ease such burdens for patients with cancer, renal, or ophthalmologic conditions are often cleared for approval by the U.S. Food and Drug Administration (FDA) using multiple endpoints but the requirement of complete healing as a primary endpoint for wound products impedes FDA clearance of interventions that can provide other clinical or patient-centered benefits for persons with wounds. A multidisciplinary group of wound experts undertook an initiative, in collaboration with the FDA, to identify and content validate supporting FDA criteria for qualifying wound endpoints relevant to clinical practice (CP) and patient-centered outcomes (PCO) as primary outcomes in clinical trials. As part of the initiative, a research study was conducted involving 628 multidisciplinary expert wound clinicians and researchers from 4 different groups: the interdisciplinary core advisory team; attendees of the Spring 2015 Symposium on Advanced Wound Care (SAWC); clinicians employed by a national network of specialty clinics focused on comprehensive wound care; and Association for the Advancement of Wound Care (AAWC) and Wound Healing Society (WHS) members who had not previously completed the survey. The online survey assessed 28 literature-based wound care endpoints for their relevance and importance to clinical practice and clinical research. Fifteen of the endpoints were evaluated for their relevance to improving quality of life. Twenty-two endpoints had content validity indexes (CVI) ≥ 0.75, and 15 were selected as meriting potential inclusion as additional endpoints for FDA approval of future wound care interventions. This study represents an important first step in identifying and validating new measurable wound care endpoints for clinical research and practice and for regulatory

  10. Generalized analytic continuation

    CERN Document Server

    Ross, William T

    2002-01-01

    The theory of generalized analytic continuation studies continuations of meromorphic functions in situations where traditional theory says there is a natural boundary. This broader theory touches on a remarkable array of topics in classical analysis, as described in the book. This book addresses the following questions: (1) When can we say, in some reasonable way, that component functions of a meromorphic function on a disconnected domain, are "continuations" of each other? (2) What role do such "continuations" play in certain aspects of approximation theory and operator theory? The authors use the strong analogy with the summability of divergent series to motivate the subject. In this vein, for instance, theorems can be described as being "Abelian" or "Tauberian". The introductory overview carefully explains the history and context of the theory. The authors begin with a review of the works of Poincaré, Borel, Wolff, Walsh, and Gončar, on continuation properties of "Borel series" and other meromorphic func...

  11. Daylight calculations in practice

    DEFF Research Database (Denmark)

    Iversen, Anne; Roy, Nicolas; Hvass, Mette

    The aim of the project was to obtain a better understanding of what daylight calculations show and also to gain knowledge of how the different daylight simulation programs perform compared with each other. Experience has shown that results for the same room, obtained from two daylight simulation...... programs can give different results. This can be due to restrictions in the program itself and/or be due to the skills of the persons setting up the models. This is crucial as daylight calculations are used to document that the demands and recommendations to daylight levels outlined by building authorities....... The aim of the project was to obtain a better understanding of what daylight calculations show and also to gain knowledge of how the different daylight simulation programs perform compared with each other. Furthermore the aim was to provide knowledge of how to build up the 3D models that were...

  12. Calculating Quenching Weights

    CERN Document Server

    Salgado, C A; Salgado, Carlos A.; Wiedemann, Urs Achim

    2003-01-01

    We calculate the probability (``quenching weight'') that a hard parton radiates an additional energy fraction due to scattering in spatially extended QCD matter. This study is based on an exact treatment of finite in-medium path length, it includes the case of a dynamically expanding medium, and it extends to the angular dependence of the medium-induced gluon radiation pattern. All calculations are done in the multiple soft scattering approximation (Baier-Dokshitzer-Mueller-Peign\\'e-Schiff--Zakharov ``BDMPS-Z''-formalism) and in the single hard scattering approximation (N=1 opacity approximation). By comparison, we establish a simple relation between transport coefficient, Debye screening mass and opacity, for which both approximations lead to comparable results. Together with this paper, a CPU-inexpensive numerical subroutine for calculating quenching weights is provided electronically. To illustrate its applications, we discuss the suppression of hadronic transverse momentum spectra in nucleus-nucleus colli...

  13. Three recent TDHF calculations

    International Nuclear Information System (INIS)

    Weiss, M.S.

    1981-05-01

    Three applications of TDHF are discussed. First, vibrational spectra of a post grazing collision 40 Ca nucleus is examined and found to contain many high energy components, qualitatively consistent with recent Orsay experiments. Second, the fusion cross section in energy and angular momentum are calculated for 16 O + 24 Mg to exhibit the parameters of the low l window for this system. A sensitivity of the fusion cross section to the effective two body potential is discussed. Last, a preliminary analysis of 86 Kr + 139 La at E/sub lab/ = 505 MeV calculated in the frozen approximation is displayed, compared to experiment and discussed

  14. Fission neutron multiplicity calculations

    International Nuclear Information System (INIS)

    Maerten, H.; Ruben, A.; Seeliger, D.

    1991-01-01

    A model for calculating neutron multiplicities in nuclear fission is presented. It is based on the solution of the energy partition problem as function of mass asymmetry within a phenomenological approach including temperature-dependent microscopic energies. Nuclear structure effects on fragment de-excitation, which influence neutron multiplicities, are discussed. Temperature effects on microscopic energy play an important role in induced fission reactions. Calculated results are presented for various fission reactions induced by neutrons. Data cover the incident energy range 0-20 MeV, i.e. multiple chance fission is considered. (author). 28 refs, 13 figs

  15. Lattice cell burnup calculation

    International Nuclear Information System (INIS)

    Pop-Jordanov, J.

    1977-01-01

    Accurate burnup prediction is a key item for design and operation of a power reactor. It should supply information on isotopic changes at each point in the reactor core and the consequences of these changes on the reactivity, power distribution, kinetic characters, control rod patterns, fuel cycles and operating strategy. A basic stage in the burnup prediction is the lattice cell burnup calculation. This series of lectures attempts to give a review of the general principles and calculational methods developed and applied in this area of burnup physics

  16. PWR core design calculations

    Energy Technology Data Exchange (ETDEWEB)

    Trkov, A; Ravnik, M; Zeleznik, N [Inst. Jozef Stefan, Ljubljana (Slovenia)

    1992-07-01

    Functional description of the programme package Cord-2 for PWR core design calculations is presented. Programme package is briefly described. Use of the package and calculational procedures for typical core design problems are treated. Comparison of main results with experimental values is presented as part of the verification process. (author) [Slovenian] Opisali smo programski paket CORD-2, ki se uporablja pri projektnih izracunih sredice pri upravljanju tlacnovodnega reaktorja. Prikazana je uporaba paketa in racunskih postopkov za tipicne probleme, ki nastopajo pri projektiranju sredice. Primerjava glavnih rezultatov z eksperimentalnimi vrednostmi je predstavljena kot del preveritvenega procesa. (author)

  17. Comparison of earthworm responses to petroleum hydrocarbon exposure in aged field contaminated soil using traditional ecotoxicity endpoints and 1H NMR-based metabolomics

    International Nuclear Information System (INIS)

    Whitfield Åslund, Melissa; Stephenson, Gladys L.; Simpson, André J.; Simpson, Myrna J.

    2013-01-01

    1 H NMR metabolomics and conventional ecotoxicity endpoints were used to examine the response of earthworms exposed to petroleum hydrocarbons (PHCs) in soil samples collected from a site that was contaminated with crude oil from a pipeline failure in the mid-1990s. The conventional ecotoxicity tests showed that the soils were not acutely toxic to earthworms (average survival ≥90%), but some soil samples impaired reproduction endpoints by >50% compared to the field control soil. Additionally, metabolomics revealed significant relationships between earthworm metabolic profiles (collected after 2 or 14 days of exposure) and soil properties including soil PHC concentration. Further comparisons by partial least squares regression revealed a significant relationship between the earthworm metabolomic data (collected after only 2 or 14 days) and the reproduction endpoints (measured after 63 days). Therefore, metabolomic responses measured after short exposure periods may be predictive of chronic, ecologically relevant toxicity endpoints for earthworms exposed to soil contaminants. -- Highlights: •Earthworm response to petroleum hydrocarbon exposure in soil is examined. •Metabolomics shows significant changes to metabolic profile after 2 days. •Significant relationships observed between metabolomic and reproduction endpoints. •Metabolomics may have value as a rapid screening tool for chronic toxicity. -- Earthworm metabolomic responses measured after 2 and 14 days are compared to traditional earthworm ecotoxicity endpoints (survival and reproduction) in petroleum hydrocarbon contaminated soil

  18. Neutralization Assay for Zika and Dengue Viruses by Use of Real-Time-PCR-Based Endpoint Assessment.

    Science.gov (United States)

    Wilson, Heather L; Tran, Thomas; Druce, Julian; Dupont-Rouzeyrol, Myrielle; Catton, Michael

    2017-10-01

    The global spread and infective complications of Zika virus (ZKV) and dengue virus (DENV) have made them flaviviruses of public health concern. Serological diagnosis can be challenging due to antibody cross-reactivity, particularly in secondary flavivirus infections or when there is a history of flavivirus vaccination. The virus neutralization assay is considered to be the most specific assay for measurement of anti-flavivirus antibodies. This study describes an assay where the neutralization endpoint is measured by real-time PCR, providing results within 72 h. It demonstrated 100% sensitivity (24/24 ZKV and 15/15 DENV) and 100% specificity (11/11 specimens) when testing well-characterized sera. In addition, the assay was able to determine the correct DENV serotype in 91.7% of cases. The high sensitivity and specificity of the real-time PCR neutralization assay makes it suitable to use as a confirmatory test for sera that are reactive in commercial IgM/IgG enzyme immunoassays. Results are objective and the PCR-based measurement of the neutralization endpoint lends itself to automation so that throughput may be increased in times of high demand. Copyright © 2017 American Society for Microbiology.

  19. A Portable Automatic Endpoint Detection System for Amplicons of Loop Mediated Isothermal Amplification on Microfluidic Compact Disk Platform

    Directory of Open Access Journals (Sweden)

    Shah Mukim Uddin

    2015-03-01

    Full Text Available In recent years, many improvements have been made in foodborne pathogen detection methods to reduce the impact of food contamination. Several rapid methods have been developed with biosensor devices to improve the way of performing pathogen detection. This paper presents an automated endpoint detection system for amplicons generated by loop mediated isothermal amplification (LAMP on a microfluidic compact disk platform. The developed detection system utilizes a monochromatic ultraviolet (UV emitter for excitation of fluorescent labeled LAMP amplicons and a color sensor to detect the emitted florescence from target. Then it processes the sensor output and displays the detection results on liquid crystal display (LCD. The sensitivity test has been performed with detection limit up to 2.5 × 10−3 ng/µL with different DNA concentrations of Salmonella bacteria. This system allows a rapid and automatic endpoint detection which could lead to the development of a point-of-care diagnosis device for foodborne pathogens detection in a resource-limited environment.

  20. Testing of an End-Point Control Unit Designed to Enable Precision Control of Manipulator-Coupled Spacecraft

    Science.gov (United States)

    Montgomery, Raymond C.; Ghosh, Dave; Tobbe, Patrick A.; Weathers, John M.; Manouchehri, Davoud; Lindsay, Thomas S.

    1994-01-01

    This paper presents an end-point control concept designed to enable precision telerobotic control of manipulator-coupled spacecraft. The concept employs a hardware unit (end-point control unit EPCU) that is positioned between the end-effector of the Space Shuttle Remote Manipulator System and the payload. Features of the unit are active compliance (control of the displacement between the end-effector and the payload), to allow precision control of payload motions, and inertial load relief, to prevent the transmission of loads between the end-effector and the payload. This paper presents the concept and studies the active compliance feature using a simulation and hardware. Results of the simulation show the effectiveness of the EPCU in smoothing the motion of the payload. Results are presented from initial, limited tests of a laboratory hardware unit on a robotic arm testbed at the l Space Flight Center. Tracking performance of the arm in a constant speed automated retraction and extension maneuver of a heavy payload with and without the unit active is compared for the design speed and higher speeds. Simultaneous load reduction and tracking performance are demonstrated using the EPCU.

  1. Importance of soil-water relation in assessment endpoint in bioremediated soils: Plant growth and soil physical properties

    International Nuclear Information System (INIS)

    Li, X.; Sawatsky, N.

    1995-01-01

    Much effort has been focused on defining the end-point of bioremediated soils by chemical analysis (Alberta Tier 1 or CCME Guideline for Contaminated Soils) or toxicity tests. However, these tests do not completely assess the soil quality, or the capability of soil to support plant growth after bioremediation. This study compared barley (Hordeum vulgare) growth on: (i) non-contaminated, agricultural topsoil, (2) oil-contaminated soil (4% total extractable hydrocarbons, or TEH), and (3) oil-contaminated soil treated by bioremediation (< 2% TEH). Soil physical properties including water retention, water uptake, and water repellence were measured. The results indicated that the growth of barley was significantly reduced by oil-contamination of agricultural topsoil. Furthermore, bioremediation did not improve the barley yield. The lack of effects from bioremediation was attributed to development of water repellence in hydrocarbon contaminated soils. There seemed to be a critical water content around 18% to 20% in contaminated soils. Above this value the water uptake by contaminated soil was near that of the agricultural topsoil. For lower water contents, there was a strong divergence in sorptivity between contaminated and agricultural topsoil. For these soils, water availability was likely the single most important parameter controlling plant growth. This parameter should be considered in assessing endpoint of bioremediation for hydrocarbon contaminated soils

  2. Surrogate endpoints for overall survival in chemotherapy and radiotherapy trials in operable and locally advanced lung cancer: a re-analysis of meta-analyses of individual patients' data

    NARCIS (Netherlands)

    Mauguen, Audrey; Pignon, Jean-Pierre; Burdett, Sarah; Domerg, Caroline; Fisher, David; Paulus, Rebecca; Mandrekar, Samithra J.; Belani, Chandra P.; Shepherd, Frances A.; Eisen, Tim; Pang, Herbert; Collette, Laurence; Sause, William T.; Dahlberg, Suzanne E.; Crawford, Jeffrey; O'Brien, Mary; Schild, Steven E.; Parmar, Mahesh; Tierney, Jayne F.; Le Pechoux, Cécile; Michiels, Stefan; Burdett, S.; Fisher, D.; Le Péchoux, C.; Mauguen, A.; Michiels, S.; Pignon, J. P.; Tierney, J. F.; Belani, C. P.; Collette, L.; Dahlberg, S.; Eisen, T.; Mandrekar, S.; O'Brien, M.; Parmar, M.; Pang, H.; Paulus, R.; Crawford, J.; Sause, W.; Schild, S. E.; Shepherd, F.; Arriagada, R.; Atagi, S.; Auperin, A.; Ball, D.; Baumann, M.; Behrendt, K.; Belderbos, J.; Koning, C. C. E.; Uitterhoeve, A.

    2013-01-01

    The gold standard endpoint in clinical trials of chemotherapy and radiotherapy for lung cancer is overall survival. Although reliable and simple to measure, this endpoint takes years to observe. Surrogate endpoints that would enable earlier assessments of treatment effects would be useful. We

  3. 46 CFR 174.360 - Calculations.

    Science.gov (United States)

    2010-10-01

    ... GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SUBDIVISION AND STABILITY SPECIAL RULES PERTAINING TO SPECIFIC VESSEL TYPES Special Rules Pertaining to Dry Cargo Ships § 174.360 Calculations. Each ship to... for that ship by the International Convention for the Safety of Life at Sea, 1974, as amended, chapter...

  4. Simple Calculation Programs for Biology Other Methods

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Simple Calculation Programs for Biology Other Methods. Hemolytic potency of drugs. Raghava et al., (1994) Biotechniques 17: 1148. FPMAP: methods for classification and identification of microorganisms 16SrRNA. graphical display of restriction and fragment map of ...

  5. Simple Calculation Programs for Biology Immunological Methods

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Simple Calculation Programs for Biology Immunological Methods. Computation of Ab/Ag Concentration from EISA data. Graphical Method; Raghava et al., 1992, J. Immuno. Methods 153: 263. Determination of affinity of Monoclonal Antibody. Using non-competitive ...

  6. 40 CFR 75.83 - Calculation of Hg mass emissions and heat input rate.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Calculation of Hg mass emissions and... (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Hg Mass Emission Provisions § 75.83 Calculation of Hg mass emissions and heat input rate. The owner or operator shall calculate Hg mass emissions...

  7. Calculating Student Grades.

    Science.gov (United States)

    Allswang, John M.

    1986-01-01

    This article provides two short microcomputer gradebook programs. The programs, written in BASIC for the IBM-PC and Apple II, provide statistical information about class performance and calculate grades either on a normal distribution or based on teacher-defined break points. (JDH)

  8. Cooling tower calculations

    International Nuclear Information System (INIS)

    Simonkova, J.

    1988-01-01

    The problems are summed up of the dynamic calculation of cooling towers with forced and natural air draft. The quantities and relations are given characterizing the simultaneous exchange of momentum, heat and mass in evaporative water cooling by atmospheric air in the packings of cooling towers. The method of solution is clarified in the calculation of evaporation criteria and thermal characteristics of countercurrent and cross current cooling systems. The procedure is demonstrated of the calculation of cooling towers, and correction curves and the effect assessed of the operating mode at constant air number or constant outlet air volume flow on their course in ventilator cooling towers. In cooling towers with the natural air draft the flow unevenness is assessed of water and air relative to its effect on the resulting cooling efficiency of the towers. The calculation is demonstrated of thermal and resistance response curves and cooling curves of hydraulically unevenly loaded towers owing to the water flow rate parameter graded radially by 20% along the cross-section of the packing. Flow rate unevenness of air due to wind impact on the outlet air flow from the tower significantly affects the temperatures of cooled water in natural air draft cooling towers of a design with lower demands on aerodynamics, as early as at wind velocity of 2 m.s -1 as was demonstrated on a concrete example. (author). 11 figs., 10 refs

  9. Hypervelocity impact cratering calculations

    Science.gov (United States)

    Maxwell, D. E.; Moises, H.

    1971-01-01

    A summary is presented of prediction calculations on the mechanisms involved in hypervelocity impact cratering and response of earth media. Considered are: (1) a one-gram lithium-magnesium alloys impacting basalt normally at 6.4 km/sec, and (2) a large terrestrial impact corresponding to that of Sierra Madera.

  10. Languages for structural calculations

    International Nuclear Information System (INIS)

    Thomas, J.B.; Chambon, M.R.

    1988-01-01

    The differences between human and computing languages are recalled. It is argued that they are to some extent structured in antagonistic ways. Languages in structural calculation, in the past, present, and future, are considered. The contribution of artificial intelligence is stressed [fr

  11. Reactor dynamics calculations

    International Nuclear Information System (INIS)

    Devooght, J.; Lefvert, T.; Stankiewiez, J.

    1981-01-01

    This chapter deals with the work done in reactor dynamics within the Coordinated Research Program on Transport Theory and Advanced Reactor Calculations by three groups in Belgium, Poland, Sweden and Italy. Discretization methods in diffusion theory, collision probability methods in time-dependent neutron transport and singular perturbation method are represented in this paper

  12. Equilibrium fission model calculations

    International Nuclear Information System (INIS)

    Beckerman, M.; Blann, M.

    1976-01-01

    In order to aid in understanding the systematics of heavy ion fission and fission-like reactions in terms of the target-projectile system, bombarding energy and angular momentum, fission widths are calculated using an angular momentum dependent extension of the Bohr-Wheeler theory and particle emission widths using angular momentum coupling

  13. Topological Photonics for Continuous Media

    Science.gov (United States)

    Silveirinha, Mario

    Photonic crystals have revolutionized light-based technologies during the last three decades. Notably, it was recently discovered that the light propagation in photonic crystals may depend on some topological characteristics determined by the manner how the light states are mutually entangled. The usual topological classification of photonic crystals explores the fact that these structures are periodic. The periodicity is essential to ensure that the underlying wave vector space is a closed surface with no boundary. In this talk, we prove that it is possible calculate Chern invariants for a wide class of continuous bianisotropic electromagnetic media with no intrinsic periodicity. The nontrivial topology of the relevant continuous materials is linked with the emergence of edge states. Moreover, we will demonstrate that continuous photonic media with the time-reversal symmetry can be topologically characterized by a Z2 integer. This novel classification extends for the first time the theory of electronic topological insulators to a wide range of photonic platforms, and is expected to have an impact in the design of novel photonic systems that enable a topologically protected transport of optical energy. This work is supported in part by Fundacao para a Ciencia e a Tecnologia Grant Number PTDC/EEI-TEL/4543/2014.

  14. Extension of TOPAS for the simulation of proton radiation effects considering molecular and cellular endpoints

    International Nuclear Information System (INIS)

    Polster, Lisa; Schuemann, Jan; Rinaldi, Ilaria; McNamara, Aimee L; Paganetti, Harald; Burigo, Lucas; Stewart, Robert D; Attili, Andrea; Carlson, David J; Sato, Tatsuhiko; Ramos Méndez, José; Faddegon, Bruce; Perl, Joseph

    2015-01-01

    The aim of this work is to extend a widely used proton Monte Carlo tool, TOPAS, towards the modeling of relative biological effect (RBE) distributions in experimental arrangements as well as patients.TOPAS provides a software core which users configure by writing parameter files to, for instance, define application specific geometries and scoring conditions. Expert users may further extend TOPAS scoring capabilities by plugging in their own additional C++ code. This structure was utilized for the implementation of eight biophysical models suited to calculate proton RBE. As far as physics parameters are concerned, four of these models are based on the proton linear energy transfer, while the others are based on DNA double strand break induction and the frequency-mean specific energy, lineal energy, or delta electron generated track structure. The biological input parameters for all models are typically inferred from fits of the models to radiobiological experiments.The model structures have been implemented in a coherent way within the TOPAS architecture. Their performance was validated against measured experimental data on proton RBE in a spread-out Bragg peak using V79 Chinese Hamster cells.This work is an important step in bringing biologically optimized treatment planning for proton therapy closer to the clinical practice as it will allow researchers to refine and compare pre-defined as well as user-defined models. (paper)

  15. Feasibility study on embedded transport core calculations

    International Nuclear Information System (INIS)

    Ivanov, B.; Zikatanov, L.; Ivanov, K.

    2007-01-01

    The main objective of this study is to develop an advanced core calculation methodology based on embedded diffusion and transport calculations. The scheme proposed in this work is based on embedded diffusion or SP 3 pin-by-pin local fuel assembly calculation within the framework of the Nodal Expansion Method (NEM) diffusion core calculation. The SP 3 method has gained popularity in the last 10 years as an advanced method for neutronics calculation. NEM is a multi-group nodal diffusion code developed, maintained and continuously improved at the Pennsylvania State University. The developed calculation scheme is a non-linear iteration process, which involves cross-section homogenization, on-line discontinuity factors generation, and boundary conditions evaluation by the global solution passed to the local calculation. In order to accomplish the local calculation, a new code has been developed based on the Finite Elements Method (FEM), which is capable of performing both diffusion and SP 3 calculations. The new code will be used in the framework of the NEM code in order to perform embedded pin-by-pin diffusion and SP 3 calculations on fuel assembly basis. The development of the diffusion and SP 3 FEM code is presented first following by its application to several problems. Description of the proposed embedded scheme is provided next as well as the obtained preliminary results of the C3 MOX benchmark. The results from the embedded calculations are compared with direct pin-by-pin whole core calculations in terms of accuracy and efficiency followed by conclusions made about the feasibility of the proposed embedded approach. (authors)

  16. Trieste will continue

    International Nuclear Information System (INIS)

    1968-01-01

    Trieste will continue to be the home of the International Centre for Theoretical Physics for the foreseeable future. An agreement signed in Vienna during December between the Italian Government and the Agency brought this assurance. (author)

  17. Nocturnal continuous glucose monitoring

    DEFF Research Database (Denmark)

    Bay, Christiane; Kristensen, Peter Lommer; Pedersen-Bjergaard, Ulrik

    2013-01-01

    Abstract Background: A reliable method to detect biochemical nocturnal hypoglycemia is highly needed, especially in patients with recurrent severe hypoglycemia. We evaluated reliability of nocturnal continuous glucose monitoring (CGM) in patients with type 1 diabetes at high risk of severe...

  18. Continual improvement plan

    Science.gov (United States)

    1994-01-01

    NASA's approach to continual improvement (CI) is a systems-oriented, agency-wide approach that builds on the past accomplishments of NASA Headquarters and its field installations and helps achieve NASA's vision, mission, and values. The NASA of the future will fully use the principles of continual improvement in every aspect of its operations. This NASA CI plan defines a systematic approach and a model for continual improvement throughout NASA, stressing systems integration and optimization. It demonstrates NASA's constancy of purpose for improvement - a consistent vision of NASA as a worldwide leader in top-quality science, technology, and management practices. The CI plan provides the rationale, structures, methods, and steps, and it defines NASA's short term (1-year) objectives for improvement. The CI plan presents the deployment strategies necessary for cascading the goals and objectives throughout the agency. It also provides guidance on implementing continual improvement with participation from top leadership and all levels of employees.

  19. Continuing Medical Education

    African Journals Online (AJOL)

    Continuing Medical Education. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 25, No 9 (2007) >. Log in or Register to get access to full text downloads.

  20. Branching trajectory continual integral

    International Nuclear Information System (INIS)

    Maslov, V.P.; Chebotarev, A.M.

    1980-01-01

    Heuristic definition of the Feynman continual integral over branching trajectories is suggested which makes it possible to obtain in the closed form the solution of the Cauchy problem for the model Hartree equation. A number of properties of the solution is derived from an integral representation. In particular, the quasiclassical asymptotics, exact solution in the gaussian case and perturbation theory series are described. The existence theorem for the simpliest continual integral over branching trajectories is proved [ru

  1. Reaction paths and equilibrium end-points in solid-solution aqueous-solution systems

    Science.gov (United States)

    Glynn, P.D.; Reardon, E.J.; Plummer, Niel; Busenberg, E.

    1990-01-01

    Equations are presented describing equilibrium in binary solid-solution aqueous-solution (SSAS) systems after a dissolution, precipitation, or recrystallization process, as a function of the composition and relative proportion of the initial phases. Equilibrium phase diagrams incorporating the concept of stoichiometric saturation are used to interpret possible reaction paths and to demonstrate relations between stoichiometric saturation, primary saturation, and thermodynamic equilibrium states. The concept of stoichiometric saturation is found useful in interpreting and putting limits on dissolution pathways, but there currently is no basis for possible application of this concept to the prediction and/ or understanding of precipitation processes. Previously published dissolution experiments for (Ba, Sr)SO4 and (Sr, Ca)C??O3orth. solids are interpreted using equilibrium phase diagrams. These studies show that stoichiometric saturation can control, or at least influence, initial congruent dissolution pathways. The results for (Sr, Ca)CO3orth. solids reveal that stoichiometric saturation can also control the initial stages of incongruent dissolution, despite the intrinsic instability of some of the initial solids. In contrast, recrystallisation experiments in the highly soluble KCl-KBr-H2O system demonstrate equilibrium. The excess free energy of mixing calculated for K(Cl, Br) solids is closely modeled by the relation GE = ??KBr??KClRT[a0 + a1(2??KBr-1)], where a0 is 1.40 ?? 0.02, a1, is -0.08 ?? 0.03 at 25??C, and ??KBr and ??KCl are the mole fractions of KBr and KCl in the solids. The phase diagram constructed using this fit reveals an alyotropic maximum located at ??KBr = 0.676 and at a total solubility product, ???? = [K+]([Cl-] + [Br-]) = 15.35. ?? 1990.

  2. Stability calculations for MHD magnets

    International Nuclear Information System (INIS)

    Turner, L.R.; Wang, S.T.; Harrang, J.

    1978-01-01

    When a cryostable composite conductor carrying current experiences a heat input from a mechanical perturbation, a normal region develops which initially propagates and then either collapses or continues to propagate. A computer model has been devised to study this phenomenon. The model incorporates initial or continuing heat input from mechanical perturbations, heat conducted to the neighboring elements of the conductor and, if appropriate, heat conducted through insulation to neighboring turns. Heat is transferred to the helium coolant according to a specified heat transfer coefficient. If the element of conductor is in a normal or current-sharing state, resistive heating also occurs. The (unstable) equilibrium state of heat generation and conduction has been studied; results agree with those of a static calculation. The model has been validated against experimental measurements of response to heat pulses. The model suffers from uncertainties in transient heat transfer to the helium, but even more from uncertainties in the perturbing heat pulse which the magnet might be expected to suffer

  3. Photo-neutron reaction cross-section for 93Nb in the end-point bremsstrahlung energies of 12–16 and 45–70 MeV

    International Nuclear Information System (INIS)

    Naik, H.; Kim, G.N.; Schwengner, R.; Kim, K.; Zaman, M.; Tatari, M.; Sahid, M.; Yang, S.C.; John, R.; Massarczyk, R.; Junghans, A.; Shin, S.G.; Key, Y.; Wagner, A.; Lee, M.W.; Goswami, A.; Cho, M.-H.

    2013-01-01

    The photo-neutron cross-sections of 93 Nb at the end-point bremsstrahlung energies of 12, 14 and 16 MeV as well as 45, 50, 55, 60 and 70 MeV have been determined by the activation and the off-line γ-ray spectrometric techniques using the 20 MeV electron linac (ELBE) at Helmholtz-Zentrum Dresden-Rossendorf (HZDR), Dresden, Germany, and 100 MeV electron linac at Pohang Accelerator Laboratory (PAL), Pohang, Korea. The 93 Nb(γ, xn, x=1–4) reaction cross-sections as a function of photon energy were also calculated using computer code TALYS 1.4. The flux-weighted average values were obtained from the experimental and the theoretical (TALYS) values based on mono-energetic photons. The experimental values of present work are in good agreement with the flux-weighted theoretical values of TALYS 1.4 but are slightly higher than the flux-weighted experimental data of mono-energetic photons. It was also found that the theoretical and the experimental values of present work and literature data for the 93 Nb(γ, xn) reaction cross-sections increase from the threshold values to a certain energy, where other reaction channels opens. However, the increase of 93 Nb(γ, n) and 93 Nb(γ, 2n) reaction cross-sections are sharper compared to 93 Nb(γ, 3n) and 93 Nb(γ, 4n) reaction cross-sections. The sharp increase of 93 Nb(γ, n) and 93 Nb(γ, 2n) reaction cross-sections from the threshold value up to 17–22 MeV is due to the Giant Dipole Resonance (GDR) effect besides the role of excitation energy. After a certain values, the individual 93 Nb(γ, xn) reaction cross-sections decrease with increase of bremsstrahlung energy due to opening of other reaction channels

  4. Course on hybrid calculation

    International Nuclear Information System (INIS)

    Weill, J.; Tellier; Bonnemay; Craigne; Chareton; Di Falco

    1969-02-01

    After a definition of hybrid calculation (combination of analogue and digital calculation) with a distinction between series and parallel hybrid computing, and a description of a hybrid computer structure and of task sharing between computers, this course proposes a description of hybrid hardware used in Saclay and Cadarache computing centres, and of operations performed by these systems. The next part addresses issues related to programming languages and software. The fourth part describes how a problem is organised for its processing on these computers. Methods of hybrid analysis are then addressed: resolution of optimisation problems, of partial differential equations, and of integral equations by means of different methods (gradient, maximum principle, characteristics, functional approximation, time slicing, Monte Carlo, Neumann iteration, Fischer iteration)

  5. Calculation of projected ranges

    International Nuclear Information System (INIS)

    Biersack, J.P.

    1980-09-01

    The concept of multiple scattering is reconsidered for obtaining the directional spreading of ion motion as a function of energy loss. From this the mean projection of each pathlength element of the ion trajectory is derived which - upon summation or integration - leads to the desired mean projected range. In special cases, the calculation can be carried out analytically, otherwise a simple general algorithm is derived which is suitable even for the smallest programmable calculators. Necessary input for the present treatment consists only of generally accessable stopping power and straggling formulas. The procedure does not rely on scattering cross sections, e.g. power potential or f(t 1 sup(/) 2 ) approximations. The present approach lends itself easily to include electronic straggling or to treat composed target materials, or even to account for the so-called time integral. (orig.)

  6. Spallation reactions: calculations

    International Nuclear Information System (INIS)

    Bertini, H.W.

    1975-01-01

    Current methods for calculating spallation reactions over various energy ranges are described and evaluated. Recent semiempirical fits to existing data will probably yield the most accurate predictions for these reactions in general. However, if the products in question have binding energies appreciably different from their isotropic neighbors and if the cross section is approximately 30 mb or larger, then the intranuclear-cascade-evaporation approach is probably better suited. (6 tables, 12 figures, 34 references) (U.S.)

  7. Continuing versus Stopping Prestroke Antihypertensive Therapy in Acute Intracerebral Hemorrhage

    DEFF Research Database (Denmark)

    Krishnan, Kailash; Scutt, Polly; Woodhouse, Lisa

    2016-01-01

    BACKGROUND AND PURPOSE: More than 50% of patients with acute intracerebral hemorrhage (ICH) are taking antihypertensive drugs before ictus. Although antihypertensive therapy should be given long term for secondary prevention, whether to continue or stop such treatment during the acute phase of ICH...... remains unclear, a question that was addressed in the Efficacy of Nitric Oxide in Stroke (ENOS) trial. METHODS: ENOS was an international multicenter, prospective, randomized, blinded endpoint trial. Among 629 patients with ICH and systolic blood pressure between 140 and 220 mmHg, 246 patients who were...... taking antihypertensive drugs were assigned to continue (n = 119) or to stop (n = 127) taking drugs temporarily for 7 days. The primary outcome was the modified Rankin Score at 90 days. Secondary outcomes included death, length of stay in hospital, discharge destination, activities of daily living, mood...

  8. Performance assessment calculational exercises

    International Nuclear Information System (INIS)

    Barnard, R.W.; Dockery, H.A.

    1990-01-01

    The Performance Assessment Calculational Exercises (PACE) are an ongoing effort coordinated by Yucca Mountain Project Office. The objectives of fiscal year 1990 work, termed PACE-90, as outlined in the Department of Energy Performance Assessment (PA) Implementation Plan were to develop PA capabilities among Yucca Mountain Project (YMP) participants by calculating performance of a Yucca Mountain (YM) repository under ''expected'' and also ''disturbed'' conditions, to identify critical elements and processes necessary to assess the performance of YM, and to perform sensitivity studies on key parameters. It was expected that the PACE problems would aid in development of conceptual models and eventual evaluation of site data. The PACE-90 participants calculated transport of a selected set of radionuclides through a portion of Yucca Mountain for a period of 100,000 years. Results include analyses of fluid-flow profiles, development of a source term for radionuclide release, and simulations of contaminant transport in the fluid-flow field. Later work included development of a problem definition for perturbations to the originally modeled conditions and for some parametric sensitivity studies. 3 refs

  9. An application of the 'end-point' method to the minimum critical mass problem in two group transport theory

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2003-01-01

    A two group integral equation derived using transport theory, which describes the fuel distribution necessary for a flat thermal flux and minimum critical mass, is solved by the classical end-point method. This method has a number of advantages and in particular highlights the changing behaviour of the fissile mass distribution function in the neighbourhood of the core-reflector interface. We also show how the reflector thermal flux behaves and explain the origin of the maximum which arises when the critical size is less than that corresponding to minimum critical mass. A comparison is made with diffusion theory and the necessary and somewhat artificial presence of surface delta functions in the fuel distribution is shown to be analogous to the edge transients that arise naturally in transport theory

  10. Prediction of the relative toxicity of environmental toxins as a function of behavioral and non-behavioral endpoints

    International Nuclear Information System (INIS)

    Young, R.W.

    1979-01-01

    This study was conducted in order to examine the differential effects of behavioral and non-behavioral endpoints on the prediction of the relative toxicity of an environmental toxin. The effects of ionizing radiation were taken as the model for this evaluation. Forty rhesus monkeys were irradiated in groups of four at five different dose levels of high energy neuton and Bremsstrahlung radiations. Measures of behavioral performance, emesis and mortality were taken for each subject in order to test the hypotheses that behavioral indices would be more sensitive to gamma radiation than would physiological indices and that the physiological indices would be more sensitive to neutron radiations than would behavioral indices. The results supported these hypotheses

  11. Some thoughts on the nature of chromosomal aberrations and their use as a quantitative end-point for radiobiological studies

    International Nuclear Information System (INIS)

    Savage, J.R.K.

    1978-01-01

    A vital condition when chromosomal aberrations are to be used as a quantitative end-point (e.g. for constructing a dose response curve) is that a specific dose must produce a specific yield of aberrations under a given set of experimental conditions. In practice, there are very few cell systems where this condition is met. The majority show significant variations in observed yield with time between irradiation and sampling, indicative of variable radiosensitivity within the cell population. The profile of this yield time curve is determined by the cell-cycle kinetics and therefore is itself subject to modification by radiation through mitotic delay and perturbation. Thus in such heterogeneous populations, each increment of dose not only induces more aberrations, but at the same time modifies the recovered yield per cell. This has an obvious bearing upon the interpretation of the shape of any dose-response curve obtained

  12. Comparison between the radiosensitivity of human, mouse and chicken fibroblast-like cells using short-term endpoints

    International Nuclear Information System (INIS)

    Diatloff-Zito, C.; Loria, E.; Maciera-Coelho, A.; Deschavanne, P.J.; Malaise, E.P.

    1981-01-01

    A comparative study has been made of the radiosensitivity of fibroblastic cell lines from three different animal species: human, mouse and chicken. Endpoints reflecting short term responses were utilized: colony forming ability (CFA), DNA single strand break (SSB) repair and repair of potentially lethal damage (PLD). Regardless of the criterion employed, the response to radiation varies from one species to another. According to survival curves, chicken cells appear to be more radioresistant than those of human and mouse. SSB repair is apparently absent in murine cells, partial in chicken cells and complete in human cells. This lack of correlation between survival curves and SSB repair demonstrates that survival of irradiated cells does not depend only (or at all) on the repair of SSB. The repair of PLD is much more efficient in human and chicken cells than in murine cells. (author)

  13. Note on simultaneous inferences about non-inferiority and superiority for a primary and a secondary endpoint.

    Science.gov (United States)

    Guilbaud, Olivier

    2011-11-01

    In their review of challenges to multiple testing in clinical trials, Hung and Wang (2010) considered the situation where a treatment is to be compared with an active comparator and the aim is to show non-inferiority and (if possible) superiority with respect to a primary and a secondary endpoint. This note extends their discussion of this particular situation, taking the sequentially rejective procedure they used for illustration as a starting point. Some alternative multiple testing procedures (MTPs) are considered, and corresponding simultaneous confidence regions are discussed that provide additional information "for free". The choice may then be based on the properties of these MTPs and corresponding confidence regions. 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. The Medical Imaging & Technology Alliance conference on research endpoints appropriate for Medicare coverage of new PET radiopharmaceuticals.

    Science.gov (United States)

    Hillman, Bruce J; Frank, Richard A; Abraham, Brian C

    2013-09-01

    The outcomes of a 2011 Medical Imaging & Technology Alliance (MITA) conference helped shape considerations about what might be the most appropriate pathways for the regulatory and payment considerations of new PET radiopharmaceuticals. As follow-up to that conference, MITA convened a second conference of stakeholders to advise payers on what might be acceptable endpoints for clinical trials to support the coverage of novel PET agents. The conference involved experts on imaging and clinical research, providers of PET services, as well as representatives of interested medical societies, the PET industry, and the regulatory and payer communities. The principal outcome of their deliberations was that it was unrealistic to expect trials of new PET radiopharmaceuticals to directly demonstrate a health benefit. Rather, intermediate outcomes, such as a positive change in patient management, would be more efficient and appropriate.

  15. Continuous Markovian Logics

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Cardelli, Luca; Larsen, Kim Guldstrand

    2012-01-01

    Continuous Markovian Logic (CML) is a multimodal logic that expresses quantitative and qualitative properties of continuous-time labelled Markov processes with arbitrary (analytic) state-spaces, henceforth called continuous Markov processes (CMPs). The modalities of CML evaluate the rates...... of the exponentially distributed random variables that characterize the duration of the labeled transitions of a CMP. In this paper we present weak and strong complete axiomatizations for CML and prove a series of metaproperties, including the finite model property and the construction of canonical models. CML...... characterizes stochastic bisimilarity and it supports the definition of a quantified extension of the satisfiability relation that measures the "compatibility" between a model and a property. In this context, the metaproperties allows us to prove two robustness theorems for the logic stating that one can...

  16. Chloride and sulphate toxicity to Hydropsyche exocellata (Trichoptera, Hydropsychidae): Exploring intraspecific variation and sub-lethal endpoints.

    Science.gov (United States)

    Sala, Miquel; Faria, Melissa; Sarasúa, Ignacio; Barata, Carlos; Bonada, Núria; Brucet, Sandra; Llenas, Laia; Ponsá, Sergio; Prat, Narcís; Soares, Amadeu M V M; Cañedo-Arguelles, Miguel

    2016-10-01

    The rivers and streams of the world are becoming saltier due to human activities. In spite of the potential damage that salt pollution can cause on freshwater ecosystems, this is an issue that is currently poorly managed. Here we explored intraspecific differences in the sensitivity of freshwater fauna to two major ions (Cl(-) and SO4(2-)) using the net-spinning caddisfly Hydropsyche exocellata Dufour 1841 (Trichoptera, Hydropsychidae) as a model organism. We exposed H. exocellata to saline solutions (reaching a conductivity of 2.5mScm(-1)) with Cl(-):SO4(2-) ratios similar to those occurring in effluents coming from the meat, mining and paper industries, which release dissolved salts to rivers and streams in Spain. We used two different populations, coming from low and high conductivity streams. To assess toxicity, we measured sub-lethal endpoints: locomotion, symmetry of the food-capturing nets and oxidative stress biomarkers. According to biomarkers and net building, the population historically exposed to lower conductivities (B10) showed higher levels of stress than the population historically exposed to higher conductivities (L102). However, the differences between populations were not strong. For example, net symmetry was lower in the B10 than in the L102 only 48h after treatment was applied, and biomarkers showed a variety of responses, with no discernable pattern. Also, treatment effects were rather weak, i.e. only some endpoints, and in most cases only in the B10 population, showed a significant response to treatment. The lack of consistent differences between populations and treatments could be related to the high salt tolerance of H. exocellata, since both populations were collected from streams with relatively high conductivities. The sub-lethal effects tested in this study can offer an interesting and promising tool to monitor freshwater salinization by combining physiological and behavioural bioindicators. Copyright © 2016 Elsevier B.V. All rights

  17. Ecotoxicological evaluation of the additive butylated hydroxyanisole using a battery with six model systems and eighteen endpoints.

    Science.gov (United States)

    Jos, Angeles; Repetto, Guillermo; Ríos, Juan Carlos; del Peso, Ana; Salguero, Manuel; Hazen, María José; Molero, María Luisa; Fernández-Freire, Paloma; Pérez-Martín, Jose Manuel; Labrador, Verónica; Cameán, Ana

    2005-01-26

    The occurrence and fate of additives in the aquatic environment is an emerging issue in environmental chemistry. This paper describes the ecotoxicological effects of the commonly used additive butylated hydroxyanisole (BHA) using a test battery, comprising of several different organisms and in vitro test systems, representing a proportion of the different trophic levels. The most sensitive system to BHA was the inhibition of bioluminescence in Vibrio fischeri bacteria, which resulted in an acute low observed adverse effect concentration (LOAEC) of 0.28 microM. The next most sensitive system was the immobilization of the cladoceran Daphnia magna followed by: the inhibition of the growth of the unicellular alga Chlorella vulgaris; the endpoints evaluated in Vero (mammalian) cells (total protein content, LDH activity, neutral red uptake and MTT metabolization), mitotic index and root growth inhibition in the terrestrial plant Allium cepa, and finally, the endpoints used on the RTG-2 salmonid fish cell line (neutral red uptake, total protein content, MTS metabolization, lactate dehydrogenase leakage and activity, and glucose-6-phosphate dehydrogenase activity). Morphological alterations in RTG-2 cells were also assessed and these included loss of cells, induction of cellular pleomorphism, hydropic degeneration and induction of apoptosis at high concentrations. The results from this study also indicated that micronuclei were not induced in A.cepa exposed to BHA. The differences in sensitivity for the diverse systems that were used (EC50 ranged from 1.2 to >500 microM) suggest the importance for a test battery approach in the evaluation of the ecological consequences of chemicals. According to the results, the levels of BHA reported in industrial wastewater would elicit adverse effects in the environment. This, coupled with its potential to bioaccumulate, makes BHA a pollutant of concern not only for acute exposures, but also for the long-term.

  18. Review of titanium dioxide nanoparticle phototoxicity: Developing a phototoxicity ratio to correct the endpoint values of toxicity tests

    Science.gov (United States)

    2015-01-01

    Abstract Titanium dioxide nanoparticles are photoactive and produce reactive oxygen species under natural sunlight. Reactive oxygen species can be detrimental to many organisms, causing oxidative damage, cell injury, and death. Most studies investigating TiO2 nanoparticle toxicity did not consider photoactivation and performed tests either in dark conditions or under artificial lighting that did not simulate natural irradiation. The present study summarizes the literature and derives a phototoxicity ratio between the results of nano‐titanium dioxide (nano‐TiO2) experiments conducted in the absence of sunlight and those conducted under solar or simulated solar radiation (SSR) for aquatic species. Therefore, the phototoxicity ratio can be used to correct endpoints of the toxicity tests with nano‐TiO2 that were performed in absence of sunlight. Such corrections also may be important for regulators and risk assessors when reviewing previously published data. A significant difference was observed between the phototoxicity ratios of 2 distinct groups: aquatic species belonging to order Cladocera, and all other aquatic species. Order Cladocera appeared very sensitive and prone to nano‐TiO2 phototoxicity. On average nano‐TiO2 was 20 times more toxic to non‐Cladocera and 1867 times more toxic to Cladocera (median values 3.3 and 24.7, respectively) after illumination. Both median value and 75% quartile of the phototoxicity ratio are chosen as the most practical values for the correction of endpoints of nano‐TiO2 toxicity tests that were performed in dark conditions, or in the absence of sunlight. Environ Toxicol Chem 2015;34:1070–1077. © 2015 The Author. Published by SETAC. PMID:25640001

  19. Continuing bonds and place.

    Science.gov (United States)

    Jonsson, Annika; Walter, Tony

    2017-08-01

    Where do people feel closest to those they have lost? This article explores how continuing bonds with a deceased person can be rooted in a particular place or places. Some conceptual resources are sketched, namely continuing bonds, place attachment, ancestral places, home, reminder theory, and loss of place. The authors use these concepts to analyze interview material with seven Swedes and five Britons who often thought warmly of the deceased as residing in a particular place and often performing characteristic actions. The destruction of such a place, by contrast, could create a troubling, haunting absence, complicating the deceased's absent-presence.

  20. Introduction to Continuous Optimization

    DEFF Research Database (Denmark)

    Andreasson, Niclas; Evgrafov, Anton; Patriksson, Michael

    optimal solutions for continuous optimization models. The main part of the mathematical material therefore concerns the analysis and linear algebra that underlie the workings of convexity and duality, and necessary/sufficient local/global optimality conditions for continuous optimization problems. Natural...... algorithms are then developed from these optimality conditions, and their most important convergence characteristics are analyzed. The book answers many more questions of the form “Why?” and “Why not?” than “How?”. We use only elementary mathematics in the development of the book, yet are rigorous throughout...

  1. Continuous Platform Development

    DEFF Research Database (Denmark)

    Nielsen, Ole Fiil

    low risks and investments but also with relatively fuzzy results. When looking for new platform projects, it is important to make sure that the company and market is ready for the introduction of platforms, and to make sure that people from marketing and sales, product development, and downstream......, but continuous product family evolution challenges this strategy. The concept of continuous platform development is based on the fact that platform development should not be a one-time experience but rather an ongoing process of developing new platforms and updating existing ones, so that product family...

  2. Measuring titratable alkalinity by single versus double endpoint titration: An evaluation in two cyprinodont species and implications for characterizing net H+ flux in aquatic organisms.

    Science.gov (United States)

    Brix, Kevin V; Wood, Chris M; Grosell, Martin

    2013-01-01

    In this study, Na(+) uptake and acid-base balance in the euryhaline pupfish Cyprinodon variegatus variegatus were characterized when fish were exposed to pH 4.5 freshwater (7mM Na(+)). Similar to the related cyprinodont, Fundulus heteroclitus, Na(+) uptake was significantly inhibited when exposed to low pH water. However, it initially appeared that C. v. variegatus increased apparent net acid excretion at low pH relative to circumneutral pH. This result is opposite to previous observations for F. heteroclitus under similar conditions where fish were observed to switch from apparent net H(+) excretion at circumneutral pH to apparent net H(+) uptake at low pH. Further investigation revealed disparate observations between these studies were the result of using double endpoint titrations to measure titratable alkalinity fluxes in the current study, while the earlier study utilized single endpoint titrations to measure these fluxes (i.e.,. Cyprinodon acid-base transport is qualitatively similar to Fundulus when characterized using single endpoint titrations). This led to a comparative investigation of these two methods. We hypothesized that either the single endpoint methodology was being influenced by a change in the buffer capacity of the water (e.g., mucus being released by the fish) at low pH, or the double endpoint methodology was not properly accounting for ammonia flux by the fish. A series of follow-up experiments indicated that buffer capacity of the water did not change significantly, that excretion of protein (a surrogate for mucus) was actually reduced at low pH, and that the double endpoint methodology does not properly account for NH(3) excretion by fish under low pH conditions. As a result, it overestimates net H(+) excretion during low pH exposure. After applying the maximum possible correction for this error (i.e., assuming that all ammonia is excreted as NH(3)), the double endpoint methodology indicates that net H(+) transport was reduced to

  3. Accurate quantum chemical calculations

    Science.gov (United States)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  4. Zero Temperature Hope Calculations

    International Nuclear Information System (INIS)

    Rozsnyai, B. F.

    2002-01-01

    The primary purpose of the HOPE code is to calculate opacities over a wide temperature and density range. It can also produce equation of state (EOS) data. Since the experimental data at the high temperature region are scarce, comparisons of predictions with the ample zero temperature data provide a valuable physics check of the code. In this report we show a selected few examples across the periodic table. Below we give a brief general information about the physics of the HOPE code. The HOPE code is an ''average atom'' (AA) Dirac-Slater self-consistent code. The AA label in the case of finite temperature means that the one-electron levels are populated according to the Fermi statistics, at zero temperature it means that the ''aufbau'' principle works, i.e. no a priory electronic configuration is set, although it can be done. As such, it is a one-particle model (any Hartree-Fock model is a one particle model). The code is an ''ion-sphere'' model, meaning that the atom under investigation is neutral within the ion-sphere radius. Furthermore, the boundary conditions for the bound states are also set at the ion-sphere radius, which distinguishes the code from the INFERNO, OPAL and STA codes. Once the self-consistent AA state is obtained, the code proceeds to generate many-electron configurations and proceeds to calculate photoabsorption in the ''detailed configuration accounting'' (DCA) scheme. However, this last feature is meaningless at zero temperature. There is one important feature in the HOPE code which should be noted; any self-consistent model is self-consistent in the space of the occupied orbitals. The unoccupied orbitals, where electrons are lifted via photoexcitation, are unphysical. The rigorous way to deal with that problem is to carry out complete self-consistent calculations both in the initial and final states connecting photoexcitations, an enormous computational task. The Amaldi correction is an attempt to address this problem by distorting the

  5. Calculation of the inventory

    International Nuclear Information System (INIS)

    Heilbron Filho, P.F.L.; Oliveira Brandao, R. de.

    1988-04-01

    The theory of Point Kernel applied to a source uniformelly distributed in a cylindrical geometry was utilized to estimated the Cs-137 content of each package of radioactive waste collected. The Taylor equation was employed to calculate the build-up factor and the Green function G was adjusted by means of a least square method. The theory also takes into account factors such as aditional shielding, heterogeneity and humidity of the medium as well as associated uncertanties of the parameters envolved. (author) [pt

  6. Calculations in furnace technology

    CERN Document Server

    Davies, Clive; Hopkins, DW; Owen, WS

    2013-01-01

    Calculations in Furnace Technology presents the theoretical and practical aspects of furnace technology. This book provides information pertinent to the development, application, and efficiency of furnace technology. Organized into eight chapters, this book begins with an overview of the exothermic reactions that occur when carbon, hydrogen, and sulfur are burned to release the energy available in the fuel. This text then evaluates the efficiencies to measure the quantity of fuel used, of flue gases leaving the plant, of air entering, and the heat lost to the surroundings. Other chapters consi

  7. Studies on continuous fermentation

    Energy Technology Data Exchange (ETDEWEB)

    Ueda, K

    1958-01-01

    Continuous fermentation of molasses with a combined system of agitated vessel and flow pipe is studied. A new apparatus was designed. The rate of the fermentation was faster with this apparatus than with the former apparatus which was composed of two vessels.

  8. Continuous Reinforced Concrete Beams

    DEFF Research Database (Denmark)

    Hoang, Cao Linh; Nielsen, Mogens Peter

    1996-01-01

    This report deals with stress and stiffness estimates of continuous reinforced concrete beams with different stiffnesses for negative and positive moments e.g. corresponding to different reinforcement areas in top and bottom. Such conditions are often met in practice.The moment distribution...

  9. Continuous Adductor Canal Blocks

    DEFF Research Database (Denmark)

    Monahan, Amanda M; Sztain, Jacklynn F; Khatibi, Bahareh

    2016-01-01

    on cutaneous knee sensation in volunteers. METHODS: Bilateral adductor canal catheters were inserted in 24 volunteers followed by ropivacaine 0.2% administration for 8 hours. One limb of each subject was assigned randomly to a continuous infusion (8 mL/h) or automated hourly boluses (8 m...

  10. Continuous Personal Improvement.

    Science.gov (United States)

    Emiliani, M. L.

    1998-01-01

    Suggests that continuous improvement tools used in the workplace can be applied to self-improvement. Explains the use of such techniques as one-piece flow, kanban, visual controls, and total productive maintenance. Points out misapplications of these tools and describes the use of fishbone diagrams to diagnose problems. (SK)

  11. Continuity and Change.

    Science.gov (United States)

    Istance, David

    1985-01-01

    Examines issues related to continuity in education and educational change. Indicates that although schools must be responsive to changing social and economic conditions (and contribute to them), they must also be protected against fluctuating swings of educational fashion and safeguard their long-term mission, even when buffeted by short-term…

  12. Promoting Continuing Education Programs.

    Science.gov (United States)

    Hendrickson, Gayle A.

    This handbook is intended for use by institutions in marketing their continuing education programs. A section on "Devising Your Strategy" looks at identifying a target audience, determining the marketing approach, and developing a marketing plan and promotional techniques. A discussion of media options looks at the advantages and…

  13. Continuous quality improvement

    NARCIS (Netherlands)

    Rohlin, Madeleine; Schaub, Rob M.H.; Holbrook, Peter; Leibur, Edvitar; Lévy, Gérard; Roubalikova, Lenka; Nilner, Maria; Roger-Leroi, Valerie; Danner, Gunter; Iseri, Haluk; Feldman, Cecile

    2002-01-01

    Versch. in: Eur J Dent Educ; 6 (Suppl. 3): 67–77 Continuous quality improvement (CQI) can be envisaged as a circular process of goal-setting, followed by external and internal evaluations resulting in improvements that can serve as goals for a next cycle. The need for CQI is apparent, because of

  14. Continuous digital health

    NARCIS (Netherlands)

    Van Halteren, Aart; Gay, Vaĺerie

    2015-01-01

    A transformation is underway regarding how we deal with our health, not only because mobile Internet technology has made it possible to have continuous access to personal health information, but also because breaking the trend of ever-growing healthcare costs is increasingly necessary. Connectivity,

  15. Continuous quality improvement

    International Nuclear Information System (INIS)

    Bourne, P.B.

    1985-01-01

    This paper describes the various statistical tools used at the Hanford Engineering Development Laboratory to achieve continuous quality improvement in the development of Breeder Reactor Technology and in reactor operations. The role of the quality assurance professionals in this process, including quantifiable measurements using actual examples, is provided. The commitment to quality improvement through top management involvement is dramatically illustrated

  16. Continuous feedback fluid queues

    NARCIS (Netherlands)

    Scheinhardt, Willem R.W.; van Foreest, N.D.; Mandjes, M.R.H.

    2003-01-01

    We investigate a fluid buffer which is modulated by a stochastic background process, while the momentary behavior of the background process depends on the current buffer level in a continuous way. Loosely speaking the feedback is such that the background process behaves `as a Markov process' with

  17. Continuing Medical Education

    African Journals Online (AJOL)

    A review article willintroduce readers to the educational subject matter, along with one-page summarises (in print) of additional articles that may be accessed in full online. We will continue to offer topical and up-to-date CME material. Readers are encouraged to register with samj.org.za to receive future notifications of new ...

  18. Weldon Spring dose calculations

    International Nuclear Information System (INIS)

    Dickson, H.W.; Hill, G.S.; Perdue, P.T.

    1978-09-01

    In response to a request by the Oak Ridge Operations (ORO) Office of the Department of Energy (DOE) for assistance to the Department of the Army (DA) on the decommissioning of the Weldon Spring Chemical Plant, the Health and Safety Research Division of the Oak Ridge National Laboratory (ORNL) performed limited dose assessment calculations for that site. Based upon radiological measurements from a number of soil samples analyzed by ORNL and from previously acquired radiological data for the Weldon Spring site, source terms were derived to calculate radiation doses for three specific site scenarios. These three hypothetical scenarios are: a wildlife refuge for hunting, fishing, and general outdoor recreation; a school with 40 hr per week occupancy by students and a custodian; and a truck farm producing fruits, vegetables, meat, and dairy products which may be consumed on site. Radiation doses are reported for each of these scenarios both for measured uranium daughter equilibrium ratios and for assumed secular equilibrium. Doses are lower for the nonequilibrium case

  19. Configuration space Faddeev calculations

    International Nuclear Information System (INIS)

    Payne, G.L.; Klink, W.H.; Polyzou, W.N.

    1989-01-01

    The detailed study of few-body systems provides one of the most effective means for studying nuclear physics at subnucleon distance scales. For few-body systems the model equations can be solved numerically with errors less than the experimental uncertainties. We have used such systems to investigate the size of relativistic effects, the role of meson-exchange currents, and the importance of quark degrees of freedom in the nucleus. Complete calculations for momentum-dependent potentials have been performed, and the properties of the three-body bound state for these potentials have been studied. Few-body calculations of the electromagnetic form factors of the deuteron and pion have been carried out using a front-form formulation of relativistic quantum mechanics. The decomposition of the operators transforming convariantly under the Poincare group into kinematical and dynamical parts has been studies. New ways for constructing interactions between particles, as well as interactions which lead to the production of particles, have been constructed in the context of a relativistic quantum mechanics. To compute scattering amplitudes in a nonperturbative way, classes of operators have been generated out of which the phase operator may be constructed. Finally, we have worked out procedures for computing Clebsch-Gordan and Racah coefficients on a computer, as well as giving procedures for dealing with the multiplicity problem

  20. Buoyant plume calculations

    International Nuclear Information System (INIS)

    Penner, J.E.; Haselman, L.C.; Edwards, L.L.

    1985-01-01

    Smoke from raging fires produced in the aftermath of a major nuclear exchange has been predicted to cause large decreases in surface temperatures. However, the extent of the decrease and even the sign of the temperature change, depend on how the smoke is distributed with altitude. We present a model capable of evaluating the initial distribution of lofted smoke above a massive fire. Calculations are shown for a two-dimensional slab version of the model and a full three-dimensional version. The model has been evaluated by simulating smoke heights for the Hamburg firestorm of 1943 and a smaller scale oil fire which occurred in Long Beach in 1958. Our plume heights for these fires are compared to those predicted by the classical Morton-Taylor-Turner theory for weakly buoyant plumes. We consider the effect of the added buoyancy caused by condensation of water-laden ground level air being carried to high altitude with the convection column as well as the effects of background wind on the calculated smoke plume heights for several fire intensities. We find that the rise height of the plume depends on the assumed background atmospheric conditions as well as the fire intensity. Little smoke is injected into the stratosphere unless the fire is unusually intense, or atmospheric conditions are more unstable than we have assumed. For intense fires significant amounts of water vapor are condensed raising the possibility of early scavenging of smoke particles by precipitation. 26 references, 11 figures