WorldWideScience

Sample records for severe deterministic effects

  1. Severe deterministic effects of external exposure and intake of radioactive material: basis for emergency response criteria

    International Nuclear Information System (INIS)

    Kutkov, V; Buglova, E; McKenna, T

    2011-01-01

    Lessons learned from responses to past events have shown that more guidance is needed for the response to radiation emergencies (in this context, a 'radiation emergency' means the same as a 'nuclear or radiological emergency') which could lead to severe deterministic effects. The International Atomic Energy Agency (IAEA) requirements for preparedness and response for a radiation emergency, inter alia, require that arrangements shall be made to prevent, to a practicable extent, severe deterministic effects and to provide the appropriate specialised treatment for these effects. These requirements apply to all exposure pathways, both internal and external, and all reasonable scenarios, to include those resulting from malicious acts (e.g. dirty bombs). This paper briefly describes the approach used to develop the basis for emergency response criteria for protective actions to prevent severe deterministic effects in the case of external exposure and intake of radioactive material.

  2. RBE for deterministic effects

    International Nuclear Information System (INIS)

    1990-01-01

    In the present report, data on RBE values for effects in tissues of experimental animals and man are analysed to assess whether for specific tissues the present dose limits or annual limits of intake based on Q values, are adequate to prevent deterministic effects. (author)

  3. ICRP (1991) and deterministic effects

    International Nuclear Information System (INIS)

    Mole, R.H.

    1992-01-01

    A critical review of ICRP Publication 60 (1991) shows that considerable revisions are needed in both language and thinking about deterministic effects (DE). ICRP (1991) makes a welcome and clear distinction between change, caused by irradiation; damage, some degree of deleterious change, for example to cells, but not necessarily deleterious to the exposed individual; harm, clinically observable deleterious effects expressed in individuals or their descendants; and detriment, a complex concept combining the probability, severity and time of expression of harm (para42). (All added emphases come from the author.) Unfortunately these distinctions are not carried through into the discussion of deterministic effects (DE) and two important terms are left undefined. Presumably effect may refer to change, damage, harm or detriment, according to context. Clinically observable is also undefined although its meaning is crucial to any consideration of DE since DE are defined as causing observable harm (para 20). (Author)

  4. Deterministic analyses of severe accident issues

    International Nuclear Information System (INIS)

    Dua, S.S.; Moody, F.J.; Muralidharan, R.; Claassen, L.B.

    2004-01-01

    Severe accidents in light water reactors involve complex physical phenomena. In the past there has been a heavy reliance on simple assumptions regarding physical phenomena alongside of probability methods to evaluate risks associated with severe accidents. Recently GE has developed realistic methodologies that permit deterministic evaluations of severe accident progression and of some of the associated phenomena in the case of Boiling Water Reactors (BWRs). These deterministic analyses indicate that with appropriate system modifications, and operator actions, core damage can be prevented in most cases. Furthermore, in cases where core-melt is postulated, containment failure can either be prevented or significantly delayed to allow sufficient time for recovery actions to mitigate severe accidents

  5. Probabilistic approach in treatment of deterministic analyses results of severe accidents

    International Nuclear Information System (INIS)

    Krajnc, B.; Mavko, B.

    1996-01-01

    Severe accidents sequences resulting in loss of the core geometric integrity have been found to have small probability of the occurrence. Because of their potential consequences to public health and safety, an evaluation of the core degradation progression and the resulting effects on the containment is necessary to determine the probability of a significant release of radioactive materials. This requires assessment of many interrelated phenomena including: steel and zircaloy oxidation, steam spikes, in-vessel debris cooling, potential vessel failure mechanisms, release of core material to the containment, containment pressurization from steam generation, or generation of non-condensable gases or hydrogen burn, and ultimately coolability of degraded core material. To asses the answer from the containment event trees in the sense of weather certain phenomenological event would happen or not the plant specific deterministic analyses should be performed. Due to the fact that there is a large uncertainty in the prediction of severe accidents phenomena in Level 2 analyses (containment event trees) the combination of probabilistic and deterministic approach should be used. In fact the result of the deterministic analyses of severe accidents are treated in probabilistic manner due to large uncertainty of results as a consequence of a lack of detailed knowledge. This paper discusses approach used in many IPEs, and which assures that the assigned probability for certain question in the event tree represent the probability that the event will or will not happen and that this probability also includes its uncertainty, which is mainly result of lack of knowledge. (author)

  6. Deterministic effects of the ionizing radiation

    International Nuclear Information System (INIS)

    Raslawski, Elsa C.

    2001-01-01

    Full text: The deterministic effect is the somatic damage that appears when radiation dose is superior to the minimum value or 'threshold dose'. Over this threshold dose, the frequency and seriousness of the damage increases with the amount given. Sixteen percent of patients younger than 15 years of age with the diagnosis of cancer have the possibility of a cure. The consequences of cancer treatment in children are very serious, as they are physically and emotionally developing. The seriousness of the delayed effects of radiation therapy depends on three factors: a)- The treatment ( dose of radiation, schedule of treatment, time of treatment, beam energy, treatment volume, distribution of the dose, simultaneous chemotherapy, etc.); b)- The patient (state of development, patient predisposition, inherent sensitivity of tissue, the present of other alterations, etc.); c)- The tumor (degree of extension or infiltration, mechanical effects, etc.). The effect of radiation on normal tissue is related to cellular activity and the maturity of the tissue irradiated. Children have a mosaic of tissues in different stages of maturity at different moments in time. On the other hand, each tissue has a different pattern of development, so that sequelae are different in different irradiated tissues of the same patient. We should keep in mind that all the tissues are affected in some degree. Bone tissue evidences damage with growth delay and degree of calcification. Damage is small at 10 Gy; between 10 and 20 Gy growth arrest is partial, whereas at doses larger than 20 Gy growth arrest is complete. The central nervous system is the most affected because the radiation injuries produce demyelination with or without focal or diffuse areas of necrosis in the white matter causing character alterations, lower IQ and functional level, neuro cognitive impairment,etc. The skin is also affected, showing different degrees of erythema such as ulceration and necrosis, different degrees of

  7. Deterministic effects of interventional radiology procedures

    International Nuclear Information System (INIS)

    Shope, Thomas B.

    1997-01-01

    The purpose of this paper is to describe deterministic radiation injuries reported to the Food and Drug Administration (FDA) that resulted from therapeutic, interventional procedures performed under fluoroscopic guidance, and to investigate the procedure or equipment-related factors that may have contributed to the injury. Reports submitted to the FDA under both mandatory and voluntary reporting requirements which described radiation-induced skin injuries from fluoroscopy were investigated. Serious skin injuries, including moist desquamation and tissues necrosis, have occurred since 1992. These injuries have resulted from a variety of interventional procedures which have required extended periods of fluoroscopy compared to typical diagnostic procedures. Facilities conducting therapeutic interventional procedures need to be aware of the potential for patient radiation injury and take appropriate steps to limit the potential for injury. (author)

  8. Deterministic influences exceed dispersal effects on hydrologically-connected microbiomes: Deterministic assembly of hyporheic microbiomes

    Energy Technology Data Exchange (ETDEWEB)

    Graham, Emily B. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Crump, Alex R. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Resch, Charles T. [Geochemistry Department, Pacific Northwest National Laboratory, Richland WA USA; Fansler, Sarah [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Arntzen, Evan [Environmental Compliance and Emergency Preparation, Pacific Northwest National Laboratory, Richland WA USA; Kennedy, David W. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Fredrickson, Jim K. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Stegen, James C. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA

    2017-03-28

    Subsurface zones of groundwater and surface water mixing (hyporheic zones) are regions of enhanced rates of biogeochemical cycling, yet ecological processes governing hyporheic microbiome composition and function through space and time remain unknown. We sampled attached and planktonic microbiomes in the Columbia River hyporheic zone across seasonal hydrologic change, and employed statistical null models to infer mechanisms generating temporal changes in microbiomes within three hydrologically-connected, physicochemically-distinct geographic zones (inland, nearshore, river). We reveal that microbiomes remain dissimilar through time across all zones and habitat types (attached vs. planktonic) and that deterministic assembly processes regulate microbiome composition in all data subsets. The consistent presence of heterotrophic taxa and members of the Planctomycetes-Verrucomicrobia-Chlamydiae (PVC) superphylum nonetheless suggests common selective pressures for physiologies represented in these groups. Further, co-occurrence networks were used to provide insight into taxa most affected by deterministic assembly processes. We identified network clusters to represent groups of organisms that correlated with seasonal and physicochemical change. Extended network analyses identified keystone taxa within each cluster that we propose are central in microbiome composition and function. Finally, the abundance of one network cluster of nearshore organisms exhibited a seasonal shift from heterotrophic to autotrophic metabolisms and correlated with microbial metabolism, possibly indicating an ecological role for these organisms as foundational species in driving biogeochemical reactions within the hyporheic zone. Taken together, our research demonstrates a predominant role for deterministic assembly across highly-connected environments and provides insight into niche dynamics associated with seasonal changes in hyporheic microbiome composition and metabolism.

  9. A critical evaluation of deterministic methods in size optimisation of reliable and cost effective standalone hybrid renewable energy systems

    International Nuclear Information System (INIS)

    Maheri, Alireza

    2014-01-01

    Reliability of a hybrid renewable energy system (HRES) strongly depends on various uncertainties affecting the amount of power produced by the system. In the design of systems subject to uncertainties, both deterministic and nondeterministic design approaches can be adopted. In a deterministic design approach, the designer considers the presence of uncertainties and incorporates them indirectly into the design by applying safety factors. It is assumed that, by employing suitable safety factors and considering worst-case-scenarios, reliable systems can be designed. In fact, the multi-objective optimisation problem with two objectives of reliability and cost is reduced to a single-objective optimisation problem with the objective of cost only. In this paper the competence of deterministic design methods in size optimisation of reliable standalone wind–PV–battery, wind–PV–diesel and wind–PV–battery–diesel configurations is examined. For each configuration, first, using different values of safety factors, the optimal size of the system components which minimises the system cost is found deterministically. Then, for each case, using a Monte Carlo simulation, the effect of safety factors on the reliability and the cost are investigated. In performing reliability analysis, several reliability measures, namely, unmet load, blackout durations (total, maximum and average) and mean time between failures are considered. It is shown that the traditional methods of considering the effect of uncertainties in deterministic designs such as design for an autonomy period and employing safety factors have either little or unpredictable impact on the actual reliability of the designed wind–PV–battery configuration. In the case of wind–PV–diesel and wind–PV–battery–diesel configurations it is shown that, while using a high-enough margin of safety in sizing diesel generator leads to reliable systems, the optimum value for this margin of safety leading to a

  10. Applicability of deterministic methods in seismic site effects modeling

    International Nuclear Information System (INIS)

    Cioflan, C.O.; Radulian, M.; Apostol, B.F.; Ciucu, C.

    2005-01-01

    The up-to-date information related to local geological structure in the Bucharest urban area has been integrated in complex analyses of the seismic ground motion simulation using deterministic procedures. The data recorded for the Vrancea intermediate-depth large earthquakes are supplemented with synthetic computations all over the city area. The hybrid method with a double-couple seismic source approximation and a relatively simple regional and local structure models allows a satisfactory reproduction of the strong motion records in the frequency domain (0.05-1)Hz. The new geological information and a deterministic analytical method which combine the modal summation technique, applied to model the seismic wave propagation between the seismic source and the studied sites, with the mode coupling approach used to model the seismic wave propagation through the local sedimentary structure of the target site, allows to extend the modelling to higher frequencies of earthquake engineering interest. The results of these studies (synthetic time histories of the ground motion parameters, absolute and relative response spectra etc) for the last 3 Vrancea strong events (August 31,1986 M w =7.1; May 30,1990 M w = 6.9 and October 27, 2004 M w = 6.0) can complete the strong motion database used for the microzonation purposes. Implications and integration of the deterministic results into the urban planning and disaster management strategies are also discussed. (authors)

  11. Inferring Fitness Effects from Time-Resolved Sequence Data with a Delay-Deterministic Model.

    Science.gov (United States)

    Nené, Nuno R; Dunham, Alistair S; Illingworth, Christopher J R

    2018-05-01

    A common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the nondeterministic properties of mutation in a finite population. We propose an alternative approach that acts to correct for this error, and which we denote the delay-deterministic model. Applying our model to a simple evolutionary system, we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model. Copyright © 2018 Nené et al.

  12. Expansion or extinction: deterministic and stochastic two-patch models with Allee effects.

    Science.gov (United States)

    Kang, Yun; Lanchier, Nicolas

    2011-06-01

    We investigate the impact of Allee effect and dispersal on the long-term evolution of a population in a patchy environment. Our main focus is on whether a population already established in one patch either successfully invades an adjacent empty patch or undergoes a global extinction. Our study is based on the combination of analytical and numerical results for both a deterministic two-patch model and a stochastic counterpart. The deterministic model has either two, three or four attractors. The existence of a regime with exactly three attractors only appears when patches have distinct Allee thresholds. In the presence of weak dispersal, the analysis of the deterministic model shows that a high-density and a low-density populations can coexist at equilibrium in nearby patches, whereas the analysis of the stochastic model indicates that this equilibrium is metastable, thus leading after a large random time to either a global expansion or a global extinction. Up to some critical dispersal, increasing the intensity of the interactions leads to an increase of both the basin of attraction of the global extinction and the basin of attraction of the global expansion. Above this threshold, for both the deterministic and the stochastic models, the patches tend to synchronize as the intensity of the dispersal increases. This results in either a global expansion or a global extinction. For the deterministic model, there are only two attractors, while the stochastic model no longer exhibits a metastable behavior. In the presence of strong dispersal, the limiting behavior is entirely determined by the value of the Allee thresholds as the global population size in the deterministic and the stochastic models evolves as dictated by their single-patch counterparts. For all values of the dispersal parameter, Allee effects promote global extinction in terms of an expansion of the basin of attraction of the extinction equilibrium for the deterministic model and an increase of the

  13. Response-surface models for deterministic effects of localized irradiation of the skin by discrete β/γ -emitting sources

    International Nuclear Information System (INIS)

    Scott, B.R.

    1995-01-01

    Individuals who work at nuclear reactor facilities can be at risk for deterministic effects in the skin from exposure to discrete Β- and γ-emitting (ΒγE) sources (e.g., ΒγE hot particles) on the skin or clothing. Deterministic effects are non-cancer effects that have a threshold and increase in severity as dose increases (e.g., ulcer in skin). Hot ΒγE particles are 60 Co- or nuclear fuel-derived particles with diameters > 10 μm and < 3 mm and contain at least 3.7 kBq (0.1 μCi) of radioactivity. For such ΒγE sources on the skin, it is the beta component of the dose that is most important. To develop exposure limitation systems that adequately control exposure of workers to discrete ΒγE sources, models are needed for systems that adequately control exposure of workers to discrete ΒγE sources, models are needed for evaluating the risk of deterministic effects of localized Β irradiation of the skin. The purpose of this study was to develop dose-rate and irradiated-area dependent, response-surface models for evaluating risks of significant deterministic effects of localized irradiation of the skin by discrete ΒγE sources and to use modeling results to recommend approaches to limiting occupational exposure to such sources. The significance of the research results as follows: (1) response-surface models are now available for evaluating the risk of specific deterministic effects of localized irradiation of the skin; (2) modeling results have been used to recommend approaches to limiting occupational exposure of workers to Β radiation from ΒγE sources on the skin or on clothing; and (3) the generic irradiated-volume, weighting-factor approach to limiting exposure can be applied to other organs including the eye, the ear, and organs of the respiratory or gastrointestinal tract and can be used for both deterministic and stochastic effects

  14. Response-surface models for deterministic effects of localized irradiation of the skin by discrete {beta}/{gamma} -emitting sources

    Energy Technology Data Exchange (ETDEWEB)

    Scott, B.R.

    1995-12-01

    Individuals who work at nuclear reactor facilities can be at risk for deterministic effects in the skin from exposure to discrete {Beta}- and {gamma}-emitting ({Beta}{gamma}E) sources (e.g., {Beta}{gamma}E hot particles) on the skin or clothing. Deterministic effects are non-cancer effects that have a threshold and increase in severity as dose increases (e.g., ulcer in skin). Hot {Beta}{gamma}E particles are {sup 60}Co- or nuclear fuel-derived particles with diameters > 10 {mu}m and < 3 mm and contain at least 3.7 kBq (0.1 {mu}Ci) of radioactivity. For such {Beta}{gamma}E sources on the skin, it is the beta component of the dose that is most important. To develop exposure limitation systems that adequately control exposure of workers to discrete {Beta}{gamma}E sources, models are needed for systems that adequately control exposure of workers to discrete {Beta}{gamma}E sources, models are needed for evaluating the risk of deterministic effects of localized {Beta} irradiation of the skin. The purpose of this study was to develop dose-rate and irradiated-area dependent, response-surface models for evaluating risks of significant deterministic effects of localized irradiation of the skin by discrete {Beta}{gamma}E sources and to use modeling results to recommend approaches to limiting occupational exposure to such sources. The significance of the research results as follows: (1) response-surface models are now available for evaluating the risk of specific deterministic effects of localized irradiation of the skin; (2) modeling results have been used to recommend approaches to limiting occupational exposure of workers to {Beta} radiation from {Beta}{gamma}E sources on the skin or on clothing; and (3) the generic irradiated-volume, weighting-factor approach to limiting exposure can be applied to other organs including the eye, the ear, and organs of the respiratory or gastrointestinal tract and can be used for both deterministic and stochastic effects.

  15. Siting criteria based on the prevention of deterministic effects from plutonium inhalation exposures

    International Nuclear Information System (INIS)

    Sorensen, S.A.; Low, J.O.

    1998-01-01

    Siting criteria are established by regulatory authorities to evaluate potential accident scenarios associated with proposed nuclear facilities. The 0.25 Sv (25 rem) siting criteria adopted in the United States has been historically based on the prevention of deterministic effects from acute, whole-body exposures. The Department of Energy has extended the applicability of this criterion to radionuclides that deliver chronic, organ-specific irradiation through the specification of a 0.25 Sv (25 rem) committed effective dose equivalent siting criterion. A methodology is developed to determine siting criteria based on the prevention of deterministic effects from inhalation intakes of radionuclides which deliver chronic, organ-specific irradiation. Revised siting criteria, expressed in terms of committed effective dose equivalent, are proposed for nuclear facilities that handle primarily plutonium compounds. The analysis determined that a siting criterion of 1.2 Sv (120 rem) committed effective dose equivalent for inhalation exposures to weapons-grade plutonium meets the historical goal of preventing deterministic effects during a facility accident scenario. The criterion also meets the Nuclear Regulatory Commission and Department of Energy Nuclear Safety Goals provided that the frequency of the accident is sufficiently low

  16. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    International Nuclear Information System (INIS)

    Elkhoraibi, T.; Hashemi, A.; Ostadan, F.

    2014-01-01

    Soil-structure interaction (SSI) is a major step for seismic design of massive and stiff structures typical of the nuclear facilities and civil infrastructures such as tunnels, underground stations, dams and lock head structures. Currently most SSI analyses are performed deterministically, incorporating limited range of variation in soil and structural properties and without consideration of the ground motion incoherency effects. This often leads to overestimation of the seismic response particularly the In-Structure-Response Spectra (ISRS) with significant impositions of design and equipment qualification costs, especially in the case of high-frequency sensitive equipment at stiff soil or rock sites. The reluctance to incorporate a more comprehensive probabilistic approach is mainly due to the fact that the computational cost of performing probabilistic SSI analysis even without incoherency function considerations has been prohibitive. As such, bounding deterministic approaches have been preferred by the industry and accepted by the regulatory agencies. However, given the recently available and growing computing capabilities, the need for a probabilistic-based approach to the SSI analysis is becoming clear with the advances in performance-based engineering and the utilization of fragility analysis in the decision making process whether by the owners or the regulatory agencies. This paper demonstrates the use of both probabilistic and deterministic SSI analysis techniques to identify important engineering demand parameters in the structure. A typical nuclear industry structure is used as an example for this study. The system is analyzed for two different site conditions: rock and deep soil. Both deterministic and probabilistic SSI analysis approaches are performed, using the program SASSI, with and without ground motion incoherency considerations. In both approaches, the analysis begins at the hard rock level using the low frequency and high frequency hard rock

  17. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    Energy Technology Data Exchange (ETDEWEB)

    Elkhoraibi, T., E-mail: telkhora@bechtel.com; Hashemi, A.; Ostadan, F.

    2014-04-01

    Soil-structure interaction (SSI) is a major step for seismic design of massive and stiff structures typical of the nuclear facilities and civil infrastructures such as tunnels, underground stations, dams and lock head structures. Currently most SSI analyses are performed deterministically, incorporating limited range of variation in soil and structural properties and without consideration of the ground motion incoherency effects. This often leads to overestimation of the seismic response particularly the In-Structure-Response Spectra (ISRS) with significant impositions of design and equipment qualification costs, especially in the case of high-frequency sensitive equipment at stiff soil or rock sites. The reluctance to incorporate a more comprehensive probabilistic approach is mainly due to the fact that the computational cost of performing probabilistic SSI analysis even without incoherency function considerations has been prohibitive. As such, bounding deterministic approaches have been preferred by the industry and accepted by the regulatory agencies. However, given the recently available and growing computing capabilities, the need for a probabilistic-based approach to the SSI analysis is becoming clear with the advances in performance-based engineering and the utilization of fragility analysis in the decision making process whether by the owners or the regulatory agencies. This paper demonstrates the use of both probabilistic and deterministic SSI analysis techniques to identify important engineering demand parameters in the structure. A typical nuclear industry structure is used as an example for this study. The system is analyzed for two different site conditions: rock and deep soil. Both deterministic and probabilistic SSI analysis approaches are performed, using the program SASSI, with and without ground motion incoherency considerations. In both approaches, the analysis begins at the hard rock level using the low frequency and high frequency hard rock

  18. Deterministic Effects of Occupational Exposures in the Mayak Nuclear Workers Cohort

    International Nuclear Information System (INIS)

    Azinova, T. V.; Okladnikova, N. D.; Sumina, M. V.; Pesternikova, V. S.; Osovets, V. S.; Druzhimina, M. B.; Seminikhina, N. g.

    2004-01-01

    A wide spread and utilization of nuclear energy in the recent decade leads to a stable increasing of contingents exposed to ionizing radiation sources. in order to predict radiation risks it's important to have and apply all the experience in assessment of health effects due to radiation exposures generated by now in different countries. the proposed report will present results of the long-term follow-up for a cohort of nuclear workers at the Mayak Production Association, with was the first nuclear facility in Russia. The established system of individual dosimetry of external exposure, monitoring of internal radiation and special system of medical follow-up of healthy nuclear workers during the last 50 years allowed collecting of the unique primary data to study radiation effects, their patterns and mechanisms specific of exposure dose. The study cohort includes 61 percent of males and 39 percent of females. The vital status is known for 90 percent of cases, 44 percent of workers are still alive and undergo regular medical examination in our Clinic. Unfortunately, by now 50 percent of workers have died. 6 percent of workers were lost for the follow-up. total doses from chronic external gamma rays in the cohort ranged from 0.6 to 10.8 Gy (annual exposure doses were from 0.001 to 7.4 Gy), Pu body burden was from 0.3 to 72.3 kBq. Most intensive chronic exposure of workers was registered during 1948 to 1958. At this time, 19 radiation accidents occurred at the Mayak PA. Thus, the highest incidence of deterministic effects was observed right at this period. In the cohort of Mayak nuclear workers there were diagnosed 60 cases of acute radiation syndrome (I to IV degrees of severity); 2079 cases of chronic radiation sickness; 120 cases of plutonium pneumoscelarosis; 5 cases of radiation cataracts; and over 400 cases of local radiation injuries. The report will present dependences of the observed effects on absorbed radiation dose and dose rate in terms of acute radiation

  19. Deterministic calculation of the effective delayed neutron fraction without using the adjoint neutron flux - 299

    International Nuclear Information System (INIS)

    Talamo, A.; Gohar, Y.; Aliberti, G.; Zhong, Z.; Bournos, V.; Fokov, Y.; Kiyavitskaya, H.; Routkovskaya, C.; Serafimovich, I.

    2010-01-01

    In 1997, Bretscher calculated the effective delayed neutron fraction by the k-ratio method. The Bretscher's approach is based on calculating the multiplication factor of a nuclear reactor core with and without the contribution of delayed neutrons. The multiplication factor set by the delayed neutrons (the delayed multiplication factor) is obtained as the difference between the total and the prompt multiplication factors. Bretscher evaluated the effective delayed neutron fraction as the ratio between the delayed and total multiplication factors (therefore the method is often referred to as k-ratio method). In the present work, the k-ratio method is applied by deterministic nuclear codes. The ENDF/B nuclear data library of the fuel isotopes ( 238 U and 238 U) have been processed by the NJOY code with and without the delayed neutron data to prepare multigroup WIMSD nuclear data libraries for the DRAGON code. The DRAGON code has been used for preparing the PARTISN macroscopic cross sections. This calculation methodology has been applied to the YALINA-Thermal assembly of Belarus. The assembly has been modeled and analyzed using PARTISN code with 69 energy groups and 60 different material zones. The deterministic and Monte Carlo results for the effective delayed neutron fraction obtained by the k-ratio method agree very well. The results also agree with the values obtained by using the adjoint flux. (authors)

  20. Deterministic extraction from weak random sources

    CERN Document Server

    Gabizon, Ariel

    2011-01-01

    In this research monograph, the author constructs deterministic extractors for several types of sources, using a methodology of recycling randomness which enables increasing the output length of deterministic extractors to near optimal length.

  1. MIRD Commentary: Proposed Name for a Dosimetry Unit Applicable to Deterministic Biological Effects-The Barendsen (Bd)

    International Nuclear Information System (INIS)

    Sgouros, George; Howell, R. W.; Bolch, Wesley E.; Fisher, Darrell R.

    2009-01-01

    The fundamental physical quantity for relating all biologic effects to radiation exposure is the absorbed dose, the energy imparted per unit mass of tissue. Absorbed dose is expressed in units of joules per kilogram (J/kg) and is given the special name gray (Gy). Exposure to ionizing radiation may cause both deterministic and stochastic biologic effects. To account for the relative effect per unit absorbed dose that has been observed for different types of radiation, the International Commission on Radiological Protection (ICRP) has established radiation weighting factors for stochastic effects. The product of absorbed dose in Gy and the radiation weighting factor is defined as the equivalent dose. Equivalent dose values are designated by a special named unit, the sievert (Sv). Unlike the situation for stochastic effects, no well-defined formalism and associated special named quantities have been widely adopted for deterministic effects. The therapeutic application of radionuclides and, specifically, -particle emitters in nuclear medicine has brought to the forefront the need for a well-defined dosimetry formalism applicable to deterministic effects that is accompanied by corresponding special named quantities. This commentary reviews recent proposals related to this issue and concludes with a recommendation to establish a new named quantity

  2. JCCRER Project 2.3 -- Deterministic effects of occupational exposure to radiation. Phase 1: Feasibility study; Final report

    Energy Technology Data Exchange (ETDEWEB)

    Okladnikova, N.; Pesternikova, V.; Sumina, M. [Inst. of Biophysics, Ozyorsk (Russian Federation)] [and others

    1998-12-01

    Phase 1 of Project 2.3, a short-term collaborative Feasibility Study, was funded for 12 months starting on 1 February 1996. The overall aim of the study was to determine the practical feasibility of using the dosimetric and clinical data on the MAYAK worker population to study the deterministic effects of exposure to external gamma radiation and to internal alpha radiation from inhaled plutonium. Phase 1 efforts were limited to the period of greatest worker exposure (1948--1954) and focused on collaboratively: assessing the comprehensiveness, availability, quality, and suitability of the Russian clinical and dosimetric data for the study of deterministic effects; creating an electronic data base containing complete clinical and dosimetric data on a small, representative sample of MAYAK workers; developing computer software for the testing of a currently used health risk model of hematopoietic effects; and familiarizing the US team with the Russian diagnostic criteria and techniques used in the identification of Chronic Radiation Sickness.

  3. JCCRER Project 2.3 - Deterministic effects of occupational exposure to radiation. Phase 1: Feasibility study. Final report

    International Nuclear Information System (INIS)

    Okladnikova, N.; Pesternikova, V.; Sumina, M.

    1998-01-01

    Phase 1 of Project 2.3, a short-term collaborative Feasibility Study, was funded for 12 months starting on 1 February 1996. The overall aim of the study was to determine the practical feasibility of using the dosimetric and clinical data on the MAYAK worker population to study the deterministic effects of exposure to external gamma radiation and to internal alpha radiation from inhaled plutonium. Phase 1 efforts were limited to the period of greatest worker exposure (1948--1954) and focused on collaboratively: assessing the comprehensiveness, availability, quality, and suitability of the Russian clinical and dosimetric data for the study of deterministic effects; creating an electronic data base containing complete clinical and dosimetric data on a small, representative sample of MAYAK workers; developing computer software for the testing of a currently used health risk model of hematopoietic effects; and familiarizing the US team with the Russian diagnostic criteria and techniques used in the identification of Chronic Radiation Sickness

  4. On the effect of deterministic terms on the bias in stable AR models

    NARCIS (Netherlands)

    van Giersbergen, N.P.A.

    2004-01-01

    This paper compares the first-order bias approximation for the autoregressive (AR) coefficients in stable AR models in the presence of deterministic terms. It is shown that the bias due to inclusion of an intercept and trend is twice as large as the bias due to an intercept. For the AR(1) model, the

  5. Height-Deterministic Pushdown Automata

    DEFF Research Database (Denmark)

    Nowotka, Dirk; Srba, Jiri

    2007-01-01

    We define the notion of height-deterministic pushdown automata, a model where for any given input string the stack heights during any (nondeterministic) computation on the input are a priori fixed. Different subclasses of height-deterministic pushdown automata, strictly containing the class...... of regular languages and still closed under boolean language operations, are considered. Several of such language classes have been described in the literature. Here, we suggest a natural and intuitive model that subsumes all the formalisms proposed so far by employing height-deterministic pushdown automata...

  6. Calculating the effective delayed neutron fraction in the Molten Salt Fast Reactor: Analytical, deterministic and Monte Carlo approaches

    International Nuclear Information System (INIS)

    Aufiero, Manuele; Brovchenko, Mariya; Cammi, Antonio; Clifford, Ivor; Geoffroy, Olivier; Heuer, Daniel; Laureau, Axel; Losa, Mario; Luzzi, Lelio; Merle-Lucotte, Elsa; Ricotti, Marco E.; Rouch, Hervé

    2014-01-01

    Highlights: • Calculation of effective delayed neutron fraction in circulating-fuel reactors. • Extension of the Monte Carlo SERPENT-2 code for delayed neutron precursor tracking. • Forward and adjoint multi-group diffusion eigenvalue problems in OpenFOAM. • Analytical approach for β eff calculation in simple geometries and flow conditions. • Good agreement among the three proposed approaches in the MSFR test-case. - Abstract: This paper deals with the calculation of the effective delayed neutron fraction (β eff ) in circulating-fuel nuclear reactors. The Molten Salt Fast Reactor is adopted as test case for the comparison of the analytical, deterministic and Monte Carlo methods presented. The Monte Carlo code SERPENT-2 has been extended to allow for delayed neutron precursors drift, according to the fuel velocity field. The forward and adjoint eigenvalue multi-group diffusion problems are implemented and solved adopting the multi-physics tool-kit OpenFOAM, by taking into account the convective and turbulent diffusive terms in the precursors balance. These two approaches show good agreement in the whole range of the MSFR operating conditions. An analytical formula for the circulating-to-static conditions β eff correction factor is also derived under simple hypotheses, which explicitly takes into account the spatial dependence of the neutron importance. Its accuracy is assessed against Monte Carlo and deterministic results. The effects of in-core recirculation vortex and turbulent diffusion are finally analysed and discussed

  7. Deterministic chaos in the pitting phenomena of passivable alloys

    International Nuclear Information System (INIS)

    Hoerle, Stephane

    1998-01-01

    It was shown that electrochemical noise recorded in stable pitting conditions exhibits deterministic (even chaotic) features. The occurrence of deterministic behaviors depend on the material/solution severity. Thus, electrolyte composition ([Cl - ]/[NO 3 - ] ratio, pH), passive film thickness or alloy composition can change the deterministic features. Only one pit is sufficient to observe deterministic behaviors. The electrochemical noise signals are non-stationary, which is a hint of a change with time in the pit behavior (propagation speed or mean). Modifications of electrolyte composition reveals transitions between random and deterministic behaviors. Spontaneous transitions between deterministic behaviors of different features (bifurcation) are also evidenced. Such bifurcations enlighten various routes to chaos. The routes to chaos and the features of chaotic signals allow to suggest the modeling (continuous and discontinuous models are proposed) of the electrochemical mechanisms inside a pit, that describe quite well the experimental behaviors and the effect of the various parameters. The analysis of the chaotic behaviors of a pit leads to a better understanding of propagation mechanisms and give tools for pit monitoring. (author) [fr

  8. Deterministic indexing for packed strings

    DEFF Research Database (Denmark)

    Bille, Philip; Gørtz, Inge Li; Skjoldjensen, Frederik Rye

    2017-01-01

    Given a string S of length n, the classic string indexing problem is to preprocess S into a compact data structure that supports efficient subsequent pattern queries. In the deterministic variant the goal is to solve the string indexing problem without any randomization (at preprocessing time...... or query time). In the packed variant the strings are stored with several character in a single word, giving us the opportunity to read multiple characters simultaneously. Our main result is a new string index in the deterministic and packed setting. Given a packed string S of length n over an alphabet σ...

  9. Deterministic Graphical Games Revisited

    DEFF Research Database (Denmark)

    Andersson, Daniel; Hansen, Kristoffer Arnsfelt; Miltersen, Peter Bro

    2008-01-01

    We revisit the deterministic graphical games of Washburn. A deterministic graphical game can be described as a simple stochastic game (a notion due to Anne Condon), except that we allow arbitrary real payoffs but disallow moves of chance. We study the complexity of solving deterministic graphical...... games and obtain an almost-linear time comparison-based algorithm for computing an equilibrium of such a game. The existence of a linear time comparison-based algorithm remains an open problem....

  10. Ways of pharmacological prophylaxis of stochastic and deterministic effects of chronical radiation exposure

    International Nuclear Information System (INIS)

    Kirillova, E.N.; Muksinova, K.N.; Revina, V.; Smirnov, D.; Sokolnikov, M.; Lukyanova, T.

    1996-01-01

    The prophylactics of late effects of exposure is the actual medico-social problem, because now there are large groups of persons who were exposed during occupational contact and living on territories contaminated by radionuclides. Most probable consequences of external and internal chronic influence of radiation may be the increase of malignant tumour frequency, the development of secondary myelo- and immuno-depressions, the earlier forming of sclerous and destructive processes, and the acceleration of senescence. The role of damages in immune system was not yet understood in pathogenesis of the late effects of radiation, but there are evidences that the decreasing of the immunologic supervision in period of forming the consequences of radiation influence enables to realize the cancerogenic effect of radiation. The purposes of this investigation are to decrease the frequency or to prevent the development of radiation consequences dangerous for health and life by using the method of modification of radiogenic damages in hemopoietic and immune systems by applying the pharmacological preparations with immunomodulating effects. The investigation tasks include: the study of modifying influence of pharmacological substances with different mechanisms of effect: myelopid (immunomodulating, and regulatory), β-carotin, Calendula officinalis (immunomodulating, and antioxidant), lipamid (detoxicating); the separate or complex applications of these substances; and the development of the optimum medico-prophylactic schemes. The advantages of these indicated preparations in comparison with the known (T-activin, thymogen, cytokines, etc.) are the absence of contraindications and the possibility to use per os. (author)

  11. Integrating the effects of forest cover on slope stability in a deterministic landslide susceptibility model (TRIGRS 2.0)

    Science.gov (United States)

    Zieher, T.; Rutzinger, M.; Bremer, M.; Meissl, G.; Geitner, C.

    2014-12-01

    The potentially stabilizing effects of forest cover in respect of slope stability have been the subject of many studies in the recent past. Hence, the effects of trees are also considered in many deterministic landslide susceptibility models. TRIGRS 2.0 (Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability; USGS) is a dynamic, physically-based model designed to estimate shallow landslide susceptibility in space and time. In the original version the effects of forest cover are not considered. As for further studies in Vorarlberg (Austria) TRIGRS 2.0 is intended to be applied in selected catchments that are densely forested, the effects of trees on slope stability were implemented in the model. Besides hydrological impacts such as interception or transpiration by tree canopies and stems, root cohesion directly influences the stability of slopes especially in case of shallow landslides while the additional weight superimposed by trees is of minor relevance. Detailed data on tree positions and further attributes such as tree height and diameter at breast height were derived throughout the study area (52 km²) from high-resolution airborne laser scanning data. Different scenarios were computed for spruce (Picea abies) in the study area. Root cohesion was estimated area-wide based on published correlations between root reinforcement and distance to tree stems depending on the stem diameter at breast height. In order to account for decreasing root cohesion with depth an exponential distribution was assumed and implemented in the model. Preliminary modelling results show that forest cover can have positive effects on slope stability yet strongly depending on tree age and stand structure. This work has been conducted within C3S-ISLS, which is funded by the Austrian Climate and Energy Fund, 5th ACRP Program.

  12. Effects of Topological Randomness on Cooperation in a Deterministic Prisoner's Dilemma Game

    International Nuclear Information System (INIS)

    Zhang Mei; Yang Junzhong

    2011-01-01

    In this work, we consider an evolutionary prisoner's dilemma game on a homogeneous random network with the richest-following strategy adoption rule. By constructing homogeneous random networks from a regular ring graph, we investigate the effects of topological randomness on cooperation. In contrast to the ordinary view that the presence of small amount of shortcuts in ring graphs favors cooperation, we find the cooperation inhibition by weak topological randomness. The explanations on the observations are presented. (general)

  13. Pseudo-deterministic Algorithms

    OpenAIRE

    Goldwasser , Shafi

    2012-01-01

    International audience; In this talk we describe a new type of probabilistic algorithm which we call Bellagio Algorithms: a randomized algorithm which is guaranteed to run in expected polynomial time, and to produce a correct and unique solution with high probability. These algorithms are pseudo-deterministic: they can not be distinguished from deterministic algorithms in polynomial time by a probabilistic polynomial time observer with black box access to the algorithm. We show a necessary an...

  14. Deterministic chaotic dynamics of Raba River flow (Polish Carpathian Mountains)

    Science.gov (United States)

    Kędra, Mariola

    2014-02-01

    Is the underlying dynamics of river flow random or deterministic? If it is deterministic, is it deterministic chaotic? This issue is still controversial. The application of several independent methods, techniques and tools for studying daily river flow data gives consistent, reliable and clear-cut results to the question. The outcomes point out that the investigated discharge dynamics is not random but deterministic. Moreover, the results completely confirm the nonlinear deterministic chaotic nature of the studied process. The research was conducted on daily discharge from two selected gauging stations of the mountain river in southern Poland, the Raba River.

  15. Deterministic Compressed Sensing

    Science.gov (United States)

    2011-11-01

    39 4.3 Digital Communications . . . . . . . . . . . . . . . . . . . . . . . . . 40 4.4 Group Testing ...deterministic de - sign matrices. All bounds ignore the O() constants. . . . . . . . . . . 131 xvi List of Algorithms 1 Iterative Hard Thresholding Algorithm...sensing is information theoretically possible using any (2k, )-RIP sensing matrix . The following celebrated results of Candès, Romberg and Tao [54

  16. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  17. I Don’t Want to Miss a Thing – Learning Dynamics and Effects of Feedback Type and Monetary Incentive in a Paired Associate Deterministic Learning Task

    Directory of Open Access Journals (Sweden)

    Magda Gawlowska

    2017-06-01

    Full Text Available Effective functioning in a complex environment requires adjusting of behavior according to changing situational demands. To do so, organisms must learn new, more adaptive behaviors by extracting the necessary information from externally provided feedback. Not surprisingly, feedback-guided learning has been extensively studied using multiple research paradigms. The purpose of the present study was to test the newly designed Paired Associate Deterministic Learning task (PADL, in which participants were presented with either positive or negative deterministic feedback. Moreover, we manipulated the level of motivation in the learning process by comparing blocks with strictly cognitive, informative feedback to blocks where participants were additionally motivated by anticipated monetary reward or loss. Our results proved the PADL to be a useful tool not only for studying the learning process in a deterministic environment, but also, due to the varying task conditions, for assessing differences in learning patterns. Particularly, we show that the learning process itself is influenced by manipulating both the type of feedback information and the motivational significance associated with the expected monetary reward.

  18. Deterministic behavioural models for concurrency

    DEFF Research Database (Denmark)

    Sassone, Vladimiro; Nielsen, Mogens; Winskel, Glynn

    1993-01-01

    This paper offers three candidates for a deterministic, noninterleaving, behaviour model which generalizes Hoare traces to the noninterleaving situation. The three models are all proved equivalent in the rather strong sense of being equivalent as categories. The models are: deterministic labelled...... event structures, generalized trace languages in which the independence relation is context-dependent, and deterministic languages of pomsets....

  19. Effect of quantum noise on deterministic remote state preparation of an arbitrary two-particle state via various quantum entangled channels

    Science.gov (United States)

    Qu, Zhiguo; Wu, Shengyao; Wang, Mingming; Sun, Le; Wang, Xiaojun

    2017-12-01

    As one of important research branches of quantum communication, deterministic remote state preparation (DRSP) plays a significant role in quantum network. Quantum noises are prevalent in quantum communication, and it can seriously affect the safety and reliability of quantum communication system. In this paper, we study the effect of quantum noise on deterministic remote state preparation of an arbitrary two-particle state via different quantum channels including the χ state, Brown state and GHZ state. Firstly, the output states and fidelities of three DRSP algorithms via different quantum entangled channels in four noisy environments, including amplitude-damping, phase-damping, bit-flip and depolarizing noise, are presented, respectively. And then, the effects of noises on three kinds of preparation algorithms in the same noisy environment are discussed. In final, the theoretical analysis proves that the effect of noise in the process of quantum state preparation is only related to the noise type and the size of noise factor and independent of the different entangled quantum channels. Furthermore, another important conclusion is given that the effect of noise is also independent of how to distribute intermediate particles for implementing DRSP through quantum measurement during the concrete preparation process. These conclusions will be very helpful for improving the efficiency and safety of quantum communication in a noisy environment.

  20. Deterministic Graphical Games Revisited

    DEFF Research Database (Denmark)

    Andersson, Klas Olof Daniel; Hansen, Kristoffer Arnsfelt; Miltersen, Peter Bro

    2012-01-01

    Starting from Zermelo’s classical formal treatment of chess, we trace through history the analysis of two-player win/lose/draw games with perfect information and potentially infinite play. Such chess-like games have appeared in many different research communities, and methods for solving them......, such as retrograde analysis, have been rediscovered independently. We then revisit Washburn’s deterministic graphical games (DGGs), a natural generalization of chess-like games to arbitrary zero-sum payoffs. We study the complexity of solving DGGs and obtain an almost-linear time comparison-based algorithm...

  1. DETERMINISTIC METHODS USED IN FINANCIAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    MICULEAC Melania Elena

    2014-06-01

    Full Text Available The deterministic methods are those quantitative methods that have as a goal to appreciate through numerical quantification the creation and expression mechanisms of factorial and causal, influence and propagation relations of effects, where the phenomenon can be expressed through a direct functional relation of cause-effect. The functional and deterministic relations are the causal relations where at a certain value of the characteristics corresponds a well defined value of the resulting phenomenon. They can express directly the correlation between the phenomenon and the influence factors, under the form of a function-type mathematical formula.

  2. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  3. Reactivity effect breakdown calculations with deterministic and stochastic perturbations analysis – JEFF-3.1.1 to JEFF3.2T1 (BRC-2009 actinides application

    Directory of Open Access Journals (Sweden)

    Morillon B.

    2013-03-01

    Full Text Available JEFF-3.1.1 is the reference nuclear data library in CEA for the design calculations of the next nuclear power plants. The validation of the new neutronics code systems is based on this library and changes in nuclear data should be looked at closely. Some new actinides evaluation files at high energies have been proposed by CEA/Bruyères-le-Chatel in 2009 and have been integrated in JEFF3.2T1 test release. For the new release JEFF-3.2, CEA will build new evaluation files for the actinides, which should be a combination of the new evaluated data coming from BRC-2009 in the high energy range and improvements or new evaluations in the resolved and unresolved resonance range from CEA-Cadarache. To prepare the building of these new files, benchmarking the BRC-2009 library in comparison with the JEFF-3.1.1 library was very important. The crucial points to evaluate were the improvements in the continuum range and the discrepancies in the resonance range. The present work presents for a selected set of benchmarks the discrepancies in the effective multiplication factor obtained while using the JEFF-3.1.1 or JEFF-3.2T1 library with the deterministic code package ERANOS/PARIS and the stochastic code TRIPOLI-4. They have both been used to calculate cross section perturbations or other nuclear data perturbations when possible. This has permittted to identify the origin of the discrepancies in reactivity calculations. In addition, this work also shows the importance of cross section processing validation. Actually, some fast neutron spectrum calculations have led to opposite tendancies between the deterministic code package and the stochastic code. Some particular nuclear data (MT=5 in ENDF terminology seem to be incompatible with the current MERGE or GECCO processing codes.

  4. Deterministic methods in radiation transport

    International Nuclear Information System (INIS)

    Rice, A.F.; Roussin, R.W.

    1992-06-01

    The Seminar on Deterministic Methods in Radiation Transport was held February 4--5, 1992, in Oak Ridge, Tennessee. Eleven presentations were made and the full papers are published in this report, along with three that were submitted but not given orally. These papers represent a good overview of the state of the art in the deterministic solution of radiation transport problems for a variety of applications of current interest to the Radiation Shielding Information Center user community

  5. Equivalence relations between deterministic and quantum mechanical systems

    International Nuclear Information System (INIS)

    Hooft, G.

    1988-01-01

    Several quantum mechanical models are shown to be equivalent to certain deterministic systems because a basis can be found in terms of which the wave function does not spread. This suggests that apparently indeterministic behavior typical for a quantum mechanical world can be the result of locally deterministic laws of physics. We show how certain deterministic systems allow the construction of a Hilbert space and a Hamiltonian so that at long distance scales they may appear to behave as quantum field theories, including interactions but as yet no mass term. These observations are suggested to be useful for building theories at the Planck scale

  6. Antiinflammatory Effect of Several Umbelliferae Species

    Directory of Open Access Journals (Sweden)

    SUWIJIYO PRAMONO

    2005-03-01

    Full Text Available A screening for antiinflammatory effects was performed on several Indonesian Umbelliferae plants based on the contents of saponins and flavonoids. They were compared with Bupleurum falcatum L. as an introduced antiinflammatory plant. Roots and grains of each plant were collected, dried, and extracted with ethanol. The ethanolic extracts were then analyzed for their saponin and flavonoid contents by gravimetric and UV-vis spectrophotometric method. Antiinflammatory activity test was conducted on carragenin induced rat paw oedema. The results showed that the highest contents of saponin and flavonoid were found in the grains of Apium graveolens L. and showed antiinflammatory effect that was equivalent to that of the root of B. falcatum.

  7. Porosity effects during a severe accident

    International Nuclear Information System (INIS)

    Cazares R, R. I.; Espinosa P, G.; Vazquez R, A.

    2015-09-01

    The aim of this work is to study the behaviour of porosity effects on the temporal evolution of the distributions of hydrogen concentration and temperature profiles in a fuel assembly where a stream of steam is flowing. The analysis considers the fuel element without mitigation effects. The mass transfer phenomenon considers that the hydrogen generated diffuses in the steam by convection and diffusion. Oxidation of the cladding, rods and other components in the core constructed in zirconium base alloy by steam is a critical issue in LWR accident producing severe core damage. The oxygen consumed by the zirconium is supplied by the up flow of steam from the water pool below the uncovered core, supplemented in the case of PWR by gas recirculation from the cooler outer regions of the core to hotter zones. Fuel rod cladding oxidation is then one of the key phenomena influencing the core behavior under high-temperature accident conditions. The chemical reaction of oxidation is highly exothermic, which determines the hydrogen rate generation and the cladding brittleness and degradation. The heat transfer process in the fuel assembly is considered with a reduced order model. The Boussinesq approximation was applied in the momentum equations for multicomponent flow analysis that considers natural convection due to buoyancy forces, which is related with thermal and hydrogen concentration effects. The numerical simulation was carried out in an averaging channel that represents a core reactor with the fuel rod with its gap and cladding and cooling steam of a BWR. (Author)

  8. Porosity effects during a severe accident

    Energy Technology Data Exchange (ETDEWEB)

    Cazares R, R. I. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, Posgrado en Energia y Medio Ambiente, San Rafael Atlixco 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Espinosa P, G.; Vazquez R, A., E-mail: ricardo-cazares@hotmail.com [Universidad Autonoma Metropolitana, Unidad Iztapalapa, Area de Ingenieria en Recursos Energeticos, San Rafael Atlixco 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico)

    2015-09-15

    The aim of this work is to study the behaviour of porosity effects on the temporal evolution of the distributions of hydrogen concentration and temperature profiles in a fuel assembly where a stream of steam is flowing. The analysis considers the fuel element without mitigation effects. The mass transfer phenomenon considers that the hydrogen generated diffuses in the steam by convection and diffusion. Oxidation of the cladding, rods and other components in the core constructed in zirconium base alloy by steam is a critical issue in LWR accident producing severe core damage. The oxygen consumed by the zirconium is supplied by the up flow of steam from the water pool below the uncovered core, supplemented in the case of PWR by gas recirculation from the cooler outer regions of the core to hotter zones. Fuel rod cladding oxidation is then one of the key phenomena influencing the core behavior under high-temperature accident conditions. The chemical reaction of oxidation is highly exothermic, which determines the hydrogen rate generation and the cladding brittleness and degradation. The heat transfer process in the fuel assembly is considered with a reduced order model. The Boussinesq approximation was applied in the momentum equations for multicomponent flow analysis that considers natural convection due to buoyancy forces, which is related with thermal and hydrogen concentration effects. The numerical simulation was carried out in an averaging channel that represents a core reactor with the fuel rod with its gap and cladding and cooling steam of a BWR. (Author)

  9. Nonlinear Markov processes: Deterministic case

    International Nuclear Information System (INIS)

    Frank, T.D.

    2008-01-01

    Deterministic Markov processes that exhibit nonlinear transition mechanisms for probability densities are studied. In this context, the following issues are addressed: Markov property, conditional probability densities, propagation of probability densities, multistability in terms of multiple stationary distributions, stability analysis of stationary distributions, and basin of attraction of stationary distribution

  10. Deterministic Chaos - Complex Chance out of Simple Necessity ...

    Indian Academy of Sciences (India)

    This is a very lucid and lively book on deterministic chaos. Chaos is very common in nature. However, the understanding and realisation of its potential applications is very recent. Thus this book is a timely addition to the subject. There are several books on chaos and several more are being added every day. In spite of this ...

  11. Deterministic hydrodynamics: Taking blood apart

    Science.gov (United States)

    Davis, John A.; Inglis, David W.; Morton, Keith J.; Lawrence, David A.; Huang, Lotien R.; Chou, Stephen Y.; Sturm, James C.; Austin, Robert H.

    2006-10-01

    We show the fractionation of whole blood components and isolation of blood plasma with no dilution by using a continuous-flow deterministic array that separates blood components by their hydrodynamic size, independent of their mass. We use the technology we developed of deterministic arrays which separate white blood cells, red blood cells, and platelets from blood plasma at flow velocities of 1,000 μm/sec and volume rates up to 1 μl/min. We verified by flow cytometry that an array using focused injection removed 100% of the lymphocytes and monocytes from the main red blood cell and platelet stream. Using a second design, we demonstrated the separation of blood plasma from the blood cells (white, red, and platelets) with virtually no dilution of the plasma and no cellular contamination of the plasma. cells | plasma | separation | microfabrication

  12. Integrated deterministic and probabilistic safety assessment: Concepts, challenges, research directions

    International Nuclear Information System (INIS)

    Zio, Enrico

    2014-01-01

    Highlights: • IDPSA contributes to robust risk-informed decision making in nuclear safety. • IDPSA considers time-dependent interactions among component failures and system process. • Also, IDPSA considers time-dependent interactions among control and operator actions. • Computational efficiency by advanced Monte Carlo and meta-modelling simulations. • Efficient post-processing of IDPSA output by clustering and data mining. - Abstract: Integrated deterministic and probabilistic safety assessment (IDPSA) is conceived as a way to analyze the evolution of accident scenarios in complex dynamic systems, like nuclear, aerospace and process ones, accounting for the mutual interactions between the failure and recovery of system components, the evolving physical processes, the control and operator actions, the software and firmware. In spite of the potential offered by IDPSA, several challenges need to be effectively addressed for its development and practical deployment. In this paper, we give an overview of these and discuss the related implications in terms of research perspectives

  13. Integrated deterministic and probabilistic safety assessment: Concepts, challenges, research directions

    Energy Technology Data Exchange (ETDEWEB)

    Zio, Enrico, E-mail: enrico.zio@ecp.fr [Ecole Centrale Paris and Supelec, Chair on System Science and the Energetic Challenge, European Foundation for New Energy – Electricite de France (EDF), Grande Voie des Vignes, 92295 Chatenay-Malabry Cedex (France); Dipartimento di Energia, Politecnico di Milano, Via Ponzio 34/3, 20133 Milano (Italy)

    2014-12-15

    Highlights: • IDPSA contributes to robust risk-informed decision making in nuclear safety. • IDPSA considers time-dependent interactions among component failures and system process. • Also, IDPSA considers time-dependent interactions among control and operator actions. • Computational efficiency by advanced Monte Carlo and meta-modelling simulations. • Efficient post-processing of IDPSA output by clustering and data mining. - Abstract: Integrated deterministic and probabilistic safety assessment (IDPSA) is conceived as a way to analyze the evolution of accident scenarios in complex dynamic systems, like nuclear, aerospace and process ones, accounting for the mutual interactions between the failure and recovery of system components, the evolving physical processes, the control and operator actions, the software and firmware. In spite of the potential offered by IDPSA, several challenges need to be effectively addressed for its development and practical deployment. In this paper, we give an overview of these and discuss the related implications in terms of research perspectives.

  14. Safety margins in deterministic safety analysis

    International Nuclear Information System (INIS)

    Viktorov, A.

    2011-01-01

    The concept of safety margins has acquired certain prominence in the attempts to demonstrate quantitatively the level of the nuclear power plant safety by means of deterministic analysis, especially when considering impacts from plant ageing and discovery issues. A number of international or industry publications exist that discuss various applications and interpretations of safety margins. The objective of this presentation is to bring together and examine in some detail, from the regulatory point of view, the safety margins that relate to deterministic safety analysis. In this paper, definitions of various safety margins are presented and discussed along with the regulatory expectations for them. Interrelationships of analysis input and output parameters with corresponding limits are explored. It is shown that the overall safety margin is composed of several components each having different origins and potential uses; in particular, margins associated with analysis output parameters are contrasted with margins linked to the analysis input. While these are separate, it is possible to influence output margins through the analysis input, and analysis method. Preserving safety margins is tantamount to maintaining safety. At the same time, efficiency of operation requires optimization of safety margins taking into account various technical and regulatory considerations. For this, basic definitions and rules for safety margins must be first established. (author)

  15. Deterministic chaos in entangled eigenstates

    Science.gov (United States)

    Schlegel, K. G.; Förster, S.

    2008-05-01

    We investigate the problem of deterministic chaos in connection with entangled states using the Bohmian formulation of quantum mechanics. We show for a two particle system in a harmonic oscillator potential, that in a case of entanglement and three energy eigen-values the maximum Lyapunov-parameters of a representative ensemble of trajectories for large times develops to a narrow positive distribution, which indicates nearly complete chaotic dynamics. We also present in short results from two time-dependent systems, the anisotropic and the Rabi oscillator.

  16. Deterministic chaos in entangled eigenstates

    Energy Technology Data Exchange (ETDEWEB)

    Schlegel, K.G. [Fakultaet fuer Physik, Universitaet Bielefeld, Postfach 100131, D-33501 Bielefeld (Germany)], E-mail: guenter.schlegel@arcor.de; Foerster, S. [Fakultaet fuer Physik, Universitaet Bielefeld, Postfach 100131, D-33501 Bielefeld (Germany)

    2008-05-12

    We investigate the problem of deterministic chaos in connection with entangled states using the Bohmian formulation of quantum mechanics. We show for a two particle system in a harmonic oscillator potential, that in a case of entanglement and three energy eigen-values the maximum Lyapunov-parameters of a representative ensemble of trajectories for large times develops to a narrow positive distribution, which indicates nearly complete chaotic dynamics. We also present in short results from two time-dependent systems, the anisotropic and the Rabi oscillator.

  17. Deterministic chaos in entangled eigenstates

    International Nuclear Information System (INIS)

    Schlegel, K.G.; Foerster, S.

    2008-01-01

    We investigate the problem of deterministic chaos in connection with entangled states using the Bohmian formulation of quantum mechanics. We show for a two particle system in a harmonic oscillator potential, that in a case of entanglement and three energy eigen-values the maximum Lyapunov-parameters of a representative ensemble of trajectories for large times develops to a narrow positive distribution, which indicates nearly complete chaotic dynamics. We also present in short results from two time-dependent systems, the anisotropic and the Rabi oscillator

  18. A deterministic width function model

    Directory of Open Access Journals (Sweden)

    C. E. Puente

    2003-01-01

    Full Text Available Use of a deterministic fractal-multifractal (FM geometric method to model width functions of natural river networks, as derived distributions of simple multifractal measures via fractal interpolating functions, is reported. It is first demonstrated that the FM procedure may be used to simulate natural width functions, preserving their most relevant features like their overall shape and texture and their observed power-law scaling on their power spectra. It is then shown, via two natural river networks (Racoon and Brushy creeks in the United States, that the FM approach may also be used to closely approximate existing width functions.

  19. A deterministic, dynamic systems model of cow-calf production: The effects of breeding replacement heifers before mature cows over a 10-year horizon.

    Science.gov (United States)

    Shane, D D; Larson, R L; Sanderson, M W; Miesner, M; White, B J

    2017-10-01

    Some cattle production experts believe that cow-calf producers should breed replacement heifers (nulliparous cows) before cows (primiparous and multiparous cows), sometimes referred to as providing a heifer lead time (tHL). Our objective was to model the effects different durations of tHL may have on measures of herd productivity, including the percent of the herd cycling before the end of the first 21 d of the breeding season (%C21), the percent of the herd pregnant at pregnancy diagnosis (%PPD), the distribution of pregnancy by 21-d breeding intervals, the kilograms of calf weaned per cow exposed (KPC), and the replacement percentage (%RH), using a deterministic, dynamic systems model of cow-calf production over a 10-yr horizon. We also wished to examine differences in the effect of tHL related to the primiparous duration of postpartum anestrus (dPPA). The study model examined 6 different dPPA for primiparous cows (60, 70, 80, 90, 100, or 110 d). The multiparous cow duration of postpartum anestrus was set to 60 d. The breeding season length for nulliparous cows was 63 d, as was the breeding season length for primiparous and multiparous cows. Nulliparous cows were modeled with a tHL of 0, 7, 14, 21, 28, 35, or 42 d. Results are reported for the final breeding season of the 10-yr horizon. Increasing tHL resulted in a greater %C21 for the herd and for primiparous cows. Length of tHL had minimal impact on the %PPD unless the dPPA was 80 d or greater. For a dPPA of 110 d, a 0 d tHL resulted in the herd having 88.1 %PPD. When tHL was 21 d, the %PPD increased to 93.0%. The KPC was 161.2 kg when the dPPA was 110 d and tHL was 0 d and improved to 183.2 kg when tHL was increased to 42 d. The %RH did not vary much unless the dPPA was 90 d or greater, but increasing tHL resulted in decreased %RH. Based on the model results, increasing tHL improves the production outcomes included in the analysis, but herds with dPPA of 90 d or greater had the greatest degree of improvement

  20. Towards deterministic optical quantum computation with coherently driven atomic ensembles

    International Nuclear Information System (INIS)

    Petrosyan, David

    2005-01-01

    Scalable and efficient quantum computation with photonic qubits requires (i) deterministic sources of single photons, (ii) giant nonlinearities capable of entangling pairs of photons, and (iii) reliable single-photon detectors. In addition, an optical quantum computer would need a robust reversible photon storage device. Here we discuss several related techniques, based on the coherent manipulation of atomic ensembles in the regime of electromagnetically induced transparency, that are capable of implementing all of the above prerequisites for deterministic optical quantum computation with single photons

  1. Deterministic and stochastic trends in the Lee-Carter mortality model

    DEFF Research Database (Denmark)

    Callot, Laurent; Haldrup, Niels; Kallestrup-Lamb, Malene

    2015-01-01

    The Lee and Carter (1992) model assumes that the deterministic and stochastic time series dynamics load with identical weights when describing the development of age-specific mortality rates. Effectively this means that the main characteristics of the model simplify to a random walk model with age...... mortality data. We find empirical evidence that this feature of the Lee–Carter model overly restricts the system dynamics and we suggest to separate the deterministic and stochastic time series components at the benefit of improved fit and forecasting performance. In fact, we find that the classical Lee......–Carter model will otherwise overestimate the reduction of mortality for the younger age groups and will underestimate the reduction of mortality for the older age groups. In practice, our recommendation means that the Lee–Carter model instead of a one-factor model should be formulated as a two- (or several...

  2. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  3. The dialectical thinking about deterministic and probabilistic safety analysis

    International Nuclear Information System (INIS)

    Qian Yongbai; Tong Jiejuan; Zhang Zuoyi; He Xuhong

    2005-01-01

    There are two methods in designing and analysing the safety performance of a nuclear power plant, the traditional deterministic method and the probabilistic method. To date, the design of nuclear power plant is based on the deterministic method. It has been proved in practice that the deterministic method is effective on current nuclear power plant. However, the probabilistic method (Probabilistic Safety Assessment - PSA) considers a much wider range of faults, takes an integrated look at the plant as a whole, and uses realistic criteria for the performance of the systems and constructions of the plant. PSA can be seen, in principle, to provide a broader and realistic perspective on safety issues than the deterministic approaches. In this paper, the historical origins and development trend of above two methods are reviewed and summarized in brief. Based on the discussion of two application cases - one is the changes to specific design provisions of the general design criteria (GDC) and the other is the risk-informed categorization of structure, system and component, it can be concluded that the deterministic method and probabilistic method are dialectical and unified, and that they are being merged into each other gradually, and being used in coordination. (authors)

  4. Effect of weight loss on the severity of psoriasis

    DEFF Research Database (Denmark)

    Jensen, P; Zachariae, Claus; Christensen, R

    2013-01-01

    Psoriasis is associated with adiposity and weight gain increases the severity of psoriasis and the risk of incident psoriasis. Therefore, we aimed to measure the effect of weight reduction on the severity of psoriasis in obese patients with psoriasis.......Psoriasis is associated with adiposity and weight gain increases the severity of psoriasis and the risk of incident psoriasis. Therefore, we aimed to measure the effect of weight reduction on the severity of psoriasis in obese patients with psoriasis....

  5. Empirical and deterministic accuracies of across-population genomic prediction

    NARCIS (Netherlands)

    Wientjes, Y.C.J.; Veerkamp, R.F.; Bijma, P.; Bovenhuis, H.; Schrooten, C.; Calus, M.P.L.

    2015-01-01

    Background: Differences in linkage disequilibrium and in allele substitution effects of QTL (quantitative trait loci) may hinder genomic prediction across populations. Our objective was to develop a deterministic formula to estimate the accuracy of across-population genomic prediction, for which

  6. Deterministic and unambiguous dense coding

    International Nuclear Information System (INIS)

    Wu Shengjun; Cohen, Scott M.; Sun Yuqing; Griffiths, Robert B.

    2006-01-01

    Optimal dense coding using a partially-entangled pure state of Schmidt rank D and a noiseless quantum channel of dimension D is studied both in the deterministic case where at most L d messages can be transmitted with perfect fidelity, and in the unambiguous case where when the protocol succeeds (probability τ x ) Bob knows for sure that Alice sent message x, and when it fails (probability 1-τ x ) he knows it has failed. Alice is allowed any single-shot (one use) encoding procedure, and Bob any single-shot measurement. For D≤D a bound is obtained for L d in terms of the largest Schmidt coefficient of the entangled state, and is compared with published results by Mozes et al. [Phys. Rev. A71, 012311 (2005)]. For D>D it is shown that L d is strictly less than D 2 unless D is an integer multiple of D, in which case uniform (maximal) entanglement is not needed to achieve the optimal protocol. The unambiguous case is studied for D≤D, assuming τ x >0 for a set of DD messages, and a bound is obtained for the average . A bound on the average requires an additional assumption of encoding by isometries (unitaries when D=D) that are orthogonal for different messages. Both bounds are saturated when τ x is a constant independent of x, by a protocol based on one-shot entanglement concentration. For D>D it is shown that (at least) D 2 messages can be sent unambiguously. Whether unitary (isometric) encoding suffices for optimal protocols remains a major unanswered question, both for our work and for previous studies of dense coding using partially-entangled states, including noisy (mixed) states

  7. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  8. Deterministic computation of functional integrals

    International Nuclear Information System (INIS)

    Lobanov, Yu.Yu.

    1995-09-01

    A new method of numerical integration in functional spaces is described. This method is based on the rigorous definition of a functional integral in complete separable metric space and on the use of approximation formulas which we constructed for this kind of integral. The method is applicable to solution of some partial differential equations and to calculation of various characteristics in quantum physics. No preliminary discretization of space and time is required in this method, as well as no simplifying assumptions like semi-classical, mean field approximations, collective excitations, introduction of ''short-time'' propagators, etc are necessary in our approach. The constructed approximation formulas satisfy the condition of being exact on a given class of functionals, namely polynomial functionals of a given degree. The employment of these formulas replaces the evaluation of a functional integral by computation of the ''ordinary'' (Riemannian) integral of a low dimension, thus allowing to use the more preferable deterministic algorithms (normally - Gaussian quadratures) in computations rather than traditional stochastic (Monte Carlo) methods which are commonly used for solution of the problem under consideration. The results of application of the method to computation of the Green function of the Schroedinger equation in imaginary time as well as the study of some models of Euclidean quantum mechanics are presented. The comparison with results of other authors shows that our method gives significant (by an order of magnitude) economy of computer time and memory versus other known methods while providing the results with the same or better accuracy. The funcitonal measure of the Gaussian type is considered and some of its particular cases, namely conditional Wiener measure in quantum statistical mechanics and functional measure in a Schwartz distribution space in two-dimensional quantum field theory are studied in detail. Numerical examples demonstrating the

  9. Deterministic calculation of grey Dancoff factors in cluster cells with cylindrical outer boundaries

    International Nuclear Information System (INIS)

    Jenisch Rodrigues, L.; Tullio de Vilhena, M.

    2008-01-01

    In the present work, the WIMSD code routine PIJM is modified to compute deterministic Dancoff factors by the collision probability definition in general arrangements of partially absorbing fuel rods. Collision probabilities are calculated by an efficient integration scheme of the third-order Bickley functions, which considers each cell region separately. The effectiveness of the method is assessed by comparing grey Dancoff factors as calculated by PIJM, with those available in the literature by the Monte Carlo method, for the irregular geometry of the Canadian CANDU and CANFLEX assemblies. Dancoff factors at several different fuel pin positions are found in very good agreement with the literature results. (orig.)

  10. Aspects of cell calculations in deterministic reactor core analysis

    International Nuclear Information System (INIS)

    Varvayanni, M.; Savva, P.; Catsaros, N.

    2011-01-01

    Τhe capability of achieving optimum utilization of the deterministic neutronic codes is very important, since, although elaborate tools, they are still widely used for nuclear reactor core analyses, due to specific advantages that they present compared to Monte Carlo codes. The user of a deterministic neutronic code system has to make some significant physical assumptions if correct results are to be obtained. A decisive first step at which such assumptions are required is the one-dimensional cell calculations, which provide the neutronic properties of the homogenized core cells and collapse the cross sections into user-defined energy groups. One of the most crucial determinations required at the above stage and significantly influencing the subsequent three-dimensional calculations of reactivity, concerns the transverse leakages, associated to each one-dimensional, user-defined core cell. For the appropriate definition of the transverse leakages several parameters concerning the core configuration must be taken into account. Moreover, the suitability of the assumptions made for the transverse cell leakages, depends on earlier user decisions, such as those made for the core partition into homogeneous cells. In the present work, the sensitivity of the calculated core reactivity to the determined leakages of the individual cells constituting the core, is studied. Moreover, appropriate assumptions concerning the transverse leakages in the one-dimensional cell calculations are searched out. The study is performed examining also the influence of the core size and the reflector existence, while the effect of the decisions made for the core partition into homogenous cells is investigated. In addition, the effect of broadened moderator channels formed within the core (e.g. by removing fuel plates to create space for control rod hosting) is also examined. Since the study required a large number of conceptual core configurations, experimental data could not be available for

  11. The State of Deterministic Thinking among Mothers of Autistic Children

    Directory of Open Access Journals (Sweden)

    Mehrnoush Esbati

    2011-10-01

    Full Text Available Objectives: The purpose of the present study was to investigate the effectiveness of cognitive-behavior education on decreasing deterministic thinking in mothers of children with autism spectrum disorders. Methods: Participants were 24 mothers of autistic children who were referred to counseling centers of Tehran and their children’s disorder had been diagnosed at least by a psychiatrist and a counselor. They were randomly selected and assigned into control and experimental groups. Measurement tool was Deterministic Thinking Questionnaire and both groups answered it before and after education and the answers were analyzed by analysis of covariance. Results: The results indicated that cognitive-behavior education decreased deterministic thinking among mothers of autistic children, it decreased four sub scale of deterministic thinking: interaction with others, absolute thinking, prediction of future, and negative events (P<0.05 as well. Discussions: By learning cognitive and behavioral techniques, parents of children with autism can reach higher level of psychological well-being and it is likely that these cognitive-behavioral skills would have a positive impact on general life satisfaction of mothers of children with autism.

  12. Deterministic secure communication protocol without using entanglement

    OpenAIRE

    Cai, Qing-yu

    2003-01-01

    We show a deterministic secure direct communication protocol using single qubit in mixed state. The security of this protocol is based on the security proof of BB84 protocol. It can be realized with current technologies.

  13. Deterministic chaos in the processor load

    International Nuclear Information System (INIS)

    Halbiniak, Zbigniew; Jozwiak, Ireneusz J.

    2007-01-01

    In this article we present the results of research whose purpose was to identify the phenomenon of deterministic chaos in the processor load. We analysed the time series of the processor load during efficiency tests of database software. Our research was done on a Sparc Alpha processor working on the UNIX Sun Solaris 5.7 operating system. The conducted analyses proved the presence of the deterministic chaos phenomenon in the processor load in this particular case

  14. Risk-based and deterministic regulation

    International Nuclear Information System (INIS)

    Fischer, L.E.; Brown, N.W.

    1995-07-01

    Both risk-based and deterministic methods are used for regulating the nuclear industry to protect the public safety and health from undue risk. The deterministic method is one where performance standards are specified for each kind of nuclear system or facility. The deterministic performance standards address normal operations and design basis events which include transient and accident conditions. The risk-based method uses probabilistic risk assessment methods to supplement the deterministic one by (1) addressing all possible events (including those beyond the design basis events), (2) using a systematic, logical process for identifying and evaluating accidents, and (3) considering alternative means to reduce accident frequency and/or consequences. Although both deterministic and risk-based methods have been successfully applied, there is need for a better understanding of their applications and supportive roles. This paper describes the relationship between the two methods and how they are used to develop and assess regulations in the nuclear industry. Preliminary guidance is suggested for determining the need for using risk based methods to supplement deterministic ones. However, it is recommended that more detailed guidance and criteria be developed for this purpose

  15. Deterministic Brownian motion generated from differential delay equations.

    Science.gov (United States)

    Lei, Jinzhi; Mackey, Michael C

    2011-10-01

    This paper addresses the question of how Brownian-like motion can arise from the solution of a deterministic differential delay equation. To study this we analytically study the bifurcation properties of an apparently simple differential delay equation and then numerically investigate the probabilistic properties of chaotic solutions of the same equation. Our results show that solutions of the deterministic equation with randomly selected initial conditions display a Gaussian-like density for long time, but the densities are supported on an interval of finite measure. Using these chaotic solutions as velocities, we are able to produce Brownian-like motions, which show statistical properties akin to those of a classical Brownian motion over both short and long time scales. Several conjectures are formulated for the probabilistic properties of the solution of the differential delay equation. Numerical studies suggest that these conjectures could be "universal" for similar types of "chaotic" dynamics, but we have been unable to prove this.

  16. Deterministic and stochastic models for middle east respiratory syndrome (MERS)

    Science.gov (United States)

    Suryani, Dessy Rizki; Zevika, Mona; Nuraini, Nuning

    2018-03-01

    World Health Organization (WHO) data stated that since September 2012, there were 1,733 cases of Middle East Respiratory Syndrome (MERS) with 628 death cases that occurred in 27 countries. MERS was first identified in Saudi Arabia in 2012 and the largest cases of MERS outside Saudi Arabia occurred in South Korea in 2015. MERS is a disease that attacks the respiratory system caused by infection of MERS-CoV. MERS-CoV transmission occurs directly through direct contact between infected individual with non-infected individual or indirectly through contaminated object by the free virus. Suspected, MERS can spread quickly because of the free virus in environment. Mathematical modeling is used to illustrate the transmission of MERS disease using deterministic model and stochastic model. Deterministic model is used to investigate the temporal dynamic from the system to analyze the steady state condition. Stochastic model approach using Continuous Time Markov Chain (CTMC) is used to predict the future states by using random variables. From the models that were built, the threshold value for deterministic models and stochastic models obtained in the same form and the probability of disease extinction can be computed by stochastic model. Simulations for both models using several of different parameters are shown, and the probability of disease extinction will be compared with several initial conditions.

  17. The Effect of Lavender Aromatherapy on the Pain Severity of ...

    African Journals Online (AJOL)

    investigated the effect of lavender aromatherapy on pain severity in primary dysmenorrhea. ... message of olfaction to limbic system and cause releasing ... using a formula sample size was estimated 200 people. The ..... Rational Phytotherapy:.

  18. Depressive realism: effects of depression severity and interpretation time.

    Science.gov (United States)

    McKendree-Smith, N; Scogin, F

    2000-12-01

    This study examined the theory of depressive realism, which posits that depressed people often are more accurate in perceptions and judgments than nondepressed people. Two possible qualifications to this theory were examined: (1) severity of depression moderates the effect, and (2) length of processing time will impact the presence of bias in depressed people, that is, negative bias will develop over time. College students were presented with a bogus personality profile that actually consisted of items previously rated as neutral in desirability. Participants rated these profiles for desirability initially and then again three days later. Results indicated a significant effect of depression severity on desirability rating. Nondepressed and mildly depressed students found their profiles to be more positive than the moderately/severely depressed students, with both groups having scores in the positive range. However, those participants who were moderately/severely depressed showed a negative bias in their ratings. No support was found for the effect of different times of interpretation.

  19. Deterministic blade row interactions in a centrifugal compressor stage

    Science.gov (United States)

    Kirtley, K. R.; Beach, T. A.

    1991-01-01

    The three-dimensional viscous flow in a low speed centrifugal compressor stage is simulated using an average passage Navier-Stokes analysis. The impeller discharge flow is of the jet/wake type with low momentum fluid in the shroud-pressure side corner coincident with the tip leakage vortex. This nonuniformity introduces periodic unsteadiness in the vane frame of reference. The effect of such deterministic unsteadiness on the time-mean is included in the analysis through the average passage stress, which allows the analysis of blade row interactions. The magnitude of the divergence of the deterministic unsteady stress is of the order of the divergence of the Reynolds stress over most of the span, from the impeller trailing edge to the vane throat. Although the potential effects on the blade trailing edge from the diffuser vane are small, strong secondary flows generated by the impeller degrade the performance of the diffuser vanes.

  20. Design of deterministic interleaver for turbo codes

    International Nuclear Information System (INIS)

    Arif, M.A.; Sheikh, N.M.; Sheikh, A.U.H.

    2008-01-01

    The choice of suitable interleaver for turbo codes can improve the performance considerably. For long block lengths, random interleavers perform well, but for some applications it is desirable to keep the block length shorter to avoid latency. For such applications deterministic interleavers perform better. The performance and design of a deterministic interleaver for short frame turbo codes is considered in this paper. The main characteristic of this class of deterministic interleaver is that their algebraic design selects the best permutation generator such that the points in smaller subsets of the interleaved output are uniformly spread over the entire range of the information data frame. It is observed that the interleaver designed in this manner improves the minimum distance or reduces the multiplicity of first few spectral lines of minimum distance spectrum. Finally we introduce a circular shift in the permutation function to reduce the correlation between the parity bits corresponding to the original and interleaved data frames to improve the decoding capability of MAP (Maximum A Posteriori) probability decoder. Our solution to design a deterministic interleaver outperforms the semi-random interleavers and the deterministic interleavers reported in the literature. (author)

  1. Deterministic ion beam material adding technology for high-precision optical surfaces.

    Science.gov (United States)

    Liao, Wenlin; Dai, Yifan; Xie, Xuhui; Zhou, Lin

    2013-02-20

    Although ion beam figuring (IBF) provides a highly deterministic method for the precision figuring of optical components, several problems still need to be addressed, such as the limited correcting capability for mid-to-high spatial frequency surface errors and low machining efficiency for pit defects on surfaces. We propose a figuring method named deterministic ion beam material adding (IBA) technology to solve those problems in IBF. The current deterministic optical figuring mechanism, which is dedicated to removing local protuberances on optical surfaces, is enriched and developed by the IBA technology. Compared with IBF, this method can realize the uniform convergence of surface errors, where the particle transferring effect generated in the IBA process can effectively correct the mid-to-high spatial frequency errors. In addition, IBA can rapidly correct the pit defects on the surface and greatly improve the machining efficiency of the figuring process. The verification experiments are accomplished on our experimental installation to validate the feasibility of the IBA method. First, a fused silica sample with a rectangular pit defect is figured by using IBA. Through two iterations within only 47.5 min, this highly steep pit is effectively corrected, and the surface error is improved from the original 24.69 nm root mean square (RMS) to the final 3.68 nm RMS. Then another experiment is carried out to demonstrate the correcting capability of IBA for mid-to-high spatial frequency surface errors, and the final results indicate that the surface accuracy and surface quality can be simultaneously improved.

  2. Proving Non-Deterministic Computations in Agda

    Directory of Open Access Journals (Sweden)

    Sergio Antoy

    2017-01-01

    Full Text Available We investigate proving properties of Curry programs using Agda. First, we address the functional correctness of Curry functions that, apart from some syntactic and semantic differences, are in the intersection of the two languages. Second, we use Agda to model non-deterministic functions with two distinct and competitive approaches incorporating the non-determinism. The first approach eliminates non-determinism by considering the set of all non-deterministic values produced by an application. The second approach encodes every non-deterministic choice that the application could perform. We consider our initial experiment a success. Although proving properties of programs is a notoriously difficult task, the functional logic paradigm does not seem to add any significant layer of difficulty or complexity to the task.

  3. Deterministic dense coding with partially entangled states

    Science.gov (United States)

    Mozes, Shay; Oppenheim, Jonathan; Reznik, Benni

    2005-01-01

    The utilization of a d -level partially entangled state, shared by two parties wishing to communicate classical information without errors over a noiseless quantum channel, is discussed. We analytically construct deterministic dense coding schemes for certain classes of nonmaximally entangled states, and numerically obtain schemes in the general case. We study the dependency of the maximal alphabet size of such schemes on the partially entangled state shared by the two parties. Surprisingly, for d>2 it is possible to have deterministic dense coding with less than one ebit. In this case the number of alphabet letters that can be communicated by a single particle is between d and 2d . In general, we numerically find that the maximal alphabet size is any integer in the range [d,d2] with the possible exception of d2-1 . We also find that states with less entanglement can have a greater deterministic communication capacity than other more entangled states.

  4. Quantum deterministic key distribution protocols based on the authenticated entanglement channel

    International Nuclear Information System (INIS)

    Zhou Nanrun; Wang Lijun; Ding Jie; Gong Lihua

    2010-01-01

    Based on the quantum entanglement channel, two secure quantum deterministic key distribution (QDKD) protocols are proposed. Unlike quantum random key distribution (QRKD) protocols, the proposed QDKD protocols can distribute the deterministic key securely, which is of significant importance in the field of key management. The security of the proposed QDKD protocols is analyzed in detail using information theory. It is shown that the proposed QDKD protocols can safely and effectively hand over the deterministic key to the specific receiver and their physical implementation is feasible with current technology.

  5. Quantum deterministic key distribution protocols based on the authenticated entanglement channel

    Energy Technology Data Exchange (ETDEWEB)

    Zhou Nanrun; Wang Lijun; Ding Jie; Gong Lihua [Department of Electronic Information Engineering, Nanchang University, Nanchang 330031 (China)], E-mail: znr21@163.com, E-mail: znr21@hotmail.com

    2010-04-15

    Based on the quantum entanglement channel, two secure quantum deterministic key distribution (QDKD) protocols are proposed. Unlike quantum random key distribution (QRKD) protocols, the proposed QDKD protocols can distribute the deterministic key securely, which is of significant importance in the field of key management. The security of the proposed QDKD protocols is analyzed in detail using information theory. It is shown that the proposed QDKD protocols can safely and effectively hand over the deterministic key to the specific receiver and their physical implementation is feasible with current technology.

  6. ZERODUR: deterministic approach for strength design

    Science.gov (United States)

    Hartmann, Peter

    2012-12-01

    There is an increasing request for zero expansion glass ceramic ZERODUR substrates being capable of enduring higher operational static loads or accelerations. The integrity of structures such as optical or mechanical elements for satellites surviving rocket launches, filigree lightweight mirrors, wobbling mirrors, and reticle and wafer stages in microlithography must be guaranteed with low failure probability. Their design requires statistically relevant strength data. The traditional approach using the statistical two-parameter Weibull distribution suffered from two problems. The data sets were too small to obtain distribution parameters with sufficient accuracy and also too small to decide on the validity of the model. This holds especially for the low failure probability levels that are required for reliable applications. Extrapolation to 0.1% failure probability and below led to design strengths so low that higher load applications seemed to be not feasible. New data have been collected with numbers per set large enough to enable tests on the applicability of the three-parameter Weibull distribution. This distribution revealed to provide much better fitting of the data. Moreover it delivers a lower threshold value, which means a minimum value for breakage stress, allowing of removing statistical uncertainty by introducing a deterministic method to calculate design strength. Considerations taken from the theory of fracture mechanics as have been proven to be reliable with proof test qualifications of delicate structures made from brittle materials enable including fatigue due to stress corrosion in a straight forward way. With the formulae derived, either lifetime can be calculated from given stress or allowable stress from minimum required lifetime. The data, distributions, and design strength calculations for several practically relevant surface conditions of ZERODUR are given. The values obtained are significantly higher than those resulting from the two

  7. Introducing Synchronisation in Deterministic Network Models

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens Frederik D.

    2006-01-01

    The paper addresses performance analysis for distributed real time systems through deterministic network modelling. Its main contribution is the introduction and analysis of models for synchronisation between tasks and/or network elements. Typical patterns of synchronisation are presented leading...... to the suggestion of suitable network models. An existing model for flow control is presented and an inherent weakness is revealed and remedied. Examples are given and numerically analysed through deterministic network modelling. Results are presented to highlight the properties of the suggested models...

  8. Optimal Deterministic Investment Strategies for Insurers

    Directory of Open Access Journals (Sweden)

    Ulrich Rieder

    2013-11-01

    Full Text Available We consider an insurance company whose risk reserve is given by a Brownian motion with drift and which is able to invest the money into a Black–Scholes financial market. As optimization criteria, we treat mean-variance problems, problems with other risk measures, exponential utility and the probability of ruin. Following recent research, we assume that investment strategies have to be deterministic. This leads to deterministic control problems, which are quite easy to solve. Moreover, it turns out that there are some interesting links between the optimal investment strategies of these problems. Finally, we also show that this approach works in the Lévy process framework.

  9. Recent achievements of the neo-deterministic seismic hazard assessment in the CEI region

    International Nuclear Information System (INIS)

    Panza, G.F.; Vaccari, F.; Kouteva, M.

    2008-03-01

    A review of the recent achievements of the innovative neo-deterministic approach for seismic hazard assessment through realistic earthquake scenarios has been performed. The procedure provides strong ground motion parameters for the purpose of earthquake engineering, based on the deterministic seismic wave propagation modelling at different scales - regional, national and metropolitan. The main advantage of this neo-deterministic procedure is the simultaneous treatment of the contribution of the earthquake source and seismic wave propagation media to the strong motion at the target site/region, as required by basic physical principles. The neo-deterministic seismic microzonation procedure has been successfully applied to numerous metropolitan areas all over the world in the framework of several international projects. In this study some examples focused on CEI region concerning both regional seismic hazard assessment and seismic microzonation of the selected metropolitan areas are shown. (author)

  10. Considerations of several real effects in pneumatic pellet injection processes

    International Nuclear Information System (INIS)

    Ming-Lun Xue.

    1987-10-01

    Several real effects that take place in a pneumatic pellet injector are examined. These are the heat transfer between a high-temperature propellent gas and the metal wall of the injector, and the frictional loss between the propellent and wall. (author)

  11. Capecitabine induced hypertriglyceridaemia: An underreported and potentially severe side effect

    OpenAIRE

    Tabchi S; Joseph K

    2014-01-01

    A 57 year-old-woman, with no previous history of dyslipedemia, developed severe hypertriglyceridemia while being treated with capecitabine for metastatic breast cancer. Capecitabine was not discontinued and serum triglyceride levels were normalized after 4 weeks of treatment with fenofibrate. Capecitabine induced hypertriglyceridemia, as a rare drug-related side effect, seems to be often overlooked by clinicians.

  12. Capecitabine induced hypertriglyceridaemia: An underreported and potentially severe side effect

    Directory of Open Access Journals (Sweden)

    Tabchi S

    2014-05-01

    Full Text Available A 57 year-old-woman, with no previous history of dyslipedemia, developed severe hypertriglyceridemia while being treated with capecitabine for metastatic breast cancer. Capecitabine was not discontinued and serum triglyceride levels were normalized after 4 weeks of treatment with fenofibrate. Capecitabine induced hypertriglyceridemia, as a rare drug-related side effect, seems to be often overlooked by clinicians.

  13. A Theory of Deterministic Event Structures

    NARCIS (Netherlands)

    Lee, I.; Rensink, Arend; Smolka, S.A.

    1995-01-01

    We present an w-complete algebra of a class of deterministic event structures which are labelled prime event structures where the labelling function satises a certain distinctness condition. The operators of the algebra are summation sequential composition and join. Each of these gives rise to a

  14. A Numerical Simulation for a Deterministic Compartmental ...

    African Journals Online (AJOL)

    In this work, an earlier deterministic mathematical model of HIV/AIDS is revisited and numerical solutions obtained using Eulers numerical method. Using hypothetical values for the parameters, a program was written in VISUAL BASIC programming language to generate series for the system of difference equations from the ...

  15. Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones

    Science.gov (United States)

    Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto

    2015-04-01

    Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions

  16. Convergence studies of deterministic methods for LWR explicit reflector methodology

    International Nuclear Information System (INIS)

    Canepa, S.; Hursin, M.; Ferroukhi, H.; Pautz, A.

    2013-01-01

    The standard approach in modem 3-D core simulators, employed either for steady-state or transient simulations, is to use Albedo coefficients or explicit reflectors at the core axial and radial boundaries. In the latter approach, few-group homogenized nuclear data are a priori produced with lattice transport codes using 2-D reflector models. Recently, the explicit reflector methodology of the deterministic CASMO-4/SIMULATE-3 code system was identified to potentially constitute one of the main sources of errors for core analyses of the Swiss operating LWRs, which are all belonging to GII design. Considering that some of the new GIII designs will rely on very different reflector concepts, a review and assessment of the reflector methodology for various LWR designs appeared as relevant. Therefore, the purpose of this paper is to first recall the concepts of the explicit reflector modelling approach as employed by CASMO/SIMULATE. Then, for selected reflector configurations representative of both GII and GUI designs, a benchmarking of the few-group nuclear data produced with the deterministic lattice code CASMO-4 and its successor CASMO-5, is conducted. On this basis, a convergence study with regards to geometrical requirements when using deterministic methods with 2-D homogenous models is conducted and the effect on the downstream 3-D core analysis accuracy is evaluated for a typical GII deflector design in order to assess the results against available plant measurements. (authors)

  17. Methods and models in mathematical biology deterministic and stochastic approaches

    CERN Document Server

    Müller, Johannes

    2015-01-01

    This book developed from classes in mathematical biology taught by the authors over several years at the Technische Universität München. The main themes are modeling principles, mathematical principles for the analysis of these models, and model-based analysis of data. The key topics of modern biomathematics are covered: ecology, epidemiology, biochemistry, regulatory networks, neuronal networks, and population genetics. A variety of mathematical methods are introduced, ranging from ordinary and partial differential equations to stochastic graph theory and  branching processes. A special emphasis is placed on the interplay between stochastic and deterministic models.

  18. Effectiveness and cost-effectiveness of a cardiovascular risk prediction algorithm for people with severe mental illness (PRIMROSE).

    Science.gov (United States)

    Zomer, Ella; Osborn, David; Nazareth, Irwin; Blackburn, Ruth; Burton, Alexandra; Hardoon, Sarah; Holt, Richard Ian Gregory; King, Michael; Marston, Louise; Morris, Stephen; Omar, Rumana; Petersen, Irene; Walters, Kate; Hunter, Rachael Maree

    2017-09-05

    To determine the cost-effectiveness of two bespoke severe mental illness (SMI)-specific risk algorithms compared with standard risk algorithms for primary cardiovascular disease (CVD) prevention in those with SMI. Primary care setting in the UK. The analysis was from the National Health Service perspective. 1000 individuals with SMI from The Health Improvement Network Database, aged 30-74 years and without existing CVD, populated the model. Four cardiovascular risk algorithms were assessed: (1) general population lipid, (2) general population body mass index (BMI), (3) SMI-specific lipid and (4) SMI-specific BMI, compared against no algorithm. At baseline, each cardiovascular risk algorithm was applied and those considered high risk ( > 10%) were assumed to be prescribed statin therapy while others received usual care. Quality-adjusted life years (QALYs) and costs were accrued for each algorithm including no algorithm, and cost-effectiveness was calculated using the net monetary benefit (NMB) approach. Deterministic and probabilistic sensitivity analyses were performed to test assumptions made and uncertainty around parameter estimates. The SMI-specific BMI algorithm had the highest NMB resulting in 15 additional QALYs and a cost saving of approximately £53 000 per 1000 patients with SMI over 10 years, followed by the general population lipid algorithm (13 additional QALYs and a cost saving of £46 000). The general population lipid and SMI-specific BMI algorithms performed equally well. The ease and acceptability of use of an SMI-specific BMI algorithm (blood tests not required) makes it an attractive algorithm to implement in clinical settings. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Estimation of several political action effects of energy prices

    Directory of Open Access Journals (Sweden)

    Andrew B. Whitford

    2016-01-01

    Full Text Available One important effect of price shocks in the United States has been increased political attention paid to the structure and performance of oil and natural gas markets, along with some governmental support for energy conservation. This article describes how price changes helped lead to the emergence of a political agenda accompanied by several interventions, as revealed through Granger causality tests on change in the legislative agenda.

  20. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  1. Deterministic nonlinear systems a short course

    CERN Document Server

    Anishchenko, Vadim S; Strelkova, Galina I

    2014-01-01

    This text is a short yet complete course on nonlinear dynamics of deterministic systems. Conceived as a modular set of 15 concise lectures it reflects the many years of teaching experience by the authors. The lectures treat in turn the fundamental aspects of the theory of dynamical systems, aspects of stability and bifurcations, the theory of deterministic chaos and attractor dimensions, as well as the elements of the theory of Poincare recurrences.Particular attention is paid to the analysis of the generation of periodic, quasiperiodic and chaotic self-sustained oscillations and to the issue of synchronization in such systems.  This book is aimed at graduate students and non-specialist researchers with a background in physics, applied mathematics and engineering wishing to enter this exciting field of research.

  2. Deterministic nanoparticle assemblies: from substrate to solution

    International Nuclear Information System (INIS)

    Barcelo, Steven J; Gibson, Gary A; Yamakawa, Mineo; Li, Zhiyong; Kim, Ansoon; Norris, Kate J

    2014-01-01

    The deterministic assembly of metallic nanoparticles is an exciting field with many potential benefits. Many promising techniques have been developed, but challenges remain, particularly for the assembly of larger nanoparticles which often have more interesting plasmonic properties. Here we present a scalable process combining the strengths of top down and bottom up fabrication to generate deterministic 2D assemblies of metallic nanoparticles and demonstrate their stable transfer to solution. Scanning electron and high-resolution transmission electron microscopy studies of these assemblies suggested the formation of nanobridges between touching nanoparticles that hold them together so as to maintain the integrity of the assembly throughout the transfer process. The application of these nanoparticle assemblies as solution-based surface-enhanced Raman scattering (SERS) materials is demonstrated by trapping analyte molecules in the nanoparticle gaps during assembly, yielding uniformly high enhancement factors at all stages of the fabrication process. (paper)

  3. Deterministic dynamics of plasma focus discharges

    International Nuclear Information System (INIS)

    Gratton, J.; Alabraba, M.A.; Warmate, A.G.; Giudice, G.

    1992-04-01

    The performance (neutron yield, X-ray production, etc.) of plasma focus discharges fluctuates strongly in series performed with fixed experimental conditions. Previous work suggests that these fluctuations are due to a deterministic ''internal'' dynamics involving degrees of freedom not controlled by the operator, possibly related to adsorption and desorption of impurities from the electrodes. According to these dynamics the yield of a discharge depends on the outcome of the previous ones. We study 8 series of discharges in three different facilities, with various electrode materials and operating conditions. More evidence of a deterministic internal dynamics is found. The fluctuation pattern depends on the electrode materials and other characteristics of the experiment. A heuristic mathematical model that describes adsorption and desorption of impurities from the electrodes and their consequences on the yield is presented. The model predicts steady yield or periodic and chaotic fluctuations, depending on parameters related to the experimental conditions. (author). 27 refs, 7 figs, 4 tabs

  4. Advances in stochastic and deterministic global optimization

    CERN Document Server

    Zhigljavsky, Anatoly; Žilinskas, Julius

    2016-01-01

    Current research results in stochastic and deterministic global optimization including single and multiple objectives are explored and presented in this book by leading specialists from various fields. Contributions include applications to multidimensional data visualization, regression, survey calibration, inventory management, timetabling, chemical engineering, energy systems, and competitive facility location. Graduate students, researchers, and scientists in computer science, numerical analysis, optimization, and applied mathematics will be fascinated by the theoretical, computational, and application-oriented aspects of stochastic and deterministic global optimization explored in this book. This volume is dedicated to the 70th birthday of Antanas Žilinskas who is a leading world expert in global optimization. Professor Žilinskas's research has concentrated on studying models for the objective function, the development and implementation of efficient algorithms for global optimization with single and mu...

  5. Understanding deterministic diffusion by correlated random walks

    International Nuclear Information System (INIS)

    Klages, R.; Korabel, N.

    2002-01-01

    Low-dimensional periodic arrays of scatterers with a moving point particle are ideal models for studying deterministic diffusion. For such systems the diffusion coefficient is typically an irregular function under variation of a control parameter. Here we propose a systematic scheme of how to approximate deterministic diffusion coefficients of this kind in terms of correlated random walks. We apply this approach to two simple examples which are a one-dimensional map on the line and the periodic Lorentz gas. Starting from suitable Green-Kubo formulae we evaluate hierarchies of approximations for their parameter-dependent diffusion coefficients. These approximations converge exactly yielding a straightforward interpretation of the structure of these irregular diffusion coefficients in terms of dynamical correlations. (author)

  6. Dynamic optimization deterministic and stochastic models

    CERN Document Server

    Hinderer, Karl; Stieglitz, Michael

    2016-01-01

    This book explores discrete-time dynamic optimization and provides a detailed introduction to both deterministic and stochastic models. Covering problems with finite and infinite horizon, as well as Markov renewal programs, Bayesian control models and partially observable processes, the book focuses on the precise modelling of applications in a variety of areas, including operations research, computer science, mathematics, statistics, engineering, economics and finance. Dynamic Optimization is a carefully presented textbook which starts with discrete-time deterministic dynamic optimization problems, providing readers with the tools for sequential decision-making, before proceeding to the more complicated stochastic models. The authors present complete and simple proofs and illustrate the main results with numerous examples and exercises (without solutions). With relevant material covered in four appendices, this book is completely self-contained.

  7. Deterministic geologic processes and stochastic modeling

    International Nuclear Information System (INIS)

    Rautman, C.A.; Flint, A.L.

    1992-01-01

    This paper reports that recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. consideration of the spatial variability indicates that her are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. Because the geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling

  8. Deterministic automata for extended regular expressions

    Directory of Open Access Journals (Sweden)

    Syzdykov Mirzakhmet

    2017-12-01

    Full Text Available In this work we present the algorithms to produce deterministic finite automaton (DFA for extended operators in regular expressions like intersection, subtraction and complement. The method like “overriding” of the source NFA(NFA not defined with subset construction rules is used. The past work described only the algorithm for AND-operator (or intersection of regular languages; in this paper the construction for the MINUS-operator (and complement is shown.

  9. Cost effectiveness of transcatheter aortic valve replacement compared to medical management in inoperable patients with severe aortic stenosis: Canadian analysis based on the PARTNER Trial Cohort B findings.

    Science.gov (United States)

    Hancock-Howard, Rebecca L; Feindel, Christopher M; Rodes-Cabau, Josep; Webb, John G; Thompson, Ann K; Banz, Kurt

    2013-01-01

    The only effective treatment for severe aortic stenosis (AS) is valve replacement. However, many patients with co-existing conditions are ineligible for surgical valve replacement, historically leaving medical management (MM) as the only option which has a poor prognosis. Transcatheter Aortic Valve Replacement (TAVR) is a less invasive replacement method. The objective was to estimate cost-effectiveness of TAVR via transfemoral access vs MM in surgically inoperable patients with severe AS from the Canadian public healthcare system perspective. A cost-effectiveness analysis of TAVR vs MM was conducted using a deterministic decision analytic model over a 3-year time horizon. The PARTNER randomized controlled trial results were used to estimate survival, utilities, and some resource utilization. Costs included the valve replacement procedure, complications, hospitalization, outpatient visits/tests, and home/nursing care. Resources were valued (2009 Canadian dollars) using costs from the Ontario Case Costing Initiative (OCCI), Ontario Ministry of Health and Long-Term Care and Ontario Drug Benefits Formulary, or were estimated using relative costs from a French economic evaluation or clinical experts. Costs and outcomes were discounted 5% annually. The effect of uncertainty in model parameters was explored in deterministic and probabilistic sensitivity analysis. The incremental cost-effectiveness ratio (ICER) was $32,170 per quality-adjusted life year (QALY) gained for TAVR vs MM. When the time horizon was shortened to 24 and 12 months, the ICER increased to $52,848 and $157,429, respectively. All other sensitivity analysis returned an ICER of less than $50,000/QALY gained. A limitation was lack of availability of Canadian-specific resource and cost data for all resources, leaving one to rely on clinical experts and data from France to inform certain parameters. Based on the results of this analysis, it can be concluded that TAVR is cost-effective compared to MM for the

  10. Effectiveness of yoga training program on the severity of autism.

    Science.gov (United States)

    Sotoodeh, Mohammad Saber; Arabameri, Elahe; Panahibakhsh, Maryam; Kheiroddin, Fatemeh; Mirdoozandeh, Hatef; Ghanizadeh, Ahmad

    2017-08-01

    This study examines the effect of yoga training program (YTP) on the severity of autism in children with High Function Autism (HFA). Twenty-nine children aged 7 to 15 (mean = 11.22, SD = 2.91) years were randomly allocated to either yoga or control group. The participants in the yoga group received an 8-week (24-session) Yoga Training Program (YTP). Parents or caregivers of participants completed autism treatment evaluation checklist (ATEC) at baseline and the end of the intervention. The results of the analysis showed that there were significant differences between the two groups with regards to all ATEC sub-scores except ATEC I (speech/language/communication). This study provides support for the implementation of a yoga training program and identifies specific procedural enhancements to reduce the severity of symptoms in children with autism. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Deterministic Mean-Field Ensemble Kalman Filtering

    KAUST Repository

    Law, Kody

    2016-05-03

    The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. A density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence k between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d<2k. The fidelity of approximation of the true distribution is also established using an extension of the total variation metric to random measures. This is limited by a Gaussian bias term arising from nonlinearity/non-Gaussianity of the model, which arises in both deterministic and standard EnKF. Numerical results support and extend the theory.

  12. Deterministic Mean-Field Ensemble Kalman Filtering

    KAUST Repository

    Law, Kody; Tembine, Hamidou; Tempone, Raul

    2016-01-01

    The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. A density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence k between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d<2k. The fidelity of approximation of the true distribution is also established using an extension of the total variation metric to random measures. This is limited by a Gaussian bias term arising from nonlinearity/non-Gaussianity of the model, which arises in both deterministic and standard EnKF. Numerical results support and extend the theory.

  13. Deterministic LOCA avoidance by gravity effect

    Energy Technology Data Exchange (ETDEWEB)

    Harms, A A [McMaster Univ., Hamilton, ON (Canada)

    1996-12-31

    A novel concept for an intrinsically safe reactor, called the Pellet Suspension Reactor (PSR), has vertical fuel tubes in which fuel, in the form of micro-pellets, is suspended by an upwardly flowing liquid or (preferably) gas coolant. Then, in the event of a primary pump failure, the fuel pellets fall down into a divergent conical annulus which is surrounded by ordinary water; the divergent geometry of this catchment ensures nuclear subcriticality, and the annulus will ensure decay-heat removal by packed-bed conduction. Thus melting of the fuel is avoided, and no emergency shut-down or emergency cooling provisions are required. 7 refs., 1 tab., 1 fig.

  14. The protective effect of Shenfu injection against elderly severe pneumonia.

    Science.gov (United States)

    Lv, S J; Lai, D P; Wei, X; Yan, Q; Xia, J M

    2017-10-01

    The purpose of this study is to investigate the effect of Shenfu injection (SFI) on the tumor necrosis factor-α (TNF-α) and the interleukin (IL-6, IL-8, IL-10) of elderly patients who suffered from severe pneumonia. From June 2012 to September 2014, we performed in our department 89 cases of elderly patients with severe pneumonia. These patients were randomly divided into two groups: the treatment group (45 cases) and the control group (44 cases). The control group was given the treatment of anti-infection, reducing sputum, and support therapy, while the treatment group was fed by SFI intravenously based on the control group. The TNF-α and the interleukin were detected by enzyme-linked immunosorbent assay (ELISA). Meanwhile, the changes in the inflammatory response indicators, the blood gas analysis, and the parameters of vital signs were measured and compared before and after therapy. Prior to treatment, there is no significant difference between the treatment group and the control group (p > 0.05); after the treatment for 7 days, the levels of TNF-α, IL-6, and IL-8 were significantly decreased, while the level of IL-10 was obviously increased. The APACHE II score was significantly decreased in comparison to that before the treatment (p < 0.05), and the time of mechanical ventilation, the duration of time in ICU, and the application time of vasoactive drugs were notably shortened. The application of Shenfu injection exhibited a positive and effective effect on removing the inflammation media during the treatment of elderly severe pneumonia.

  15. Deterministic quantum state transfer and remote entanglement using microwave photons.

    Science.gov (United States)

    Kurpiers, P; Magnard, P; Walter, T; Royer, B; Pechal, M; Heinsoo, J; Salathé, Y; Akin, A; Storz, S; Besse, J-C; Gasparinetti, S; Blais, A; Wallraff, A

    2018-06-01

    Sharing information coherently between nodes of a quantum network is fundamental to distributed quantum information processing. In this scheme, the computation is divided into subroutines and performed on several smaller quantum registers that are connected by classical and quantum channels 1 . A direct quantum channel, which connects nodes deterministically rather than probabilistically, achieves larger entanglement rates between nodes and is advantageous for distributed fault-tolerant quantum computation 2 . Here we implement deterministic state-transfer and entanglement protocols between two superconducting qubits fabricated on separate chips. Superconducting circuits 3 constitute a universal quantum node 4 that is capable of sending, receiving, storing and processing quantum information 5-8 . Our implementation is based on an all-microwave cavity-assisted Raman process 9 , which entangles or transfers the qubit state of a transmon-type artificial atom 10 with a time-symmetric itinerant single photon. We transfer qubit states by absorbing these itinerant photons at the receiving node, with a probability of 98.1 ± 0.1 per cent, achieving a transfer-process fidelity of 80.02 ± 0.07 per cent for a protocol duration of only 180 nanoseconds. We also prepare remote entanglement on demand with a fidelity as high as 78.9 ± 0.1 per cent at a rate of 50 kilohertz. Our results are in excellent agreement with numerical simulations based on a master-equation description of the system. This deterministic protocol has the potential to be used for quantum computing distributed across different nodes of a cryogenic network.

  16. Effects of vernal and allergic conjunctivitis on severity of keratoconus.

    Science.gov (United States)

    Cingu, Abdullah Kursat; Cinar, Yasin; Turkcu, Fatih Mehmet; Sahin, Alparslan; Ari, Seyhmus; Yuksel, Harun; Sahin, Muhammed; Caca, Ihsan

    2013-01-01

    To demonstrate the effects of two different types of allergic conjunctivitis on severity of keratoconus (KC). We retrospectively reviewed the medical records of 171 KC patients referred between June 2010 and June 2011. The KC patients were divided into 3 groups as KC (group A), KC with vernal keratoconjunctivitis (VKC) (group B) and KC with allergic conjunctivitis (AC) (group C). Main outcome measures were demographic and ocular clinical features including age at presentation, gender, spherical equivalent (SE), best spectacle corrected visual acuity (BCVA), mean keratometric measurement (Km), central corneal thickness (CCT), and intraocular pressure (IOP). Groups were compared in term of study variables. The median age at presentation was significantly lower in group B (P<0.001). According to the median SE (P=0.003), BCVA (P=0.022), Km (P<0.001), CCT (P=0.015) and Amsler-Krumeich classification (P<0.001), KC was more severe in group B. There was no significant difference in terms of IOP and corrected IOP among the groups (P=0.44), however there were 4 patients who had increased corrected IOP developed after topical corticosteroid use in group B. The differences among the groups persisted even after controlling for age and gender. Our findings demonstrated a more severe KC in VKC patients despite their younger age which suggests evaluation of VKC patients as a separate group in keratoconus disease.

  17. Effects of vernal and allergic conjunctivitis on severity of keratoconus

    Directory of Open Access Journals (Sweden)

    Muhammed Sahin

    2013-06-01

    Full Text Available AIM: To demonstrate the effects of two different types of allergic conjunctivitis on severity of keratoconus (KC.METHODS: We retrospectively reviewed the medical records of 171 KC patients referred between June 2010 and June 2011. The KC patients were divided into 3 groups as KC (group A, KC with vernal keratoconjunctivitis (VKC (group B and KC with allergic conjunctivitis (AC (group C. Main outcome measures were demographic and ocular clinical features including age at presentation, gender, spherical equivalent (SE, best spectacle corrected visual acuity (BCVA, mean keratometric measurement (Km, central corneal thickness (CCT, and intraocular pressure (IOP. Groups were compared in term of study variables. RESULTS: The median age at presentation was significantly lower in group B (P<0.001. According to the median SE (P=0.003, BCVA (P=0.022, Km (P<0.001, CCT (P=0.015 and Amsler–Krumeich classification (P<0.001, KC was more severe in group B. There was no significant difference in terms of IOP and corrected IOP among the groups (P=0.44, however there were 4 patients who had increased corrected IOP developed after topical corticosteroid use in group B. The differences among the groups persisted even after controlling for age and gender.CONCLUSION: Our findings demonstrated a more severe KC in VKC patients despite their younger age which suggests evaluation of VKC patients as a separate group in keratoconus disease.

  18. Fire severity effects on ash extractable Total Phosphorous

    Science.gov (United States)

    Pereira, Paulo; Úbeda, Xavier; Martin, Deborah

    2010-05-01

    Phosphorous (P) is a crucial element to plant nutrition and limits vegetal production. The amounts of P in soil are lower and great part of this nutrient is absorbed or precipitated. It is well known that fire has important implications on P cycle, that can be lost throughout volatilization, evacuated with the smoke, but also more available to transport after organic matter mineralization imposed by the fire. The release of P depends on ash pH and their chemical and physical characteristics. Fire temperatures impose different severities, according to the specie affected and contact time. Fire severity is often evaluated by ash colour and this is a low-cost and excellent methodology to assess the fire effects on ecosystems. The aim of this work is study the ash properties physical and chemical properties on ash extractable Total Phosphorous (TP), collected in three wildfires, occured in Portugal, (named, (1) Quinta do Conde, (2) Quinta da Areia and (3) Casal do Sapo) composed mainly by Quercus suber and Pinus pinaster trees. The ash colour was assessed using the Munsell color chart. From all three plots we analyzed a total of 102 ash samples and we identified 5 different ash colours, ordered in an increasing order of severity, Very Dark Brown, Black, Dark Grey, Very Dark Grey and Light Grey. In order to observe significant differences between extractable TP and ash colours, we applied an ANOVA One Way test, and considered the differences significant at a p<0.05. The results showed that significant differences in the extractable TP among the different ash colours. Hence, to identify specific differences between each ash colour, we applied a post-hoc Fisher LSD test, significant at a p<0.05. The results obtained showed significant differences between the extractable TP from Very dark Brown and Black ash, produced at lower severities, in relation to Dark Grey, Very Dark Grey and Light Grey ash, generated at higher severities. The means of the first group were higher

  19. Cannabis Use Has Negligible Effects Following Severe Traumatic Injury.

    Science.gov (United States)

    AbdelFattah, Kareem R; Edwards, Courtney R; Cripps, Michael W; Minshall, Christian T; Phelan, Herb A; Minei, Joseph P; Eastman, Alexander L

    Nearly half of all states have legalized medical marijuana or recreational-use marijuana. As more states move toward legalization, the effects on injured patients must be evaluated. This study sought to determine effects of cannabis positivity at the time of severe injury on hospital outcomes compared with individuals negative for illicit substances and those who were users of other illicit substances. A Level I trauma center performed a retrospective chart review covering subjects over a 2-year period with toxicology performed and an Injury Severity Score (ISS) of more than 16. These individuals were divided into the negative and positive toxicology groups, further divided into the marijuana-only, other drugs-only, and mixed-use groups. Differences in presenting characteristics, hospital length of stay, intensive care unit (ICU) stays, ventilator days, and death were compared. A total of 8,441 subjects presented during the study period; 2,134 (25%) of these had toxicology performed; 843 (40%) had an ISS of more than 16, with 347 having negative tests (NEG); 70 (8.3%) substance users tested positive only for marijuana (MO), 323 (38.3%) for other drugs-only, excluding marijuana (OD), and 103 (12.2%) subjects showed positivity for mixed-use (MU). The ISS was similar for all groups. No differences were identified in Glasgow Coma Scale (GCS), ventilator days, blood administration, or ICU/hospital length of stay when comparing the MO group with the NEG group. Significant differences occurred between the OD group and the NEG/MO/MU groups for GCS, ICU length of stay, and hospital charges. Cannabis users suffering from severe injury demonstrated no detrimental outcomes in this study compared with nondrug users.

  20. Individually dosed omalizumab: an effective treatment for severe peanut allergy.

    Science.gov (United States)

    Brandström, J; Vetander, M; Lilja, G; Johansson, S G O; Sundqvist, A-C; Kalm, F; Nilsson, C; Nopp, A

    2017-04-01

    Treatment with omalizumab has shown a positive effect on food allergies, but no dosages are established. Basophil allergen threshold sensitivity (CD-sens) can be used to objectively measure omalizumab treatment efficacy and correlates with the outcome of double-blind placebo-controlled food challenge to peanut. To evaluate whether individualized omalizumab treatment monitored by CD-sens could be an effective intervention for suppression of allergic reactions to peanut. Severely peanut allergic adolescents (n = 23) were treated with omalizumab for 8 weeks, and CD-sens was analysed before and after. Based on whether CD-sens was suppressed after 8 weeks, the patients either were subject to a peanut challenge or received eight more weeks with increased dose of omalizumab, followed by peanut challenge or another 8-week cycle of omalizumab. IgE and IgE-antibodies to peanut and its components were analysed before treatment. After individualized omalizumab treatment (8-24 weeks), all patients continued with an open peanut challenge with no (n = 18) or mild (n = 5) objective allergic symptoms. Patients (n = 15) needing an elevated omalizumab dose (ED) to suppress CD-sens had significantly higher CD-sens values at baseline 1.49 (0.44-20.5) compared to those (n = 8) who managed with normal dose (ND) 0.32 (0.24-5.5) (P omalizumab, monitored by CD-sens, is an effective and safe treatment for severe peanut allergy. The ratio of IgE-ab to storage protein Ara h 2/IgE as well as CD-sens to peanut may predict the need of a higher omalizumab dose. Clinical trials numbers: EudraCT; 2012-005625-78, ClinicalTrials.gov; NCT02402231. © 2016 John Wiley & Sons Ltd.

  1. Effects of alcoholism severity and smoking on executive neurocognitive function.

    Science.gov (United States)

    Glass, Jennifer M; Buu, Anne; Adams, Kenneth M; Nigg, Joel T; Puttler, Leon I; Jester, Jennifer M; Zucker, Robert A

    2009-01-01

    Neurocognitive deficits in chronic alcoholic men are well documented. Impairments include memory, visual-spatial processing, problem solving and executive function. The cause of impairment could include direct effects of alcohol toxicity, pre-existing cognitive deficits that predispose towards substance abuse, comorbid psychiatric disorders and abuse of substances other than alcohol. Cigarette smoking occurs at higher rates in alcoholism and has been linked to poor cognitive performance, yet the effects of smoking on cognitive function in alcoholism are often ignored. We examined whether chronic alcoholism and chronic smoking have effects on executive function. Alcoholism and smoking were examined in a community-recruited sample of alcoholic and non-alcoholic men (n = 240) using standard neuropsychological and reaction-time measures of executive function. Alcoholism was measured as the average level of alcoholism diagnoses across the study duration (12 years). Smoking was measured in pack-years. Both alcoholism and smoking were correlated negatively with a composite executive function score. For component measures, alcoholism was correlated negatively with a broad range of measures, whereas smoking was correlated negatively with measures that emphasize response speed. In regression analyses, both smoking and alcoholism were significant predictors of executive function composite. However, when IQ is included in the regression analyses, alcoholism severity is no longer significant. Both smoking and alcoholism were related to executive function. However, the effect of alcoholism was not independent of IQ, suggesting a generalized effect, perhaps affecting a wide range of cognitive abilities of which executive function is a component. On the other hand, the effect of smoking on measures relying on response speed were independent of IQ, suggesting a more specific processing speed deficit associated with chronic smoking.

  2. A DETERMINISTIC METHOD FOR TRANSIENT, THREE-DIMENSIONAL NUETRON TRANSPORT

    International Nuclear Information System (INIS)

    S. GOLUOGLU, C. BENTLEY, R. DEMEGLIO, M. DUNN, K. NORTON, R. PEVEY I.SUSLOV AND H.L. DODDS

    1998-01-01

    A deterministic method for solving the time-dependent, three-dimensional Boltzmam transport equation with explicit representation of delayed neutrons has been developed and evaluated. The methodology used in this study for the time variable of the neutron flux is known as the improved quasi-static (IQS) method. The position, energy, and angle-dependent neutron flux is computed deterministically by using the three-dimensional discrete ordinates code TORT. This paper briefly describes the methodology and selected results. The code developed at the University of Tennessee based on this methodology is called TDTORT. TDTORT can be used to model transients involving voided and/or strongly absorbing regions that require transport theory for accuracy. This code can also be used to model either small high-leakage systems, such as space reactors, or asymmetric control rod movements. TDTORT can model step, ramp, step followed by another step, and step followed by ramp type perturbations. It can also model columnwise rod movement can also be modeled. A special case of columnwise rod movement in a three-dimensional model of a boiling water reactor (BWR) with simple adiabatic feedback is also included. TDTORT is verified through several transient one-dimensional, two-dimensional, and three-dimensional benchmark problems. The results show that the transport methodology and corresponding code developed in this work have sufficient accuracy and speed for computing the dynamic behavior of complex multidimensional neutronic systems

  3. Deterministic global optimization an introduction to the diagonal approach

    CERN Document Server

    Sergeyev, Yaroslav D

    2017-01-01

    This book begins with a concentrated introduction into deterministic global optimization and moves forward to present new original results from the authors who are well known experts in the field. Multiextremal continuous problems that have an unknown structure with Lipschitz objective functions and functions having the first Lipschitz derivatives defined over hyperintervals are examined. A class of algorithms using several Lipschitz constants is introduced which has its origins in the DIRECT (DIviding RECTangles) method. This new class is based on an efficient strategy that is applied for the search domain partitioning. In addition a survey on derivative free methods and methods using the first derivatives is given for both one-dimensional and multi-dimensional cases. Non-smooth and smooth minorants and acceleration techniques that can speed up several classes of global optimization methods with examples of applications and problems arising in numerical testing of global optimization algorithms are discussed...

  4. Comparison of probabilistic and deterministic fiber tracking of cranial nerves.

    Science.gov (United States)

    Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H

    2017-09-01

    PICo threshold increase is more effective for this task than the previously described deterministic tracking with a gradual FA threshold increase and might represent a method that is useful for depicting cranial nerves with DTI since it eliminates the erroneous fibers without manual intervention.

  5. Deterministic and probabilistic approach to safety analysis

    International Nuclear Information System (INIS)

    Heuser, F.W.

    1980-01-01

    The examples discussed in this paper show that reliability analysis methods fairly well can be applied in order to interpret deterministic safety criteria in quantitative terms. For further improved extension of applied reliability analysis it has turned out that the influence of operational and control systems and of component protection devices should be considered with the aid of reliability analysis methods in detail. Of course, an extension of probabilistic analysis must be accompanied by further development of the methods and a broadening of the data base. (orig.)

  6. Diffusion in Deterministic Interacting Lattice Systems

    Science.gov (United States)

    Medenjak, Marko; Klobas, Katja; Prosen, Tomaž

    2017-09-01

    We study reversible deterministic dynamics of classical charged particles on a lattice with hard-core interaction. It is rigorously shown that the system exhibits three types of transport phenomena, ranging from ballistic, through diffusive to insulating. By obtaining an exact expressions for the current time-autocorrelation function we are able to calculate the linear response transport coefficients, such as the diffusion constant and the Drude weight. Additionally, we calculate the long-time charge profile after an inhomogeneous quench and obtain diffusive profilewith the Green-Kubo diffusion constant. Exact analytical results are corroborated by Monte Carlo simulations.

  7. Effect of solar radiation on severity of soybean rust.

    Science.gov (United States)

    Young, Heather M; George, Sheeja; Narváez, Dario F; Srivastava, Pratibha; Schuerger, Andrew C; Wright, David L; Marois, James J

    2012-08-01

    Soybean rust (SBR), caused by Phakopsora pachyrhizi, is a damaging fungal disease of soybean (Glycine max). Although solar radiation can reduce SBR urediniospore survival, limited information is available on how solar radiation affects SBR progress within soybean canopies. Such information can aid in developing accurate SBR prediction models. To manipulate light penetration into soybean canopies, structures of shade cloth attenuating 30, 40, and 60% sunlight were constructed over soybean plots. In each plot, weekly evaluations of severity in lower, middle, and upper canopies, and daily temperature and relative humidity were recorded. Final plant height and leaf area index were also recorded for each plot. The correlation between amount of epicuticular wax and susceptibility of leaves in the lower, middle, and upper canopies was assessed with a detached leaf assay. Final disease severity was 46 to 150% greater in the lower canopy of all plots and in the middle canopy of 40 and 60% shaded plots. While daytime temperature within the canopy of nonshaded soybean was greater than shaded soybean by 2 to 3°C, temperatures recorded throughout typical evenings and mornings of the growing season in all treatments were within the range (10 to 28.5°C) for SBR development as was relative humidity. This indicates temperature and relative humidity were not limiting factors in this experiment. Epicuticular wax and disease severity in detached leaf assays from the upper canopy had significant negative correlation (P = 0.009, R = -0.84) regardless of shade treatment. In laboratory experiments, increasing simulated total solar radiation (UVA, UVB, and PAR) from 0.15 to 11.66 MJ m(-2) increased mortality of urediniospores from 2 to 91%. Variability in disease development across canopy heights in early planted soybean may be attributed to the effects of solar radiation not only on urediniospore viability, but also on plant height, leaf area index, and epicuticular wax, which influence

  8. The effects of severe mixed environmental pollution on human chromosomes.

    Science.gov (United States)

    Katsantoni, A; Nakou, S; Antoniadou-Koumatou, I; Côté, G B

    1986-01-01

    Cytogenetic studies were conducted on healthy young mothers, shortly after child birth, in two residential areas each with an approximate population of 20,000, situated about 25 km from Athens, Greece. One of the areas, Elefsis, is subject to severe mixed industrial pollution, and the other, Koropi, is relatively free of pollution. Chromosomal aberrations were investigated in 16 women from each area in 72 hour lymphocyte cultures treated with gentian violet to enhance any chromosomal instability induced by the pollution. The women were of a comparable socioeconomic level, aged between 20 and 31 years, and with no history of factors associated with mutagenesis. Venous blood samples were taken from the two groups and processed concurrently. The slides were coded and examined independently by two observers, who were unaware of the source of the samples. A total of 100 cells was examined on each sample. The two observers obtained highly comparable results. Women from Elefsis had an average of 0.42 anomalies per cell and those from Koropi had 0.39. The absence of a statistically significant difference between the two groups clearly shows that the severe mixed environmental pollution of Elefsis has no significant visible effect on human chromosomes in most residents. However, two Elefsis women had abnormal results and could be at risk. Their presence is not sufficient to raise significantly their group's average, but the induction by pollution of an increased rate of chromosomal anomalies in only a few people at risk could account for the known association between urban residence and cancer mortality. PMID:3783622

  9. Streamflow disaggregation: a nonlinear deterministic approach

    Directory of Open Access Journals (Sweden)

    B. Sivakumar

    2004-01-01

    Full Text Available This study introduces a nonlinear deterministic approach for streamflow disaggregation. According to this approach, the streamflow transformation process from one scale to another is treated as a nonlinear deterministic process, rather than a stochastic process as generally assumed. The approach follows two important steps: (1 reconstruction of the scalar (streamflow series in a multi-dimensional phase-space for representing the transformation dynamics; and (2 use of a local approximation (nearest neighbor method for disaggregation. The approach is employed for streamflow disaggregation in the Mississippi River basin, USA. Data of successively doubled resolutions between daily and 16 days (i.e. daily, 2-day, 4-day, 8-day, and 16-day are studied, and disaggregations are attempted only between successive resolutions (i.e. 2-day to daily, 4-day to 2-day, 8-day to 4-day, and 16-day to 8-day. Comparisons between the disaggregated values and the actual values reveal excellent agreements for all the cases studied, indicating the suitability of the approach for streamflow disaggregation. A further insight into the results reveals that the best results are, in general, achieved for low embedding dimensions (2 or 3 and small number of neighbors (less than 50, suggesting possible presence of nonlinear determinism in the underlying transformation process. A decrease in accuracy with increasing disaggregation scale is also observed, a possible implication of the existence of a scaling regime in streamflow.

  10. A mathematical theory for deterministic quantum mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Hooft, Gerard ' t [Institute for Theoretical Physics, Utrecht University (Netherlands); Spinoza Institute, Postbox 80.195, 3508 TD Utrecht (Netherlands)

    2007-05-15

    Classical, i.e. deterministic theories underlying quantum mechanics are considered, and it is shown how an apparent quantum mechanical Hamiltonian can be defined in such theories, being the operator that generates evolution in time. It includes various types of interactions. An explanation must be found for the fact that, in the real world, this Hamiltonian is bounded from below. The mechanism that can produce exactly such a constraint is identified in this paper. It is the fact that not all classical data are registered in the quantum description. Large sets of values of these data are assumed to be indistinguishable, forming equivalence classes. It is argued that this should be attributed to information loss, such as what one might suspect to happen during the formation and annihilation of virtual black holes. The nature of the equivalence classes follows from the positivity of the Hamiltonian. Our world is assumed to consist of a very large number of subsystems that may be regarded as approximately independent, or weakly interacting with one another. As long as two (or more) sectors of our world are treated as being independent, they all must be demanded to be restricted to positive energy states only. What follows from these considerations is a unique definition of energy in the quantum system in terms of the periodicity of the limit cycles of the deterministic model.

  11. Design of deterministic OS for SPLC

    International Nuclear Information System (INIS)

    Son, Choul Woong; Kim, Dong Hoon; Son, Gwang Seop

    2012-01-01

    Existing safety PLCs for using in nuclear power plants operates based on priority based scheduling, in which the highest priority task runs first. This type of scheduling scheme determines processing priorities when multiple requests for processing or when there is a lack of resources available for processing, guaranteeing execution of higher priority tasks. This type of scheduling is prone to exhaustion of resources and continuous preemptions by devices with high priorities, and therefore there is uncertainty every period in terms of smooth running of the overall system. Hence, it is difficult to apply this type of scheme to where deterministic operation is required, such as in nuclear power plant. Also, existing PLCs either have no output logic with regard to devices' redundant selection or it was set in a fixed way, and as a result it was extremely inefficient to use them for redundant systems such as that of a nuclear power plant and their use was limited. Therefore, functional modules that can manage and control all devices need to be developed by improving on the way priorities are assigned among the devices, making it more flexible. A management module should be able to schedule all devices of the system, manage resources, analyze states of the devices, and give warnings in case of abnormal situations, such as device fail or resource scarcity and decide on how to handle it. Also, the management module should have output logic for device redundancy, as well as deterministic processing capabilities, such as with regard to device interrupt events

  12. Deterministic prediction of surface wind speed variations

    Directory of Open Access Journals (Sweden)

    G. V. Drisya

    2014-11-01

    Full Text Available Accurate prediction of wind speed is an important aspect of various tasks related to wind energy management such as wind turbine predictive control and wind power scheduling. The most typical characteristic of wind speed data is its persistent temporal variations. Most of the techniques reported in the literature for prediction of wind speed and power are based on statistical methods or probabilistic distribution of wind speed data. In this paper we demonstrate that deterministic forecasting methods can make accurate short-term predictions of wind speed using past data, at locations where the wind dynamics exhibit chaotic behaviour. The predictions are remarkably accurate up to 1 h with a normalised RMSE (root mean square error of less than 0.02 and reasonably accurate up to 3 h with an error of less than 0.06. Repeated application of these methods at 234 different geographical locations for predicting wind speeds at 30-day intervals for 3 years reveals that the accuracy of prediction is more or less the same across all locations and time periods. Comparison of the results with f-ARIMA model predictions shows that the deterministic models with suitable parameters are capable of returning improved prediction accuracy and capturing the dynamical variations of the actual time series more faithfully. These methods are simple and computationally efficient and require only records of past data for making short-term wind speed forecasts within practically tolerable margin of errors.

  13. Use of deterministic methods in survey calculations for criticality problems

    International Nuclear Information System (INIS)

    Hutton, J.L.; Phenix, J.; Course, A.F.

    1991-01-01

    A code package using deterministic methods for solving the Boltzmann Transport equation is the WIMS suite. This has been very successful for a range of situations. In particular it has been used with great success to analyse trends in reactivity with a range of changes in state. The WIMS suite of codes have a range of methods and are very flexible in the way they can be combined. A wide variety of situations can be modelled ranging through all the current Thermal Reactor variants to storage systems and items of chemical plant. These methods have recently been enhanced by the introduction of the CACTUS method. This is based on a characteristics technique for solving the Transport equation and has the advantage that complex geometrical situations can be treated. In this paper the basis of the method is outlined and examples of its use are illustrated. In parallel with these developments the validation for out of pile situations has been extended to include experiments with relevance to criticality situations. The paper will summarise this evidence and show how these results point to a partial re-adoption of deterministic methods for some areas of criticality. The paper also presents results to illustrate the use of WIMS in criticality situations and in particular show how it can complement codes such as MONK when used for surveying the reactivity effect due to changes in geometry or materials. (Author)

  14. Correction of the deterministic part of space–charge interaction in momentum microscopy of charged particles

    Energy Technology Data Exchange (ETDEWEB)

    Schönhense, G., E-mail: schoenhense@uni-mainz.de [Institut für Physik, Johannes Gutenberg-Universität, 55128 Mainz (Germany); Medjanik, K. [Institut für Physik, Johannes Gutenberg-Universität, 55128 Mainz (Germany); Tusche, C. [Max-Planck-Institut für Mikrostrukturphysik, 06120 Halle (Germany); Loos, M. de; Geer, B. van der [Pulsar Physics, Burghstraat 47, 5614 BC Eindhoven (Netherlands); Scholz, M.; Hieke, F.; Gerken, N. [Physics Department and Center for Free-Electron Laser Science, Univ. Hamburg, 22761 Hamburg (Germany); Kirschner, J. [Max-Planck-Institut für Mikrostrukturphysik, 06120 Halle (Germany); Wurth, W. [Physics Department and Center for Free-Electron Laser Science, Univ. Hamburg, 22761 Hamburg (Germany); DESY Photon Science, 22607 Hamburg (Germany)

    2015-12-15

    Ultrahigh spectral brightness femtosecond XUV and X-ray sources like free electron lasers (FEL) and table-top high harmonics sources (HHG) offer fascinating experimental possibilities for analysis of transient states and ultrafast electron dynamics. For electron spectroscopy experiments using illumination from such sources, the ultrashort high-charge electron bunches experience strong space–charge interactions. The Coulomb interactions between emitted electrons results in large energy shifts and severe broadening of photoemission signals. We propose a method for a substantial reduction of the effect by exploiting the deterministic nature of space–charge interaction. The interaction of a given electron with the average charge density of all surrounding electrons leads to a rotation of the electron distribution in 6D phase space. Momentum microscopy gives direct access to the three momentum coordinates, opening a path for a correction of an essential part of space–charge interaction. In a first experiment with a time-of-flight momentum microscope using synchrotron radiation at BESSY, the rotation in phase space became directly visible. In a separate experiment conducted at FLASH (DESY), the energy shift and broadening of the photoemission signals were quantified. Finally, simulations of a realistic photoemission experiment including space–charge interaction reveals that a gain of an order of magnitude in resolution is possible using the correction technique presented here. - Highlights: • Photoemission spectromicroscopy with high-brightness pulsed sources is examined. • Deterministic interaction of an electron with the average charge density can be corrected. • Requires a cathode-lens type microscope optimized for best k-resolution in reciprocal plane. • Extractor field effectively separates pencil beam of secondary electrons from true signal. • Simulations reveal one order of magnitude gain in resolution.

  15. Inferring hierarchical clustering structures by deterministic annealing

    International Nuclear Information System (INIS)

    Hofmann, T.; Buhmann, J.M.

    1996-01-01

    The unsupervised detection of hierarchical structures is a major topic in unsupervised learning and one of the key questions in data analysis and representation. We propose a novel algorithm for the problem of learning decision trees for data clustering and related problems. In contrast to many other methods based on successive tree growing and pruning, we propose an objective function for tree evaluation and we derive a non-greedy technique for tree growing. Applying the principles of maximum entropy and minimum cross entropy, a deterministic annealing algorithm is derived in a meanfield approximation. This technique allows us to canonically superimpose tree structures and to fit parameters to averaged or open-quote fuzzified close-quote trees

  16. Mechanics from Newton's laws to deterministic chaos

    CERN Document Server

    Scheck, Florian

    2018-01-01

    This book covers all topics in mechanics from elementary Newtonian mechanics, the principles of canonical mechanics and rigid body mechanics to relativistic mechanics and nonlinear dynamics. It was among the first textbooks to include dynamical systems and deterministic chaos in due detail. As compared to the previous editions the present 6th edition is updated and revised with more explanations, additional examples and problems with solutions, together with new sections on applications in science.   Symmetries and invariance principles, the basic geometric aspects of mechanics as well as elements of continuum mechanics also play an important role. The book will enable the reader to develop general principles from which equations of motion follow, to understand the importance of canonical mechanics and of symmetries as a basis for quantum mechanics, and to get practice in using general theoretical concepts and tools that are essential for all branches of physics.   The book contains more than 150 problems ...

  17. Deterministic Diffusion in Delayed Coupled Maps

    International Nuclear Information System (INIS)

    Sozanski, M.

    2005-01-01

    Coupled Map Lattices (CML) are discrete time and discrete space dynamical systems used for modeling phenomena arising in nonlinear systems with many degrees of freedom. In this work, the dynamical and statistical properties of a modified version of the CML with global coupling are considered. The main modification of the model is the extension of the coupling over a set of local map states corresponding to different time iterations. The model with both stochastic and chaotic one-dimensional local maps is studied. Deterministic diffusion in the CML under variation of a control parameter is analyzed for unimodal maps. As a main result, simple relations between statistical and dynamical measures are found for the model and the cases where substituting nonlinear lattices with simpler processes is possible are presented. (author)

  18. Deterministic Chaos in Radon Time Variation

    International Nuclear Information System (INIS)

    Planinic, J.; Vukovic, B.; Radolic, V.; Faj, Z.; Stanic, D.

    2003-01-01

    Radon concentrations were continuously measured outdoors, in living room and basement in 10-minute intervals for a month. The radon time series were analyzed by comparing algorithms to extract phase-space dynamical information. The application of fractal methods enabled to explore the chaotic nature of radon in the atmosphere. The computed fractal dimensions, such as Hurst exponent (H) from the rescaled range analysis, Lyapunov exponent (λ ) and attractor dimension, provided estimates of the degree of chaotic behavior. The obtained low values of the Hurst exponent (0< H<0.5) indicated anti-persistent behavior (non random changes) of the time series, but the positive values of the λ pointed out the grate sensitivity on initial conditions and appearing deterministic chaos by radon time variations. The calculated fractal dimensions of attractors indicated more influencing (meteorological) parameters on radon in the atmosphere. (author)

  19. Radon time variations and deterministic chaos

    Energy Technology Data Exchange (ETDEWEB)

    Planinic, J. E-mail: planinic@pedos.hr; Vukovic, B.; Radolic, V

    2004-07-01

    Radon concentrations were continuously measured outdoors, in the living room and in the basement at 10 min intervals for a month. Radon time series were analyzed by comparing algorithms to extract phase space dynamical information. The application of fractal methods enabled exploration of the chaotic nature of radon in atmosphere. The computed fractal dimensions, such as the Hurst exponent (H) from the rescaled range analysis, Lyapunov exponent ({lambda}) and attractor dimension, provided estimates of the degree of chaotic behavior. The obtained low values of the Hurst exponent (0deterministic chaos that appeared due to radon time variations. The calculated fractal dimensions of attractors indicated more influencing (meteorological) parameters on radon in the atmosphere.

  20. Radon time variations and deterministic chaos

    International Nuclear Information System (INIS)

    Planinic, J.; Vukovic, B.; Radolic, V.

    2004-01-01

    Radon concentrations were continuously measured outdoors, in the living room and in the basement at 10 min intervals for a month. Radon time series were analyzed by comparing algorithms to extract phase space dynamical information. The application of fractal methods enabled exploration of the chaotic nature of radon in atmosphere. The computed fractal dimensions, such as the Hurst exponent (H) from the rescaled range analysis, Lyapunov exponent (λ) and attractor dimension, provided estimates of the degree of chaotic behavior. The obtained low values of the Hurst exponent (0< H<0.5) indicated anti-persistent behavior (non-random changes) of the time series, but the positive values of λ pointed out the grate sensitivity on initial conditions and the deterministic chaos that appeared due to radon time variations. The calculated fractal dimensions of attractors indicated more influencing (meteorological) parameters on radon in the atmosphere

  1. Deterministic SLIR model for tuberculosis disease mapping

    Science.gov (United States)

    Aziz, Nazrina; Diah, Ijlal Mohd; Ahmad, Nazihah; Kasim, Maznah Mat

    2017-11-01

    Tuberculosis (TB) occurs worldwide. It can be transmitted to others directly through air when active TB persons sneeze, cough or spit. In Malaysia, it was reported that TB cases had been recognized as one of the most infectious disease that lead to death. Disease mapping is one of the methods that can be used as the prevention strategies since it can displays clear picture for the high-low risk areas. Important thing that need to be considered when studying the disease occurrence is relative risk estimation. The transmission of TB disease is studied through mathematical model. Therefore, in this study, deterministic SLIR models are used to estimate relative risk for TB disease transmission.

  2. Primality deterministic and primality probabilistic tests

    Directory of Open Access Journals (Sweden)

    Alfredo Rizzi

    2007-10-01

    Full Text Available In this paper the A. comments the importance of prime numbers in mathematics and in cryptography. He remembers the very important researches of Eulero, Fermat, Legen-re, Rieman and others scholarships. There are many expressions that give prime numbers. Between them Mersenne’s primes have interesting properties. There are also many conjectures that still have to be demonstrated or rejected. The primality deterministic tests are the algorithms that permit to establish if a number is prime or not. There are not applicable in many practical situations, for instance in public key cryptography, because the computer time would be very long. The primality probabilistic tests consent to verify the null hypothesis: the number is prime. In the paper there are comments about the most important statistical tests.

  3. A continuous variable quantum deterministic key distribution based on two-mode squeezed states

    International Nuclear Information System (INIS)

    Gong, Li-Hua; Song, Han-Chong; Liu, Ye; Zhou, Nan-Run; He, Chao-Sheng

    2014-01-01

    The distribution of deterministic keys is of significance in personal communications, but the existing continuous variable quantum key distribution protocols can only generate random keys. By exploiting the entanglement properties of two-mode squeezed states, a continuous variable quantum deterministic key distribution (CVQDKD) scheme is presented for handing over the pre-determined key to the intended receiver. The security of the CVQDKD scheme is analyzed in detail from the perspective of information theory. It shows that the scheme can securely and effectively transfer pre-determined keys under ideal conditions. The proposed scheme can resist both the entanglement and beam splitter attacks under a relatively high channel transmission efficiency. (paper)

  4. Bacterial effect of accelerated electrons on several pathogens

    International Nuclear Information System (INIS)

    Butaev, M.K.; Bulkhanov, R.U.; Ryasnyanskii, I.V.; Mirzaev, B.Sh.; Safarov, A.N.; Suleymanov, R.D.

    2006-01-01

    Full text: Applied radiobiology assumes the use of results of fundamental studies of radiobiological phenomena which can serve as the basis for the development of certain technological processes. Radiation biotechnology (RBT) arises from applied radiobiology and is its natural sequel. RBT includes the development of the methods and techniques to use the energy of ionizing radiation in various areas of human activities including scientific research which is aimed at the production of biological medications which are useful for veterinary [1]. It is known that in order to obtain the vaccine for veterinary it is necessary to use physical, in particular, thermal and chemical effects. It was noted that in these processes the antigenic structure of bacterial cells which is responsible for creation of immunity is destroyed. The use of ionizing radiation as one of the factors of influence on bacteria gives the possibility to reduce the virulence to a minimum while preserving immunogenic properties of microorganism and not destroying its antigenic structure. The issue of simultaneous vaccination of the cubs of farm animals against several infectious diseases is important not only in theoretical but also in applied area since it determines the choice of strategy to increase the efficiency of preventive measures. Taking into account frequent cases of cubs' infection by pasteurellesis, salmonellosis and colibacterillesis in mono- and mixed forms, the radiation biotechnology to construct radio vaccines against infectious diseases was developed in laboratory of radiobiology of Research Institute of Veterinary of Uzbekistan (RIVU) [2]. Radiation biotechnology allows to produce highly effective mono-associated and polyvalent radio vaccines against the most widespread diseases of farm animals, especially the cubs. Using the mentioned radiation biotechnology, 'Associated radio vaccine against colibacterillesis and salmonellosis of small cattle' and A ssociated radio vaccine against

  5. Effects of muscle injury severity on localized bioimpedance measurements

    International Nuclear Information System (INIS)

    Nescolarde, L; Rosell-Ferrer, J; Yanguas, J; Lukaski, H; Alomar, X; Rodas, G

    2015-01-01

    Muscle injuries in the lower limb are common among professional football players. Classification is made according to severity and is diagnosed with radiological assessment as: grade I (minor strain or minor injury), grade II (partial rupture, moderate injury) and grade III (complete rupture, severe injury). Tetrapolar localized bioimpedance analysis (BIA) at 50 kHz made with a phase-sensitive analyzer was used to assess damage to the integrity of muscle structures and the fluid accumulation 24 h after injury in 21 injuries in the quadriceps, hamstring and calf, and was diagnosed with magnetic resonance imaging (MRI). The aim of this study was to identify the pattern of change in BIA variables as indicators of fluid [resistance (R)] and cell structure integrity [reactance (Xc) and phase angle (PA)] according to the severity of the MRI-defined injury. The % difference compared to the non-injured contralateral muscle also measured 24-h after injury of R, Xc and PA were respectively: grade I (n = 11; −10.4, −17.5 and −9.0%), grade II (n = 8; −18.4, −32.9 and −16.6%) and grade III (n = 2; −14.1, −52.9 and −43.1%), showing a greater significant decrease in Xc (p < 0.001). The greatest relative changes were in grade III injuries. However, decreases in R, that indicate fluid distribution, were not proportional to the severity of the injury. Disruption of the muscle structure, demonstrated by the localized determination of Xc, increased with the severity of muscle injury. The most significant changes 24 h after injury was the sizeable decrease in Xc that indicates a pattern of disrupted soft tissue structure, proportional to the severity of the injury. (paper)

  6. Effective Literacy Instruction for Students with Moderate or Severe Disabilities

    Science.gov (United States)

    Copeland, Susan R.; Keefe, Elizabeth B.

    2007-01-01

    For students with moderate or severe disabilities, developing literacy skills is a critical component of successful communication, employment, and community participation. Finally, educators have a practical, concise guidebook for helping these students meet NCLB's academic standards for literacy. Appropriate for use in all settings, including…

  7. Effect of asthma severity on symptom perception in childhood asthma

    Directory of Open Access Journals (Sweden)

    A.L.B. Cabral

    2002-03-01

    Full Text Available Individual ability to perceive airway obstruction varies substantially. The factors influencing the perception of asthma are probably numerous and not well established in children. The present study was designed to examine the influence of asthma severity, use of preventive medication, age and gender on the association between respiratory symptoms (RS and peak expiratory flow (PEF rates in asthmatic children. We followed 92 asthmatic children, aged 6 to 16 years, for five months. Symptom scores were recorded daily and PEF was measured twice a day. The correlations among variables at the within-person level over time were analyzed for each child and for the pooled data by multivariate analysis. After pooling the data, there was a significant (P<0.05 correlation between each symptom and PEF; 60% of the children were accurate perceivers (defined by a statistically significant correlation between symptoms and PEF across time for diurnal symptoms and 37% for nocturnal symptoms. The accuracy of perception was independent of asthma severity, age, gender or the use of preventive medication. Symptom perception is inaccurate in a substantial number of asthmatic children, independently of clinical severity, age, gender or use of preventive medication. It is not clear why some asthmatic patients are capable of accurately perceiving the severity of airway obstruction while others are not.

  8. CSL model checking of deterministic and stochastic Petri nets

    NARCIS (Netherlands)

    Martinez Verdugo, J.M.; Haverkort, Boudewijn R.H.M.; German, R.; Heindl, A.

    2006-01-01

    Deterministic and Stochastic Petri Nets (DSPNs) are a widely used high-level formalism for modeling discrete-event systems where events may occur either without consuming time, after a deterministic time, or after an exponentially distributed time. The underlying process dened by DSPNs, under

  9. Recognition of deterministic ETOL languages in logarithmic space

    DEFF Research Database (Denmark)

    Jones, Neil D.; Skyum, Sven

    1977-01-01

    It is shown that if G is a deterministic ETOL system, there is a nondeterministic log space algorithm to determine membership in L(G). Consequently, every deterministic ETOL language is recognizable in polynomial time. As a corollary, all context-free languages of finite index, and all Indian...

  10. Effect of Posttraumatic Serum Thyroid Hormone Levels on Severity and Mortality of Patients with Severe Traumatic Brain Injury

    Directory of Open Access Journals (Sweden)

    Forough Saki

    2012-02-01

    Full Text Available Traumatic brain injury (TBI is an important cause of death and disability in young adults ,and may lead to physical disabilities and long-term cognitive, behavioral psychological and social defects. There is a lack of definite result about the effect of thyroid hormones after traumatic brain injury in the severity and no data about their effect on mortality of the injury. The aim of this study is to evaluate the effect of thyroid hormones after traumatic brain injury in the severity and mortality and gain a clue in brain injury prognosis. In a longitudinal prospective study from February 2010 until February 2011, we checked serum levels of T3, T4, TSH and TBG of severely brain injured patients and compared the relationship of them with primary Glasgow Coma Scale (GCS score and mortality of patients. Statistical analysis used SPSS 11.5 software with using chi-square and Fisher exact test. Serum levels of T3 and T4 were decreased after brain trauma but not TSH and TBG. Mortality rates were higher in patients with lower T4 serum levels. The head injury was more severe in whom with low T3 and T4. Follow a severe brain injury a secondary hypothyroidism is happened due to pituitary dysfunction. Also, serum level of T3 and T4 on the first day admission affect on primary GCS score of patients which is an indicator of severity of brain injury. In addition, mortality rates of severely brain injured patients have a high correlation with the serum level of T4 in the first day admission.

  11. Experimental aspects of deterministic secure quantum key distribution

    Energy Technology Data Exchange (ETDEWEB)

    Walenta, Nino; Korn, Dietmar; Puhlmann, Dirk; Felbinger, Timo; Hoffmann, Holger; Ostermeyer, Martin [Universitaet Potsdam (Germany). Institut fuer Physik; Bostroem, Kim [Universitaet Muenster (Germany)

    2008-07-01

    Most common protocols for quantum key distribution (QKD) use non-deterministic algorithms to establish a shared key. But deterministic implementations can allow for higher net key transfer rates and eavesdropping detection rates. The Ping-Pong coding scheme by Bostroem and Felbinger[1] employs deterministic information encoding in entangled states with its characteristic quantum channel from Bob to Alice and back to Bob. Based on a table-top implementation of this protocol with polarization-entangled photons fundamental advantages as well as practical issues like transmission losses, photon storage and requirements for progress towards longer transmission distances are discussed and compared to non-deterministic protocols. Modifications of common protocols towards a deterministic quantum key distribution are addressed.

  12. Location deterministic biosensing from quantum-dot-nanowire assemblies

    International Nuclear Information System (INIS)

    Liu, Chao; Kim, Kwanoh; Fan, D. L.

    2014-01-01

    Semiconductor quantum dots (QDs) with high fluorescent brightness, stability, and tunable sizes, have received considerable interest for imaging, sensing, and delivery of biomolecules. In this research, we demonstrate location deterministic biochemical detection from arrays of QD-nanowire hybrid assemblies. QDs with diameters less than 10 nm are manipulated and precisely positioned on the tips of the assembled Gold (Au) nanowires. The manipulation mechanisms are quantitatively understood as the synergetic effects of dielectrophoretic (DEP) and alternating current electroosmosis (ACEO) due to AC electric fields. The QD-nanowire hybrid sensors operate uniquely by concentrating bioanalytes to QDs on the tips of nanowires before detection, offering much enhanced efficiency and sensitivity, in addition to the position-predictable rationality. This research could result in advances in QD-based biomedical detection and inspires an innovative approach for fabricating various QD-based nanodevices.

  13. Development of a model for unsteady deterministic stresses adapted to the multi-stages turbomachines simulation; Developpement d'un modele de tensions deterministes instationnaires adapte a la simulation de turbomachines multi-etagees

    Energy Technology Data Exchange (ETDEWEB)

    Charbonnier, D.

    2004-12-15

    The physical phenomena observed in turbomachines are generally three-dimensional and unsteady. A recent study revealed that a three-dimensional steady simulation can reproduce the time-averaged unsteady phenomena, since the steady flow field equations integrate deterministic stresses. The objective of this work is thus to develop an unsteady deterministic stresses model. The analogy with turbulence makes it possible to write transport equations for these stresses. The equations are implemented in steady flow solver and e model for the energy deterministic fluxes is also developed and implemented. Finally, this work shows that a three-dimensional steady simulation, by taking into account unsteady effects with transport equations of deterministic stresses, increases the computing time by only approximately 30 %, which remains very interesting compared to an unsteady simulation. (author)

  14. Suppression Pools: paradigm of the thermalhydraulic effect on severe accidents

    International Nuclear Information System (INIS)

    Herranz, L. E.; Lopez del Pra, C.

    2016-01-01

    Influence of thermal-hydrualic phenomena on severe accident unforlding is beyond question. The present paper supports this statement on two key aspects of a severe accident: preservation of containment integrity and transport of fission products once released from fuel. To illustrate them, the attention is focused on suppression pools performance and, particularly, on some recent findings stemming from authors research of Fukushima scenarios. Gas behvaior at the injection point and its later evolution, potential axial and/or azimuthal stratification of the aqueous body or water saturation state, are some of the processes tha more strongly affect the role of pools as a mass and energy sink. They are described and discussed in detail. (Author)

  15. The effect of bacterial sepsis severity on triglyceride value

    Science.gov (United States)

    Fahila, R.; Kembaren, T.; Rahimi, A.

    2018-03-01

    Sepsis can increase the amount of triglyceride as well as change the functional and structural components of lipoproteins. The triglyceride level is directly proportional to the severity of sepsis and associated with a systemic inflammatory response. The study aims to determine the correlation between the severity of bacterial sepsis with triglyceride value. An observational study with case control design from January2017 to March 2017 in 30 sepsis and 30 non-sepsis patients at H. Adam Malik General Hospital Medan. We examined Procalcitonin (PCT) and triglyceride level on the 1st, 3rd and 5th day and then analyzed using MannWhitney to assess their correlation.The triglyceride value in the sepsis group was 120 ± 5.1 mg/dl on day 1, non-sepsis 117.53 ± 36.37mg/dl. However, on the fifth day, the sepsis group of triglyceride values was 124.2±50.29mg/dl and the non-sepsis group triglyceride values 134.03±68.12mg/dl. There was no specific connection between the severity of sepsis and triglyceride value in a patient with sepsis.

  16. The antiproliferative effect of coumarins on several cancer cell lines.

    Science.gov (United States)

    Kawaii, S; Tomono, Y; Ogawa, K; Sugiura, M; Yano, M; Yoshizawa, Y

    2001-01-01

    Twenty-one coumarins were examined for their antiproliferative activity towards several cancer cell lines, namely lung carcinoma (A549), melanin pigment producing mouse melanoma (B16 melanoma 4A5), human T-cell leukemia (CCRF-HSB-2), and human gastric cancer, lymph node metastasized (TGBC11TKB). The structure-activity relationship established from the results revealed that the 6,7-dihydroxy moiety had an important role for their antiproliferative activity. Analysis of cell cycle distribution indicated that esculetin-treated cells accumulated in the G1 (at 400 microM) or in S phase (at 100 microM).

  17. Antiproliferative effect of isopentenylated coumarins on several cancer cell lines.

    Science.gov (United States)

    Kawaii, S; Tomono, Y; Ogawa, K; Sugiura, M; Yano, M; Yoshizawa, Y; Ito, C; Furukawa, H

    2001-01-01

    33 coumarins, mainly the simple isopentenylated coumarins and derived pyrano- and furanocoumarins, were examined for their antiproliferative activity towards several cancer and normal human cell lines. The pyrano- and furanocoumarins showed strong activity against the cancer cell lines, whereas they had weak antiproliferative activity against the normal human cell lines. The decreasing rank order of potency was osthenone (10), clausarin (25), clausenidin (26), dentatin (24), nordentatin (23), imperatorin (29), seselin (27), xanthyletin (21), suberosin (17), phebalosin (8) and osthol (12). The structure-activity relationship established from the results revealed that the 1,1-dimethylallyl and isopentenyl groups have an important role for antiproliferative activity.

  18. Deterministic models for energy-loss straggling

    International Nuclear Information System (INIS)

    Prinja, A.K.; Gleicher, F.; Dunham, G.; Morel, J.E.

    1999-01-01

    Inelastic ion interactions with target electrons are dominated by extremely small energy transfers that are difficult to resolve numerically. The continuous-slowing-down (CSD) approximation is then commonly employed, which, however, only preserves the mean energy loss per collision through the stopping power, S(E) = ∫ 0 ∞ dEprime (E minus Eprime) σ s (E → Eprime). To accommodate energy loss straggling, a Gaussian distribution with the correct mean-squared energy loss (akin to a Fokker-Planck approximation in energy) is commonly used in continuous-energy Monte Carlo codes. Although this model has the unphysical feature that ions can be upscattered, it nevertheless yields accurate results. A multigroup model for energy loss straggling was recently presented for use in multigroup Monte Carlo codes or in deterministic codes that use multigroup data. The method has the advantage that the mean and mean-squared energy loss are preserved without unphysical upscatter and hence is computationally efficient. Results for energy spectra compared extremely well with Gaussian distributions under the idealized conditions for which the Gaussian may be considered to be exact. Here, the authors present more consistent comparisons by extending the method to accommodate upscatter and, further, compare both methods with exact solutions obtained from an analog Monte Carlo simulation, for a straight-ahead transport problem

  19. A Deterministic Approach to Earthquake Prediction

    Directory of Open Access Journals (Sweden)

    Vittorio Sgrigna

    2012-01-01

    Full Text Available The paper aims at giving suggestions for a deterministic approach to investigate possible earthquake prediction and warning. A fundamental contribution can come by observations and physical modeling of earthquake precursors aiming at seeing in perspective the phenomenon earthquake within the framework of a unified theory able to explain the causes of its genesis, and the dynamics, rheology, and microphysics of its preparation, occurrence, postseismic relaxation, and interseismic phases. Studies based on combined ground and space observations of earthquake precursors are essential to address the issue. Unfortunately, up to now, what is lacking is the demonstration of a causal relationship (with explained physical processes and looking for a correlation between data gathered simultaneously and continuously by space observations and ground-based measurements. In doing this, modern and/or new methods and technologies have to be adopted to try to solve the problem. Coordinated space- and ground-based observations imply available test sites on the Earth surface to correlate ground data, collected by appropriate networks of instruments, with space ones detected on board of Low-Earth-Orbit (LEO satellites. Moreover, a new strong theoretical scientific effort is necessary to try to understand the physics of the earthquake.

  20. Deterministic Approach to Detect Heart Sound Irregularities

    Directory of Open Access Journals (Sweden)

    Richard Mengko

    2017-07-01

    Full Text Available A new method to detect heart sound that does not require machine learning is proposed. The heart sound is a time series event which is generated by the heart mechanical system. From the analysis of heart sound S-transform and the understanding of how heart works, it can be deducted that each heart sound component has unique properties in terms of timing, frequency, and amplitude. Based on these facts, a deterministic method can be designed to identify each heart sound components. The recorded heart sound then can be printed with each component correctly labeled. This greatly help the physician to diagnose the heart problem. The result shows that most known heart sounds were successfully detected. There are some murmur cases where the detection failed. This can be improved by adding more heuristics including setting some initial parameters such as noise threshold accurately, taking into account the recording equipment and also the environmental condition. It is expected that this method can be integrated into an electronic stethoscope biomedical system.

  1. Deterministic dense coding and entanglement entropy

    International Nuclear Information System (INIS)

    Bourdon, P. S.; Gerjuoy, E.; McDonald, J. P.; Williams, H. T.

    2008-01-01

    We present an analytical study of the standard two-party deterministic dense-coding protocol, under which communication of perfectly distinguishable messages takes place via a qudit from a pair of nonmaximally entangled qudits in a pure state |ψ>. Our results include the following: (i) We prove that it is possible for a state |ψ> with lower entanglement entropy to support the sending of a greater number of perfectly distinguishable messages than one with higher entanglement entropy, confirming a result suggested via numerical analysis in Mozes et al. [Phys. Rev. A 71, 012311 (2005)]. (ii) By explicit construction of families of local unitary operators, we verify, for dimensions d=3 and d=4, a conjecture of Mozes et al. about the minimum entanglement entropy that supports the sending of d+j messages, 2≤j≤d-1; moreover, we show that the j=2 and j=d-1 cases of the conjecture are valid in all dimensions. (iii) Given that |ψ> allows the sending of K messages and has √(λ 0 ) as its largest Schmidt coefficient, we show that the inequality λ 0 ≤d/K, established by Wu et al. [Phys. Rev. A 73, 042311 (2006)], must actually take the form λ 0 < d/K if K=d+1, while our constructions of local unitaries show that equality can be realized if K=d+2 or K=2d-1

  2. Analysis of pinching in deterministic particle separation

    Science.gov (United States)

    Risbud, Sumedh; Luo, Mingxiang; Frechette, Joelle; Drazer, German

    2011-11-01

    We investigate the problem of spherical particles vertically settling parallel to Y-axis (under gravity), through a pinching gap created by an obstacle (spherical or cylindrical, center at the origin) and a wall (normal to X axis), to uncover the physics governing microfluidic separation techniques such as deterministic lateral displacement and pinched flow fractionation: (1) theoretically, by linearly superimposing the resistances offered by the wall and the obstacle separately, (2) computationally, using the lattice Boltzmann method for particulate systems and (3) experimentally, by conducting macroscopic experiments. Both, theory and simulations, show that for a given initial separation between the particle centre and the Y-axis, presence of a wall pushes the particles closer to the obstacle, than its absence. Experimentally, this is expected to result in an early onset of the short-range repulsive forces caused by solid-solid contact. We indeed observe such an early onset, which we quantify by measuring the asymmetry in the trajectories of the spherical particles around the obstacle. This work is partially supported by the National Science Foundation Grant Nos. CBET- 0731032, CMMI-0748094, and CBET-0954840.

  3. Effect of severe kwashiorkor on intellectual development among Nigerian children.

    Science.gov (United States)

    Nwuga, V C

    1977-09-01

    A study was conducted to investigated the intellectual sequelae of severe kwashiorkor among Nigerian children of school age. The design for the study had an experimental urban kwashiorkor (index) group and four control groups, namely, a sibling group, a lower class group, an upper class group, and a rural kwashiorkor group. Various psychological tests measuring specific intellectual abilities were administered to all of the subjects taking part in the study. The findings showed that the index group had lower levels of certain types of intellectual skill-specifically the higher cognitive skills-at school age than their siblings, and more so than other controls except their rural counterparts; males showed a tendency to be more affected by severe kwashiorkor with regard to mental development than their female counterparts; there was no relationship between scores in the psychological tests and the ages at which the index cases were admitted into hospital; the upper class was clearly superior in performance of the tests and also in measures of weight and head measurements when compared to all of the other groups; there was no relationship between head circumference and scores in the tests among subjects in the five groups.

  4. The effective potential in the presence of several mass scales

    International Nuclear Information System (INIS)

    Casas, J.A.; Di Clemente, V.; Quiros, M.

    1999-01-01

    We consider the problem of improving the effective potential in mass independent schemes, as e.g. the MS-bar or DR-bar renormalization scheme, in the presence of an arbitrary number of fields with PHI-dependent masses M i(PHI c ) . We use the decoupling theorem at the scales μ i M i (PHI c ) such that the matching between the effective (low energy) and complete (high energy) one-loop theories contains no thresholds. We find that for any value of PHI c , there is a convenient scale μ * ≡ min i M i (PHI c ), at which the loop expansion has the best behaviour and the effective potential has the least μ-dependence. Furthermore, at this scale the effective potential coincides with the (improved) tree-level one in the effective field theory. The decoupling method is explicitly illustrated with a simple Higgs-Yukawa model, along with its relationship with other decoupling prescriptions and with proposed multi-scale renormalization approaches. The procedure leads to a nice suppression of potentially large logarithms and can be easily adapted to include higher-loop effects, which is explicitly shown at the two-loop level

  5. Realization of deterministic quantum teleportation with solid state qubits

    International Nuclear Information System (INIS)

    Andreas Wallfraff

    2014-01-01

    Using modern micro and nano-fabrication techniques combined with superconducting materials we realize electronic circuits the dynamics of which are governed by the laws of quantum mechanics. Making use of the strong interaction of photons with superconducting quantum two-level systems realized in these circuits we investigate both fundamental quantum effects of light and applications in quantum information processing. In this talk I will discuss the deterministic teleportation of a quantum state in a macroscopic quantum system. Teleportation may be used for distributing entanglement between distant qubits in a quantum network and for realizing universal and fault-tolerant quantum computation. Previously, we have demonstrated the implementation of a teleportation protocol, up to the single-shot measurement step, with three superconducting qubits coupled to a single microwave resonator. Using full quantum state tomography and calculating the projection of the measured density matrix onto the basis of two qubits has allowed us to reconstruct the teleported state with an average output state fidelity of 86%. Now we have realized a new device in which four qubits are coupled pair-wise to three resonators. Making use of parametric amplifiers coupled to the output of two of the resonators we are able to perform high-fidelity single-shot read-out. This has allowed us to demonstrate teleportation by individually post-selecting on any Bell-state and by deterministically distinguishing between all four Bell states measured by the sender. In addition, we have recently implemented fast feed-forward to complete the teleportation process. In all instances, we demonstrate that the fidelity of the teleported states are above the threshold imposed by classical physics. The presented experiments are expected to contribute towards realizing quantum communication with microwave photons in the foreseeable future. (author)

  6. Deterministic calculations of radiation doses from brachytherapy seeds

    International Nuclear Information System (INIS)

    Reis, Sergio Carneiro dos; Vasconcelos, Vanderley de; Santos, Ana Maria Matildes dos

    2009-01-01

    Brachytherapy is used for treating certain types of cancer by inserting radioactive sources into tumours. CDTN/CNEN is developing brachytherapy seeds to be used mainly in prostate cancer treatment. Dose calculations play a very significant role in the characterization of the developed seeds. The current state-of-the-art of computation dosimetry relies on Monte Carlo methods using, for instance, MCNP codes. However, deterministic calculations have some advantages, as, for example, short computer time to find solutions. This paper presents a software developed to calculate doses in a two-dimensional space surrounding the seed, using a deterministic algorithm. The analysed seeds consist of capsules similar to IMC6711 (OncoSeed), that are commercially available. The exposure rates and absorbed doses are computed using the Sievert integral and the Meisberger third order polynomial, respectively. The software also allows the isodose visualization at the surface plan. The user can choose between four different radionuclides ( 192 Ir, 198 Au, 137 Cs and 60 Co). He also have to enter as input data: the exposure rate constant; the source activity; the active length of the source; the number of segments in which the source will be divided; the total source length; the source diameter; and the actual and effective source thickness. The computed results were benchmarked against results from literature and developed software will be used to support the characterization process of the source that is being developed at CDTN. The software was implemented using Borland Delphi in Windows environment and is an alternative to Monte Carlo based codes. (author)

  7. The european flood alert system EFAS – Part 2: Statistical skill assessment of probabilistic and deterministic operational forecasts

    Directory of Open Access Journals (Sweden)

    J. C. Bartholmes

    2009-02-01

    Full Text Available Since 2005 the European Flood Alert System (EFAS has been producing probabilistic hydrological forecasts in pre-operational mode at the Joint Research Centre (JRC of the European Commission. EFAS aims at increasing preparedness for floods in trans-national European river basins by providing medium-range deterministic and probabilistic flood forecasting information, from 3 to 10 days in advance, to national hydro-meteorological services.

    This paper is Part 2 of a study presenting the development and skill assessment of EFAS. In Part 1, the scientific approach adopted in the development of the system has been presented, as well as its basic principles and forecast products. In the present article, two years of existing operational EFAS forecasts are statistically assessed and the skill of EFAS forecasts is analysed with several skill scores. The analysis is based on the comparison of threshold exceedances between proxy-observed and forecasted discharges. Skill is assessed both with and without taking into account the persistence of the forecasted signal during consecutive forecasts.

    Skill assessment approaches are mostly adopted from meteorology and the analysis also compares probabilistic and deterministic aspects of EFAS. Furthermore, the utility of different skill scores is discussed and their strengths and shortcomings illustrated. The analysis shows the benefit of incorporating past forecasts in the probability analysis, for medium-range forecasts, which effectively increases the skill of the forecasts.

  8. Quantal health effects for a combination of several toxic agents

    Energy Technology Data Exchange (ETDEWEB)

    Seiler, F A

    1988-12-01

    Quantal health effects caused by the combined action of a number of toxic agents are modeled using the information available for each toxicant acting in isolation. Two basic models are used; one assumes no interaction, the other postulates a separable kind of interaction in which each agent contributes an enhancement factor independent of all other agents. These two models provide yardsticks by which to measure synergisms and antagonisms in the interaction between the effects of toxic agents. Equations are given in approximations for small and large values of the risk. (author)

  9. Quantal health effects for a combination of several toxic agents

    International Nuclear Information System (INIS)

    Seiler, F.A.

    1988-01-01

    Quantal health effects caused by the combined action of a number of toxic agents are modeled using the information available for each toxicant acting in isolation. Two basic models are used; one assumes no interaction, the other postulates a separable kind of interaction in which each agent contributes an enhancement factor independent of all other agents. These two models provide yardsticks by which to measure synergisms and antagonisms in the interaction between the effects of toxic agents. Equations are given in approximations for small and large values of the risk. (author)

  10. Rootstock effects on almond leaf scorch disease incidence and severity

    Science.gov (United States)

    A five-year field study was conducted to evaluate effects of duration and exclusion of Xylella fastidiosa infections on young almond tree performance and their links to tree vigor. ‘Nemaguard’, ‘Okinawa’, ‘Nonpareil’, and Y119 were used as rootstocks for almond scion ‘Sonora’. Among X.fastidiosa-inf...

  11. Effects of several physiochemical factors on cell growth and gallic ...

    African Journals Online (AJOL)

    The production of gallic acid in cell suspension culture of Acer ginnala Maxim was studied. Some physiochemical factors and chemical substances effect on the cell growth and the production of gallic acid were investigated. Cells harvested from plant tissue culture were extracted and applied to high performance liquid ...

  12. The Economic Crisis and Several Effects on Global Economy

    Directory of Open Access Journals (Sweden)

    Florina BRAN

    2011-01-01

    Full Text Available The main mechanism of profit making is not production according to the outcomes of several analyses of the current economic crisis. This mechanism is circulation and exchange. Starting with this observation the paper goes through a number of aspects regarding the relation between crisis and economy at global level. These aspects consist in the recent financial turmoil; who pays for the crisis; stabilizing the financial sector; recession and the financial crisis; the internationalization of the crisis; commodities and the ecological crisis; an end to neo-liberalism; what should socialists demand. We notice and comment on how important current development in the wake of the banking crisis is for the transmission of that crisis to the rest of the economy and its interaction with the more general economic crisis now emerging. It was concluded that there are good chances that the current economic order to be broken. The future shape of the order will depend more on vision of managers than on the influence of the so called objective factors.

  13. Stochastic Modeling and Deterministic Limit of Catalytic Surface Processes

    DEFF Research Database (Denmark)

    Starke, Jens; Reichert, Christian; Eiswirth, Markus

    2007-01-01

    Three levels of modeling, microscopic, mesoscopic and macroscopic are discussed for the CO oxidation on low-index platinum single crystal surfaces. The introduced models on the microscopic and mesoscopic level are stochastic while the model on the macroscopic level is deterministic. It can......, such that in contrast to the microscopic model the spatial resolution is reduced. The derivation of deterministic limit equations is in correspondence with the successful description of experiments under low-pressure conditions by deterministic reaction-diffusion equations while for intermediate pressures phenomena...

  14. Operational State Complexity of Deterministic Unranked Tree Automata

    Directory of Open Access Journals (Sweden)

    Xiaoxue Piao

    2010-08-01

    Full Text Available We consider the state complexity of basic operations on tree languages recognized by deterministic unranked tree automata. For the operations of union and intersection the upper and lower bounds of both weakly and strongly deterministic tree automata are obtained. For tree concatenation we establish a tight upper bound that is of a different order than the known state complexity of concatenation of regular string languages. We show that (n+1 ( (m+12^n-2^(n-1 -1 vertical states are sufficient, and necessary in the worst case, to recognize the concatenation of tree languages recognized by (strongly or weakly deterministic automata with, respectively, m and n vertical states.

  15. Exploring the stochastic and deterministic aspects of cyclic emission variability on a high speed spark-ignition engine

    International Nuclear Information System (INIS)

    Karvountzis-Kontakiotis, A.; Dimaratos, A.; Ntziachristos, L.; Samaras, Z.

    2017-01-01

    This study contributes to the understanding of cycle-to-cycle emissions variability (CEV) in premixed spark-ignition combustion engines. A number of experimental investigations of cycle-to-cycle combustion variability (CCV) exist in published literature; however only a handful of studies deal with CEV. This study experimentally investigates the impact of CCV on CEV of NO and CO, utilizing experimental results from a high-speed spark-ignition engine. Both CEV and CCV are shown to comprise a deterministic and a stochastic component. Results show that at maximum break torque (MBT) operation, the indicated mean effective pressure (IMEP) maximizes and its coefficient of variation (COV_I_M_E_P) minimizes, leading to minimum variation of NO. NO variability and hence mean NO levels can be reduced by more than 50% and 30%, respectively, at advanced ignition timing, by controlling the deterministic CCV using cycle resolved combustion control. The deterministic component of CEV increases at lean combustion (lambda = 1.12) and this overall increases NO variability. CEV was also found to decrease with engine load. At steady speed, increasing throttle position from 20% to 80%, decreased COV_I_M_E_P, COV_N_O and COV_C_O by 59%, 46%, and 6% respectively. Highly resolved engine control, by means of cycle-to-cycle combustion control, appears as key to limit the deterministic feature of cyclic variability and by that to overall reduce emission levels. - Highlights: • Engine emissions variability comprise both stochastic and deterministic components. • Lean and diluted combustion conditions increase emissions variability. • Advanced ignition timing enhances the deterministic component of variability. • Load increase decreases the deterministic component of variability. • The deterministic component can be reduced by highly resolved combustion control.

  16. Observation of magnetooptical effects in several high Tc superconductors

    International Nuclear Information System (INIS)

    Dillon, J.F. Jr; Lyons, K.B.

    1992-01-01

    Recent so called 'anyon' theories of high temperature superconductivity in layer structure materials suggested that at some temperature T TP ≥T c there is a symmetry breaking transition below which these materials may be in either of two distinct states related to each other by time reversal. The study of magneto-optical effects in superconductors reviewed here was undertaken to explore time reversal symmetry of these materials. Using novel technique with rotating λ/2 plate at 525 nm, 'circular dichroism' was observed on reflection from epitaxial films and single crystals of cuprate superconductor with layer structures. The onset of dichroism was at temperatures of ∼ 180K to ∼ 300K. These results appear to support the 'anyon' theories. However, circular dichroism was also seen in films and single crystals of bismuthate superconductors with cubic structure, to which the theories seem inapplicable. In sharp contrast, Spielman et al., at Stanford in a very sensitive experiment at 1060 nm have seen no evidence of non-reciprocal circular birefringence in epitaxial cuprate superconducting films. Weber et al. at Dortmund have recently reported the observation at 633 nm of non-reciprocal magneto-optical effects on single crystals of cuprate superconductors, but none on films. (author). 15 refs., 5 figs

  17. Deterministic factor analysis: methods of integro-differentiation of non-integral order

    Directory of Open Access Journals (Sweden)

    Valentina V. Tarasova

    2016-12-01

    Full Text Available Objective to summarize the methods of deterministic factor economic analysis namely the differential calculus and the integral method. nbsp Methods mathematical methods for integrodifferentiation of nonintegral order the theory of derivatives and integrals of fractional nonintegral order. Results the basic concepts are formulated and the new methods are developed that take into account the memory and nonlocality effects in the quantitative description of the influence of individual factors on the change in the effective economic indicator. Two methods are proposed for integrodifferentiation of nonintegral order for the deterministic factor analysis of economic processes with memory and nonlocality. It is shown that the method of integrodifferentiation of nonintegral order can give more accurate results compared with standard methods method of differentiation using the first order derivatives and the integral method using the integration of the first order for a wide class of functions describing effective economic indicators. Scientific novelty the new methods of deterministic factor analysis are proposed the method of differential calculus of nonintegral order and the integral method of nonintegral order. Practical significance the basic concepts and formulas of the article can be used in scientific and analytical activity for factor analysis of economic processes. The proposed method for integrodifferentiation of nonintegral order extends the capabilities of the determined factorial economic analysis. The new quantitative method of deterministic factor analysis may become the beginning of quantitative studies of economic agents behavior with memory hereditarity and spatial nonlocality. The proposed methods of deterministic factor analysis can be used in the study of economic processes which follow the exponential law in which the indicators endogenous variables are power functions of the factors exogenous variables including the processes

  18. Effect of pancreatic kallikrein combined with magnesium sulfate therapy on disease severity of patients with severe preeclampsia

    Directory of Open Access Journals (Sweden)

    Yi Yang

    2016-10-01

    Full Text Available Objective: To study the clinical effect of pancreatic kallikrein combined with magnesium sulfate therapy on severe preeclampsia. Methods: A total of 47 patients with severe preeclampsia treated in our hospital from May 2013 to October 2015 were retrospectively analyzed, patients who received pancreatic kallikrein combined with magnesium sulfate therapy were selected as observation group, patients who received magnesium sulfate therapy were selected as control group, and then blood pressure level, renal function, uterine artery and umbilical artery blood flow state, and hypoxia indexes in serum and placenta were compared between two groups. Results: After treatment, systolic pressure and diastolic pressure levels as well as serum CysC level and 24 h urine protein level of observation group were significantly lower than those of control group, the uterine artery and umbilical artery S/ D and PI were significantly lower than those of control group, PIGF and NO levels in serum as well as Xiap and Survivin levels in placenta were significantly higher than those of control group, and sVEGFR1 level in serum as well as Caspase-3 and Caspase-7 levels in placenta was significantly lower than those of control group. Conclusions: Pancreatic kallikrein combined with magnesium sulfate therapy can alleviate the disease severity, lower blood pressure and vascular resistance, and improve renal function and the apoptosis caused by placental hypoxia in patients with severe preeclampsia.

  19. Cost-effectiveness of roflumilast in combination with bronchodilator therapies in patients with severe and very severe COPD in Switzerland

    Directory of Open Access Journals (Sweden)

    Samyshkin Y

    2013-01-01

    Full Text Available Yevgeniy Samyshkin,1 Michael Schlunegger,2 Susan Haefliger,3 Sabine Ledderhose,3 Matthew Radford11IMS Health, Health Economics and Outcomes Research, London, United Kingdom; 2Marketing Specialty Care, 3Medical Department, Takeda Pharma AG, Pfäffikon, SwitzerlandObjective: Chronic obstructive pulmonary disease (COPD represents a burden on patients and health systems. Roflumilast, an oral, selective phosphodiesterase-4-inhibitor reduces exacerbations and improves lung function in severe/very severe COPD patients with a history of exacerbations. This study aimed to estimate the lifetime cost and outcomes of roflumilast added-on to commonly used COPD regimens in Switzerland.Methods: A Markov cohort model was developed to simulate COPD progression in patients with disease states of severe, very severe COPD, and death. The exacerbation rate was assumed to be two per year in severe COPD. COPD progression rates were drawn from the published literature. Efficacy was expressed as relative ratios of exacerbation rates associated with roflumilast, derived from a mixed-treatment comparison. A cost-effectiveness analysis was conducted for roflumilast added to long-acting muscarinic antagonists (LAMA, long-acting ß2-agonist/inhaled corticosteroids (LABA/ICS, and LAMA + LABA/ICS. The analysis was conducted from the Swiss payer perspective, with costs and outcomes discounted at 2.5% annually. Parameter uncertainties were explored in one-way and probabilistic sensitivity analyses.Results: In each of the comparator regimens mean life expectancy was 9.28 years and quality-adjusted life years (QALYs gained were 6.19. Mean estimated lifetime costs per patient in the comparator arms were CHF 83,364 (LAMA, CHF 88,161 (LABA/ICS, and CHF 95,564 (LAMA + LABA/ICS respectively. Adding roflumilast resulted in a mean cost per patient per lifetime of CHF 86,754 (LAMA + roflumilast, CHF 91,470 (LABA/ICS + roflumilast, and CHF 99,364 (LAMA + LABA/ICS + roflumilast

  20. Steam inhalation therapy: severe scalds as an adverse side effect

    Science.gov (United States)

    Baartmans, Martin; Kerkhof, Evelien; Vloemans, Jos; Dokter, Jan; Nijman, Susanne; Tibboel, Dick; Nieuwenhuis, Marianne

    2012-01-01

    Background Steam inhalation therapy is often recommended in the treatment of a common cold. However, it has no proven benefit and may in fact have serious adverse side effects in terms of burn injuries. Aim To quantify the human and economic costs of steam inhalation therapy in terms of burn injury. Design and setting A prospective database study of all patients admitted to the burn centres (Beverwijk, Groningen, Rotterdam) and the hospital emergency departments in the Netherlands. Method Number and extent of burn injuries as a result of steam inhalation therapy were analysed, as well as an approximation made of the direct costs for their medical treatment. Results Annually, on average three people are admitted to in one of the Dutch burn centres for burns resulting from steam inhalation therapy. Most victims were children, and they needed skin grafting more often than adults. The total direct medical costs for burn centre and emergency department treatment were €115 500 (£93 000), emotional costs are not reflected. Conclusion As steam inhalation therapy has no proven benefit and the number and extent of complications of this therapy in terms of burn injury are significant, especially in children, steam inhalation therapy should be considered a dangerous procedure and not recommended anymore in professional guidelines and patient brochures. PMID:22781995

  1. Deterministic Echo State Networks Based Stock Price Forecasting

    Directory of Open Access Journals (Sweden)

    Jingpei Dan

    2014-01-01

    Full Text Available Echo state networks (ESNs, as efficient and powerful computational models for approximating nonlinear dynamical systems, have been successfully applied in financial time series forecasting. Reservoir constructions in standard ESNs rely on trials and errors in real applications due to a series of randomized model building stages. A novel form of ESN with deterministically constructed reservoir is competitive with standard ESN by minimal complexity and possibility of optimizations for ESN specifications. In this paper, forecasting performances of deterministic ESNs are investigated in stock price prediction applications. The experiment results on two benchmark datasets (Shanghai Composite Index and S&P500 demonstrate that deterministic ESNs outperform standard ESN in both accuracy and efficiency, which indicate the prospect of deterministic ESNs for financial prediction.

  2. The cointegrated vector autoregressive model with general deterministic terms

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    2017-01-01

    In the cointegrated vector autoregression (CVAR) literature, deterministic terms have until now been analyzed on a case-by-case, or as-needed basis. We give a comprehensive unified treatment of deterministic terms in the additive model X(t)=Z(t) Y(t), where Z(t) belongs to a large class...... of deterministic regressors and Y(t) is a zero-mean CVAR. We suggest an extended model that can be estimated by reduced rank regression and give a condition for when the additive and extended models are asymptotically equivalent, as well as an algorithm for deriving the additive model parameters from the extended...... model parameters. We derive asymptotic properties of the maximum likelihood estimators and discuss tests for rank and tests on the deterministic terms. In particular, we give conditions under which the estimators are asymptotically (mixed) Gaussian, such that associated tests are X 2 -distributed....

  3. The cointegrated vector autoregressive model with general deterministic terms

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    In the cointegrated vector autoregression (CVAR) literature, deterministic terms have until now been analyzed on a case-by-case, or as-needed basis. We give a comprehensive unified treatment of deterministic terms in the additive model X(t)= Z(t) + Y(t), where Z(t) belongs to a large class...... of deterministic regressors and Y(t) is a zero-mean CVAR. We suggest an extended model that can be estimated by reduced rank regression and give a condition for when the additive and extended models are asymptotically equivalent, as well as an algorithm for deriving the additive model parameters from the extended...... model parameters. We derive asymptotic properties of the maximum likelihood estimators and discuss tests for rank and tests on the deterministic terms. In particular, we give conditions under which the estimators are asymptotically (mixed) Gaussian, such that associated tests are khi squared distributed....

  4. Method to deterministically study photonic nanostructures in different experimental instruments

    NARCIS (Netherlands)

    Husken, B.H.; Woldering, L.A.; Blum, Christian; Tjerkstra, R.W.; Vos, Willem L.

    2009-01-01

    We describe an experimental method to recover a single, deterministically fabricated nanostructure in various experimental instruments without the use of artificially fabricated markers, with the aim to study photonic structures. Therefore, a detailed map of the spatial surroundings of the

  5. Pseudo-random number generator based on asymptotic deterministic randomness

    Science.gov (United States)

    Wang, Kai; Pei, Wenjiang; Xia, Haishan; Cheung, Yiu-ming

    2008-06-01

    A novel approach to generate the pseudorandom-bit sequence from the asymptotic deterministic randomness system is proposed in this Letter. We study the characteristic of multi-value correspondence of the asymptotic deterministic randomness constructed by the piecewise linear map and the noninvertible nonlinearity transform, and then give the discretized systems in the finite digitized state space. The statistic characteristics of the asymptotic deterministic randomness are investigated numerically, such as stationary probability density function and random-like behavior. Furthermore, we analyze the dynamics of the symbolic sequence. Both theoretical and experimental results show that the symbolic sequence of the asymptotic deterministic randomness possesses very good cryptographic properties, which improve the security of chaos based PRBGs and increase the resistance against entropy attacks and symbolic dynamics attacks.

  6. Pseudo-random number generator based on asymptotic deterministic randomness

    International Nuclear Information System (INIS)

    Wang Kai; Pei Wenjiang; Xia Haishan; Cheung Yiuming

    2008-01-01

    A novel approach to generate the pseudorandom-bit sequence from the asymptotic deterministic randomness system is proposed in this Letter. We study the characteristic of multi-value correspondence of the asymptotic deterministic randomness constructed by the piecewise linear map and the noninvertible nonlinearity transform, and then give the discretized systems in the finite digitized state space. The statistic characteristics of the asymptotic deterministic randomness are investigated numerically, such as stationary probability density function and random-like behavior. Furthermore, we analyze the dynamics of the symbolic sequence. Both theoretical and experimental results show that the symbolic sequence of the asymptotic deterministic randomness possesses very good cryptographic properties, which improve the security of chaos based PRBGs and increase the resistance against entropy attacks and symbolic dynamics attacks

  7. Non deterministic finite automata for power systems fault diagnostics

    Directory of Open Access Journals (Sweden)

    LINDEN, R.

    2009-06-01

    Full Text Available This paper introduces an application based on finite non-deterministic automata for power systems diagnosis. Automata for the simpler faults are presented and the proposed system is compared with an established expert system.

  8. Transmission power control in WSNs : from deterministic to cognitive methods

    NARCIS (Netherlands)

    Chincoli, M.; Liotta, A.; Gravina, R.; Palau, C.E.; Manso, M.; Liotta, A.; Fortino, G.

    2018-01-01

    Communications in Wireless Sensor Networks (WSNs) are affected by dynamic environments, variable signal fluctuations and interference. Thus, prompt actions are necessary to achieve dependable communications and meet Quality of Service (QoS) requirements. To this end, the deterministic algorithms

  9. The probabilistic approach and the deterministic licensing procedure

    International Nuclear Information System (INIS)

    Fabian, H.; Feigel, A.; Gremm, O.

    1984-01-01

    If safety goals are given, the creativity of the engineers is necessary to transform the goals into actual safety measures. That is, safety goals are not sufficient for the derivation of a safety concept; the licensing process asks ''What does a safe plant look like.'' The answer connot be given by a probabilistic procedure, but need definite deterministic statements; the conclusion is, that the licensing process needs a deterministic approach. The probabilistic approach should be used in a complementary role in cases where deterministic criteria are not complete, not detailed enough or not consistent and additional arguments for decision making in connection with the adequacy of a specific measure are necessary. But also in these cases the probabilistic answer has to be transformed into a clear deterministic statement. (orig.)

  10. Effect(s) of Language Tasks on Severity of Disfluencies in Preschool Children with Stuttering

    Science.gov (United States)

    Zamani, Peyman; Ravanbakhsh, Majid; Weisi, Farzad; Rashedi, Vahid; Naderi, Sara; Hosseinzadeh, Ayub; Rezaei, Mohammad

    2017-01-01

    Speech disfluency in children can be increased or decreased depending on the type of linguistic task presented to them. In this study, the effect of sentence imitation and sentence modeling on severity of speech disfluencies in preschool children with stuttering is investigated. In this cross-sectional descriptive analytical study, 58 children…

  11. Local deterministic theory surviving the violation of Bell's inequalities

    International Nuclear Information System (INIS)

    Cormier-Delanoue, C.

    1984-01-01

    Bell's theorem which asserts that no deterministic theory with hidden variables can give the same predictions as quantum theory, is questioned. Such a deterministic theory is presented and carefully applied to real experiments performed on pairs of correlated photons, derived from the EPR thought experiment. The ensuing predictions violate Bell's inequalities just as quantum mechanics does, and it is further shown that this discrepancy originates in the very nature of radiations. Complete locality is therefore restored while separability remains more limited [fr

  12. Deterministic operations research models and methods in linear optimization

    CERN Document Server

    Rader, David J

    2013-01-01

    Uniquely blends mathematical theory and algorithm design for understanding and modeling real-world problems Optimization modeling and algorithms are key components to problem-solving across various fields of research, from operations research and mathematics to computer science and engineering. Addressing the importance of the algorithm design process. Deterministic Operations Research focuses on the design of solution methods for both continuous and discrete linear optimization problems. The result is a clear-cut resource for understanding three cornerstones of deterministic operations resear

  13. Exercise and severe major depression: effect on symptom severity and quality of life at discharge in an inpatient cohort.

    Science.gov (United States)

    Schuch, F B; Vasconcelos-Moreno, M P; Borowsky, C; Zimmermann, A B; Rocha, N S; Fleck, M P

    2015-02-01

    Exercise is a potential treatment for depression. However, few studies have evaluated the role of adjunct exercise in the treatment of severely major depressed inpatients. The goal of this study was to evaluate the effects of add-on exercise on the usual treatment of severely depressed inpatients. Fifty participants were randomized to an exercise (exercise + usual treatment) or a control (usual treatment) group. Twenty-five patients were randomly allocated to each group. The participants in the exercise group performed three sessions per week throughout the hospitalization period, with a goal dose of 16.5 kcal/kg/week plus the usual pharmacological treatment. Depressive symptoms and the Quality of Life (QoL) of the participants were assessed at the baseline, the second week, and discharge. A significant group × time interaction was found for depressive symptoms and the physical and psychological domains of QoL. Differences between groups occurred at the second week and discharge with respect to depressive symptoms and the physical and psychological domains of QoL. There was no difference in the remission rate at discharge (48% and 32% for the exercise and control group, respectively). An NNT of 6.25 was found. No significant baseline characteristics predict remission at discharge. Add-on exercise is an efficacious treatment for severely depressed inpatients, improving their depressive symptoms and QoL. Initial acceptance of exercise remains a challenge. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Efficiency of transport in periodic potentials: dichotomous noise contra deterministic force

    Science.gov (United States)

    Spiechowicz, J.; Łuczka, J.; Machura, L.

    2016-05-01

    We study the transport of an inertial Brownian particle moving in a symmetric and periodic one-dimensional potential, and subjected to both a symmetric, unbiased external harmonic force as well as biased dichotomic noise η (t) also known as a random telegraph signal or a two state continuous-time Markov process. In doing so, we concentrate on the previously reported regime (Spiechowicz et al 2014 Phys. Rev. E 90 032104) for which non-negative biased noise η (t) in the form of generalized white Poissonian noise can induce anomalous transport processes similar to those generated by a deterministic constant force F= but significantly more effective than F, i.e. the particle moves much faster, the velocity fluctuations are noticeably reduced and the transport efficiency is enhanced several times. Here, we confirm this result for the case of dichotomous fluctuations which, in contrast to white Poissonian noise, can assume positive as well as negative values and examine the role of thermal noise in the observed phenomenon. We focus our attention on the impact of bidirectionality of dichotomous fluctuations and reveal that the effect of nonequilibrium noise enhanced efficiency is still detectable. This result may explain transport phenomena occurring in strongly fluctuating environments of both physical and biological origin. Our predictions can be corroborated experimentally by use of a setup that consists of a resistively and capacitively shunted Josephson junction.

  15. Deterministic bound for avionics switched networks according to networking features using network calculus

    Directory of Open Access Journals (Sweden)

    Feng HE

    2017-12-01

    Full Text Available The state of the art avionics system adopts switched networks for airborne communications. A major concern in the design of the networks is the end-to-end guarantee ability. Analytic methods have been developed to compute the worst-case delays according to the detailed configurations of flows and networks within avionics context, such as network calculus and trajectory approach. It still lacks a relevant method to make a rapid performance estimation according to some typically switched networking features, such as networking scale, bandwidth utilization and average flow rate. The goal of this paper is to establish a deterministic upper bound analysis method by using these networking features instead of the complete network configurations. Two deterministic upper bounds are proposed from network calculus perspective: one is for a basic estimation, and another just shows the benefits from grouping strategy. Besides, a mathematic expression for grouping ability is established based on the concept of network connecting degree, which illustrates the possibly minimal grouping benefit. For a fully connected network with 4 switches and 12 end systems, the grouping ability coming from grouping strategy is 15–20%, which just coincides with the statistical data (18–22% from the actual grouping advantage. Compared with the complete network calculus analysis method for individual flows, the effectiveness of the two deterministic upper bounds is no less than 38% even with remarkably varied packet lengths. Finally, the paper illustrates the design process for an industrial Avionics Full DupleX switched Ethernet (AFDX networking case according to the two deterministic upper bounds and shows that a better control for network connecting, when designing a switched network, can improve the worst-case delays dramatically. Keywords: Deterministic bound, Grouping ability, Network calculus, Networking features, Switched networks

  16. Separation of red blood cells in deep deterministic lateral displacement devices

    Science.gov (United States)

    Kabacaoglu, Gokberk; Biros, George

    2017-11-01

    Microfluidic cell separation techniques are of great interest since they help rapid medical diagnoses and tests. Deterministic lateral displacement (DLD) is one of them. A DLD device consists of arrays of pillars. Main flow and alignment of the pillars define two different directions. Size-based separation of rigid spherical particles is possible as they follow one of these directions depending on their sizes. However, the separation of non-spherical deformable particles such as red blood cells (RBCs) is more complicated than that due to their intricate dynamics. We study the separation of RBCs in DLD using an in-house integral equation solver. We systematically investigate the effects of the interior fluid viscosity and the membrane elasticity of an RBC on its behavior. These mechanical properties of a cell determine its deformability, which can be altered by several diseases. We particularly consider deep devices in which an RBC can show rich dynamics such as tank-treading and tumbling. It turns out that strong hydrodynamic lift force moves the tank-treading cells along the pillars and downward force leads the tumbling ones to move with the flow. Thereby, deformability-based separation of RBCs is possible.

  17. Entrepreneurs, Chance, and the Deterministic Concentration of Wealth

    Science.gov (United States)

    Fargione, Joseph E.; Lehman, Clarence; Polasky, Stephen

    2011-01-01

    In many economies, wealth is strikingly concentrated. Entrepreneurs–individuals with ownership in for-profit enterprises–comprise a large portion of the wealthiest individuals, and their behavior may help explain patterns in the national distribution of wealth. Entrepreneurs are less diversified and more heavily invested in their own companies than is commonly assumed in economic models. We present an intentionally simplified individual-based model of wealth generation among entrepreneurs to assess the role of chance and determinism in the distribution of wealth. We demonstrate that chance alone, combined with the deterministic effects of compounding returns, can lead to unlimited concentration of wealth, such that the percentage of all wealth owned by a few entrepreneurs eventually approaches 100%. Specifically, concentration of wealth results when the rate of return on investment varies by entrepreneur and by time. This result is robust to inclusion of realities such as differing skill among entrepreneurs. The most likely overall growth rate of the economy decreases as businesses become less diverse, suggesting that high concentrations of wealth may adversely affect a country's economic growth. We show that a tax on large inherited fortunes, applied to a small portion of the most fortunate in the population, can efficiently arrest the concentration of wealth at intermediate levels. PMID:21814540

  18. Fisher-Wright model with deterministic seed bank and selection.

    Science.gov (United States)

    Koopmann, Bendix; Müller, Johannes; Tellier, Aurélien; Živković, Daniel

    2017-04-01

    Seed banks are common characteristics to many plant species, which allow storage of genetic diversity in the soil as dormant seeds for various periods of time. We investigate an above-ground population following a Fisher-Wright model with selection coupled with a deterministic seed bank assuming the length of the seed bank is kept constant and the number of seeds is large. To assess the combined impact of seed banks and selection on genetic diversity, we derive a general diffusion model. The applied techniques outline a path of approximating a stochastic delay differential equation by an appropriately rescaled stochastic differential equation. We compute the equilibrium solution of the site-frequency spectrum and derive the times to fixation of an allele with and without selection. Finally, it is demonstrated that seed banks enhance the effect of selection onto the site-frequency spectrum while slowing down the time until the mutation-selection equilibrium is reached. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Rapid detection of small oscillation faults via deterministic learning.

    Science.gov (United States)

    Wang, Cong; Chen, Tianrui

    2011-08-01

    Detection of small faults is one of the most important and challenging tasks in the area of fault diagnosis. In this paper, we present an approach for the rapid detection of small oscillation faults based on a recently proposed deterministic learning (DL) theory. The approach consists of two phases: the training phase and the test phase. In the training phase, the system dynamics underlying normal and fault oscillations are locally accurately approximated through DL. The obtained knowledge of system dynamics is stored in constant radial basis function (RBF) networks. In the diagnosis phase, rapid detection is implemented. Specially, a bank of estimators are constructed using the constant RBF neural networks to represent the training normal and fault modes. By comparing the set of estimators with the test monitored system, a set of residuals are generated, and the average L(1) norms of the residuals are taken as the measure of the differences between the dynamics of the monitored system and the dynamics of the training normal mode and oscillation faults. The occurrence of a test oscillation fault can be rapidly detected according to the smallest residual principle. A rigorous analysis of the performance of the detection scheme is also given. The novelty of the paper lies in that the modeling uncertainty and nonlinear fault functions are accurately approximated and then the knowledge is utilized to achieve rapid detection of small oscillation faults. Simulation studies are included to demonstrate the effectiveness of the approach.

  20. The effect of head size/shape, miscentering, and bowtie filter on peak patient tissue doses from modern brain perfusion 256-slice CT: How can we minimize the risk for deterministic effects?

    International Nuclear Information System (INIS)

    Perisinakis, Kostas; Seimenis, Ioannis; Tzedakis, Antonis; Papadakis, Antonios E.; Damilakis, John

    2013-01-01

    Purpose: To determine patient-specific absorbed peak doses to skin, eye lens, brain parenchyma, and cranial red bone marrow (RBM) of adult individuals subjected to low-dose brain perfusion CT studies on a 256-slice CT scanner, and investigate the effect of patient head size/shape, head position during the examination and bowtie filter used on peak tissue doses. Methods: The peak doses to eye lens, skin, brain, and RBM were measured in 106 individual-specific adult head phantoms subjected to the standard low-dose brain perfusion CT on a 256-slice CT scanner using a novel Monte Carlo simulation software dedicated for patient CT dosimetry. Peak tissue doses were compared to corresponding thresholds for induction of cataract, erythema, cerebrovascular disease, and depression of hematopoiesis, respectively. The effects of patient head size/shape, head position during acquisition and bowtie filter used on resulting peak patient tissue doses were investigated. The effect of eye-lens position in the scanned head region was also investigated. The effect of miscentering and use of narrow bowtie filter on image quality was assessed. Results: The mean peak doses to eye lens, skin, brain, and RBM were found to be 124, 120, 95, and 163 mGy, respectively. The effect of patient head size and shape on peak tissue doses was found to be minimal since maximum differences were less than 7%. Patient head miscentering and bowtie filter selection were found to have a considerable effect on peak tissue doses. The peak eye-lens dose saving achieved by elevating head by 4 cm with respect to isocenter and using a narrow wedge filter was found to approach 50%. When the eye lies outside of the primarily irradiated head region, the dose to eye lens was found to drop to less than 20% of the corresponding dose measured when the eye lens was located in the middle of the x-ray beam. Positioning head phantom off-isocenter by 4 cm and employing a narrow wedge filter results in a moderate reduction of

  1. Using Reputation Systems and Non-Deterministic Routing to Secure Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Juan-Mariano de Goyeneche

    2009-05-01

    Full Text Available Security in wireless sensor networks is difficult to achieve because of the resource limitations of the sensor nodes. We propose a trust-based decision framework for wireless sensor networks coupled with a non-deterministic routing protocol. Both provide a mechanism to effectively detect and confine common attacks, and, unlike previous approaches, allow bad reputation feedback to the network. This approach has been extensively simulated, obtaining good results, even for unrealistically complex attack scenarios.

  2. The philosophy of severe accident management in the US

    International Nuclear Information System (INIS)

    Baratta, A.J.

    1990-01-01

    The US NRC has put forth the initial steps in what is viewed as the resolution of the severe accident issue. Underlying this process is a fundamental philosophy that if followed will likely lead to an order of magnitude reduction in the risk of severe accidents. Thus far, this philosophy has proven cost effective through improved performance. This paper briefly examines this philosophy and the next step in closure of the severe accident issue, the IPE. An example of the authors experience with determinist. (author)

  3. The effect of several adjuvants on glufosinate effectiveness in Conyza species.

    Science.gov (United States)

    Fernandez-Cerejido, M C; Bastida, F; Menendez, J

    2009-01-01

    The effect of several adjuvants on the effectiveness of glufosinate and the role of adherence and contact angle modifications due to the presence of these adjuvants in the spraying solution on the increase in efficacy observed on the broadleaved weeds Conyza albida and Conyza bonariensis has been determined under laboratory controlled conditions. The adjuvants used on the experiment were a mixture of methyl oleate and palmitate (MO/MP), a dodecylbenzene ammonium sulphonate (DBAS), a paraffinic oil (PO), an alkylglycol ester (AGE), and a lecithin + propionic acid + non ionic surfactant based product (LI-700). Dose-response experiments showed that C. albida displayed higher susceptibility to glufosinate than C. bonariensis, no matter the adjuvant tested. However, none of the mixtures increased the herbicide effectiveness on C. atlbida, with LI 700 and PO showing an antagonistic effect on the herbicide efficacy. On C. bonariensis, MO/MP and DBAS showed significant better results than non-amended glufosinate controls, with LI 700 showing again an antagonistic effect. Both adherence and contact angle studies were inconclusive, since adjuvants with best adherence and contact angle values were not the most effective ones. Therefore, other unknown parameters putatively modified by adjuvants such as herbicide penetration should be questioned.

  4. Learning to Act: Qualitative Learning of Deterministic Action Models

    DEFF Research Database (Denmark)

    Bolander, Thomas; Gierasimczuk, Nina

    2017-01-01

    In this article we study learnability of fully observable, universally applicable action models of dynamic epistemic logic. We introduce a framework for actions seen as sets of transitions between propositional states and we relate them to their dynamic epistemic logic representations as action...... in the limit (inconclusive convergence to the right action model). We show that deterministic actions are finitely identifiable, while arbitrary (non-deterministic) actions require more learning power—they are identifiable in the limit. We then move on to a particular learning method, i.e. learning via update......, which proceeds via restriction of a space of events within a learning-specific action model. We show how this method can be adapted to learn conditional and unconditional deterministic action models. We propose update learning mechanisms for the afore mentioned classes of actions and analyse...

  5. Deterministic and stochastic CTMC models from Zika disease transmission

    Science.gov (United States)

    Zevika, Mona; Soewono, Edy

    2018-03-01

    Zika infection is one of the most important mosquito-borne diseases in the world. Zika virus (ZIKV) is transmitted by many Aedes-type mosquitoes including Aedes aegypti. Pregnant women with the Zika virus are at risk of having a fetus or infant with a congenital defect and suffering from microcephaly. Here, we formulate a Zika disease transmission model using two approaches, a deterministic model and a continuous-time Markov chain stochastic model. The basic reproduction ratio is constructed from a deterministic model. Meanwhile, the CTMC stochastic model yields an estimate of the probability of extinction and outbreaks of Zika disease. Dynamical simulations and analysis of the disease transmission are shown for the deterministic and stochastic models.

  6. SCALE6 Hybrid Deterministic-Stochastic Shielding Methodology for PWR Containment Calculations

    International Nuclear Information System (INIS)

    Matijevic, Mario; Pevec, Dubravko; Trontl, Kresimir

    2014-01-01

    The capabilities and limitations of SCALE6/MAVRIC hybrid deterministic-stochastic shielding methodology (CADIS and FW-CADIS) are demonstrated when applied to a realistic deep penetration Monte Carlo (MC) shielding problem of full-scale PWR containment model. The ultimate goal of such automatic variance reduction (VR) techniques is to achieve acceptable precision for the MC simulation in reasonable time by preparation of phase-space VR parameters via deterministic transport theory methods (discrete ordinates SN) by generating space-energy mesh-based adjoint function distribution. The hybrid methodology generates VR parameters that work in tandem (biased source distribution and importance map) in automated fashion which is paramount step for MC simulation of complex models with fairly uniform mesh tally uncertainties. The aim in this paper was determination of neutron-gamma dose rate distribution (radiation field) over large portions of PWR containment phase-space with uniform MC uncertainties. The sources of ionizing radiation included fission neutrons and gammas (reactor core) and gammas from activated two-loop coolant. Special attention was given to focused adjoint source definition which gave improved MC statistics in selected materials and/or regions of complex model. We investigated benefits and differences of FW-CADIS over CADIS and manual (i.e. analog) MC simulation of particle transport. Computer memory consumption by deterministic part of hybrid methodology represents main obstacle when using meshes with millions of cells together with high SN/PN parameters, so optimization of control and numerical parameters of deterministic module plays important role for computer memory management. We investigated the possibility of using deterministic module (memory intense) with broad group library v7 2 7n19g opposed to fine group library v7 2 00n47g used with MC module to fully take effect of low energy particle transport and secondary gamma emission. Compared with

  7. Inherent Conservatism in Deterministic Quasi-Static Structural Analysis

    Science.gov (United States)

    Verderaime, V.

    1997-01-01

    The cause of the long-suspected excessive conservatism in the prevailing structural deterministic safety factor has been identified as an inherent violation of the error propagation laws when reducing statistical data to deterministic values and then combining them algebraically through successive structural computational processes. These errors are restricted to the applied stress computations, and because mean and variations of the tolerance limit format are added, the errors are positive, serially cumulative, and excessively conservative. Reliability methods circumvent these errors and provide more efficient and uniform safe structures. The document is a tutorial on the deficiencies and nature of the current safety factor and of its improvement and transition to absolute reliability.

  8. Deterministic and efficient quantum cryptography based on Bell's theorem

    International Nuclear Information System (INIS)

    Chen Zengbing; Pan Jianwei; Zhang Qiang; Bao Xiaohui; Schmiedmayer, Joerg

    2006-01-01

    We propose a double-entanglement-based quantum cryptography protocol that is both efficient and deterministic. The proposal uses photon pairs with entanglement both in polarization and in time degrees of freedom; each measurement in which both of the two communicating parties register a photon can establish one and only one perfect correlation, and thus deterministically create a key bit. Eavesdropping can be detected by violation of local realism. A variation of the protocol shows a higher security, similar to the six-state protocol, under individual attacks. Our scheme allows a robust implementation under the current technology

  9. Seismic hazard in Romania associated to Vrancea subcrustal source Deterministic evaluation

    CERN Document Server

    Radulian, M; Moldoveanu, C L; Panza, G F; Vaccari, F

    2002-01-01

    Our study presents an application of the deterministic approach to the particular case of Vrancea intermediate-depth earthquakes to show how efficient the numerical synthesis is in predicting realistic ground motion, and how some striking peculiarities of the observed intensity maps are properly reproduced. The deterministic approach proposed by Costa et al. (1993) is particularly useful to compute seismic hazard in Romania, where the most destructive effects are caused by the intermediate-depth earthquakes generated in the Vrancea region. Vrancea is unique among the seismic sources of the World because of its striking peculiarities: the extreme concentration of seismicity with a remarkable invariance of the foci distribution, the unusually high rate of strong shocks (an average frequency of 3 events with magnitude greater than 7 per century) inside an exceptionally narrow focal volume, the predominance of a reverse faulting mechanism with the T-axis almost vertical and the P-axis almost horizontal and the mo...

  10. Deterministic and stochastic trends in the Lee-Carter mortality model

    DEFF Research Database (Denmark)

    Callot, Laurent; Haldrup, Niels; Kallestrup-Lamb, Malene

    The Lee and Carter (1992) model assumes that the deterministic and stochastic time series dynamics loads with identical weights when describing the development of age specific mortality rates. Effectively this means that the main characteristics of the model simplifies to a random walk model...... that characterizes mortality data. We find empirical evidence that this feature of the Lee-Carter model overly restricts the system dynamics and we suggest to separate the deterministic and stochastic time series components at the benefit of improved fit and forecasting performance. In fact, we find...... that the classical Lee-Carter model will otherwise over estimate the reduction of mortality for the younger age groups and will under estimate the reduction of mortality for the older age groups. In practice, our recommendation means that the Lee-Carter model instead of a one-factor model should be formulated...

  11. Analysis of deterministic swapping of photonic and atomic states through single-photon Raman interaction

    Science.gov (United States)

    Rosenblum, Serge; Borne, Adrien; Dayan, Barak

    2017-03-01

    The long-standing goal of deterministic quantum interactions between single photons and single atoms was recently realized in various experiments. Among these, an appealing demonstration relied on single-photon Raman interaction (SPRINT) in a three-level atom coupled to a single-mode waveguide. In essence, the interference-based process of SPRINT deterministically swaps the qubits encoded in a single photon and a single atom, without the need for additional control pulses. It can also be harnessed to construct passive entangling quantum gates, and can therefore form the basis for scalable quantum networks in which communication between the nodes is carried out only by single-photon pulses. Here we present an analytical and numerical study of SPRINT, characterizing its limitations and defining parameters for its optimal operation. Specifically, we study the effect of losses, imperfect polarization, and the presence of multiple excited states. In all cases we discuss strategies for restoring the operation of SPRINT.

  12. Stochastic Simulation of Integrated Circuits with Nonlinear Black-Box Components via Augmented Deterministic Equivalents

    Directory of Open Access Journals (Sweden)

    MANFREDI, P.

    2014-11-01

    Full Text Available This paper extends recent literature results concerning the statistical simulation of circuits affected by random electrical parameters by means of the polynomial chaos framework. With respect to previous implementations, based on the generation and simulation of augmented and deterministic circuit equivalents, the modeling is extended to generic and ?black-box? multi-terminal nonlinear subcircuits describing complex devices, like those found in integrated circuits. Moreover, based on recently-published works in this field, a more effective approach to generate the deterministic circuit equivalents is implemented, thus yielding more compact and efficient models for nonlinear components. The approach is fully compatible with commercial (e.g., SPICE-type circuit simulators and is thoroughly validated through the statistical analysis of a realistic interconnect structure with a 16-bit memory chip. The accuracy and the comparison against previous approaches are also carefully established.

  13. The cost effectiveness of pandemic influenza interventions: a pandemic severity based analysis.

    Directory of Open Access Journals (Sweden)

    George J Milne

    Full Text Available BACKGROUND: The impact of a newly emerged influenza pandemic will depend on its transmissibility and severity. Understanding how these pandemic features impact on the effectiveness and cost effectiveness of alternative intervention strategies is important for pandemic planning. METHODS: A cost effectiveness analysis of a comprehensive range of social distancing and antiviral drug strategies intended to mitigate a future pandemic was conducted using a simulation model of a community of ∼30,000 in Australia. Six pandemic severity categories were defined based on case fatality ratio (CFR, using data from the 2009/2010 pandemic to relate hospitalisation rates to CFR. RESULTS: Intervention strategies combining school closure with antiviral treatment and prophylaxis are the most cost effective strategies in terms of cost per life year saved (LYS for all severity categories. The cost component in the cost per LYS ratio varies depending on pandemic severity: for a severe pandemic (CFR of 2.5% the cost is ∼$9 k per LYS; for a low severity pandemic (CFR of 0.1% this strategy costs ∼$58 k per LYS; for a pandemic with very low severity similar to the 2009 pandemic (CFR of 0.03% the cost is ∼$155 per LYS. With high severity pandemics (CFR >0.75% the most effective attack rate reduction strategies are also the most cost effective. During low severity pandemics costs are dominated by productivity losses due to illness and social distancing interventions, while for high severity pandemics costs are dominated by hospitalisation costs and productivity losses due to death. CONCLUSIONS: The most cost effective strategies for mitigating an influenza pandemic involve combining sustained social distancing with the use of antiviral agents. For low severity pandemics the most cost effective strategies involve antiviral treatment, prophylaxis and short durations of school closure; while these are cost effective they are less effective than other strategies in

  14. The Cost Effectiveness of Pandemic Influenza Interventions: A Pandemic Severity Based Analysis

    Science.gov (United States)

    Milne, George J.; Halder, Nilimesh; Kelso, Joel K.

    2013-01-01

    Background The impact of a newly emerged influenza pandemic will depend on its transmissibility and severity. Understanding how these pandemic features impact on the effectiveness and cost effectiveness of alternative intervention strategies is important for pandemic planning. Methods A cost effectiveness analysis of a comprehensive range of social distancing and antiviral drug strategies intended to mitigate a future pandemic was conducted using a simulation model of a community of ∼30,000 in Australia. Six pandemic severity categories were defined based on case fatality ratio (CFR), using data from the 2009/2010 pandemic to relate hospitalisation rates to CFR. Results Intervention strategies combining school closure with antiviral treatment and prophylaxis are the most cost effective strategies in terms of cost per life year saved (LYS) for all severity categories. The cost component in the cost per LYS ratio varies depending on pandemic severity: for a severe pandemic (CFR of 2.5%) the cost is ∼$9 k per LYS; for a low severity pandemic (CFR of 0.1%) this strategy costs ∼$58 k per LYS; for a pandemic with very low severity similar to the 2009 pandemic (CFR of 0.03%) the cost is ∼$155 per LYS. With high severity pandemics (CFR >0.75%) the most effective attack rate reduction strategies are also the most cost effective. During low severity pandemics costs are dominated by productivity losses due to illness and social distancing interventions, while for high severity pandemics costs are dominated by hospitalisation costs and productivity losses due to death. Conclusions The most cost effective strategies for mitigating an influenza pandemic involve combining sustained social distancing with the use of antiviral agents. For low severity pandemics the most cost effective strategies involve antiviral treatment, prophylaxis and short durations of school closure; while these are cost effective they are less effective than other strategies in reducing the

  15. A Comparison of Monte Carlo and Deterministic Solvers for keff and Sensitivity Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Haeck, Wim [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parsons, Donald Kent [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); White, Morgan Curtis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Saller, Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Favorite, Jeffrey A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-12-12

    Verification and validation of our solutions for calculating the neutron reactivity for nuclear materials is a key issue to address for many applications, including criticality safety, research reactors, power reactors, and nuclear security. Neutronics codes solve variations of the Boltzmann transport equation. The two main variants are Monte Carlo versus deterministic solutions, e.g. the MCNP [1] versus PARTISN [2] codes, respectively. There have been many studies over the decades that examined the accuracy of such solvers and the general conclusion is that when the problems are well-posed, either solver can produce accurate results. However, the devil is always in the details. The current study examines the issue of self-shielding and the stress it puts on deterministic solvers. Most Monte Carlo neutronics codes use continuous-energy descriptions of the neutron interaction data that are not subject to this effect. The issue of self-shielding occurs because of the discretisation of data used by the deterministic solutions. Multigroup data used in these solvers are the average cross section and scattering parameters over an energy range. Resonances in cross sections can occur that change the likelihood of interaction by one to three orders of magnitude over a small energy range. Self-shielding is the numerical effect that the average cross section in groups with strong resonances can be strongly affected as neutrons within that material are preferentially absorbed or scattered out of the resonance energies. This affects both the average cross section and the scattering matrix.

  16. Appearance of deterministic mixing behavior from ensembles of fluctuating hydrodynamics simulations of the Richtmyer-Meshkov instability

    KAUST Repository

    Narayanan, Kiran

    2018-04-19

    We obtain numerical solutions of the two-fluid fluctuating compressible Navier-Stokes (FCNS) equations, which consistently account for thermal fluctuations from meso- to macroscales, in order to study the effect of such fluctuations on the mixing behavior in the Richtmyer-Meshkov instability (RMI). The numerical method used was successfully verified in two stages: for the deterministic fluxes by comparison against air-SF6 RMI experiment, and for the stochastic terms by comparison against the direct simulation Monte Carlo results for He-Ar RMI. We present results from fluctuating hydrodynamic RMI simulations for three He-Ar systems having length scales with decreasing order of magnitude that span from macroscopic to mesoscopic, with different levels of thermal fluctuations characterized by a nondimensional Boltzmann number (Bo). For a multidimensional FCNS system on a regular Cartesian grid, when using a discretization of a space-time stochastic flux Z(x,t) of the form Z(x,t)→1/-tN(ih,nΔt) for spatial interval h, time interval Δt, h, and Gaussian noise N should be greater than h0, with h0 corresponding to a cell volume that contains a sufficient number of molecules of the fluid such that the fluctuations are physically meaningful and produce the right equilibrium spectrum. For the mesoscale RMI systems simulated, it was desirable to use a cell size smaller than this limit in order to resolve the viscous shock. This was achieved by using a modified regularization of the noise term via Zx,t→1/-tmaxh3,h03Nih,nΔt, with h0=ξhdeterministic mixing behavior emerges as the ensemble-averaged behavior of several fluctuating instances, whereas when Bo≈1, a deviation from deterministic behavior is observed. For all cases, the FCNS solution provides bounds on the growth rate of the amplitude of the mixing layer.

  17. Appearance of deterministic mixing behavior from ensembles of fluctuating hydrodynamics simulations of the Richtmyer-Meshkov instability

    KAUST Repository

    Narayanan, Kiran; Samtaney, Ravi

    2018-01-01

    We obtain numerical solutions of the two-fluid fluctuating compressible Navier-Stokes (FCNS) equations, which consistently account for thermal fluctuations from meso- to macroscales, in order to study the effect of such fluctuations on the mixing behavior in the Richtmyer-Meshkov instability (RMI). The numerical method used was successfully verified in two stages: for the deterministic fluxes by comparison against air-SF6 RMI experiment, and for the stochastic terms by comparison against the direct simulation Monte Carlo results for He-Ar RMI. We present results from fluctuating hydrodynamic RMI simulations for three He-Ar systems having length scales with decreasing order of magnitude that span from macroscopic to mesoscopic, with different levels of thermal fluctuations characterized by a nondimensional Boltzmann number (Bo). For a multidimensional FCNS system on a regular Cartesian grid, when using a discretization of a space-time stochastic flux Z(x,t) of the form Z(x,t)→1/-tN(ih,nΔt) for spatial interval h, time interval Δt, h, and Gaussian noise N should be greater than h0, with h0 corresponding to a cell volume that contains a sufficient number of molecules of the fluid such that the fluctuations are physically meaningful and produce the right equilibrium spectrum. For the mesoscale RMI systems simulated, it was desirable to use a cell size smaller than this limit in order to resolve the viscous shock. This was achieved by using a modified regularization of the noise term via Zx,t→1/-tmaxh3,h03Nih,nΔt, with h0=ξhdeterministic mixing behavior emerges as the ensemble-averaged behavior of several fluctuating instances, whereas when Bo≈1, a deviation from deterministic behavior is observed. For all cases, the FCNS solution provides bounds on the growth rate of the amplitude of the mixing layer.

  18. The Effect of Colour Psychodynamic Environment on the Psychophysiological and Behavioural Reactions of Severely Handicapped Children. Effects of Colour/Light Changes on Severely Handicapped Children.

    Science.gov (United States)

    Wohlfarth, H.; Sam, C.

    The effects of varied lighting and coloring in the classroom environment were examined on the behavior of seven severely handicapped 8 to 11 year olds with behavior problems. Analysis of changes in systolic blood pressure indicated that Ss were more comfortable and relaxed in the experimental room (in which the fluorescent lights were replaced by…

  19. Fault Detection for Nonlinear Process With Deterministic Disturbances: A Just-In-Time Learning Based Data Driven Method.

    Science.gov (United States)

    Yin, Shen; Gao, Huijun; Qiu, Jianbin; Kaynak, Okyay

    2017-11-01

    Data-driven fault detection plays an important role in industrial systems due to its applicability in case of unknown physical models. In fault detection, disturbances must be taken into account as an inherent characteristic of processes. Nevertheless, fault detection for nonlinear processes with deterministic disturbances still receive little attention, especially in data-driven field. To solve this problem, a just-in-time learning-based data-driven (JITL-DD) fault detection method for nonlinear processes with deterministic disturbances is proposed in this paper. JITL-DD employs JITL scheme for process description with local model structures to cope with processes dynamics and nonlinearity. The proposed method provides a data-driven fault detection solution for nonlinear processes with deterministic disturbances, and owns inherent online adaptation and high accuracy of fault detection. Two nonlinear systems, i.e., a numerical example and a sewage treatment process benchmark, are employed to show the effectiveness of the proposed method.

  20. A systems approach to college drinking: development of a deterministic model for testing alcohol control policies.

    Science.gov (United States)

    Scribner, Richard; Ackleh, Azmy S; Fitzpatrick, Ben G; Jacquez, Geoffrey; Thibodeaux, Jeremy J; Rommel, Robert; Simonsen, Neal

    2009-09-01

    The misuse and abuse of alcohol among college students remain persistent problems. Using a systems approach to understand the dynamics of student drinking behavior and thus forecasting the impact of campus policy to address the problem represents a novel approach. Toward this end, the successful development of a predictive mathematical model of college drinking would represent a significant advance for prevention efforts. A deterministic, compartmental model of college drinking was developed, incorporating three processes: (1) individual factors, (2) social interactions, and (3) social norms. The model quantifies these processes in terms of the movement of students between drinking compartments characterized by five styles of college drinking: abstainers, light drinkers, moderate drinkers, problem drinkers, and heavy episodic drinkers. Predictions from the model were first compared with actual campus-level data and then used to predict the effects of several simulated interventions to address heavy episodic drinking. First, the model provides a reasonable fit of actual drinking styles of students attending Social Norms Marketing Research Project campuses varying by "wetness" and by drinking styles of matriculating students. Second, the model predicts that a combination of simulated interventions targeting heavy episodic drinkers at a moderately "dry" campus would extinguish heavy episodic drinkers, replacing them with light and moderate drinkers. Instituting the same combination of simulated interventions at a moderately "wet" campus would result in only a moderate reduction in heavy episodic drinkers (i.e., 50% to 35%). A simple, five-state compartmental model adequately predicted the actual drinking patterns of students from a variety of campuses surveyed in the Social Norms Marketing Research Project study. The model predicted the impact on drinking patterns of several simulated interventions to address heavy episodic drinking on various types of campuses.

  1. A Systems Approach to College Drinking: Development of a Deterministic Model for Testing Alcohol Control Policies*

    Science.gov (United States)

    Scribner, Richard; Ackleh, Azmy S.; Fitzpatrick, Ben G.; Jacquez, Geoffrey; Thibodeaux, Jeremy J.; Rommel, Robert; Simonsen, Neal

    2009-01-01

    Objective: The misuse and abuse of alcohol among college students remain persistent problems. Using a systems approach to understand the dynamics of student drinking behavior and thus forecasting the impact of campus policy to address the problem represents a novel approach. Toward this end, the successful development of a predictive mathematical model of college drinking would represent a significant advance for prevention efforts. Method: A deterministic, compartmental model of college drinking was developed, incorporating three processes: (1) individual factors, (2) social interactions, and (3) social norms. The model quantifies these processes in terms of the movement of students between drinking compartments characterized by five styles of college drinking: abstainers, light drinkers, moderate drinkers, problem drinkers, and heavy episodic drinkers. Predictions from the model were first compared with actual campus-level data and then used to predict the effects of several simulated interventions to address heavy episodic drinking. Results: First, the model provides a reasonable fit of actual drinking styles of students attending Social Norms Marketing Research Project campuses varying by “wetness” and by drinking styles of matriculating students. Second, the model predicts that a combination of simulated interventions targeting heavy episodic drinkers at a moderately “dry” campus would extinguish heavy episodic drinkers, replacing them with light and moderate drinkers. Instituting the same combination of simulated interventions at a moderately “wet” campus would result in only a moderate reduction in heavy episodic drinkers (i.e., 50% to 35%). Conclusions: A simple, five-state compartmental model adequately predicted the actual drinking patterns of students from a variety of campuses surveyed in the Social Norms Marketing Research Project study. The model predicted the impact on drinking patterns of several simulated interventions to address heavy

  2. Appearance of deterministic mixing behavior from ensembles of fluctuating hydrodynamics simulations of the Richtmyer-Meshkov instability

    Science.gov (United States)

    Narayanan, Kiran; Samtaney, Ravi

    2018-04-01

    We obtain numerical solutions of the two-fluid fluctuating compressible Navier-Stokes (FCNS) equations, which consistently account for thermal fluctuations from meso- to macroscales, in order to study the effect of such fluctuations on the mixing behavior in the Richtmyer-Meshkov instability (RMI). The numerical method used was successfully verified in two stages: for the deterministic fluxes by comparison against air-SF6 RMI experiment, and for the stochastic terms by comparison against the direct simulation Monte Carlo results for He-Ar RMI. We present results from fluctuating hydrodynamic RMI simulations for three He-Ar systems having length scales with decreasing order of magnitude that span from macroscopic to mesoscopic, with different levels of thermal fluctuations characterized by a nondimensional Boltzmann number (Bo). For a multidimensional FCNS system on a regular Cartesian grid, when using a discretization of a space-time stochastic flux Z (x ,t ) of the form Z (x ,t ) →1 /√{h ▵ t }N (i h ,n Δ t ) for spatial interval h , time interval Δ t , h , and Gaussian noise N should be greater than h0, with h0 corresponding to a cell volume that contains a sufficient number of molecules of the fluid such that the fluctuations are physically meaningful and produce the right equilibrium spectrum. For the mesoscale RMI systems simulated, it was desirable to use a cell size smaller than this limit in order to resolve the viscous shock. This was achieved by using a modified regularization of the noise term via Z (h3,h03)>x ,t →1 /√ ▵ t max(i h ,n Δ t ) , with h0=ξ h ∀h mixing behavior emerges as the ensemble-averaged behavior of several fluctuating instances, whereas when Bo≈1 , a deviation from deterministic behavior is observed. For all cases, the FCNS solution provides bounds on the growth rate of the amplitude of the mixing layer.

  3. Deterministic Predictions of Vessel Responses Based on Past Measurements

    DEFF Research Database (Denmark)

    Nielsen, Ulrik Dam; Jensen, Jørgen Juncher

    2017-01-01

    The paper deals with a prediction procedure from which global wave-induced responses can be deterministically predicted a short time, 10-50 s, ahead of current time. The procedure relies on the autocorrelation function and takes into account prior measurements only; i.e. knowledge about wave...

  4. About the Possibility of Creation of a Deterministic Unified Mechanics

    International Nuclear Information System (INIS)

    Khomyakov, G.K.

    2005-01-01

    The possibility of creation of a unified deterministic scheme of classical and quantum mechanics, allowing to preserve their achievements is discussed. It is shown that the canonical system of ordinary differential equation of Hamilton classical mechanics can be added with the vector system of ordinary differential equation for the variables of equations. The interpretational problems of quantum mechanics are considered

  5. Deterministic Versus Stochastic Interpretation of Continuously Monitored Sewer Systems

    DEFF Research Database (Denmark)

    Harremoës, Poul; Carstensen, Niels Jacob

    1994-01-01

    An analysis has been made of the uncertainty of input parameters to deterministic models for sewer systems. The analysis reveals a very significant uncertainty, which can be decreased, but not eliminated and has to be considered for engineering application. Stochastic models have a potential for ...

  6. Deterministic multimode photonic device for quantum-information processing

    DEFF Research Database (Denmark)

    Nielsen, Anne E. B.; Mølmer, Klaus

    2010-01-01

    We propose the implementation of a light source that can deterministically generate a rich variety of multimode quantum states. The desired states are encoded in the collective population of different ground hyperfine states of an atomic ensemble and converted to multimode photonic states by exci...

  7. Nonlinear deterministic structures and the randomness of protein sequences

    CERN Document Server

    Huang Yan Zhao

    2003-01-01

    To clarify the randomness of protein sequences, we make a detailed analysis of a set of typical protein sequences representing each structural classes by using nonlinear prediction method. No deterministic structures are found in these protein sequences and this implies that they behave as random sequences. We also give an explanation to the controversial results obtained in previous investigations.

  8. Line and lattice networks under deterministic interference models

    NARCIS (Netherlands)

    Goseling, Jasper; Gastpar, Michael; Weber, Jos H.

    Capacity bounds are compared for four different deterministic models of wireless networks, representing four different ways of handling broadcast and superposition in the physical layer. In particular, the transport capacity under a multiple unicast traffic pattern is studied for a 1-D network of

  9. Comparison of deterministic and Monte Carlo methods in shielding design.

    Science.gov (United States)

    Oliveira, A D; Oliveira, C

    2005-01-01

    In shielding calculation, deterministic methods have some advantages and also some disadvantages relative to other kind of codes, such as Monte Carlo. The main advantage is the short computer time needed to find solutions while the disadvantages are related to the often-used build-up factor that is extrapolated from high to low energies or with unknown geometrical conditions, which can lead to significant errors in shielding results. The aim of this work is to investigate how good are some deterministic methods to calculating low-energy shielding, using attenuation coefficients and build-up factor corrections. Commercial software MicroShield 5.05 has been used as the deterministic code while MCNP has been used as the Monte Carlo code. Point and cylindrical sources with slab shield have been defined allowing comparison between the capability of both Monte Carlo and deterministic methods in a day-by-day shielding calculation using sensitivity analysis of significant parameters, such as energy and geometrical conditions.

  10. Comparison of deterministic and Monte Carlo methods in shielding design

    International Nuclear Information System (INIS)

    Oliveira, A. D.; Oliveira, C.

    2005-01-01

    In shielding calculation, deterministic methods have some advantages and also some disadvantages relative to other kind of codes, such as Monte Carlo. The main advantage is the short computer time needed to find solutions while the disadvantages are related to the often-used build-up factor that is extrapolated from high to low energies or with unknown geometrical conditions, which can lead to significant errors in shielding results. The aim of this work is to investigate how good are some deterministic methods to calculating low-energy shielding, using attenuation coefficients and build-up factor corrections. Commercial software MicroShield 5.05 has been used as the deterministic code while MCNP has been used as the Monte Carlo code. Point and cylindrical sources with slab shield have been defined allowing comparison between the capability of both Monte Carlo and deterministic methods in a day-by-day shielding calculation using sensitivity analysis of significant parameters, such as energy and geometrical conditions. (authors)

  11. Deterministic teleportation using single-photon entanglement as a resource

    DEFF Research Database (Denmark)

    Björk, Gunnar; Laghaout, Amine; Andersen, Ulrik L.

    2012-01-01

    We outline a proof that teleportation with a single particle is, in principle, just as reliable as with two particles. We thereby hope to dispel the skepticism surrounding single-photon entanglement as a valid resource in quantum information. A deterministic Bell-state analyzer is proposed which...

  12. A Deterministic Approach to the Synchronization of Cellular Automata

    OpenAIRE

    Garcia, J.; Garcia, P.

    2011-01-01

    In this work we introduce a deterministic scheme of synchronization of linear and nonlinear cellular automata (CA) with complex behavior, connected through a master-slave coupling. By using a definition of Boolean derivative, we use the linear approximation of the automata to determine a function of coupling that promotes synchronization without perturbing all the sites of the slave system.

  13. Deterministic and Stochastic Study of Wind Farm Harmonic Currents

    DEFF Research Database (Denmark)

    Sainz, Luis; Mesas, Juan Jose; Teodorescu, Remus

    2010-01-01

    Wind farm harmonic emissions are a well-known power quality problem, but little data based on actual wind farm measurements are available in literature. In this paper, harmonic emissions of an 18 MW wind farm are investigated using extensive measurements, and the deterministic and stochastic char...

  14. Mixed motion in deterministic ratchets due to anisotropic permeability

    NARCIS (Netherlands)

    Kulrattanarak, T.; Sman, van der R.G.M.; Lubbersen, Y.S.; Schroën, C.G.P.H.; Pham, H.T.M.; Sarro, P.M.; Boom, R.M.

    2011-01-01

    Nowadays microfluidic devices are becoming popular for cell/DNA sorting and fractionation. One class of these devices, namely deterministic ratchets, seems most promising for continuous fractionation applications of suspensions (Kulrattanarak et al., 2008 [1]). Next to the two main types of particle

  15. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  16. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  17. Using a satisfiability solver to identify deterministic finite state automata

    NARCIS (Netherlands)

    Heule, M.J.H.; Verwer, S.

    2009-01-01

    We present an exact algorithm for identification of deterministic finite automata (DFA) which is based on satisfiability (SAT) solvers. Despite the size of the low level SAT representation, our approach seems to be competitive with alternative techniques. Our contributions are threefold: First, we

  18. Deterministic mean-variance-optimal consumption and investment

    DEFF Research Database (Denmark)

    Christiansen, Marcus; Steffensen, Mogens

    2013-01-01

    In dynamic optimal consumption–investment problems one typically aims to find an optimal control from the set of adapted processes. This is also the natural starting point in case of a mean-variance objective. In contrast, we solve the optimization problem with the special feature that the consum......In dynamic optimal consumption–investment problems one typically aims to find an optimal control from the set of adapted processes. This is also the natural starting point in case of a mean-variance objective. In contrast, we solve the optimization problem with the special feature...... that the consumption rate and the investment proportion are constrained to be deterministic processes. As a result we get rid of a series of unwanted features of the stochastic solution including diffusive consumption, satisfaction points and consistency problems. Deterministic strategies typically appear in unit......-linked life insurance contracts, where the life-cycle investment strategy is age dependent but wealth independent. We explain how optimal deterministic strategies can be found numerically and present an example from life insurance where we compare the optimal solution with suboptimal deterministic strategies...

  19. Simulation of photonic waveguides with deterministic aperiodic nanostructures for biosensing

    DEFF Research Database (Denmark)

    Neustock, Lars Thorben; Paulsen, Moritz; Jahns, Sabrina

    2016-01-01

    Photonic waveguides with deterministic aperiodic corrugations offer rich spectral characteristics under surface-normal illumination. The finite-element method (FEM), the finite-difference time-domain (FDTD) method and a rigorous coupled wave algorithm (RCWA) are compared for computing the near...

  20. Langevin equation with the deterministic algebraically correlated noise

    International Nuclear Information System (INIS)

    Ploszajczak, M.; Srokowski, T.

    1995-01-01

    Stochastic differential equations with the deterministic, algebraically correlated noise are solved for a few model problems. The chaotic force with both exponential and algebraic temporal correlations is generated by the adjoined extended Sinai billiard with periodic boundary conditions. The correspondence between the autocorrelation function for the chaotic force and both the survival probability and the asymptotic energy distribution of escaping particles is found. (author)

  1. Deterministic dense coding and faithful teleportation with multipartite graph states

    International Nuclear Information System (INIS)

    Huang, C.-Y.; Yu, I-C.; Lin, F.-L.; Hsu, L.-Y.

    2009-01-01

    We propose schemes to perform the deterministic dense coding and faithful teleportation with multipartite graph states. We also find the sufficient and necessary condition of a viable graph state for the proposed schemes. That is, for the associated graph, the reduced adjacency matrix of the Tanner-type subgraph between senders and receivers should be invertible.

  2. Deterministic algorithms for multi-criteria Max-TSP

    NARCIS (Netherlands)

    Manthey, Bodo

    2012-01-01

    We present deterministic approximation algorithms for the multi-criteria maximum traveling salesman problem (Max-TSP). Our algorithms are faster and simpler than the existing randomized algorithms. We devise algorithms for the symmetric and asymmetric multi-criteria Max-TSP that achieve ratios of

  3. A Deterministic Annealing Approach to Clustering AIRS Data

    Science.gov (United States)

    Guillaume, Alexandre; Braverman, Amy; Ruzmaikin, Alexander

    2012-01-01

    We will examine the validity of means and standard deviations as a basis for climate data products. We will explore the conditions under which these two simple statistics are inadequate summaries of the underlying empirical probability distributions by contrasting them with a nonparametric, method called Deterministic Annealing technique

  4. The concerted calculation of the BN-600 reactor for the deterministic and stochastic codes

    Science.gov (United States)

    Bogdanova, E. V.; Kuznetsov, A. N.

    2017-01-01

    The solution of the problem of increasing the safety of nuclear power plants implies the existence of complete and reliable information about the processes occurring in the core of a working reactor. Nowadays the Monte-Carlo method is the most general-purpose method used to calculate the neutron-physical characteristic of the reactor. But it is characterized by large time of calculation. Therefore, it may be useful to carry out coupled calculations with stochastic and deterministic codes. This article presents the results of research for possibility of combining stochastic and deterministic algorithms in calculation the reactor BN-600. This is only one part of the work, which was carried out in the framework of the graduation project at the NRC “Kurchatov Institute” in cooperation with S. S. Gorodkov and M. A. Kalugin. It is considering the 2-D layer of the BN-600 reactor core from the international benchmark test, published in the report IAEA-TECDOC-1623. Calculations of the reactor were performed with MCU code and then with a standard operative diffusion algorithm with constants taken from the Monte - Carlo computation. Macro cross-section, diffusion coefficients, the effective multiplication factor and the distribution of neutron flux and power were obtained in 15 energy groups. The reasonable agreement between stochastic and deterministic calculations of the BN-600 is observed.

  5. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  6. Pattern and process of prescribed fires influence effectiveness at reducing wildfire severity in dry coniferous forests

    Science.gov (United States)

    Arkle, Robert S.; Pilliod, David S.; Welty, Justin L.

    2012-01-01

    We examined the effects of three early season (spring) prescribed fires on burn severity patterns of summer wildfires that occurred 1–3 years post-treatment in a mixed conifer forest in central Idaho. Wildfire and prescribed fire burn severities were estimated as the difference in normalized burn ratio (dNBR) using Landsat imagery. We used GIS derived vegetation, topography, and treatment variables to generate models predicting the wildfire burn severity of 1286–5500 30-m pixels within and around treated areas. We found that wildfire severity was significantly lower in treated areas than in untreated areas and significantly lower than the potential wildfire severity of the treated areas had treatments not been implemented. At the pixel level, wildfire severity was best predicted by an interaction between prescribed fire severity, topographic moisture, heat load, and pre-fire vegetation volume. Prescribed fire severity and vegetation volume were the most influential predictors. Prescribed fire severity, and its influence on wildfire severity, was highest in relatively warm and dry locations, which were able to burn under spring conditions. In contrast, wildfire severity peaked in cooler, more mesic locations that dried later in the summer and supported greater vegetation volume. We found considerable evidence that prescribed fires have landscape-level influences within treatment boundaries; most notable was an interaction between distance from the prescribed fire perimeter and distance from treated patch edges, which explained up to 66% of the variation in wildfire severity. Early season prescribed fires may not directly target the locations most at risk of high severity wildfire, but proximity of these areas to treated patches and the discontinuity of fuels following treatment may influence wildfire severity and explain how even low severity treatments can be effective management tools in fire-prone landscapes.

  7. Emergency Department Use by Nursing Home Residents: Effect of Severity of Cognitive Impairment

    Science.gov (United States)

    Stephens, Caroline E.; Newcomer, Robert; Blegen, Mary; Miller, Bruce; Harrington, Charlene

    2012-01-01

    Purpose: To examine the 1-year prevalence and risk of emergency department (ED) use and ambulatory care-sensitive (ACS) ED use by nursing home (NH) residents with different levels of severity of cognitive impairment (CI). Design and Methods: We used multinomial logistic regression to estimate the effect of CI severity on the odds of any ED visit…

  8. Curative effect of ganglioside sodium for adjuvant therapy on acute severe craniocerebral injury

    Directory of Open Access Journals (Sweden)

    Yun-Liang Deng

    2017-01-01

    >Conclusions: The adjuvant therapy of ganglioside sodium in patients with severe craniocerebral injury can effectively reduce ICP, improve PbtO2 and alleviate the injuries of neurons and glial cells caused by oxidative stress.

  9. Safety and effect of high dose allopurinol in patients with severe left ventricular systolic dysfunction

    Directory of Open Access Journals (Sweden)

    Mohammad Mostafa Ansari-Ramandi

    2017-06-01

    Conclusion: Allopurinol could be of benefit in non-hyperuricemic patients with severe LV systolic dysfunction without significant adverse effects. Randomized clinical trials are needed in future to confirm the results.

  10. Deterministic diffusion in flower-shaped billiards.

    Science.gov (United States)

    Harayama, Takahisa; Klages, Rainer; Gaspard, Pierre

    2002-08-01

    We propose a flower-shaped billiard in order to study the irregular parameter dependence of chaotic normal diffusion. Our model is an open system consisting of periodically distributed obstacles in the shape of a flower, and it is strongly chaotic for almost all parameter values. We compute the parameter dependent diffusion coefficient of this model from computer simulations and analyze its functional form using different schemes, all generalizing the simple random walk approximation of Machta and Zwanzig. The improved methods we use are based either on heuristic higher-order corrections to the simple random walk model, on lattice gas simulation methods, or they start from a suitable Green-Kubo formula for diffusion. We show that dynamical correlations, or memory effects, are of crucial importance in reproducing the precise parameter dependence of the diffusion coefficent.

  11. A new deterministic model of strange stars

    Energy Technology Data Exchange (ETDEWEB)

    Rahaman, Farook; Shit, G.C. [Jadavpur University, Department of Mathematics, Kolkata, West Bengal (India); Chakraborty, Koushik [Government Training College, Department of Physics, Hooghly, West Bengal (India); Kuhfittig, P.K.F. [Milwaukee School of Engineering, Department of Mathematics, Milwaukee, WI (United States); Rahman, Mosiur [Meghnad Saha Institute of Technology, Department of Mathematics, Kolkata (India)

    2014-10-15

    The observed evidence for the existence of strange stars and the concomitant observed masses and radii are used to derive an interpolation formula for the mass as a function of the radial coordinate. The resulting general mass function becomes an effective model for a strange star. The analysis is based on the MIT bag model and yields the energy density, as well as the radial and transverse pressures. Using the interpolation function for the mass, it is shown that a mass-radius relation due to Buchdahl is satisfied in our model. We find the surface redshift (Z) corresponding to the compactness of the stars. Finally, from our results, we predict some characteristics of a strange star of radius 9.9 km. (orig.)

  12. Condensation and homogenization of cross sections for the deterministic transport codes with Monte Carlo method: Application to the GEN IV fast neutron reactors

    International Nuclear Information System (INIS)

    Cai, Li

    2014-01-01

    In the framework of the Generation IV reactors neutronic research, new core calculation tools are implemented in the code system APOLLO3 for the deterministic part. These calculation methods are based on the discretization concept of nuclear energy data (named multi-group and are generally produced by deterministic codes) and should be validated and qualified with respect to some Monte-Carlo reference calculations. This thesis aims to develop an alternative technique of producing multi-group nuclear properties by a Monte-Carlo code (TRIPOLI-4). At first, after having tested the existing homogenization and condensation functionalities with better precision obtained nowadays, some inconsistencies are revealed. Several new multi-group parameters estimators are developed and validated for TRIPOLI-4 code with the aid of itself, since it has the possibility to use the multi-group constants in a core calculation. Secondly, the scattering anisotropy effect which is necessary for handling neutron leakage case is studied. A correction technique concerning the diagonal line of the first order moment of the scattering matrix is proposed. This is named the IGSC technique and is based on the usage of an approximate current which is introduced by Todorova. An improvement of this IGSC technique is then presented for the geometries which hold an important heterogeneity property. This improvement uses a more accurate current quantity which is the projection on the abscissa X. The later current can represent the real situation better but is limited to 1D geometries. Finally, a B1 leakage model is implemented in the TRIPOLI-4 code for generating multi-group cross sections with a fundamental mode based critical spectrum. This leakage model is analyzed and validated rigorously by the comparison with other codes: Serpent and ECCO, as well as an analytical case.The whole development work introduced in TRIPOLI-4 code allows producing multi-group constants which can then be used in the core

  13. SWAT4.0 - The integrated burnup code system driving continuous energy Monte Carlo codes MVP, MCNP and deterministic calculation code SRAC

    International Nuclear Information System (INIS)

    Kashima, Takao; Suyama, Kenya; Takada, Tomoyuki

    2015-03-01

    There have been two versions of SWAT depending on details of its development history: the revised SWAT that uses the deterministic calculation code SRAC as a neutron transportation solver, and the SWAT3.1 that uses the continuous energy Monte Carlo code MVP or MCNP5 for the same purpose. It takes several hours, however, to execute one calculation by the continuous energy Monte Carlo code even on the super computer of the Japan Atomic Energy Agency. Moreover, two-dimensional burnup calculation is not practical using the revised SWAT because it has problems on production of effective cross section data and applying them to arbitrary fuel geometry when a calculation model has multiple burnup zones. Therefore, SWAT4.0 has been developed by adding, to SWAT3.1, a function to utilize the deterministic code SARC2006, which has shorter calculation time, as an outer module of neutron transportation solver for burnup calculation. SWAT4.0 has been enabled to execute two-dimensional burnup calculation by providing an input data template of SRAC2006 to SWAT4.0 input data, and updating atomic number densities of burnup zones in each burnup step. This report describes outline, input data instruction, and examples of calculations of SWAT4.0. (author)

  14. A deterministic-probabilistic model for contaminant transport. User manual

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, F W; Crowe, A

    1980-08-01

    This manual describes a deterministic-probabilistic contaminant transport (DPCT) computer model designed to simulate mass transfer by ground-water movement in a vertical section of the earth's crust. The model can account for convection, dispersion, radioactive decay, and cation exchange for a single component. A velocity is calculated from the convective transport of the ground water for each reference particle in the modeled region; dispersion is accounted for in the particle motion by adding a readorn component to the deterministic motion. The model is sufficiently general to enable the user to specify virtually any type of water table or geologic configuration, and a variety of boundary conditions. A major emphasis in the model development has been placed on making the model simple to use, and information provided in the User Manual will permit changes to the computer code to be made relatively easily for those that might be required for specific applications. (author)

  15. Deterministic chaos at the ocean surface: applications and interpretations

    Directory of Open Access Journals (Sweden)

    A. J. Palmer

    1998-01-01

    Full Text Available Ocean surface, grazing-angle radar backscatter data from two separate experiments, one of which provided coincident time series of measured surface winds, were found to exhibit signatures of deterministic chaos. Evidence is presented that the lowest dimensional underlying dynamical system responsible for the radar backscatter chaos is that which governs the surface wind turbulence. Block-averaging time was found to be an important parameter for determining the degree of determinism in the data as measured by the correlation dimension, and by the performance of an artificial neural network in retrieving wind and stress from the radar returns, and in radar detection of an ocean internal wave. The correlation dimensions are lowered and the performance of the deterministic retrieval and detection algorithms are improved by averaging out the higher dimensional surface wave variability in the radar returns.

  16. Deterministic Properties of Serially Connected Distributed Lag Models

    Directory of Open Access Journals (Sweden)

    Piotr Nowak

    2013-01-01

    Full Text Available Distributed lag models are an important tool in modeling dynamic systems in economics. In the analysis of composite forms of such models, the component models are ordered in parallel (with the same independent variable and/or in series (where the independent variable is also the dependent variable in the preceding model. This paper presents an analysis of certain deterministic properties of composite distributed lag models composed of component distributed lag models arranged in sequence, and their asymptotic properties in particular. The models considered are in discrete form. Even though the paper focuses on deterministic properties of distributed lag models, the derivations are based on analytical tools commonly used in probability theory such as probability distributions and the central limit theorem. (original abstract

  17. Progress in nuclear well logging modeling using deterministic transport codes

    International Nuclear Information System (INIS)

    Kodeli, I.; Aldama, D.L.; Maucec, M.; Trkov, A.

    2002-01-01

    Further studies in continuation of the work presented in 2001 in Portoroz were performed in order to study and improve the performances, precission and domain of application of the deterministic transport codes with respect to the oil well logging analysis. These codes are in particular expected to complement the Monte Carlo solutions, since they can provide a detailed particle flux distribution in the whole geometry in a very reasonable CPU time. Real-time calculation can be envisaged. The performances of deterministic transport methods were compared to those of the Monte Carlo method. IRTMBA generic benchmark was analysed using the codes MCNP-4C and DORT/TORT. Centric as well as excentric casings were considered using 14 MeV point neutron source and NaI scintillation detectors. Neutron and gamma spectra were compared at two detector positions.(author)

  18. One-step deterministic multipartite entanglement purification with linear optics

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, Yu-Bo [Department of Physics, Tsinghua University, Beijing 100084 (China); Long, Gui Lu, E-mail: gllong@tsinghua.edu.cn [Department of Physics, Tsinghua University, Beijing 100084 (China); Center for Atomic and Molecular NanoSciences, Tsinghua University, Beijing 100084 (China); Key Laboratory for Quantum Information and Measurements, Beijing 100084 (China); Deng, Fu-Guo [Department of Physics, Applied Optics Beijing Area Major Laboratory, Beijing Normal University, Beijing 100875 (China)

    2012-01-09

    We present a one-step deterministic multipartite entanglement purification scheme for an N-photon system in a Greenberger–Horne–Zeilinger state with linear optical elements. The parties in quantum communication can in principle obtain a maximally entangled state from each N-photon system with a success probability of 100%. That is, it does not consume the less-entangled photon systems largely, which is far different from other multipartite entanglement purification schemes. This feature maybe make this scheme more feasible in practical applications. -- Highlights: ► We proposed a deterministic entanglement purification scheme for GHZ states. ► The scheme uses only linear optical elements and has a success probability of 100%. ► The scheme gives a purified GHZ state in just one-step.

  19. Relationship of Deterministic Thinking With Loneliness and Depression in the Elderly

    Directory of Open Access Journals (Sweden)

    Mehdi Sharifi

    2017-12-01

    Conclusion According to the results, it can be said that deterministic thinking has a significant relationship with depression and sense of loneliness in older adults. So, deterministic thinking acts as a predictor of depression and sense of loneliness in older adults. Therefore, psychological interventions for challenging cognitive distortion of deterministic thinking and attention to mental health in older adult are very important. 

  20. Deterministic Modeling of the High Temperature Test Reactor

    International Nuclear Information System (INIS)

    Ortensi, J.; Cogliati, J.J.; Pope, M.A.; Ferrer, R.M.; Ougouag, A.M.

    2010-01-01

    Idaho National Laboratory (INL) is tasked with the development of reactor physics analysis capability of the Next Generation Nuclear Power (NGNP) project. In order to examine INL's current prismatic reactor deterministic analysis tools, the project is conducting a benchmark exercise based on modeling the High Temperature Test Reactor (HTTR). This exercise entails the development of a model for the initial criticality, a 19 column thin annular core, and the fully loaded core critical condition with 30 columns. Special emphasis is devoted to the annular core modeling, which shares more characteristics with the NGNP base design. The DRAGON code is used in this study because it offers significant ease and versatility in modeling prismatic designs. Despite some geometric limitations, the code performs quite well compared to other lattice physics codes. DRAGON can generate transport solutions via collision probability (CP), method of characteristics (MOC), and discrete ordinates (Sn). A fine group cross section library based on the SHEM 281 energy structure is used in the DRAGON calculations. HEXPEDITE is the hexagonal z full core solver used in this study and is based on the Green's Function solution of the transverse integrated equations. In addition, two Monte Carlo (MC) based codes, MCNP5 and PSG2/SERPENT, provide benchmarking capability for the DRAGON and the nodal diffusion solver codes. The results from this study show a consistent bias of 2-3% for the core multiplication factor. This systematic error has also been observed in other HTTR benchmark efforts and is well documented in the literature. The ENDF/B VII graphite and U235 cross sections appear to be the main source of the error. The isothermal temperature coefficients calculated with the fully loaded core configuration agree well with other benchmark participants but are 40% higher than the experimental values. This discrepancy with the measurement stems from the fact that during the experiments the control

  1. Ordinal optimization and its application to complex deterministic problems

    Science.gov (United States)

    Yang, Mike Shang-Yu

    1998-10-01

    We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.

  2. Evaluation of Deterministic and Stochastic Components of Traffic Counts

    Directory of Open Access Journals (Sweden)

    Ivan Bošnjak

    2012-10-01

    Full Text Available Traffic counts or statistical evidence of the traffic processare often a characteristic of time-series data. In this paper fundamentalproblem of estimating deterministic and stochasticcomponents of a traffic process are considered, in the context of"generalised traffic modelling". Different methods for identificationand/or elimination of the trend and seasonal componentsare applied for concrete traffic counts. Further investigationsand applications of ARIMA models, Hilbert space formulationsand state-space representations are suggested.

  3. Nano transfer and nanoreplication using deterministically grown sacrificial nanotemplates

    Science.gov (United States)

    Melechko, Anatoli V [Oak Ridge, TN; McKnight, Timothy E [Greenback, TN; Guillorn, Michael A [Ithaca, NY; Ilic, Bojan [Ithaca, NY; Merkulov, Vladimir I [Knoxville, TX; Doktycz, Mitchel J [Knoxville, TN; Lowndes, Douglas H [Knoxville, TN; Simpson, Michael L [Knoxville, TN

    2012-03-27

    Methods, manufactures, machines and compositions are described for nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates. An apparatus, includes a substrate and a nanoconduit material coupled to a surface of the substrate. The substrate defines an aperture and the nanoconduit material defines a nanoconduit that is i) contiguous with the aperture and ii) aligned substantially non-parallel to a plane defined by the surface of the substrate.

  4. Langevin equation with the deterministic algebraically correlated noise

    Energy Technology Data Exchange (ETDEWEB)

    Ploszajczak, M. [Grand Accelerateur National d`Ions Lourds (GANIL), 14 - Caen (France); Srokowski, T. [Grand Accelerateur National d`Ions Lourds (GANIL), 14 - Caen (France)]|[Institute of Nuclear Physics, Cracow (Poland)

    1995-12-31

    Stochastic differential equations with the deterministic, algebraically correlated noise are solved for a few model problems. The chaotic force with both exponential and algebraic temporal correlations is generated by the adjoined extended Sinai billiard with periodic boundary conditions. The correspondence between the autocorrelation function for the chaotic force and both the survival probability and the asymptotic energy distribution of escaping particles is found. (author). 58 refs.

  5. Beeping a Deterministic Time-Optimal Leader Election

    OpenAIRE

    Dufoulon , Fabien; Burman , Janna; Beauquier , Joffroy

    2018-01-01

    The beeping model is an extremely restrictive broadcast communication model that relies only on carrier sensing. In this model, we solve the leader election problem with an asymptotically optimal round complexity of O(D + log n), for a network of unknown size n and unknown diameter D (but with unique identifiers). Contrary to the best previously known algorithms in the same setting, the proposed one is deterministic. The techniques we introduce give a new insight as to how local constraints o...

  6. Complex Relationships of the Effects of Topographic Characteristics and Susceptible Tree Cover on Burn Severity

    Directory of Open Access Journals (Sweden)

    Hyun-Joo Lee

    2018-01-01

    Full Text Available Forest fires and burn severity mosaics have profound impacts on the post-fire dynamics and complexity of forest ecosystems. Numerous studies have investigated the relationship between topographic variables and susceptible tree covers with regard to burn severity. However, these relationships have not been fully elucidated, because most studies have assumed linearity in these relationships. Therefore, we examined the linearity and the nonlinearity in the relationships between topographic variables and susceptible tree covers with burn severity by comparing linear and nonlinear models. The site of the Samcheok fire, the largest recorded forest fire in Korea, was used as the study area. We generated 802 grid cells with a 500-m resolution that encompassed the entire study area and collected a dataset that included the topographic variables and percentage of red pine trees, which are the most susceptible tree cover types in Korea. We used conventional linear models and generalized additive models to estimate the linear and the nonlinear models based on topographic variables and Japanese red pine trees. The results revealed that the percentage of red pine trees had linear effects on burn severity, reinforcing the importance of silviculture and forest management to lower burn severity. Meanwhile, the topographic variables had nonlinear effects on burn severity. Among the topographic variables, elevation had the strongest nonlinear effect on burn severity, possibly by overriding the effects of susceptible fuels over elevation effects or due to the nonlinear effects of topographic characteristics on pre-fire fuel conditions, including the spatial distribution and availability of susceptible tree cover. To validate and generalize the nonlinear effects of elevation and other topographic variables, additional research is required at different fire sites with different tree cover types in different geographic locations.

  7. Are deterministic methods suitable for short term reserve planning?

    International Nuclear Information System (INIS)

    Voorspools, Kris R.; D'haeseleer, William D.

    2005-01-01

    Although deterministic methods for establishing minutes reserve (such as the N-1 reserve or the percentage reserve) ignore the stochastic nature of reliability issues, they are commonly used in energy modelling as well as in practical applications. In order to check the validity of such methods, two test procedures are developed. The first checks if the N-1 reserve is a logical fixed value for minutes reserve. The second test procedure investigates whether deterministic methods can realise a stable reliability that is independent of demand. In both evaluations, the loss-of-load expectation is used as the objective stochastic criterion. The first test shows no particular reason to choose the largest unit as minutes reserve. The expected jump in reliability, resulting in low reliability for reserve margins lower than the largest unit and high reliability above, is not observed. The second test shows that both the N-1 reserve and the percentage reserve methods do not provide a stable reliability level that is independent of power demand. For the N-1 reserve, the reliability increases with decreasing maximum demand. For the percentage reserve, the reliability decreases with decreasing demand. The answer to the question raised in the title, therefore, has to be that the probability based methods are to be preferred over the deterministic methods

  8. Deterministic hazard quotients (HQs): Heading down the wrong road

    International Nuclear Information System (INIS)

    Wilde, L.; Hunter, C.; Simpson, J.

    1995-01-01

    The use of deterministic hazard quotients (HQs) in ecological risk assessment is common as a screening method in remediation of brownfield sites dominated by total petroleum hydrocarbon (TPH) contamination. An HQ ≥ 1 indicates further risk evaluation is needed, but an HQ ≤ 1 generally excludes a site from further evaluation. Is the predicted hazard known with such certainty that differences of 10% (0.1) do not affect the ability to exclude or include a site from further evaluation? Current screening methods do not quantify uncertainty associated with HQs. To account for uncertainty in the HQ, exposure point concentrations (EPCs) or ecological benchmark values (EBVs) are conservatively biased. To increase understanding of the uncertainty associated with HQs, EPCs (measured and modeled) and toxicity EBVs were evaluated using a conservative deterministic HQ method. The evaluation was then repeated using a probabilistic (stochastic) method. The probabilistic method used data distributions for EPCs and EBVs to generate HQs with measurements of associated uncertainty. Sensitivity analyses were used to identify the most important factors significantly influencing risk determination. Understanding uncertainty associated with HQ methods gives risk managers a more powerful tool than deterministic approaches

  9. Distinguishing deterministic and noise components in ELM time series

    International Nuclear Information System (INIS)

    Zvejnieks, G.; Kuzovkov, V.N

    2004-01-01

    Full text: One of the main problems in the preliminary data analysis is distinguishing the deterministic and noise components in the experimental signals. For example, in plasma physics the question arises analyzing edge localized modes (ELMs): is observed ELM behavior governed by a complicate deterministic chaos or just by random processes. We have developed methodology based on financial engineering principles, which allows us to distinguish deterministic and noise components. We extended the linear auto regression method (AR) by including the non-linearity (NAR method). As a starting point we have chosen the nonlinearity in the polynomial form, however, the NAR method can be extended to any other type of non-linear functions. The best polynomial model describing the experimental ELM time series was selected using Bayesian Information Criterion (BIC). With this method we have analyzed type I ELM behavior in a subset of ASDEX Upgrade shots. Obtained results indicate that a linear AR model can describe the ELM behavior. In turn, it means that type I ELM behavior is of a relaxation or random type

  10. Iterative acceleration methods for Monte Carlo and deterministic criticality calculations

    International Nuclear Information System (INIS)

    Urbatsch, T.J.

    1995-11-01

    If you have ever given up on a nuclear criticality calculation and terminated it because it took so long to converge, you might find this thesis of interest. The author develops three methods for improving the fission source convergence in nuclear criticality calculations for physical systems with high dominance ratios for which convergence is slow. The Fission Matrix Acceleration Method and the Fission Diffusion Synthetic Acceleration (FDSA) Method are acceleration methods that speed fission source convergence for both Monte Carlo and deterministic methods. The third method is a hybrid Monte Carlo method that also converges for difficult problems where the unaccelerated Monte Carlo method fails. The author tested the feasibility of all three methods in a test bed consisting of idealized problems. He has successfully accelerated fission source convergence in both deterministic and Monte Carlo criticality calculations. By filtering statistical noise, he has incorporated deterministic attributes into the Monte Carlo calculations in order to speed their source convergence. He has used both the fission matrix and a diffusion approximation to perform unbiased accelerations. The Fission Matrix Acceleration method has been implemented in the production code MCNP and successfully applied to a real problem. When the unaccelerated calculations are unable to converge to the correct solution, they cannot be accelerated in an unbiased fashion. A Hybrid Monte Carlo method weds Monte Carlo and a modified diffusion calculation to overcome these deficiencies. The Hybrid method additionally possesses reduced statistical errors

  11. Iterative acceleration methods for Monte Carlo and deterministic criticality calculations

    Energy Technology Data Exchange (ETDEWEB)

    Urbatsch, T.J.

    1995-11-01

    If you have ever given up on a nuclear criticality calculation and terminated it because it took so long to converge, you might find this thesis of interest. The author develops three methods for improving the fission source convergence in nuclear criticality calculations for physical systems with high dominance ratios for which convergence is slow. The Fission Matrix Acceleration Method and the Fission Diffusion Synthetic Acceleration (FDSA) Method are acceleration methods that speed fission source convergence for both Monte Carlo and deterministic methods. The third method is a hybrid Monte Carlo method that also converges for difficult problems where the unaccelerated Monte Carlo method fails. The author tested the feasibility of all three methods in a test bed consisting of idealized problems. He has successfully accelerated fission source convergence in both deterministic and Monte Carlo criticality calculations. By filtering statistical noise, he has incorporated deterministic attributes into the Monte Carlo calculations in order to speed their source convergence. He has used both the fission matrix and a diffusion approximation to perform unbiased accelerations. The Fission Matrix Acceleration method has been implemented in the production code MCNP and successfully applied to a real problem. When the unaccelerated calculations are unable to converge to the correct solution, they cannot be accelerated in an unbiased fashion. A Hybrid Monte Carlo method weds Monte Carlo and a modified diffusion calculation to overcome these deficiencies. The Hybrid method additionally possesses reduced statistical errors.

  12. Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow

    Science.gov (United States)

    Gupta, Atma Ram; Kumar, Ashwani

    2017-12-01

    Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.

  13. Precision production: enabling deterministic throughput for precision aspheres with MRF

    Science.gov (United States)

    Maloney, Chris; Entezarian, Navid; Dumas, Paul

    2017-10-01

    Aspherical lenses offer advantages over spherical optics by improving image quality or reducing the number of elements necessary in an optical system. Aspheres are no longer being used exclusively by high-end optical systems but are now replacing spherical optics in many applications. The need for a method of production-manufacturing of precision aspheres has emerged and is part of the reason that the optics industry is shifting away from artisan-based techniques towards more deterministic methods. Not only does Magnetorheological Finishing (MRF) empower deterministic figure correction for the most demanding aspheres but it also enables deterministic and efficient throughput for series production of aspheres. The Q-flex MRF platform is designed to support batch production in a simple and user friendly manner. Thorlabs routinely utilizes the advancements of this platform and has provided results from using MRF to finish a batch of aspheres as a case study. We have developed an analysis notebook to evaluate necessary specifications for implementing quality control metrics. MRF brings confidence to optical manufacturing by ensuring high throughput for batch processing of aspheres.

  14. The effect of a severe disaster on the mental health of adolescents: A controlled study

    NARCIS (Netherlands)

    Reijneveld, S.A.; Crone, M.R.; Verhulst, F.C.; Verloove-Vanhorick, S.P.

    2003-01-01

    Background: Disasters greatly affect the mental health of children and adolescents, but quantification of such effects is difficult. Using prospective predisaster and postdisaster data for affected and control populations, we aimed to assess the effects of a severe disaster on the mental health and

  15. Minimal effectiveness of native and non-native seeding following three high-severity wildfire

    Science.gov (United States)

    Ken A. Stella; Carolyn H. Sieg; Pete Z. Fule

    2010-01-01

    The rationale for seeding following high-severity wildfires is to enhance plant cover and reduce bare ground, thus decreasing the potential for soil erosion and non-native plant invasion. However, experimental tests of the effectiveness of seeding in meeting these objectives in forests are lacking. We conducted three experimental studies of the effectiveness of seeding...

  16. Quantifying soil burn severity for hydrologic modeling to assess post-fire effects on sediment delivery

    Science.gov (United States)

    Dobre, Mariana; Brooks, Erin; Lew, Roger; Kolden, Crystal; Quinn, Dylan; Elliot, William; Robichaud, Pete

    2017-04-01

    Soil erosion is a secondary fire effect with great implications for many ecosystem resources. Depending on the burn severity, topography, and the weather immediately after the fire, soil erosion can impact municipal water supplies, degrade water quality, and reduce reservoirs' storage capacity. Scientists and managers use field and remotely sensed data to quickly assess post-fire burn severity in ecologically-sensitive areas. From these assessments, mitigation activities are implemented to minimize post-fire flood and soil erosion and to facilitate post-fire vegetation recovery. Alternatively, land managers can use fire behavior and spread models (e.g. FlamMap, FARSITE, FOFEM, or CONSUME) to identify sensitive areas a priori, and apply strategies such as fuel reduction treatments to proactively minimize the risk of wildfire spread and increased burn severity. There is a growing interest in linking fire behavior and spread models with hydrology-based soil erosion models to provide site-specific assessment of mitigation treatments on post-fire runoff and erosion. The challenge remains, however, that many burn severity mapping and modeling products quantify vegetation loss rather than measuring soil burn severity. Wildfire burn severity is spatially heterogeneous and depends on the pre-fire vegetation cover, fuel load, topography, and weather. Severities also differ depending on the variable of interest (e.g. soil, vegetation). In the United States, Burned Area Reflectance Classification (BARC) maps, derived from Landsat satellite images, are used as an initial burn severity assessment. BARC maps are classified from either a Normalized Burn Ratio (NBR) or differenced Normalized Burned Ratio (dNBR) scene into four classes (Unburned, Low, Moderate, and High severity). The development of soil burn severity maps requires further manual field validation efforts to transform the BARC maps into a product more applicable for post-fire soil rehabilitation activities

  17. Modelling Variable Fire Severity in Boreal Forests: Effects of Fire Intensity and Stand Structure.

    Science.gov (United States)

    Miquelajauregui, Yosune; Cumming, Steven G; Gauthier, Sylvie

    2016-01-01

    It is becoming clear that fires in boreal forests are not uniformly stand-replacing. On the contrary, marked variation in fire severity, measured as tree mortality, has been found both within and among individual fires. It is important to understand the conditions under which this variation can arise. We integrated forest sample plot data, tree allometries and historical forest fire records within a diameter class-structured model of 1.0 ha patches of mono-specific black spruce and jack pine stands in northern Québec, Canada. The model accounts for crown fire initiation and vertical spread into the canopy. It uses empirical relations between fire intensity, scorch height, the percent of crown scorched and tree mortality to simulate fire severity, specifically the percent reduction in patch basal area due to fire-caused mortality. A random forest and a regression tree analysis of a large random sample of simulated fires were used to test for an effect of fireline intensity, stand structure, species composition and pyrogeographic regions on resultant severity. Severity increased with intensity and was lower for jack pine stands. The proportion of simulated fires that burned at high severity (e.g. >75% reduction in patch basal area) was 0.80 for black spruce and 0.11 for jack pine. We identified thresholds in intensity below which there was a marked sensitivity of simulated fire severity to stand structure, and to interactions between intensity and structure. We found no evidence for a residual effect of pyrogeographic region on simulated severity, after the effects of stand structure and species composition were accounted for. The model presented here was able to produce variation in fire severity under a range of fire intensity conditions. This suggests that variation in stand structure is one of the factors causing the observed variation in boreal fire severity.

  18. The effect of hospital volume on mortality in patients admitted with severe sepsis.

    Directory of Open Access Journals (Sweden)

    Sajid Shahul

    Full Text Available IMPORTANCE: The association between hospital volume and inpatient mortality for severe sepsis is unclear. OBJECTIVE: To assess the effect of severe sepsis case volume and inpatient mortality. DESIGN SETTING AND PARTICIPANTS: Retrospective cohort study from 646,988 patient discharges with severe sepsis from 3,487 hospitals in the Nationwide Inpatient Sample from 2002 to 2011. EXPOSURES: The exposure of interest was the mean yearly sepsis case volume per hospital divided into tertiles. MAIN OUTCOMES AND MEASURES: Inpatient mortality. RESULTS: Compared with the highest tertile of severe sepsis volume (>60 cases per year, the odds ratio for inpatient mortality among persons admitted to hospitals in the lowest tertile (≤10 severe sepsis cases per year was 1.188 (95% CI: 1.074-1.315, while the odds ratio was 1.090 (95% CI: 1.031-1.152 for patients admitted to hospitals in the middle tertile. Similarly, improved survival was seen across the tertiles with an adjusted inpatient mortality incidence of 35.81 (95% CI: 33.64-38.03 for hospitals with the lowest volume of severe sepsis cases and a drop to 32.07 (95% CI: 31.51-32.64 for hospitals with the highest volume. CONCLUSIONS AND RELEVANCE: We demonstrate an association between a higher severe sepsis case volume and decreased mortality. The need for a systems-based approach for improved outcomes may require a high volume of severely septic patients.

  19. Eating disorder severity and functional impairment: moderating effects of illness duration in a clinical sample.

    Science.gov (United States)

    Davidsen, Annika Helgadóttir; Hoyt, William T; Poulsen, Stig; Waaddegaard, Mette; Lau, Marianne

    2017-09-01

    The aim was to examine duration of illness and body mass index as possible moderators of the relationship between eating disorder severity and functional impairment, as well as psychological distress as a possible mediator of this relationship. The study included 159 patients diagnosed with bulimia nervosa, binge eating disorder or eating disorder not otherwise specified. Regression analysis was applied to assess the effect of the hypothesized moderators and mediators. Eating disorder severity was measured with the Eating Disorder Examination Questionnaire, functional impairment was measured with the Sheehan Disability Scale, and psychological distress was measured with the Symptom Check List-90-R. Duration of illness and body mass index were assessed clinically. Duration of illness significantly moderated the relationship between eating disorder severity and functional impairment; the relationship was strongest for patients with a shorter duration of illness. Psychological distress partly mediated the relationship between eating disorder severity and functional impairment. Duration of illness significantly moderated the relationship between psychological distress and functional impairment; the strongest relationship was seen for patients with a shorter duration of illness. Body mass index was not a significant moderator of the relationship between ED severity and functional impairment. Overall, this study established a link between ED severity, psychological distress and functional impairment indicating that both eating disorder severity and psychological distress are more strongly related to impaired role functioning for patients with more recent onset of an eating disorder. More research in the complex relationship between ED severity and functional impairment is needed.

  20. Theory and application of deterministic multidimensional pointwise energy lattice physics methods

    International Nuclear Information System (INIS)

    Zerkle, M.L.

    1999-01-01

    The theory and application of deterministic, multidimensional, pointwise energy lattice physics methods are discussed. These methods may be used to solve the neutron transport equation in multidimensional geometries using near-continuous energy detail to calculate equivalent few-group diffusion theory constants that rigorously account for spatial and spectral self-shielding effects. A dual energy resolution slowing down algorithm is described which reduces the computer memory and disk storage requirements for the slowing down calculation. Results are presented for a 2D BWR pin cell depletion benchmark problem

  1. A plateau–valley separation method for textured surfaces with a deterministic pattern

    DEFF Research Database (Denmark)

    Godi, Alessandro; Kühle, Anders; De Chiffre, Leonardo

    2014-01-01

    The effective characterization of textured surfaces presenting a deterministic pattern of lubricant reservoirs is an issue with which many researchers are nowadays struggling. Existing standards are not suitable for the characterization of such surfaces, providing at times values without physical...... meaning. A new method based on the separation between the plateau and valley regions is hereby presented allowing independent functional analyses of the detected features. The determination of a proper threshold between plateaus and valleys is the first step of a procedure resulting in an efficient...

  2. A systematic framework for effective uncertainty assessment of severe accident calculations; Hybrid qualitative and quantitative methodology

    International Nuclear Information System (INIS)

    Hoseyni, Seyed Mohsen; Pourgol-Mohammad, Mohammad; Tehranifard, Ali Abbaspour; Yousefpour, Faramarz

    2014-01-01

    This paper describes a systematic framework for characterizing important phenomena and quantifying the degree of contribution of each parameter to the output in severe accident uncertainty assessment. The proposed methodology comprises qualitative as well as quantitative phases. The qualitative part so called Modified PIRT, being a robust process of PIRT for more precise quantification of uncertainties, is a two step process for identifying and ranking based on uncertainty importance in severe accident phenomena. In this process identified severe accident phenomena are ranked according to their effect on the figure of merit and their level of knowledge. Analytical Hierarchical Process (AHP) serves here as a systematic approach for severe accident phenomena ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the severe accident model(s) used to represent the important phenomena. The methodology uses subjective justification by evaluating available information and data from experiments, and code predictions for this step. The quantitative part utilizes uncertainty importance measures for the quantification of the effect of each input parameter to the output uncertainty. A response surface fitting approach is proposed for estimating associated uncertainties with less calculation cost. The quantitative results are used to plan in reducing epistemic uncertainty in the output variable(s). The application of the proposed methodology is demonstrated for the ACRR MP-2 severe accident test facility. - Highlights: • A two stage framework for severe accident uncertainty analysis is proposed. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • Uncertainty importance measure quantitatively calculates effect of each uncertainty source. • Methodology is applied successfully on ACRR MP-2 severe accident test facility

  3. Severe side effects with the application of Mesalazine (5-aminosalicylic acid) during radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Freund, U.; Siems, H.; Wannenmacher, M.; Schoelmerich, J.; Kluge, F.; Schaefer, H.E.

    1987-10-01

    In a prospective randomized placebo controlled double blind study, the prophylactic effect of Mesalazine (5-aminosalicylic acid, 5-ASA) as suppositories (3x250 mg/day) on radiation induced proctitis during radiotherapy for prostatic carcinoma was studied. The study ended when 16 patients had been included (5-ASA: Eight, placebo: Eight) because of severe side effects in the 5-ASA group. 75% of patients treated with 5-ASA reported symptoms of a severe proctitis while only one patient in the placebo group had similar complaints. The application of Mesalazine as suppositories is not useful in preventing radiation induced proctitis during radiotherapy of prostate carcinoma.

  4. Severe side effects with the application of Mesalazine (5-aminosalicylic acid) during radiotherapy

    International Nuclear Information System (INIS)

    Freund, U.; Siems, H.; Wannenmacher, M.; Schoelmerich, J.; Kluge, F.; Schaefer, H.E.

    1987-01-01

    In a prospective randomized placebo controlled double blind study, the prophylactic effect of Mesalazine (5-aminosalicylic acid, 5-ASA) as suppositories (3x250 mg/day) on radiation induced proctitis during radiotherapy for prostatic carcinoma was studied. The study ended when 16 patients had been included (5-ASA: Eight, placebo: Eight) because of severe side effects in the 5-ASA group. 75% of patients treated with 5-ASA reported symptoms of a severe proctitis while only one patient in the placebo group had similar complaints. The application of Mesalazine as suppositories is not useful in preventing radiation induced proctitis during radiotherapy of prostate carcinoma. (orig.) [de

  5. Field-free deterministic ultrafast creation of magnetic skyrmions by spin-orbit torques

    Science.gov (United States)

    Büttner, Felix; Lemesh, Ivan; Schneider, Michael; Pfau, Bastian; Günther, Christian M.; Hessing, Piet; Geilhufe, Jan; Caretta, Lucas; Engel, Dieter; Krüger, Benjamin; Viefhaus, Jens; Eisebitt, Stefan; Beach, Geoffrey S. D.

    2017-11-01

    Magnetic skyrmions are stabilized by a combination of external magnetic fields, stray field energies, higher-order exchange interactions and the Dzyaloshinskii-Moriya interaction (DMI). The last favours homochiral skyrmions, whose motion is driven by spin-orbit torques and is deterministic, which makes systems with a large DMI relevant for applications. Asymmetric multilayers of non-magnetic heavy metals with strong spin-orbit interactions and transition-metal ferromagnetic layers provide a large and tunable DMI. Also, the non-magnetic heavy metal layer can inject a vertical spin current with transverse spin polarization into the ferromagnetic layer via the spin Hall effect. This leads to torques that can be used to switch the magnetization completely in out-of-plane magnetized ferromagnetic elements, but the switching is deterministic only in the presence of a symmetry-breaking in-plane field. Although spin-orbit torques led to domain nucleation in continuous films and to stochastic nucleation of skyrmions in magnetic tracks, no practical means to create individual skyrmions controllably in an integrated device design at a selected position has been reported yet. Here we demonstrate that sub-nanosecond spin-orbit torque pulses can generate single skyrmions at custom-defined positions in a magnetic racetrack deterministically using the same current path as used for the shifting operation. The effect of the DMI implies that no external in-plane magnetic fields are needed for this aim. This implementation exploits a defect, such as a constriction in the magnetic track, that can serve as a skyrmion generator. The concept is applicable to any track geometry, including three-dimensional designs.

  6. The Role of Auxiliary Variables in Deterministic and Deterministic-Stochastic Spatial Models of Air Temperature in Poland

    Science.gov (United States)

    Szymanowski, Mariusz; Kryza, Maciej

    2017-02-01

    Our study examines the role of auxiliary variables in the process of spatial modelling and mapping of climatological elements, with air temperature in Poland used as an example. The multivariable algorithms are the most frequently applied for spatialization of air temperature, and their results in many studies are proved to be better in comparison to those obtained by various one-dimensional techniques. In most of the previous studies, two main strategies were used to perform multidimensional spatial interpolation of air temperature. First, it was accepted that all variables significantly correlated with air temperature should be incorporated into the model. Second, it was assumed that the more spatial variation of air temperature was deterministically explained, the better was the quality of spatial interpolation. The main goal of the paper was to examine both above-mentioned assumptions. The analysis was performed using data from 250 meteorological stations and for 69 air temperature cases aggregated on different levels: from daily means to 10-year annual mean. Two cases were considered for detailed analysis. The set of potential auxiliary variables covered 11 environmental predictors of air temperature. Another purpose of the study was to compare the results of interpolation given by various multivariable methods using the same set of explanatory variables. Two regression models: multiple linear (MLR) and geographically weighted (GWR) method, as well as their extensions to the regression-kriging form, MLRK and GWRK, respectively, were examined. Stepwise regression was used to select variables for the individual models and the cross-validation method was used to validate the results with a special attention paid to statistically significant improvement of the model using the mean absolute error (MAE) criterion. The main results of this study led to rejection of both assumptions considered. Usually, including more than two or three of the most significantly

  7. Studying the Effects of Fasting during Ramadan on Pulmonary Functioning Test and Asthma Severity

    Directory of Open Access Journals (Sweden)

    Seyyed Hassan Adeli

    2015-03-01

    Full Text Available Background and Objectives: Studies have shown that fasting can have an impact on the course and severity of chronic diseases. There are a few studies on the association of fasting and asthma. Therefore, this study has been conducted with the purpose of examining the effects of fasting on asthma severity and pulmonary functioning tests. Methods: 30 patients with asthma who attended a pulmonology clinic in Qom were enrolled in this study. The severity of patients’ asthma has been studied by questionnaire and spirometry of pulmonary functioning in the month of Shaban, Ramadan and Shawwal. The results of Asthma Control Questionnaire and the pulmonary functioning tests in three months have been compared. Results: The average age of patients was 43.42 years and 43.3% of patients were males. The Average score for asthma severity questionnaire in three months were 20.4, 21 and 20.17 respectively. Statistically, there haven’t been any significant differences between the results of pulmonary functioning test and asthma severity before Ramadan (Shaban, during Ramadan and after that (Shawwal. Conclusion: The findings of this study showed that fasting in patients with asthma has no effect on pulmonary function and asthma severity.

  8. Export of solids and nutrients from burnt areas: effects of fire severity and forest type

    Science.gov (United States)

    Abrantes, Nelson; Morais, Inês; Silva, Vera; Malvar, Mauxa C.; Prats, Sérgio; Coelho, Celeste; Keizer, Jan J.

    2015-04-01

    In the last few decades, the number of wildfires has markedly increased in Mediterranean Europe, including Portugal. Besides a range of direct impacts, wildfires can significantly alter the geomorphological and hydrological processes during a period commonly referred to as the "window-of-disturbance". It is now increasingly recognized that these indirect wildfire effects depend strongly on fire severity, i.e. the heating-induced changes in vegetation and litter cover as well as in topsoil properties such as infiltration capacity, aggregate stability and soil water repellency. Nonetheless, the exact role of fire severity in post-fire hydrological and erosion processes is still poorly quantified in many parts of the world, including Portugal. Another important gap in fire-related research stills to be the impacts of wildfire on soil fertility losses, in particular through erosion by runoff. Both research gaps were addressed in this study, following a wildfire that took place in July 2013 in Talhadas (Sever do Vouga, Aveiro) and burnt circa 815 ha. In the burnt area and the surrounding unburnt areas, six study sites were selected and, immediately after the fire, instrumented with slope-scale runoff plots. Two of the sites were long-unburnt, two were burnt at low severity and the other two were burnt at high severity; for all of them one being covered by a Eucalyptus globulus plantation and the other by a Pinus pinaster plantation. Following the instrumentation of the sites, runoff was measured at 1- to 2-weekly intervals and, whenever possible, runoff samples were collected for subsequent analysis in the laboratory with respect to total suspended sediments content and total nitrogen and total phosphorus concentrations. The results obtained in this study showed that the severity of the fire played a more important role in the loss of nutrients and solids than the type of vegetation. While the occurrence of fire markedly increased soil (fertility) losses, this effect

  9. Detrimental effects of environmental tobacco smoke in relation to asthma severity.

    Directory of Open Access Journals (Sweden)

    Suzy A A Comhair

    2011-05-01

    Full Text Available Environmental tobacco smoke (ETS has adverse effects on the health of asthmatics, however the harmful consequences of ETS in relation to asthma severity are unknown.In a multicenter study of severe asthma, we assessed the impact of ETS exposure on morbidity, health care utilization and lung functions; and activity of systemic superoxide dismutase (SOD, a potential oxidative target of ETS that is negatively associated with asthma severity.From 2002-2006, 654 asthmatics (non-severe 366, severe 288 were enrolled, among whom 109 non-severe and 67 severe asthmatics were routinely exposed to ETS as ascertained by history and validated by urine cotinine levels. ETS-exposure was associated with lower quality of life scores; greater rescue inhaler use; lower lung function; greater bronchodilator responsiveness; and greater risk for emergency room visits, hospitalization and intensive care unit admission. ETS-exposure was associated with lower levels of serum SOD activity, particularly in asthmatic women of African heritage.ETS-exposure of asthmatic individuals is associated with worse lung function, higher acuity of exacerbations, more health care utilization, and greater bronchial hyperreactivity. The association of diminished systemic SOD activity to ETS exposure provides for the first time a specific oxidant mechanism by which ETS may adversely affect patients with asthma.

  10. Maternal effects alter the severity of inbreeding depression in the offspring.

    Science.gov (United States)

    Pilakouta, Natalie; Smiseth, Per T

    2016-09-14

    A maternal effect is a causal influence of the maternal phenotype on the offspring phenotype over and above any direct effects of genes. There is abundant evidence that maternal effects can have a major impact on offspring fitness. Yet, no previous study has investigated the potential role of maternal effects in influencing the severity of inbreeding depression in the offspring. Inbreeding depression is a reduction in the fitness of inbred offspring relative to outbred offspring. Here, we tested whether maternal effects due to body size alter the magnitude of inbreeding depression in the burying beetle Nicrophorus vespilloides We found that inbreeding depression in larval survival was more severe for offspring of large females than offspring of small females. This might be due to differences in how small and large females invest in an inbred brood because of their different prospects for future breeding opportunities. To our knowledge, this is the first evidence for a causal effect of the maternal phenotype on the severity of inbreeding depression in the offspring. In natural populations that are subject to inbreeding, maternal effects may drive variation in inbreeding depression and therefore contribute to variation in the strength and direction of selection for inbreeding avoidance. © 2016 The Author(s).

  11. The evaluator effect in usability studies: Problem detection and severity judgments

    DEFF Research Database (Denmark)

    Jacobsen, Niels Ebbe; Hertzum, Morten; John, Bonnie E.

    1998-01-01

    Usability studies are commonly used in industry and applied in research as a yardstick for other usability evaluation methods. Though usability studies have been studied extensively, one potential threat to their reliability has been left virtually untouched: the evaluator effect. In this study......, four evaluators individually analyzed four videotaped usability test sessions. Only 20% of the 93 detected problems were detected by all evaluators, and 46% were detected by only a single evaluator. From the total set of 93 problems the evaluators individually selected the ten problems they considered...... most severe. None of the selected severe problems appeared on all four evaluators’ top-10 lists, and 4 of the 11 problems that were considered severe by more than one evaluator were only detected by one or two evaluators. Thus, both detection of usability problems and selection of the most severe...

  12. The effectiveness of the treatment of severe exercise-induced asthma in schoolchildren

    Directory of Open Access Journals (Sweden)

    M.N. Garas

    2017-03-01

    Full Text Available Background. Bronchial asthma is one of the most common chronic multifactorial diseases of the lungs. At least 10–12 % of patients with bronchial asthma are suffering from a severe form of the disease. One aspect of inadequate severe asthma control is its phenotypic heterogeneity, interest of experts increases to the problem of exercise-induced asthma. The purpose of the study was to increase efficiency of treatment for severe exercise-induced asthma in schoolchildren based on the analysis of the attack dynamics and to achieve disease control according to main inflammatometric and spirometric indices. Materials and methods. We examined 46 children with severe persistent bronchial asthma, in particular, 15 schoolchildren suffering from severe exercise-induced asthma, the second clinical group (comparison one consisted of 31 children suffering from severe type of the disease, with no signs of exercise-induced bronchoconstriction. Basic therapy effectiveness was determined prospectively by assessing the disease control using AST-test with an interval of 3 months. The severity of bronchial obstruction syndrome in patients on admission to hospital during exacerbation was assessed by score scale. Airway hyperresponsiveness was evaluated according to the results of bronchoprovocation with histamine. Results. Children of I clinical group had more significant manifestations of bronchial obstruction during the week of inpatient treatment than the comparison group of patients, including significantly more severe manifestations of bronchial obstruction were verified on 1st and 7th day of hospitalization. Due to the analysis of basic therapy effectiveness, only a quarter of I clinical group patients and a larger part of schoolchildren in comparison group achieved the partial control after a 3-month course of anti-inflammatory treatment. Eosinophilic inflammation was observed in most children with severe exercise-induced asthma (60.1 % and in 47.2 % of

  13. Contribution of the deterministic approach to the characterization of seismic input

    International Nuclear Information System (INIS)

    Panza, G.F.; Romanelli, F.; Vaccari, F.; Decanini, L.; Mollaioli, F.

    1999-10-01

    Traditional methods use either a deterministic or a probabilistic approach, based on empirically derived laws for ground motion attenuation. The realistic definition of seismic input can be performed by means of advanced modelling codes based on the modal summation technique. These codes and their extension to laterally heterogeneous structures allow us to accurately calculate synthetic signals, complete of body waves and of surface waves, corresponding to different source and anelastic structural models, taking into account the effect of local geological conditions. This deterministic approach is capable to address some aspects largely overlooked in the probabilistic approach: (a) the effect of crustal properties on attenuation are not neglected; (b) the ground motion parameters are derived from synthetic time histories. and not from overly simplified attenuation functions; (c) the resulting maps are in terms of design parameters directly, and do not require the adaptation of probabilistic maps to design ground motions; and (d) such maps address the issue of the deterministic definition of ground motion in a way which permits the generalization of design parameters to locations where there is little seismic history. The methodology has been applied to a large part of south-eastern Europe, in the framework of the EU-COPERNICUS project 'Quantitative Seismic Zoning of the Circum Pannonian Region'. Maps of various seismic hazard parameters numerically modelled, and whenever possible tested against observations, such as peak ground displacement, velocity and acceleration, of practical use for the design of earthquake-safe structures, have been produced. The results of a standard probabilistic approach are compared with the findings based on the deterministic approach. A good agreement is obtained except for the Vrancea (Romania) zone, where the attenuation relations used in the probabilistic approach seem to underestimate, mainly at large distances, the seismic hazard

  14. The frequency of occurrence and severity of side-effects of immersion virtual reality.

    Science.gov (United States)

    Regan, E C; Price, K R

    1994-06-01

    Virtual reality (VR) has become increasingly well-known over the last few years. However, little is known about the side-effects of prolonged immersion in VR. This study set out to investigate the frequency of occurrence and severity of side-effects of using an immersion VR system. Out of 146 subjects, 61% reported symptoms of malaise at some point during a 20-min immersion and 10-min post-immersion period. These ranged from symptoms such as dizziness, stomach awareness, headaches, eyestrain and lightheadedness to severe nausea. These symptoms caused 5% of the subjects to withdraw from the experiment before completing their 20-min immersion period. Further research needs to be conducted that attempts to identify those factors that play a causative role in the side-effects of the VR system, and that looks for methods of reducing these side-effects.

  15. The therapeutic effects and experience of tracheal stent implantation in managing severe tracheal stenosis

    International Nuclear Information System (INIS)

    Lv Weifu; Zhang Xingming; Zhang Xuebing; Wang Weiyu; Hou Changlong

    2006-01-01

    Objective: To evaluate the therapeutic effects and experience of the tracheal stent implantation for the management of severe tracheal stenosis. Materials: Thirteen patients with severe tracheal stenosis of various causes underwent high kilovoltage radiography and computed tomography for evaluating the site, form and extent of the stenosis including 10 at the trachea, 1 at the right main bronchus and 2 at left main bronchus. The C2 catheter assisted with ultra-slipping guide wire was inserted into the trachea under fluoroscopy and then a replaced high shoring guide wire was pushed through the stenotic segment and retained the stent. Results: All stents were implanted successfully with successful rate 100% together with dyspnoeic improvements. The mean survival time was 6.2 months for patients with malignant neoplasm. One patient with benign tracheal stenosis has been followed-up for 5 years without restenosis. Conclusions: The tracheal stent implantation is an effective means for severe tracheal stenosis. (authors)

  16. Effect of early vitrectomy combined with silicone oil tamponade for severe infectious traumatized endophthalmitis

    Directory of Open Access Journals (Sweden)

    Xiao Zheng

    2013-08-01

    Full Text Available AIM: To explore the relations of clinical efficacy and surgical timing of vitrectomy combined with silicone oil tamponade for severe infectious traumatized endophthalmitis.METHODS: Totally 59 patients(59 eyeswith severe infectious traumatized endophthalmitis accepted vitrectomy combined with silicone oil tamponade. Patients were divided into two groups by different surgical timing. Group A accepted operation in 24 hours. Group B accepted operation 24 hours after injury. Retina status during operation, clinical efficacy and best-corrected visual acuity were observed and recorded. RESULTS: The cases of early operation group got lesser retina injury and higher efficacy and better best-corrected visual acuity. CONCLUSION:Vitrectomy combined with silicone oil tamponade is an effective way to cure severe infected traumatized endophthalmitis. Early surgical treatment is the key to achieve better effect.

  17. A toolkit for integrated deterministic and probabilistic assessment for hydrogen infrastructure.

    Energy Technology Data Exchange (ETDEWEB)

    Groth, Katrina M.; Tchouvelev, Andrei V.

    2014-03-01

    There has been increasing interest in using Quantitative Risk Assessment [QRA] to help improve the safety of hydrogen infrastructure and applications. Hydrogen infrastructure for transportation (e.g. fueling fuel cell vehicles) or stationary (e.g. back-up power) applications is a relatively new area for application of QRA vs. traditional industrial production and use, and as a result there are few tools designed to enable QRA for this emerging sector. There are few existing QRA tools containing models that have been developed and validated for use in small-scale hydrogen applications. However, in the past several years, there has been significant progress in developing and validating deterministic physical and engineering models for hydrogen dispersion, ignition, and flame behavior. In parallel, there has been progress in developing defensible probabilistic models for the occurrence of events such as hydrogen release and ignition. While models and data are available, using this information is difficult due to a lack of readily available tools for integrating deterministic and probabilistic components into a single analysis framework. This paper discusses the first steps in building an integrated toolkit for performing QRA on hydrogen transportation technologies and suggests directions for extending the toolkit.

  18. Generalized outcome-based strategy classification: comparing deterministic and probabilistic choice models.

    Science.gov (United States)

    Hilbig, Benjamin E; Moshagen, Morten

    2014-12-01

    Model comparisons are a vital tool for disentangling which of several strategies a decision maker may have used--that is, which cognitive processes may have governed observable choice behavior. However, previous methodological approaches have been limited to models (i.e., decision strategies) with deterministic choice rules. As such, psychologically plausible choice models--such as evidence-accumulation and connectionist models--that entail probabilistic choice predictions could not be considered appropriately. To overcome this limitation, we propose a generalization of Bröder and Schiffer's (Journal of Behavioral Decision Making, 19, 361-380, 2003) choice-based classification method, relying on (1) parametric order constraints in the multinomial processing tree framework to implement probabilistic models and (2) minimum description length for model comparison. The advantages of the generalized approach are demonstrated through recovery simulations and an experiment. In explaining previous methods and our generalization, we maintain a nontechnical focus--so as to provide a practical guide for comparing both deterministic and probabilistic choice models.

  19. The effect of lavender aromatherapy on the pain severity of primary ...

    African Journals Online (AJOL)

    Background: Primary dysmenorrhea is the most common complaint in adolescents and adult young women that disturbs their daily life performance. Aim: The current study investigated the effect of lavender aromatherapy on pain severity in primary dysmenorrhea. Subjects and Methods: This triple‑blind randomized clinical ...

  20. A small effect of adding antiviral agents in treating patients with severe Bell palsy.

    NARCIS (Netherlands)

    Veen, E.L. van der; Rovers, M.M.; Ru, J.A. de; Heijden, G.J. van der

    2012-01-01

    In this evidence-based case report, the authors studied the following clinical question: What is the effect of adding antiviral agents to corticosteroids in the treatment of patients with severe or complete Bell palsy? The search yielded 250 original research articles. The 6 randomized trials of

  1. Effect of probiotics on diarrhea in children with severe acute malnutrition

    DEFF Research Database (Denmark)

    Grenov, Benedikte; Namusoke, Hanifa; Lanyero, Betty

    2017-01-01

    OBJECTIVES: To assess the effect of probiotics on diarrhea during in- and outpatient treatment of children with severe acute malnutrition (SAM). METHODS: A randomized, double-blind, placebo-controlled study was conducted involving 400 children admitted with SAM. Patients received one daily dose...

  2. DISTRIBUTION OF EXOGENOUS SURFACTANT IN RABBITS WITH SEVERE RESPIRATORY-FAILURE - THE EFFECT OF VOLUME

    NARCIS (Netherlands)

    VANDERBLEEK, J; PLOTZ, FB; VANOVERBEEK, FM; HEIKAMP, A; BEEKHUIS, H; WILDEVUUR, CRH; OKKEN, A; OETOMO, SB

    The transient effect of surfactant therapy that is observed in some patients might, at least in part, be explained by a nonhomogeneous distribution. Therefore, we investigated the distribution of a surfactant preparation (Alvofact, 45 g/L) that is used clinically. Rabbits with severe respiratory

  3. A NOVEL EFFECT OF DIOXIN: EXPOSURE DURING PREGNANCY SEVERELY IMPAIRS MAMMARY GLAND DIFFERENTIATION

    Science.gov (United States)

    A novel effect of dioxin: Exposure during pregnancy severely impairs mammary gland differentiation.Beth A. Vorderstrasse1, Suzanne E. Fenton2, Andrea A. Bohn3, Jennifer A. Cundiff1, and B. Paige Lawrence1,3,4 1Department of Pharmaceutical Sciences, Washington State Universi...

  4. Cost-effectiveness of bariatric surgery in adolescents with severe obesity in the UK.

    Science.gov (United States)

    Panca, M; Viner, R M; White, B; Pandya, T; Melo, H; Adamo, M; Batterham, R; Christie, D; Kinra, S; Morris, S

    2018-04-01

    Evidence shows that surgery for severe obesity in adults improves health and psychological functioning, and is cost-effective. Data on bariatric surgery for adolescents with severe obesity are extremely limited, with no evidence on cost-effectiveness. We evaluated the lifetime cost-effectiveness of bariatric surgery compared with no surgery in adolescents with severe obesity from the UK's National Health Service perspective. Eighteen adolescents with body mass index ≥40 kg m -2 who underwent bariatric surgery (laparoscopic Roux en Y Gastric Bypass [RYGB] [N = 9], and laparoscopic Sleeve Gastrectomy [SG] [N = 9]) at University College London Hospitals between January 2008 and December 2013 were included. We used a Markov cohort model to compare the lifetime expected costs and quality-adjusted life years (QALYs) between bariatric surgery and no surgery. Mean cost of RYGB and SG procedures were £7100 and £7312, respectively. For RYGB vs. no surgery, the incremental cost/QALY was £2018 (95% CI £1942 - £2042) for males and £2005 (95% CI £1974 - £2031) for females. For SG vs. no surgery, the incremental cost/QALY was £1978 (95% CI £1954 - £2002) for males and £1941 (95% CI £1915 - £1969) for females. Bariatric surgery in adolescents with severe obesity is cost-effective; it is more costly than no surgery however it markedly improved quality of life. © 2017 World Obesity Federation.

  5. Using the symptom monitor in a randomized controlled trial: the effect on symptom prevalence and severity

    NARCIS (Netherlands)

    Hoekstra, Johanna; de Vos, Rien; van Duijn, Nico P.; Schadé, Egbert; Bindels, Patrick J. E.

    2006-01-01

    This randomized controlled trial investigated the effect of reporting physical symptoms by using a systematic symptom monitoring instrument, the Symptom Monitor, on symptom prevalence and severity among patients with cancer in the palliative phase. The overall objective was to achieve symptom relief

  6. Control of deterministic and stochastic systems with several small parameters - A survey

    Directory of Open Access Journals (Sweden)

    Vasile Dragan

    2009-07-01

    Full Text Available The past three decades of research on multiparametric singularly perturbed systems are reviewed, including recent results. Particular attention is paid to stability analysis, control, filtering problems and dynamic games. First, a parameter-independent design methodology is summarized, which employs a two-time-scale and descriptor system approach without information on the small parameters. Further, variational computational algorithms are included to avoid ill-conditioned systems : the exact slow-fast decomposition method, the recursive algorithm and Newton's method are considered in particular. Convergence results are presented and the existence and uniqueness of the solutions are discussed. Second, the new results obtained via the stochastic approach are presented. Finally, the results of a simulation of a practical power system are presented to validate the efficiency of the considered design methods.

  7. [Risk of deterministic effects after exposure to low doses of ionizing radiation: retrospective study among health workers in view of a new publication of International Commission on Radiological Protection].

    Science.gov (United States)

    Negrone, Mario; Di Lascio, Doriana

    2016-01-01

    The new recommended equivalent (publication n. 118 of International Commission on Radiological Protection) dose limit for occupational exposure of the lens of the eye is based on prevention of radiogenic cataracts, with the underlying assumption of a nominal threshold which has been adjusted from 2,5 Gy to 0.5 Gy for acute or protracted exposure. The study aim was to determine the prevalence of ocular lens opacity among healthcare workers (radiologic technologists, physicians, physician assistants) with respect to occupational exposures to ionizing radiations. Therefore, we conducted another retrospective study to explore the relationship between occupational exposure to radiation and opacity lens increase. Healthcare data (current occupational dosimetry, occupational history) are used to investigate risk of increase of opacity lens of eye. The sample of this study consisted of 148 health-workers (64 M and 84 W) aged from 28 to 66 years coming from different hospitals of the ASL of Potenza (clinic, hospital and institute with scientific feature). On the basis of the evaluation of the dosimetric history of the workers (global and effective dose) we agreed to ascribe the group of exposed subjects in cat A (equivalent dose > 2 mSV) and the group of non exposed subjects in cat B (workers with annual absorbed level of dose near 0 mSv). The analisys was conducted using SPSS 15.0 (Statistical Package for Social Science). A trend of increased ocular lens opacity was found with increasing number for workers in highest category of exposure (cat. A, Yates' chi-squared test = 13,7 p = 0,0002); variable significantly related to opacity lens results job: nurse (Χ(2)Y = 14,3 p = 0,0002) physician (Χ(2)Y = 2.2 p = 0,1360) and radiologic technologists (Χ(2)Y = 0,1 p = 0,6691). In conclusion our provides evidence that exposure to relatively low doses of ionizing radiation may be harmful to the lens of the eye and may increase a long-term risk of cataract formation; similary

  8. Statistical methods of parameter estimation for deterministically chaotic time series

    Science.gov (United States)

    Pisarenko, V. F.; Sornette, D.

    2004-03-01

    We discuss the possibility of applying some standard statistical methods (the least-square method, the maximum likelihood method, and the method of statistical moments for estimation of parameters) to deterministically chaotic low-dimensional dynamic system (the logistic map) containing an observational noise. A “segmentation fitting” maximum likelihood (ML) method is suggested to estimate the structural parameter of the logistic map along with the initial value x1 considered as an additional unknown parameter. The segmentation fitting method, called “piece-wise” ML, is similar in spirit but simpler and has smaller bias than the “multiple shooting” previously proposed. Comparisons with different previously proposed techniques on simulated numerical examples give favorable results (at least, for the investigated combinations of sample size N and noise level). Besides, unlike some suggested techniques, our method does not require the a priori knowledge of the noise variance. We also clarify the nature of the inherent difficulties in the statistical analysis of deterministically chaotic time series and the status of previously proposed Bayesian approaches. We note the trade off between the need of using a large number of data points in the ML analysis to decrease the bias (to guarantee consistency of the estimation) and the unstable nature of dynamical trajectories with exponentially fast loss of memory of the initial condition. The method of statistical moments for the estimation of the parameter of the logistic map is discussed. This method seems to be the unique method whose consistency for deterministically chaotic time series is proved so far theoretically (not only numerically).

  9. Deterministic nonlinear phase gates induced by a single qubit

    Science.gov (United States)

    Park, Kimin; Marek, Petr; Filip, Radim

    2018-05-01

    We propose deterministic realizations of nonlinear phase gates by repeating a finite sequence of non-commuting Rabi interactions between a harmonic oscillator and only a single two-level ancillary qubit. We show explicitly that the key nonclassical features of the ideal cubic phase gate and the quartic phase gate are generated in the harmonic oscillator faithfully by our method. We numerically analyzed the performance of our scheme under realistic imperfections of the oscillator and the two-level system. The methodology is extended further to higher-order nonlinear phase gates. This theoretical proposal completes the set of operations required for continuous-variable quantum computation.

  10. CALTRANS: A parallel, deterministic, 3D neutronics code

    Energy Technology Data Exchange (ETDEWEB)

    Carson, L.; Ferguson, J.; Rogers, J.

    1994-04-01

    Our efforts to parallelize the deterministic solution of the neutron transport equation has culminated in a new neutronics code CALTRANS, which has full 3D capability. In this article, we describe the layout and algorithms of CALTRANS and present performance measurements of the code on a variety of platforms. Explicit implementation of the parallel algorithms of CALTRANS using both the function calls of the Parallel Virtual Machine software package (PVM 3.2) and the Meiko CS-2 tagged message passing library (based on the Intel NX/2 interface) are provided in appendices.

  11. MIMO capacity for deterministic channel models: sublinear growth

    DEFF Research Database (Denmark)

    Bentosela, Francois; Cornean, Horia; Marchetti, Nicola

    2013-01-01

    . In the current paper, we apply those results in order to study the (Shannon-Foschini) capacity behavior of a MIMO system as a function of the deterministic spread function of the environment and the number of transmitting and receiving antennas. The antennas are assumed to fill in a given fixed volume. Under...... some generic assumptions, we prove that the capacity grows much more slowly than linearly with the number of antennas. These results reinforce previous heuristic results obtained from statistical models of the transfer matrix, which also predict a sublinear behavior....

  12. Deterministic Single-Photon Source for Distributed Quantum Networking

    International Nuclear Information System (INIS)

    Kuhn, Axel; Hennrich, Markus; Rempe, Gerhard

    2002-01-01

    A sequence of single photons is emitted on demand from a single three-level atom strongly coupled to a high-finesse optical cavity. The photons are generated by an adiabatically driven stimulated Raman transition between two atomic ground states, with the vacuum field of the cavity stimulating one branch of the transition, and laser pulses deterministically driving the other branch. This process is unitary and therefore intrinsically reversible, which is essential for quantum communication and networking, and the photons should be appropriate for all-optical quantum information processing

  13. On the progress towards probabilistic basis for deterministic codes

    International Nuclear Information System (INIS)

    Ellyin, F.

    1975-01-01

    Fundamentals arguments for a probabilistic basis of codes are presented. A class of code formats is outlined in which explicit statistical measures of uncertainty of design variables are incorporated. The format looks very much like present codes (deterministic) except for having probabilistic background. An example is provided whereby the design factors are plotted against the safety index, the probability of failure, and the risk of mortality. The safety level of the present codes is also indicated. A decision regarding the new probabilistically based code parameters thus could be made with full knowledge of implied consequences

  14. The deterministic optical alignment of the HERMES spectrograph

    Science.gov (United States)

    Gers, Luke; Staszak, Nicholas

    2014-07-01

    The High Efficiency and Resolution Multi Element Spectrograph (HERMES) is a four channel, VPH-grating spectrograph fed by two 400 fiber slit assemblies whose construction and commissioning has now been completed at the Anglo Australian Telescope (AAT). The size, weight, complexity, and scheduling constraints of the system necessitated that a fully integrated, deterministic, opto-mechanical alignment system be designed into the spectrograph before it was manufactured. This paper presents the principles about which the system was assembled and aligned, including the equipment and the metrology methods employed to complete the spectrograph integration.

  15. Enhanced deterministic phase retrieval using a partially developed speckle field

    DEFF Research Database (Denmark)

    Almoro, Percival F.; Waller, Laura; Agour, Mostafa

    2012-01-01

    A technique for enhanced deterministic phase retrieval using a partially developed speckle field (PDSF) and a spatial light modulator (SLM) is demonstrated experimentally. A smooth test wavefront impinges on a phase diffuser, forming a PDSF that is directed to a 4f setup. Two defocused speckle...... intensity measurements are recorded at the output plane corresponding to axially-propagated representations of the PDSF in the input plane. The speckle intensity measurements are then used in a conventional transport of intensity equation (TIE) to reconstruct directly the test wavefront. The PDSF in our...

  16. Deterministic and efficient quantum cryptography based on Bell's theorem

    International Nuclear Information System (INIS)

    Chen, Z.-B.; Zhang, Q.; Bao, X.-H.; Schmiedmayer, J.; Pan, J.-W.

    2005-01-01

    Full text: We propose a novel double-entanglement-based quantum cryptography protocol that is both efficient and deterministic. The proposal uses photon pairs with entanglement both in polarization and in time degrees of freedom; each measurement in which both of the two communicating parties register a photon can establish a key bit with the help of classical communications. Eavesdropping can be detected by checking the violation of local realism for the detected events. We also show that our protocol allows a robust implementation under current technology. (author)

  17. Quantifying diffusion MRI tractography of the corticospinal tract in brain tumors with deterministic and probabilistic methods.

    Science.gov (United States)

    Bucci, Monica; Mandelli, Maria Luisa; Berman, Jeffrey I; Amirbekian, Bagrat; Nguyen, Christopher; Berger, Mitchel S; Henry, Roland G

    2013-01-01

    sensitivity (79%) as determined from cortical IES compared to deterministic q-ball (50%), probabilistic DTI (36%), and deterministic DTI (10%). The sensitivity using the q-ball algorithm (65%) was significantly higher than using DTI (23%) (p probabilistic algorithms (58%) were more sensitive than deterministic approaches (30%) (p = 0.003). Probabilistic q-ball fiber tracks had the smallest offset to the subcortical stimulation sites. The offsets between diffusion fiber tracks and subcortical IES sites were increased significantly for those cases where the diffusion fiber tracks were visibly thinner than expected. There was perfect concordance between the subcortical IES function (e.g. hand stimulation) and the cortical connection of the nearest diffusion fiber track (e.g. upper extremity cortex). This study highlights the tremendous utility of intraoperative stimulation sites to provide a gold standard from which to evaluate diffusion MRI fiber tracking methods and has provided an object standard for evaluation of different diffusion models and approaches to fiber tracking. The probabilistic q-ball fiber tractography was significantly better than DTI methods in terms of sensitivity and accuracy of the course through the white matter. The commonly used DTI fiber tracking approach was shown to have very poor sensitivity (as low as 10% for deterministic DTI fiber tracking) for delineation of the lateral aspects of the corticospinal tract in our study. Effects of the tumor/edema resulted in significantly larger offsets between the subcortical IES and the preoperative fiber tracks. The provided data show that probabilistic HARDI tractography is the most objective and reproducible analysis but given the small sample and number of stimulation points a generalization about our results should be given with caution. Indeed our results inform the capabilities of preoperative diffusion fiber tracking and indicate that such data should be used carefully when making pre-surgical and

  18. [Effectiveness of an integrated treatment for severe personality disorders. A 36-month pragmatic follow-up].

    Science.gov (United States)

    Lana, Fernando; Sánchez-Gil, Carmen; Ferrer, Laia; López-Patón, Nuria; Litvan, Lia; Marcos, Susana; Sierra, Ana C; Soldevilla, Joan M; Feixas, Guillem; Pérez, Víctor

    2015-01-01

    Over the past 25 years, several studies have shown the efficacy of a number of psychological interventions for severe personality disorders. However, the generalizability of these positive results from long traditional research settings to more ordinary ones has been questioned, requiring a need for replication in pragmatic studies. This pragmatic study compares hospitalizations and Emergency Room visits before and during a 6-month therapeutic program for severe personality disorders, and at 36 months after starting it. The therapeutic program, which integrates several specific interventions within a coherent framework, was carried out in an ordinary clinical setting. Fifty-one patients, evaluated according DSM-IV criteria by using the Spanish version of the Structured Clinical Interview for Personality Disorders (SCID-II), were included. The clinical characteristics showed a group of severely disturbed patients, of which 78.4% met criteria for borderline personality disorder. The percentage of patients hospitalized and visiting the Emergency Room, as well as the number of days of hospitalization and Emergency Room visits was significantly reduced during the treatment, and this improvement was maintained throughout. An integrated treatment for severe personality disorders could be effective in preventing reliance on readmissions, or prolonged hospital stays, when it is implemented by clinicians in ordinary clinical settings. Copyright © 2014 SEP y SEPB. Published by Elsevier España. All rights reserved.

  19. The psychosocial effects of severe caries in 4-year-old children in Recife, Pernambuco, Brazil

    Directory of Open Access Journals (Sweden)

    Sandra Feitosa

    2005-10-01

    Full Text Available The aim of this study was to analyze the psychosocial effects of severe caries in 4-year-old children in Recife, Pernambuco, Brazil. The clinical examination was conducted by a single examiner in order to select children with severe caries and caries-free (kappa = 1. Of the 861 children examined, 77 (8.1% had severe caries and 225 (23.6% were caries-free. Data were collected by applying validated questionnaires answered by the parents or guardians. Most of the parents or guardians of children with severe caries reported that their children complained of toothache (72.7%, and a significant portion stated that their children had problems eating certain kinds of food (49.4% and missed school (26.0% because of their teeth. Most of the parents or guardians of children with severe caries (68.8% stated that oral health affects their children's life, while the same was stated by 9.8% of the parents or guardians of the caries-free children. Severe caries was found to have a negative impact on children's oral health-related quality of life.

  20. The effects of exercise on oxidative stress (TBARS) and BDNF in severely depressed inpatients.

    Science.gov (United States)

    Schuch, Felipe Barreto; Vasconcelos-Moreno, Mirela Paiva; Borowsky, Carolina; Zimmermann, Ana Beatriz; Wollenhaupt-Aguiar, Bianca; Ferrari, Pamela; de Almeida Fleck, Marcelo Pio

    2014-10-01

    Exercise can be an effective treatment for depression. Although the efficacy of exercise is well established, little is known concerning the biological changes associated with the antidepressant effects of exercise. A randomized, controlled trial was conducted to evaluate the effects of adding exercise to the usual treatment on the thiobarbituric acid-reactive substances (TBARS) and brain-derived neurotrophic factor (BDNF) serum levels of severely depressed inpatients. Twenty-six participants were randomized to an exercise group (n=15, exercise+treatment as usual) or a control group (n=11, treatment as usual). The participants in the exercise group completed a targeted dose of 16.5 kcal/kg/week of aerobic exercise, three times per week, throughout their hospitalizations. The control group did not exercise during their hospitalizations. The mean hospitalization length was of 21.63 (4.5)×23.82 (5.7) days for exercise and control groups, respectively. The exercise group performed a median of nine sessions. After adjusting for previous tobacco use, a significant group×time interaction was found for TBARS serum levels (p=0.02). A post hoc Bonferroni test revealed differences between the exercise and control groups at discharge. A significant time effect (pexercise to the usual treatment of severely depressed inpatients decreases the TBARS serum levels of severely depressed inpatients after 3 weeks. Adding exercise had no additional effects on BDNF serum levels.

  1. Cytoprotective effect of cytoflavinum in the treatment of thermal injuries of various severity levels

    Directory of Open Access Journals (Sweden)

    Alexey J. Bozhedomov

    2012-12-01

    Full Text Available The research aimed to conduct studying of cytoprotective effect of cytoflavinum in thermal traumas of various severity levels. Material and methods – 169 patients were included into the research with thermal burns and with a favorable outcome and the severity of a thermal injury from 30 to 170 points according Frank index. 28 patients received cytoflavinum in a complex therapy in a standard dosage. Results – During the cytoflavinum usage in patients with the severity of a thermal injury more than 60 points by Frank there had been fixed: the decrease of a systemic inflammatory response syndrome (SIRS, reduction of stab neutrophils content, slower decrease of erythrocytes, smaller activation of thrombopoiesis, decrease of concentration of the vascular endothelial growth factor. In the group of patients with thermal injuries less than 60 points who had been receiving cytoflavinum there had not positive effects been fixed. Conclusion – Cytoflavinum is the most effective when the severity of a thermal trauma is more than 60 points by Frank.

  2. Efficacy and effectiveness of recombinant human activated protein C in severe sepsis of adults

    Directory of Open Access Journals (Sweden)

    Greiner, Wolfgang

    2007-07-01

    Full Text Available Introduction: Sepsis is defined as an invasion of microorganisms and/or their toxins into the blood associated the reaction of the organism to this invasion. Severe sepsis is a major cost driver in intensive care medicine. In Germany, prevalence data was assessed in the context of the German Prevalence Study. Severe sepsis has a prevalence of 35% in German intensive care units. Research questions: The following questions were analysed: is Drotrecogin alfa (activated (DAA effective in the treatment of patients with severe sepsis and a mixed risk of death, both in all patients and in different subgroups? Is DAA effective in the treatment of patients with severe sepsis and low risk of death? Is DAA cost effective in the treatment of patients with severe sepsis compared to placebo? Methods: Only studies with adult patients are included. There are no other exclusion criteria. A systematic literature search is performed by the German Institute of Medical Documentation and Information (DIMDI. The literature search yielded as a total of 847 hits. After screening of the abstracts, 165 medical and 101 economic publications were chosen for full text appraisal. Results: Therapy with DAA appears to be cost effective in reducing 28-day-mortality in patients with severe sepsis and a high risk of death. A high risk of death is indicated by the presence of multiorgan failure (≥2 and/or an APACHE-II-Score ≥25. Therapy with DAA is not associated with a long-term reduction of mortality at later follow-up assessments. Therapy with DAA is not associated with a long-term reduction of mortality at later follow-up assessments. Therapy with DAA is cost-effective in patients with multiorgan failure and/or an APACHE II Score (≥25. In patients with a lower risk of death, DAA is not cost-effective. Costs associated with bleeding events have been rarely included in cost calculations. Discussion: DAA appears to reduce mortality in patients with severe sepsis and a high

  3. Strongly Deterministic Population Dynamics in Closed Microbial Communities

    Directory of Open Access Journals (Sweden)

    Zak Frentz

    2015-10-01

    Full Text Available Biological systems are influenced by random processes at all scales, including molecular, demographic, and behavioral fluctuations, as well as by their interactions with a fluctuating environment. We previously established microbial closed ecosystems (CES as model systems for studying the role of random events and the emergent statistical laws governing population dynamics. Here, we present long-term measurements of population dynamics using replicate digital holographic microscopes that maintain CES under precisely controlled external conditions while automatically measuring abundances of three microbial species via single-cell imaging. With this system, we measure spatiotemporal population dynamics in more than 60 replicate CES over periods of months. In contrast to previous studies, we observe strongly deterministic population dynamics in replicate systems. Furthermore, we show that previously discovered statistical structure in abundance fluctuations across replicate CES is driven by variation in external conditions, such as illumination. In particular, we confirm the existence of stable ecomodes governing the correlations in population abundances of three species. The observation of strongly deterministic dynamics, together with stable structure of correlations in response to external perturbations, points towards a possibility of simple macroscopic laws governing microbial systems despite numerous stochastic events present on microscopic levels.

  4. Bayesian analysis of deterministic and stochastic prisoner's dilemma games

    Directory of Open Access Journals (Sweden)

    Howard Kunreuther

    2009-08-01

    Full Text Available This paper compares the behavior of individuals playing a classic two-person deterministic prisoner's dilemma (PD game with choice data obtained from repeated interdependent security prisoner's dilemma games with varying probabilities of loss and the ability to learn (or not learn about the actions of one's counterpart, an area of recent interest in experimental economics. This novel data set, from a series of controlled laboratory experiments, is analyzed using Bayesian hierarchical methods, the first application of such methods in this research domain. We find that individuals are much more likely to be cooperative when payoffs are deterministic than when the outcomes are probabilistic. A key factor explaining this difference is that subjects in a stochastic PD game respond not just to what their counterparts did but also to whether or not they suffered a loss. These findings are interpreted in the context of behavioral theories of commitment, altruism and reciprocity. The work provides a linkage between Bayesian statistics, experimental economics, and consumer psychology.

  5. Forced Translocation of Polymer through Nanopore: Deterministic Model and Simulations

    Science.gov (United States)

    Wang, Yanqian; Panyukov, Sergey; Liao, Qi; Rubinstein, Michael

    2012-02-01

    We propose a new theoretical model of forced translocation of a polymer chain through a nanopore. We assume that DNA translocation at high fields proceeds too fast for the chain to relax, and thus the chain unravels loop by loop in an almost deterministic way. So the distribution of translocation times of a given monomer is controlled by the initial conformation of the chain (the distribution of its loops). Our model predicts the translocation time of each monomer as an explicit function of initial polymer conformation. We refer to this concept as ``fingerprinting''. The width of the translocation time distribution is determined by the loop distribution in initial conformation as well as by the thermal fluctuations of the polymer chain during the translocation process. We show that the conformational broadening δt of translocation times of m-th monomer δtm^1.5 is stronger than the thermal broadening δtm^1.25 The predictions of our deterministic model were verified by extensive molecular dynamics simulations

  6. Stochastic and deterministic causes of streamer branching in liquid dielectrics

    International Nuclear Information System (INIS)

    Jadidian, Jouya; Zahn, Markus; Lavesson, Nils; Widlund, Ola; Borg, Karl

    2013-01-01

    Streamer branching in liquid dielectrics is driven by stochastic and deterministic factors. The presence of stochastic causes of streamer branching such as inhomogeneities inherited from noisy initial states, impurities, or charge carrier density fluctuations is inevitable in any dielectric. A fully three-dimensional streamer model presented in this paper indicates that deterministic origins of branching are intrinsic attributes of streamers, which in some cases make the branching inevitable depending on shape and velocity of the volume charge at the streamer frontier. Specifically, any given inhomogeneous perturbation can result in streamer branching if the volume charge layer at the original streamer head is relatively thin and slow enough. Furthermore, discrete nature of electrons at the leading edge of an ionization front always guarantees the existence of a non-zero inhomogeneous perturbation ahead of the streamer head propagating even in perfectly homogeneous dielectric. Based on the modeling results for streamers propagating in a liquid dielectric, a gauge on the streamer head geometry is introduced that determines whether the branching occurs under particular inhomogeneous circumstances. Estimated number, diameter, and velocity of the born branches agree qualitatively with experimental images of the streamer branching

  7. Deterministic sensitivity analysis for the numerical simulation of contaminants transport

    International Nuclear Information System (INIS)

    Marchand, E.

    2007-12-01

    The questions of safety and uncertainty are central to feasibility studies for an underground nuclear waste storage site, in particular the evaluation of uncertainties about safety indicators which are due to uncertainties concerning properties of the subsoil or of the contaminants. The global approach through probabilistic Monte Carlo methods gives good results, but it requires a large number of simulations. The deterministic method investigated here is complementary. Based on the Singular Value Decomposition of the derivative of the model, it gives only local information, but it is much less demanding in computing time. The flow model follows Darcy's law and the transport of radionuclides around the storage site follows a linear convection-diffusion equation. Manual and automatic differentiation are compared for these models using direct and adjoint modes. A comparative study of both probabilistic and deterministic approaches for the sensitivity analysis of fluxes of contaminants through outlet channels with respect to variations of input parameters is carried out with realistic data provided by ANDRA. Generic tools for sensitivity analysis and code coupling are developed in the Caml language. The user of these generic platforms has only to provide the specific part of the application in any language of his choice. We also present a study about two-phase air/water partially saturated flows in hydrogeology concerning the limitations of the Richards approximation and of the global pressure formulation used in petroleum engineering. (author)

  8. Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.

    Science.gov (United States)

    Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O

    2006-03-01

    The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.

  9. A study of deterministic models for quantum mechanics

    International Nuclear Information System (INIS)

    Sutherland, R.

    1980-01-01

    A theoretical investigation is made into the difficulties encountered in constructing a deterministic model for quantum mechanics and into the restrictions that can be placed on the form of such a model. The various implications of the known impossibility proofs are examined. A possible explanation for the non-locality required by Bell's proof is suggested in terms of backward-in-time causality. The efficacy of the Kochen and Specker proof is brought into doubt by showing that there is a possible way of avoiding its implications in the only known physically realizable situation to which it applies. A new thought experiment is put forward to show that a particle's predetermined momentum and energy values cannot satisfy the laws of momentum and energy conservation without conflicting with the predictions of quantum mechanics. Attention is paid to a class of deterministic models for which the individual outcomes of measurements are not dependent on hidden variables associated with the measuring apparatus and for which the hidden variables of a particle do not need to be randomized after each measurement

  10. Deterministic direct reprogramming of somatic cells to pluripotency.

    Science.gov (United States)

    Rais, Yoach; Zviran, Asaf; Geula, Shay; Gafni, Ohad; Chomsky, Elad; Viukov, Sergey; Mansour, Abed AlFatah; Caspi, Inbal; Krupalnik, Vladislav; Zerbib, Mirie; Maza, Itay; Mor, Nofar; Baran, Dror; Weinberger, Leehee; Jaitin, Diego A; Lara-Astiaso, David; Blecher-Gonen, Ronnie; Shipony, Zohar; Mukamel, Zohar; Hagai, Tzachi; Gilad, Shlomit; Amann-Zalcenstein, Daniela; Tanay, Amos; Amit, Ido; Novershtern, Noa; Hanna, Jacob H

    2013-10-03

    Somatic cells can be inefficiently and stochastically reprogrammed into induced pluripotent stem (iPS) cells by exogenous expression of Oct4 (also called Pou5f1), Sox2, Klf4 and Myc (hereafter referred to as OSKM). The nature of the predominant rate-limiting barrier(s) preventing the majority of cells to successfully and synchronously reprogram remains to be defined. Here we show that depleting Mbd3, a core member of the Mbd3/NuRD (nucleosome remodelling and deacetylation) repressor complex, together with OSKM transduction and reprogramming in naive pluripotency promoting conditions, result in deterministic and synchronized iPS cell reprogramming (near 100% efficiency within seven days from mouse and human cells). Our findings uncover a dichotomous molecular function for the reprogramming factors, serving to reactivate endogenous pluripotency networks while simultaneously directly recruiting the Mbd3/NuRD repressor complex that potently restrains the reactivation of OSKM downstream target genes. Subsequently, the latter interactions, which are largely depleted during early pre-implantation development in vivo, lead to a stochastic and protracted reprogramming trajectory towards pluripotency in vitro. The deterministic reprogramming approach devised here offers a novel platform for the dissection of molecular dynamics leading to establishing pluripotency at unprecedented flexibility and resolution.

  11. Using MCBEND for neutron or gamma-ray deterministic calculations

    Directory of Open Access Journals (Sweden)

    Geoff Dobson

    2017-01-01

    Full Text Available MCBEND 11 is the latest version of the general radiation transport Monte Carlo code from AMEC Foster Wheeler’s ANSWERS® Software Service. MCBEND is well established in the UK shielding community for radiation shielding and dosimetry assessments. MCBEND supports a number of acceleration techniques, for example the use of an importance map in conjunction with Splitting/Russian Roulette. MCBEND has a well established automated tool to generate this importance map, commonly referred to as the MAGIC module using a diffusion adjoint solution. This method is fully integrated with the MCBEND geometry and material specification, and can easily be run as part of a normal MCBEND calculation. An often overlooked feature of MCBEND is the ability to use this method for forward scoping calculations, which can be run as a very quick deterministic method. Additionally, the development of the Visual Workshop environment for results display provides new capabilities for the use of the forward calculation as a productivity tool. In this paper, we illustrate the use of the combination of the old and new in order to provide an enhanced analysis capability. We also explore the use of more advanced deterministic methods for scoping calculations used in conjunction with MCBEND, with a view to providing a suite of methods to accompany the main Monte Carlo solver.

  12. Using MCBEND for neutron or gamma-ray deterministic calculations

    Science.gov (United States)

    Geoff, Dobson; Adam, Bird; Brendan, Tollit; Paul, Smith

    2017-09-01

    MCBEND 11 is the latest version of the general radiation transport Monte Carlo code from AMEC Foster Wheeler's ANSWERS® Software Service. MCBEND is well established in the UK shielding community for radiation shielding and dosimetry assessments. MCBEND supports a number of acceleration techniques, for example the use of an importance map in conjunction with Splitting/Russian Roulette. MCBEND has a well established automated tool to generate this importance map, commonly referred to as the MAGIC module using a diffusion adjoint solution. This method is fully integrated with the MCBEND geometry and material specification, and can easily be run as part of a normal MCBEND calculation. An often overlooked feature of MCBEND is the ability to use this method for forward scoping calculations, which can be run as a very quick deterministic method. Additionally, the development of the Visual Workshop environment for results display provides new capabilities for the use of the forward calculation as a productivity tool. In this paper, we illustrate the use of the combination of the old and new in order to provide an enhanced analysis capability. We also explore the use of more advanced deterministic methods for scoping calculations used in conjunction with MCBEND, with a view to providing a suite of methods to accompany the main Monte Carlo solver.

  13. On the deterministic and stochastic use of hydrologic models

    Science.gov (United States)

    Farmer, William H.; Vogel, Richard M.

    2016-01-01

    Environmental simulation models, such as precipitation-runoff watershed models, are increasingly used in a deterministic manner for environmental and water resources design, planning, and management. In operational hydrology, simulated responses are now routinely used to plan, design, and manage a very wide class of water resource systems. However, all such models are calibrated to existing data sets and retain some residual error. This residual, typically unknown in practice, is often ignored, implicitly trusting simulated responses as if they are deterministic quantities. In general, ignoring the residuals will result in simulated responses with distributional properties that do not mimic those of the observed responses. This discrepancy has major implications for the operational use of environmental simulation models as is shown here. Both a simple linear model and a distributed-parameter precipitation-runoff model are used to document the expected bias in the distributional properties of simulated responses when the residuals are ignored. The systematic reintroduction of residuals into simulated responses in a manner that produces stochastic output is shown to improve the distributional properties of the simulated responses. Every effort should be made to understand the distributional behavior of simulation residuals and to use environmental simulation models in a stochastic manner.

  14. Shock-induced explosive chemistry in a deterministic sample configuration.

    Energy Technology Data Exchange (ETDEWEB)

    Stuecker, John Nicholas; Castaneda, Jaime N.; Cesarano, Joseph, III (,; ); Trott, Wayne Merle; Baer, Melvin R.; Tappan, Alexander Smith

    2005-10-01

    Explosive initiation and energy release have been studied in two sample geometries designed to minimize stochastic behavior in shock-loading experiments. These sample concepts include a design with explosive material occupying the hole locations of a close-packed bed of inert spheres and a design that utilizes infiltration of a liquid explosive into a well-defined inert matrix. Wave profiles transmitted by these samples in gas-gun impact experiments have been characterized by both velocity interferometry diagnostics and three-dimensional numerical simulations. Highly organized wave structures associated with the characteristic length scales of the deterministic samples have been observed. Initiation and reaction growth in an inert matrix filled with sensitized nitromethane (a homogeneous explosive material) result in wave profiles similar to those observed with heterogeneous explosives. Comparison of experimental and numerical results indicates that energetic material studies in deterministic sample geometries can provide an important new tool for validation of models of energy release in numerical simulations of explosive initiation and performance.

  15. Effects of severe obstetric complications on women's health and infant mortality in Benin.

    Science.gov (United States)

    Filippi, Véronique; Goufodji, Sourou; Sismanidis, Charalambos; Kanhonou, Lydie; Fottrell, Edward; Ronsmans, Carine; Alihonou, Eusèbe; Patel, Vikram

    2010-06-01

    To document the impact of severe obstetric complications on post-partum health in mothers and mortality in babies over 12 months in Benin and to assess whether severe complications associated with perinatal death are particularly likely to lead to adverse health consequences. Cohort study which followed women and their babies after a severe complication or an uncomplicated childbirth. Women were selected in hospitals and interviewed at home at discharge, and at 6 and 12 months post-partum. Women were invited for a medical check-up at 6 months and 12 months. The cohort includes 205 women with severe complications and a live birth, 64 women with severe complications and perinatal death and 440 women with uncomplicated delivery. Women with severe complications and a live birth were not dissimilar to women with a normal delivery in terms of post-partum health, except for hypertension [adjusted OR = 5.8 (1.9-17.0)], fever [adjusted OR = 1.71 (1.1-2.8)] and infant mortality [adjusted OR = 11.0 (0.8-158.2)]. Women with complications and perinatal death were at increased risk of depression [adjusted OR = 3.4 (1.3-9.0)], urine leakages [adjusted OR = 2.7 (1.2-5.8)], and to report poor health [adjusted OR = 5.27 (2.2-12.4)] and pregnancy's negative effects on their life [adjusted OR = 4.11 (1.9-9.0)]. Uptake of post-natal services was poor in all groups. Women in developing countries face a high risk of severe complications during pregnancy and delivery. These can lead to adverse consequences for their own health and that of their offspring. Resources are needed to ensure that pregnant women receive adequate care before, during and after discharge from hospital. Near-miss women with a perinatal death appear a particularly high-risk group.

  16. The estimation of effective doses using measurement of several relevant physical parameters from radon exposures

    International Nuclear Information System (INIS)

    Ridzikova, A; Fronka, A.; Maly, B.; Moucka, L.

    2003-01-01

    In the present investigation, we will be study the dose relevant factors from continual monitoring in real homes into account getting more accurate estimation of 222 Rn the effective dose. The dose relevant parameters include the radon concentration, the equilibrium factor (f), the fraction (fp) of unattached radon decay products and real time occupancy people in home. The result of the measurement are the time courses of radon concentration that are based on estimation effective doses together with assessment of the real time occupancy people indoor. We found out by analysis that year effective dose is lower than effective dose estimated by ICRP recommendation from the integral measurement that included only average radon concentration. Our analysis of estimation effective doses using measurement of several physical parameters was made only in one case and for the better specification is important to measure in different real occupancy houses. (authors)

  17. A new reliability allocation weight for reducing the occurrence of severe failure effects

    International Nuclear Information System (INIS)

    Kim, Kyungmee O.; Yang, Yoonjung; Zuo, Ming J.

    2013-01-01

    A reliability allocation weight is used during the early design stage of a system to apportion the system reliability requirement to its individual subsystems. Since some failures have serious effects on public safety, cost and environmental issues especially in a mission critical system, the failure effect must be considered as one of the important factors in determining the allocation weight. Previously, the risk priority number or the criticality number was used to consider the failure effect in the allocation weight. In this paper, we identify the limitations of the previous approach and propose a new allocation weight based on the subsystem failure severity and its relative frequency. An example is given to illustrate that the proposed method is more effective than the previous method for reducing the occurrence of the unacceptable failure effects in a newly designed system

  18. Persistent Effects of Fire Severity on Early Successional Forests in Interior Alaska

    Science.gov (United States)

    Shenoy, Aditi; Johnstone, Jill F.; Kasischke, Eric S.; Kielland, Knut

    2011-01-01

    There has been a recent increase in the frequency and extent of wildfires in interior Alaska, and this trend is predicted to continue under a warming climate. Although less well documented, corresponding increases in fire severity are expected. Previous research from boreal forests in Alaska and western Canada indicate that severe fire promotes the recruitment of deciduous tree species and decreases the relative abundance of black spruce (Picea mariana) immediately after fire. Here we extend these observations by (1) examining changes in patterns of aspen and spruce density and biomass that occurred during the first two decades of post-fire succession, and (2) comparing patterns of tree composition in relation to variations in post-fire organic layer depth in four burned black spruce forests in interior Alaska after 10-20 years of succession.Wefound that initial effects of fire severity on recruitment and establishment of aspen and black spruce were maintained by subsequent effects of organic layer depth and initial plant biomass on plant growth during post-fire succession. The proportional contribution of aspen (Populus tremuloides) to total stand biomass remained above 90% during the first and second decades of succession in severely burned sites, while in lightly burned sites the proportional contribution of aspen was reduced due to a 40- fold increase in spruce biomass in these sites. Relationships between organic layer depth and stem density and biomass were consistently negative for aspen, and positive or neutral for black spruce in all four burns. Our results suggest that initial effects of post-fire organic layer depths on deciduous recruitment are likely to translate into a prolonged phase of deciduous dominance during post-fire succession in severely burned stands. This shift in vegetation distribution has important implications for climate-albedo feedbacks, future fire regime, wildlife habitat quality and natural resources for indigenous subsistence

  19. Effect of CPAP on arterial stiffness in severely obese patients with obstructive sleep apnoea.

    Science.gov (United States)

    Seetho, Ian W; Asher, Rebecca; Parker, Robert J; Craig, Sonya; Duffy, Nick; Hardy, Kevin J; Wilding, John P H

    2015-12-01

    Obstructive sleep apnoea (OSA) may independently increase cardiovascular risk in obesity. Although there is evidence that arterial stiffness is altered in OSA, knowledge of these effects with continuous positive airway pressure (CPAP) in severe obesity (body mass index (BMI) ≥ 35 kg/m(2)) is limited. This study aimed to explore how arterial stiffness, as measured by the augmentation index (Aix), changed in severely obese patients with OSA who were treated with CPAP and in patients without OSA. Forty-two patients with severe obesity-22 with OSA, 20 without OSA-were recruited at baseline and followed-up after a median of 13.5 months. Pulse wave analysis (PWA) was performed using applanation tonometry at the radial artery to measure augmentation index (Aix), augmentation pressure (AP) and subendocardial viability ratio (SEVR). Cardiovascular parameters and body composition were also measured. There were significant improvements in Aix, AP (both P CPAP compared with subjects without OSA. Epworth scores (P CPAP. Regression showed that CPAP was significantly associated with change in arterial stiffness from baseline. However, patients with OSA on CPAP continued to have increased arterial stiffness (Aix) (P CPAP in severe obesity, CPAP alone is not sufficient to modify PWA measures to levels comparable with non-OSA patients. This supports a need for a multifaceted approach when managing cardiovascular risk in patients with severe obesity and obstructive sleep apnoea receiving CPAP therapy.

  20. The cost-effectiveness of an intensive treatment protocol for severe dyslexia in children.

    Science.gov (United States)

    Hakkaart-van Roijen, Leona; Goettsch, Wim G; Ekkebus, Michel; Gerretsen, Patty; Stolk, Elly A

    2011-08-01

    Studies of interventions for dyslexia have focused entirely on outcomes related to literacy. In this study, we considered a broader picture assessing improved quality of life compared with costs. A model served as a tool to compare costs and effects of treatment according to a new protocol and care as usual. Quality of life was measured and valued by proxies using a general quality-of-life instrument (EQ-5D). We considered medical cost and non-medical cost (e.g. remedial teaching). The model computed cost per successful treatment and cost per quality adjusted life year (QALY) in time. About 75% of the total costs was related to diagnostic tests to distinguish between children with severe dyslexia and children who have reading difficulties for other reasons. The costs per successful treatment of severe dyslexia were €36 366. Successful treatment showed a quality-of-life gain of about 11%. At primary school, the average cost per QALY for severe dyslexia amounted to €58 647. In the long term, the cost per QALY decreased to €26 386 at secondary school and €17 663 thereafter. The results of this study provide evidence that treatment of severe dyslexia is cost-effective when the investigated protocol is followed. Copyright © 2011 John Wiley & Sons, Ltd.

  1. Maximizing cost-effectiveness by adjusting treatment strategy according to glaucoma severity

    Science.gov (United States)

    Guedes, Ricardo Augusto Paletta; Guedes, Vanessa Maria Paletta; Gomes, Carlos Eduardo de Mello; Chaoubah, Alfredo

    2016-01-01

    Abstract Background: The aim of this study is to determine the most cost-effective strategy for the treatment of primary open-angle glaucoma (POAG) in Brazil, from the payer's perspective (Brazilian Public Health System) in the setting of the Glaucoma Referral Centers. Methods: Study design was a cost-effectiveness analysis of different treatment strategies for POAG. We developed 3 Markov models (one for each glaucoma stage: early, moderate and advanced), using a hypothetical cohort of POAG patients, from the perspective of the Brazilian Public Health System (SUS) and a horizon of the average life expectancy of the Brazilian population. Different strategies were tested according to disease severity. For early glaucoma, we compared observation, laser and medications. For moderate glaucoma, medications, laser and surgery. For advanced glaucoma, medications and surgery. Main outcome measures were ICER (incremental cost-effectiveness ratio), medical direct costs and QALY (quality-adjusted life year). Results: In early glaucoma, both laser and medical treatment were cost-effective (ICERs of initial laser and initial medical treatment over observation only, were R$ 2,811.39/QALY and R$ 3,450.47/QALY). Compared to observation strategy, the two alternatives have provided significant gains in quality of life. In moderate glaucoma population, medical treatment presented the highest costs among treatment strategies. Both laser and surgery were highly cost-effective in this group. For advanced glaucoma, both tested strategies were cost-effective. Starting age had a great impact on results in all studied groups. Initiating glaucoma therapy using laser or surgery were more cost-effective, the younger the patient. Conclusion: All tested treatment strategies for glaucoma provided real gains in quality of life and were cost-effective. However, according to the disease severity, not all strategies provided the same cost-effectiveness profile. Based on our findings, there should be a

  2. The effect of COPD severity and study duration on exacerbation outcome in randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Eriksson G

    2017-05-01

    Full Text Available Göran Eriksson,1 Peter M Calverley,2 Christine R Jenkins,3,4 Antonio R Anzueto,5 Barry J Make,6 Magnus Lindberg,7 Malin Fagerås,7 Dirkje S Postma8 1Department of Respiratory Medicine and Allergology, University Hospital, Lund, Sweden; 2Pulmonary and Rehabilitation Research Group, University Hospital Aintree, Liverpool, UK; 3Concord Clinical School, University of Sydney, 4The George Institute for Global Health, Sydney, Australia; 5Department of Pulmonary Medicine and Critical Care, University of Texas Health Sciences Center and South Texas Veterans’ Health Care System, San Antonio, Texas, 6Division of Pulmonary Sciences and Critical Care Medicine, National Jewish Health, University of Colorado, Denver, Colorado, USA; 7AstraZeneca R&D, Mölndal, Sweden; 8Department of Pulmonary Medicine and Tuberculosis, University Medical Center Groningen, GRIAC Research Institute, University of Groningen, Groningen, the Netherlands Background: When discontinuation in COPD randomized controlled trials (RCTs is unevenly distributed between treatments (differential dropout, the capacity to demonstrate treatment effects may be reduced. We investigated the impact of the time of differential dropout on exacerbation outcomes in RCTs, in relation to study duration and COPD severity.Methods: A post hoc analysis of 2,345 patients from three RCTs of 6- and 12-month duration was performed to compare budesonide/formoterol and formoterol in moderate, severe, and very severe COPD. Outcomes were exacerbation rate, time-to-first exacerbation, or discontinuation; patients were stratified by disease severity. Outcomes were studied by censoring data monthly from 1 to 12 months.Results: In patients treated with budesonide/formoterol, annualized exacerbation rates (AERs were comparable for each study duration (rate ratio [RR] =0.6. With formoterol, the AER decreased with study duration (RR =1.20 at 1 month to RR =0.86 at 12 months. There was a treatment-related difference in

  3. Effect of inhaled corticosteroid use on weight (BMI) in pediatric patients with moderate-severe asthma.

    Science.gov (United States)

    Han, Jennifer; Nguyen, John; Kim, Yuna; Geng, Bob; Romanowski, Gale; Alejandro, Lawrence; Proudfoot, James; Xu, Ronghui; Leibel, Sydney

    2018-04-19

    Assess the relationship between inhaled corticosteroid use (ICS) and weight (BMI) in pediatric patients with moderate-severe asthma. Assess if the number of emergency department (ED) visits correlates with overall BMI trajectory. Assess the trend of prescribing biologic therapy in pediatric patients with moderate-severe asthma and determine its relationship with weight (BMI). A retrospective chart review was performed on 93 pediatric patients with moderate-severe asthma to determine the relationship between ICS use and weight (BMI), biologic therapy and BMI, and number of ED visits and BMI trajectory. A mixed effects model was employed with the correlation between repeated measures accounted for through the random effects. There is a statistically significant increase of 0.369 kg/m 2 in BMI trajectory per year in subjects on high-dose steroids compared to an increase of 0.195 kg/m 2 in the low dose group (p BMI of subjects initiated on biologic therapy (omalizumab or mepolizumab) had a statistically significant decrease in BMI trajectory of 0.818 kg/m 2 per year (p BMI trajectory (p BMI trajectory; the higher the dose, the greater the projected BMI increase per year. Initiation of biologic therapy decreased BMI trajectory over time. Lastly, those with frequent ED visits had a higher BMI trend. Future prospective studies are warranted that further evaluate the potential metabolic impacts of ICS and assess the effects of biologic therapy on BMI.

  4. Immune globulins are effective in severe pediatric Guillain-Barré syndrome.

    Science.gov (United States)

    Shahar, E; Shorer, Z; Roifman, C M; Levi, Y; Brand, N; Ravid, S; Murphy, E G

    1997-01-01

    The effect of high-dose intravenous immune globulins was evaluated in an open prospective multicenter study of 26 children with severe Guillain-Barré syndrome. They presented with mild to moderate flaccid weakness of extremities, with cranial nerve involvement (20) and sensory impairment (22). All children rapidly deteriorated in 2-16 days (mean 6) to become bedridden, and 2 children also developed respiratory failure requiring artificial ventilation (Disability Grading Scale 4-5). Immune globulins were then administered at a total dose of 2 gm/kg, on 2 consecutive days, without adverse effects requiring discontinuation of therapy. Marked and rapid improvement was noted in 25 children, who improved by 1 to 2 Disability Grade Scales ventilator. Eighteen children recovered by 2 weeks. The rest recuperated in a period of four months, including a child who was artificially ventilated for 4 weeks. The uniform rapid improvement and recovery associated with immune globulins contrasts with the slow recovery course in severe natural cases. We conclude that immune globulins are effective and safe in severe childhood-onset Guillain-Barré syndrome and therefore may serve as the initial treatment of choice.

  5. Effects of three types of potentially biasing information on symptom severity judgments for major depressive episode.

    Science.gov (United States)

    Mumma, Gregory H

    2002-10-01

    Two experiments examined the effects of potentially biasing information on judgments of symptom severity pertaining to the diagnosis of major depressive episode (MDE). In both experiments, clinicians viewed videotapes of two actor-simulated patients responding to questions from a standardized diagnostic interview. In Study 1, an expectancy effect was found for both patients such that prior information about a clear-cut history of depression resulted in lower rated severity of current symptoms. In addition, a halo effect was observed for one patient in Study 1 and both patients in Study 2: Clear-cut depressive nonverbal behavior (DNVB) resulted in greater rated severity for symptoms that should not have been affected (e.g., appetite/weight change, suicidal ideation). Clear-cut versus near-threshold information for the two essential criteria for MDE did not affect subsequent judgments in either study. Implications for diagnostic interviewing are discussed. Copyright 2002 Wiley Periodicals, Inc. J Clin Psychol 58: 1327-1345, 2002.

  6. Buildup factors for multilayer shieldings in deterministic methods and their comparison with Monte Carlo

    International Nuclear Information System (INIS)

    Listjak, M.; Slavik, O.; Kubovcova, D.; Vermeersch, F.

    2008-01-01

    In general there are two ways how to calculate effective doses. The first way is by use of deterministic methods like point kernel method which is implemented in Visiplan or Microshield. These kind of calculations are very fast, but they are not very convenient for a complex geometry with shielding composed of more then one material in meaning of result precision. In spite of this that programs are sufficient for ALARA optimisation calculations. On other side there are Monte Carlo methods which can be used for calculations. This way of calculation is quite precise in comparison with reality but calculation time is usually very large. Deterministic method like programs have one disadvantage -usually there is option to choose buildup factor (BUF) only for one material in multilayer stratified slabs shielding calculation problems even if shielding is composed from different materials. In literature there are proposed different formulas for multilayer BUF approximation. Aim of this paper was to examine these different formulas and their comparison with MCNP calculations. At first ware compared results of Visiplan and Microshield. Simple geometry was modelled - point source behind single and double slab shielding. For Build-up calculations was chosen Geometric Progression method (feature of the newest version of Visiplan) because there are lower deviations in comparison with Taylor fitting. (authors)

  7. Procedures with deterministic risk in interventionist radiology; Procedimientos con riesgo deterministico en radiologia intervencionista

    Energy Technology Data Exchange (ETDEWEB)

    Tort Ausina, Isabel; Ruiz-Cruces, Rafael; Perez Martinez, Manuel; Carrera Magarino, Francisco; Diez de los Rios, Antonio [Universidad de Malaga (Spain). Facultad de Medicina. Grupo de Investigacion en Proteccion Radiologica]. E-mail: rrcmf@uma.es

    2001-07-01

    The determinist effects in interventionist radiology are been from the irradiation in skin. The objective of this work is the estimation of the deterministic risk of the organs exposed in IR procedures. There ware selected four procedures: coated stent in abdominal aorta; shunt carry-digs; embolization of varicocele; mesenteric arteriography with venous returns. They have present maximum values of dose-area product (PDA), and it has considered the doses in organs by means of computer programs (Eff-Dose, PCXMC and OSD). The PDA has been measured with flat ionization chamber (PTW Diamentor M2). Although still few cases exist, are a high value of dose in kidney and testicles, that suppose that recommendations must be applied to avoid high exhibitions, motivating the personnel to change the irradiation fields, to use the collimation and losses rates of dose of x-ray radioscopy. Dispersion between the values of dose of the different programs is observed, which causes that it considers which is indicated of them, although seems that the Eff-Dose could be recommended, based on the Report-262 of the NRPB.

  8. Buildup factors for multilayer shieldings in deterministic methods and their comparison with Monte Carlo

    International Nuclear Information System (INIS)

    Listjak, M.; Slavik, O.; Kubovcova, D.; Vermeersch, F.

    2009-01-01

    In general there are two ways how to calculate effective doses. The first way is by use of deterministic methods like point kernel method which is implemented in Visiplan or Microshield. These kind of calculations are very fast, but they are not very convenient for a complex geometry with shielding composed of more then one material in meaning of result precision. In spite of this that programs are sufficient for ALARA optimisation calculations. On other side there are Monte Carlo methods which can be used for calculations. This way of calculation is quite precise in comparison with reality but calculation time is usually very large. Deterministic method like programs have one disadvantage -usually there is option to choose buildup factor (BUF) only for one material in multilayer stratified slabs shielding calculation problems even if shielding is composed from different materials. In literature there are proposed different formulas for multilayer BUF approximation. Aim of this paper was to examine these different formulas and their comparison with MCNP calculations. At first ware compared results of Visiplan and Microshield. Simple geometry was modelled - point source behind single and double slab shielding. For Build-up calculations was chosen Geometric Progression method (feature of the newest version of Visiplan) because there are lower deviations in comparison with Taylor fitting. (authors)

  9. Deterministic Safety Analysis for Nuclear Power Plants. Specific Safety Guide (Russian Edition)

    International Nuclear Information System (INIS)

    2014-01-01

    The objective of this Safety Guide is to provide harmonized guidance to designers, operators, regulators and providers of technical support on deterministic safety analysis for nuclear power plants. It provides information on the utilization of the results of such analysis for safety and reliability improvements. The Safety Guide addresses conservative, best estimate and uncertainty evaluation approaches to deterministic safety analysis and is applicable to current and future designs. Contents: 1. Introduction; 2. Grouping of initiating events and associated transients relating to plant states; 3. Deterministic safety analysis and acceptance criteria; 4. Conservative deterministic safety analysis; 5. Best estimate plus uncertainty analysis; 6. Verification and validation of computer codes; 7. Relation of deterministic safety analysis to engineering aspects of safety and probabilistic safety analysis; 8. Application of deterministic safety analysis; 9. Source term evaluation for operational states and accident conditions; References

  10. Examination of several pre-oxidation procedures and their effect as hydrogen permeation-barrier

    International Nuclear Information System (INIS)

    Heimes, E.

    1986-03-01

    Several pre-oxidation procedures have been tested with respect to their effect as a hydrogen permeation barrier at the high temperature alloys Hastelloy X and Inconel 617. By outside coating of Hastelloy X samples with alumina the determined impeding effects were very low. A surface aluminium enrichment by different procedures were accomplished before selective oxidation. The method of Aluminium-Hot-Dipping generated oxide layers with a four- to fivefold higher impeding effect compared to specimens fabricated by a standard procedure. With the aid of a metallographical follow-up examination it was shown that the higher impeding effects are due to an improved adhesion between the oxide layer and the high temperature material, whereby in the cooling period after manufacturing a smaller amount of oxide cracking is obtainable. (orig./PW) [de

  11. Colostomy is a simple and effective procedure for severe chronic radiation proctitis.

    Science.gov (United States)

    Yuan, Zi-Xu; Ma, Teng-Hui; Wang, Huai-Ming; Zhong, Qing-Hua; Yu, Xi-Hu; Qin, Qi-Yuan; Wang, Jian-Ping; Wang, Lei

    2016-06-28

    To assess the efficacy and safety of diverting colostomy in treating severe hemorrhagic chronic radiation proctitis (CRP). Patients with severe hemorrhagic CRP who were admitted from 2008 to 2014 were enrolled into this study. All CRP patients were diagnosed by a combination of pelvic radiation history, clinical rectal bleeding, and endoscopic findings. Inclusion criteria were CRP patients with refractory bleeding with moderate to severe anemia with a hemoglobin level colostomy, while the control group included patients who received conservative treatment. The remission of bleeding was defined as complete cessation or only occasional bleeding that needed no further treatment. The primary outcome was bleeding remission at 6 mo after treatment. Quality of life before treatment and at follow-up was evaluated according to EORTC QLQ C30. Severe CRP complications were recorded during follow-up. Forty-seven consecutive patients were enrolled, including 22 in the colostomy group and 27 in the conservative treatment group. When compared to conservative treatment, colostomy obtained a higher rate of bleeding remission (94% vs 12%), especially in control of transfusion-dependent bleeding (100% vs 0%), and offered a better control of refractory perianal pain (100% vs 0%), and a lower score of bleeding (P colostomy achieved better remission of both moderate bleeding (100% vs 21.5%, P = 0.002) and severe bleeding (100% vs 0%, P colostomy, which included global health, function, and symptoms, but it was not improved in the control group. Pathological evaluation after colostomy found diffused chronic inflammation cells, and massive fibrosis collagen depositions under the rectal wall, which revealed potential fibrosis formation. Diverting colostomy is a simple, effective and safe procedure for severe hemorrhagic CRP. Colostomy can improve quality of life and reduce serious complications secondary to radiotherapy.

  12. Does recall period have an effect on cancer patients' ratings of the severity of multiple symptoms?

    Science.gov (United States)

    Shi, Qiuling; Trask, Peter C; Wang, Xin Shelley; Mendoza, Tito R; Apraku, Winifred A; Malekifar, Maggie; Cleeland, Charles S

    2010-08-01

    Choosing an appropriate recall period for symptom assessment in a clinical trial is dependent on the design and purpose of the trial. To examine the effects of recall on symptom severity ratings by comparing ratings made using 24-hour and seven-day recall periods of the MD Anderson Symptom Inventory (MDASI). Forty-two patients in their third to eighth week of chemoradiation rated their symptoms using the MDASI on two separate occasions (T1 and T2), one week apart. At T1, patients were randomly assigned to rate symptoms using either a 24-hour or a seven-day recall. At T2, patients rated symptoms using the recall period not used at their first visit. Comparing the 24-hour and seven-day recall periods, the correlation coefficient for total symptom severity was 0.888. All correlation coefficients for symptom severity items were >0.7 except for distress (r=0.67). The percentages of moderate to severe symptoms (rated >or=5) were consistent for both recall periods, with no significant difference between recall periods in the prevalence of moderate to severe symptoms. Cronbach alpha coefficients for both 24-hour and seven-day recalls were >0.8. Symptoms from both recall periods were more severe for patients with poorer performance status. Twenty patients were cognitively debriefed; 70% thought that the seven-day recall was "more appropriate" for the MDASI, but 85% did not think that recall period would influence their answers. This study demonstrated that the MDASI in a seven-day recall format has psychometric properties consistent with the 24-hour recall version, which may promote its use in future cancer clinical trials and may inform the choice of recall period when symptoms are outcome measures. Copyright (c) 2010 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.

  13. Effect of Enamel Caries Lesion Baseline Severity on Fluoride Dose-Response

    Directory of Open Access Journals (Sweden)

    Frank Lippert

    2017-01-01

    Full Text Available This study aimed to investigate the effect of enamel caries lesion baseline severity on fluoride dose-response under pH cycling conditions. Early caries lesions were created in human enamel specimens at four different severities (8, 16, 24, and 36 h. Lesions were allocated to treatment groups (0, 83, and 367 ppm fluoride as sodium fluoride based on Vickers surface microhardness (VHN and pH cycled for 5 d. The cycling model comprised 3 × 1 min fluoride treatments sandwiched between 2 × 60 min demineralization challenges with specimens stored in artificial saliva in between. VHN was measured again and changes versus lesion baseline were calculated (ΔVHN. Data were analyzed using two-way ANOVA (p<0.05. Increased demineralization times led to increased surface softening. The lesion severity×fluoride concentration interaction was significant (p<0.001. Fluoride dose-response was observed in all groups. Lesions initially demineralized for 16 and 8 h showed similar overall rehardening (ΔVHN and more than 24 and 36 h lesions, which were similar. The 8 h lesions showed the greatest fluoride response differential (367 versus 0 ppm F which diminished with increasing lesion baseline severity. The extent of rehardening as a result of the 0 ppm F treatment increased with increasing lesion baseline severity, whereas it decreased for the fluoride treatments. In conclusion, lesion baseline severity impacts the extent of the fluoride dose-response.

  14. Assessing the value of mepolizumab for severe eosinophilic asthma: a cost-effectiveness analysis.

    Science.gov (United States)

    Whittington, Melanie D; McQueen, R Brett; Ollendorf, Daniel A; Tice, Jeffrey A; Chapman, Richard H; Pearson, Steven D; Campbell, Jonathan D

    2017-02-01

    Adding mepolizumab to standard treatment with inhaled corticosteroids and controller medications could decrease asthma exacerbations and use of long-term oral steroids in patients with severe disease and increased eosinophils; however, mepolizumab is costly and its cost effectiveness is unknown. To estimate the cost effectiveness of mepolizumab. A Markov model was used to determine the incremental cost per quality-adjusted life year (QALY) gained for mepolizumab plus standard of care (SoC) and for SoC alone. The population, adults with severe eosinophilic asthma, was modeled for a lifetime time horizon. A responder scenario analysis was conducted to determine the cost effectiveness for a cohort able to achieve and maintain asthma control. Over a lifetime treatment horizon, 23.96 exacerbations were averted per patient receiving mepolizumab plus SoC. Avoidance of exacerbations and decrease in long-term oral steroid use resulted in more than $18,000 in cost offsets among those receiving mepolizumab, but treatment costs increased by more than $600,000. Treatment with mepolizumab plus SoC vs SoC alone resulted in a cost-effectiveness estimate of $386,000 per QALY. To achieve cost effectiveness of approximately $150,000 per QALY, mepolizumab would require a more than 60% price discount. At current pricing, treating a responder cohort yielded cost-effectiveness estimates near $160,000 per QALY. The estimated cost effectiveness of mepolizumab exceeds value thresholds. Achieving these thresholds would require significant discounts from the current list price. Alternatively, treatment limited to responders improves the cost effectiveness toward, but remains still slightly above, these thresholds. Payers interested in improving the efficiency of health care resources should consider negotiations of the mepolizumab price and ways to predict and assess the response to mepolizumab. Copyright © 2016 American College of Allergy, Asthma & Immunology. Published by Elsevier Inc. All

  15. Deterministic and stochastic algorithms for resolving the flow fields in ducts and networks using energy minimization

    Science.gov (United States)

    Sochi, Taha

    2016-09-01

    Several deterministic and stochastic multi-variable global optimization algorithms (Conjugate Gradient, Nelder-Mead, Quasi-Newton and global) are investigated in conjunction with energy minimization principle to resolve the pressure and volumetric flow rate fields in single ducts and networks of interconnected ducts. The algorithms are tested with seven types of fluid: Newtonian, power law, Bingham, Herschel-Bulkley, Ellis, Ree-Eyring and Casson. The results obtained from all those algorithms for all these types of fluid agree very well with the analytically derived solutions as obtained from the traditional methods which are based on the conservation principles and fluid constitutive relations. The results confirm and generalize the findings of our previous investigations that the energy minimization principle is at the heart of the flow dynamics systems. The investigation also enriches the methods of computational fluid dynamics for solving the flow fields in tubes and networks for various types of Newtonian and non-Newtonian fluids.

  16. Comparison of deterministic and stochastic methods for time-dependent Wigner simulations

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Sihong, E-mail: sihong@math.pku.edu.cn [LMAM and School of Mathematical Sciences, Peking University, Beijing 100871 (China); Sellier, Jean Michel, E-mail: jeanmichel.sellier@parallel.bas.bg [IICT, Bulgarian Academy of Sciences, Acad. G. Bonchev str. 25A, 1113 Sofia (Bulgaria)

    2015-11-01

    Recently a Monte Carlo method based on signed particles for time-dependent simulations of the Wigner equation has been proposed. While it has been thoroughly validated against physical benchmarks, no technical study about its numerical accuracy has been performed. To this end, this paper presents the first step towards the construction of firm mathematical foundations for the signed particle Wigner Monte Carlo method. An initial investigation is performed by means of comparisons with a cell average spectral element method, which is a highly accurate deterministic method and utilized to provide reference solutions. Several different numerical tests involving the time-dependent evolution of a quantum wave-packet are performed and discussed in deep details. In particular, this allows us to depict a set of crucial criteria for the signed particle Wigner Monte Carlo method to achieve a satisfactory accuracy.

  17. Pattern recognition methodologies and deterministic evaluation of seismic hazard: A strategy to increase earthquake preparedness

    International Nuclear Information System (INIS)

    Peresan, Antonella; Panza, Giuliano F.; Gorshkov, Alexander I.; Aoudia, Abdelkrim

    2001-05-01

    Several algorithms, structured according to a general pattern-recognition scheme, have been developed for the space-time identification of strong events. Currently, two of such algorithms are applied to the Italian territory, one for the recognition of earthquake-prone areas and the other, namely CN algorithm, for earthquake prediction purposes. These procedures can be viewed as independent experts, hence they can be combined to better constrain the alerted seismogenic area. We examine here the possibility to integrate CN intermediate-term medium-range earthquake predictions, pattern recognition of earthquake-prone areas and deterministic hazard maps, in order to associate CN Times of Increased Probability (TIPs) to a set of appropriate scenarios of ground motion. The advantage of this procedure mainly consists in the time information provided by predictions, useful to increase preparedness of safety measures and to indicate a priority for detailed seismic risk studies to be performed at a local scale. (author)

  18. Sexual orientation beliefs: their relationship to anti-gay attitudes and biological determinist arguments.

    Science.gov (United States)

    Hegarty, P; Pratto, F

    2001-01-01

    Previous studies which have measured beliefs about sexual orientation with either a single item, or a one-dimensional scale are discussed. In the present study beliefs were observed to vary along two dimensions: the "immutability" of sexual orientation and the "fundamentality" of a categorization of persons as heterosexuals and homosexuals. While conceptually related, these two dimensions were empirically distinct on several counts. They were negatively correlated with each other. Condemning attitudes toward lesbians and gay men were correlated positively with fundamentality but negatively with immutability. Immutability, but not fundamentality, affected the assimilation of a biological determinist argument. The relationship between sexual orientation beliefs and anti-gay prejudice is discussed and suggestions for empirical studies of sexual orientation beliefs are presented.

  19. Broken flow symmetry explains the dynamics of small particles in deterministic lateral displacement arrays.

    Science.gov (United States)

    Kim, Sung-Cheol; Wunsch, Benjamin H; Hu, Huan; Smith, Joshua T; Austin, Robert H; Stolovitzky, Gustavo

    2017-06-27

    Deterministic lateral displacement (DLD) is a technique for size fractionation of particles in continuous flow that has shown great potential for biological applications. Several theoretical models have been proposed, but experimental evidence has demonstrated that a rich class of intermediate migration behavior exists, which is not predicted. We present a unified theoretical framework to infer the path of particles in the whole array on the basis of trajectories in a unit cell. This framework explains many of the unexpected particle trajectories reported and can be used to design arrays for even nanoscale particle fractionation. We performed experiments that verify these predictions and used our model to develop a condenser array that achieves full particle separation with a single fluidic input.

  20. Faithful deterministic secure quantum communication and authentication protocol based on hyperentanglement against collective noise

    International Nuclear Information System (INIS)

    Chang Yan; Zhang Shi-Bin; Yan Li-Li; Han Gui-Hua

    2015-01-01

    Higher channel capacity and security are difficult to reach in a noisy channel. The loss of photons and the distortion of the qubit state are caused by noise. To solve these problems, in our study, a hyperentangled Bell state is used to design faithful deterministic secure quantum communication and authentication protocol over collective-rotation and collective-dephasing noisy channel, which doubles the channel capacity compared with using an ordinary Bell state as a carrier; a logical hyperentangled Bell state immune to collective-rotation and collective-dephasing noise is constructed. The secret message is divided into several parts to transmit, however the identity strings of Alice and Bob are reused. Unitary operations are not used. (paper)

  1. On the implementation of a deterministic secure coding protocol using polarization entangled photons

    OpenAIRE

    Ostermeyer, Martin; Walenta, Nino

    2007-01-01

    We demonstrate a prototype-implementation of deterministic information encoding for quantum key distribution (QKD) following the ping-pong coding protocol [K. Bostroem, T. Felbinger, Phys. Rev. Lett. 89 (2002) 187902-1]. Due to the deterministic nature of this protocol the need for post-processing the key is distinctly reduced compared to non-deterministic protocols. In the course of our implementation we analyze the practicability of the protocol and discuss some security aspects of informat...

  2. Effect of pre-stroke use of ACE inhibitors on ischemic stroke severity

    Directory of Open Access Journals (Sweden)

    Caplan Louis

    2005-06-01

    Full Text Available Abstract Background Recent trials suggest that angiotensin-converting enzyme inhibitors (ACEI are effective in prevention of ischemic stroke, as measured by reduced stroke incidence. We aimed to compare stroke severity between stroke patients who were taking ACEI before their stroke onset and those who were not, to examine the effects of pretreatment with ACEI on ischemic stroke severity. Methods We retrospectively studied 126 consecutive patients presenting within 24 hours of ischemic stroke onset, as confirmed by diffusion-weighted magnetic resonance imaging (DWI. We calculated the NIHSS score at presentation, as the primary measure of clinical stroke severity, and categorized stroke severity as mild (NIHSS [less than or equal to] 7, moderate (NIHSS 8–13 or severe (NIHSS [greater than or equal to] 14. We analyzed demographic data, risk-factor profile, blood pressure (BP and medications on admissions, and determined stroke mechanism according to TOAST criteria. We also measured the volumes of admission diffusion- and perfusion-weighted (DWI /PWI magnetic resonance imaging lesions, as a secondary measure of ischemic tissue volume. We compared these variables among patients on ACEI and those who were not. Results Thirty- three patients (26% were on ACE-inhibitors. The overall median baseline NIHSS score was 5.5 (range 2–21 among ACEI-treated patients vs. 9 (range 1–36 in non-ACEI patients (p = 0.036. Patients on ACEI prior to their stroke had more mild and less severe strokes, and smaller DWI and PWI lesion volumes compared to non-ACEI treated patients. However, none of these differences were significant. Predictably, a higher percentage of patients on ACEI had a history of heart failure (p = 0.03. Age, time-to-imaging or neurological evaluation, risk-factor profile, concomitant therapy with lipid lowering, other antihypertensives or antithrombotic agents, or admission BP were comparable between the two groups. Conclusion Our results

  3. Effect of CPAP Therapy in Improving Daytime Sleepiness in Indian Patients with Moderate and Severe OSA.

    Science.gov (United States)

    Battan, Gulshan; Kumar, Sanjeev; Panwar, Ajay; Atam, Virendra; Kumar, Pradeep; Gangwar, Anil; Roy, Ujjawal

    2016-11-01

    Obstructive Sleep Apnoea (OSA) is a highly prevalent disease and a major public health issue in India. Excessive daytime sleepiness is an almost ubiquitous symptom of OSA. Epworth Sleepiness Scale (ESS) score is a validated objective score to measure the degree of daytime sleepiness. Continuous Positive Airway Pressure (CPAP) therapy has been established as the gold standard treatment modality for OSA patients. A few Indian studies have reported the effectiveness of CPAP therapy in improving ESS scores after 1 st month of CPAP use. To observe both, short-term (one month) and long-term (three month) effects of CPAP therapy on ESS scores in moderate to severe OSA patients. The patients complaining of excessive day-time sleepiness, snoring and choking episodes during sleep, consecutively presenting to medicine OPD over a period of 2 years, were subjected to Polysomnography (PSG). Seventy-three patients with apnoea-hypopnea index (AHI) ≥15 were categorised as having moderate to severe forms of OSA (moderate OSA with AHI=15-30 and severe OSA with AHI >30), and were scheduled for an initial trial of CPAP therapy. Forty-seven patients reported good tolerance to CPAP therapy after a trial period of 2 weeks and comprised the final study group. ESS scores in these patients were recorded at the baseline, and after 1 st and 3 rd month of CPAP therapy, and statistically analysed for significance. Mean ESS score at the baseline among moderate and severe OSA patients were 13.67±2.29 and 16.56 ±1.87, respectively. ESS score in both these subgroups improved significantly to 11.63±3.79, p=0.022, CI (0.3293-4.0106)} and 14.13 ±3.74, p CPAP therapy. Likewise, mean ESS scores among moderate and severe OSA patients improved significantly to 9.84 ±2.97, p = 0.022, CI (0.3293-4.0106) and 12.29 ±3.97, p CPAP therapy. The result of the present study shows that CPAP therapy is significantly effective in improving ESS scores in Indian patients having moderate to severe OSA. Benefits

  4. Effects of B4C control rod degradation under severe accident

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Si-Won; Park, Sang-Gil; Han, Sang-Ku [Atomic Creative Technology Co., Daejeon (Korea, Republic of)

    2016-10-15

    Boron carbide (B4C) is widely used as absorber material in western boiling water reactor (BWR), some PWR, EPR and Russian RBMK and VVERs. B4C oxidation is one of the important phenomena of in-vessel. In the present paper, the main results and knowledge gained regarding the B4C control rod degradation from above mentioned experiments are reviewed and arranged to inform its significance on the severe accident consequences. In this paper, the role of B4C control rod oxidation and the subsequent degradation on the severe accident consequences is reviewed with available literature and report of previous experimental program regarding the B4C oxidation. From this review, it seems that the contribution of this B4C oxidation on the accident progression to the further severe accident situation is not negligible. For the future work, the extensive experimental data interpretation will be performed to assess quantitatively the effect of B4C oxidation and degradation on the various postulated severe accident conditions.

  5. Punica granatum juice effects on oxidative stress in severe physical activity.

    Science.gov (United States)

    Naghizadeh-Baghi, Abbas; Mazani, Mohammad; Shadman-Fard, Ali; Nemati, Ali

    2015-02-01

    The aim of this study was to investigate Punica granatum juice effects on oxidative stress in young healthy males during severe physical activity. Our subjects were selected from healthy males at 18 - 24 years. They were enrolled and randomly distributed into control and supplemented groups. 240 ml of Punica granatum juice and tap water were given to supplement and control groups daily for two weeks, respectively. Fasting blood samples were taken at the starting and the end of two weeks of intervention. Subjects were given once severe physical activity and then fasting blood samples were taken. Fasting blood samples were used for testing of oxidative and antioxidative factors. Data were analyzed using descriptive statistical tests, paired samples t-test, and independent samples t-test. The levels of arylesterase, superoxide dismutase, glutathione peroxidase and total antioxidant capacity after severe physical activity in supplement group were significantly increased (pPunica granatum juice significantly modulates oxidative stress and thus protects against severe physical activity oxidative injury in young healthy males.

  6. Effects of Burn Severity and Environmental Conditions on Post-Fire Regeneration in Siberian Larch Forest

    Directory of Open Access Journals (Sweden)

    Thuan Chu

    2017-03-01

    Full Text Available Post-fire forest regeneration is strongly influenced by abiotic and biotic heterogeneity in the pre- and post-fire environments, including fire regimes, species characteristics, landforms, hydrology, regional climate, and soil properties. Assessing these drivers is key to understanding the long-term effects of fire disturbances on forest succession. We evaluated multiple factors influencing patterns of variability in a post-fire boreal Larch (Larix sibirica forest in Siberia. A time-series of remote sensing images was analyzed to estimate post-fire recovery as a response variable across the burned area in 1996. Our results suggested that burn severity and water content were primary controllers of both Larch forest recruitment and green vegetation cover as defined by the forest recovery index (FRI and the fractional vegetation cover (FVC, respectively. We found a high rate of Larch forest recruitment in sites of moderate burn severity, while a more severe burn was the preferable condition for quick occupation by vegetation that included early seral communities of shrubs, grasses, conifers and broadleaf trees. Sites close to water and that received higher solar energy during the summer months showed a higher rate of both recovery types, defined by the FRI and FVC, dependent on burn severity. In addition to these factors, topographic variables and pre-fire condition were important predictors of post-fire forest patterns. These results have direct implications for the post-fire forest management in the Siberian boreal Larch region.

  7. The effect of Orobanche crenata infection severity in faba bean, field pea, and grass pea productivity.

    Directory of Open Access Journals (Sweden)

    Monica Fernandez-Aparicio

    2016-09-01

    Full Text Available Broomrape weeds (Orobanche and Phelipanche spp. are root holoparasites that feed off a wide range of important crops. Among them, Orobanche crenata attacks legumes complicating their inclusion in cropping systems along the Mediterranean area and West Asia. The detrimental effect of broomrape parasitism in crop yield can reach up to 100% depending on infection severity and the broomrape-crop association. This work provides field data of the consequences of O. crenata infection severity in three legume crops i.e. faba bean, field pea and grass pea. Regression functions modelled productivity losses and revealed trends in dry matter allocation in relation to infection severity. The host species differentially limits parasitic sink strength indicating different levels of broomrape tolerance at equivalent infection severities. Reductions in host aboveground biomass were observed starting at low infection severity and half maximal inhibitory performance was predicted as 4.5, 8.2 and 1.5 parasites per faba bean, field pea and grass pea plant, respectively. Reductions in host biomass occurred in both vegetative and reproductive organs, the latter resulting more affected. The proportion of resources allocated within the parasite was concomitant to reduction of host seed yield indicating that parasite growth and host reproduction compete directly for resources within a host plant. However, the parasitic sink activity does not fully explain the total host biomass reduction because combined biomass of host-parasite complex was lower than the biomass of uninfected plants. In grass pea, the seed yield was negligible at severities higher than 4 parasites per plant. In contrast, faba bean and field pea sustained low but significant seed production at the highest infection severity. Data on seed yield and seed number indicated that the sensitivity of field pea to O. crenata limited the production of grain yield by reducing seed number but maintaining seed size

  8. The Effect of Orobanche crenata Infection Severity in Faba Bean, Field Pea, and Grass Pea Productivity.

    Science.gov (United States)

    Fernández-Aparicio, Mónica; Flores, Fernando; Rubiales, Diego

    2016-01-01

    Broomrape weeds ( Orobanche and Phelipanche spp.) are root holoparasites that feed off a wide range of important crops. Among them, Orobanche crenata attacks legumes complicating their inclusion in cropping systems along the Mediterranean area and West Asia. The detrimental effect of broomrape parasitism in crop yield can reach up to 100% depending on infection severity and the broomrape-crop association. This work provides field data of the consequences of O. crenata infection severity in three legume crops, i.e., faba bean, field pea, and grass pea. Regression functions modeled productivity losses and revealed trends in dry matter allocation in relation to infection severity. The host species differentially limits parasitic sink strength indicating different levels of broomrape tolerance at equivalent infection severities. Reductions in host aboveground biomass were observed starting at low infection severity and half maximal inhibitory performance was predicted as 4.5, 8.2, and 1.5 parasites per faba bean, field pea, and grass pea plant, respectively. Reductions in host biomass occurred in both vegetative and reproductive organs, the latter resulting more affected. The increase of resources allocated within the parasite was concomitant to reduction of host seed yield indicating that parasite growth and host reproduction compete directly for resources within a host plant. However, the parasitic sink activity does not fully explain the total host biomass reduction because combined biomass of host-parasite complex was lower than the biomass of uninfected plants. In grass pea, the seed yield was negligible at severities higher than four parasites per plant. In contrast, faba bean and field pea sustained low but significant seed production at the highest infection severity. Data on seed yield and seed number indicated that the sensitivity of field pea to O. crenata limited the production of grain yield by reducing seed number but maintaining seed size. In contrast

  9. Health effects of technologies for power generation: Contributions from normal operation, severe accidents and terrorist threat

    International Nuclear Information System (INIS)

    Hirschberg, Stefan; Bauer, Christian; Burgherr, Peter; Cazzoli, Eric; Heck, Thomas; Spada, Matteo; Treyer, Karin

    2016-01-01

    As a part of comprehensive analysis of current and future energy systems we carried out numerous analyses of health effects of a wide spectrum of electricity supply technologies including advanced ones, operating in various countries under different conditions. The scope of the analysis covers full energy chains, i.e. fossil, nuclear and renewable power plants and the various stages of fuel cycles. State-of-the-art methods are used for the estimation of health effects. This paper addresses health effects in terms of reduced life expectancy in the context of normal operation as well as fatalities resulting from severe accidents and potential terrorist attacks. Based on the numerical results and identified patterns a comparative perspective on health effects associated with various electricity generation technologies and fuel cycles is provided. In particular the estimates of health risks from normal operation can be compared with those resulting from severe accidents and hypothetical terrorist attacks. A novel approach to the analysis of terrorist threat against energy infrastructure was developed, implemented and applied to selected energy facilities in various locations. Finally, major limitations of the current approach are identified and recommendations for further work are given. - Highlights: • We provide state-of-the-art comparative assessment of energy health risks. • The scope of the analysis should to the extent possible cover full energy chains. • Health impacts from normal operation dominate the risks. • We present novel approach to analysis of terrorist threat. • Limitations include technology choices, geographical coverage and terrorist issues.

  10. Effects of fluids on microvascular perfusion in patients with severe sepsis.

    Science.gov (United States)

    Ospina-Tascon, Gustavo; Neves, Ana Paula; Occhipinti, Giovanna; Donadello, Katia; Büchele, Gustavo; Simion, Davide; Chierego, Maria-Luisa; Silva, Tatiana Oliveira; Fonseca, Adriana; Vincent, Jean-Louis; De Backer, Daniel

    2010-06-01

    To evaluate the effects of fluid administration on microcirculatory alterations in sepsis. With a Sidestream Dark Field device, we evaluated the effects of fluids on the sublingual microcirculation in 60 patients with severe sepsis. These patients were investigated either within 24 h (early, n = 37) or more than 48 h (late, n = 23) after a diagnosis of severe sepsis. Hemodynamic and microcirculatory measurements were obtained before and 30 min after administration of 1,000 ml Ringer's lactate (n = 29) or 400 ml 4% albumin (n = 31) solutions. Fluid administration increased perfused small vessel density from 3.5 (2.9-4.3) to 4.4 (3.7-4.9) n/mm (p density from 5.3 (4.4-5.9) to 5.6 (4.8-6.3) n/mm (p fluids were not related to changes in cardiac index (R(2) = 0.05, p = ns) or mean arterial pressure (R(2) = 0.04, p = ns). In this non-randomized trial, fluid administration improved microvascular perfusion in the early but not late phase of sepsis. This effect is independent of global hemodynamic effects and of the type of solution.

  11. Predictability effect on N400 reflects the severity of reading comprehension deficits in aphasia.

    Science.gov (United States)

    Chang, Chih-Ting; Lee, Chia-Ying; Chou, Chia-Ju; Fuh, Jong-Ling; Wu, Hsin-Chi

    2016-01-29

    Predictability effect on N400, in which low predictability words elicited a larger N400 than high predictability words did over central to posterior electrodes, has been used to index difficulty of lexical retrieval and semantic integration of words in sentence comprehension. This study examined predictability effect on N400 in aphasic patients to determine if the properties of N400 are suited to indexing the severity of reading comprehension deficits. Patients with aphasia were divided into high and low ability groups based on scores on the reading comprehension subtest in the Chinese Concise Aphasia Test (CCAT). The two aphasia groups, a group of healthy elders who were age-matched to the aphasic participants, and a group of young adults, were requested to read sentences that either ended with highly predictable words or unexpected but plausible words, while undergoing electroencephalography (EEG). The young adult and healthy elderly groups exhibited the typical centro-parietal distributed effect of predictability on N400; however, healthy elders exhibited a reduced N400 effect in a delayed time window compared to the young adults. Compared with the elderly control, the high ability aphasia group exhibited a comparable N400 effect in a more restricted time window; by contrast, the low ability aphasia group exhibited a frontal distributed N400 in a much later time window (400-700 ms). These data suggest that the severity of reading comprehension deficits affects predictability effect on a set of N400 characteristics (i.e., amplitude, time window, and topographic distribution), which may be effective as ERP signatures in the evaluation of language recovery in aphasia. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Effect of Renin-Angiotensin Blockers on Left Ventricular Remodeling in Severe Aortic Stenosis.

    Science.gov (United States)

    Goh, Serene Si-Ning; Sia, Ching-Hui; Ngiam, Nicholas Jinghao; Tan, Benjamin Yong-Qiang; Lee, Poay Sian; Tay, Edgar Lik-Wui; Kong, William Kok-Fai; Yeo, Tiong Cheng; Poh, Kian-Keong

    2017-06-01

    Studies have shown that medical therapy with renin-angiotensin blockers (RABs) may benefit patients with aortic stenosis (AS). However, its use and efficacy remains controversial, including in patients with low flow (LF) with preserved left ventricular ejection fraction (LVEF). We examined the effects of RAB use on LV remodeling in patients with severe AS with preserved LVEF, analyzing the differential effects in patients with LF compared with normal flow (NF). This is a retrospective study of 428 consecutive subjects from 2005 to 2014 with echocardiographic diagnosis of severe AS and preserved LVEF. Clinical and echocardiographic parameters were systematically collected and analyzed. Two hundred forty-two (57%) patients had LF. Sixty-four LF patients (26%) were treated with RAB. Patients on RAB treatment had a higher incidence of hyperlipidemia (69% vs 44%) and diabetes mellitus (53% vs 34%). Severity of AS in terms of valve area, transvalvular mean pressure gradient, and aortic valve resistance were similar between both groups as was the degree of LV diastolic function. The RAB group demonstrated significantly lower LV mass index with a correspondingly lower incidence of concentric LV hypertrophy. Regardless of the duration of RAB therapy, patients had increased odds of having a preserved LV mass index compared with those without RAB therapy. In conclusion, RAB therapy may be associated with less LV pathological remodeling and have a role in delaying patients from developing cardiovascular complications of AS. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Effect of several variables in the polymer toys additive migration to saliva.

    Science.gov (United States)

    Noguerol-Cal, R; López-Vilariño, J M; González-Rodríguez, M V; Barral-Losada, L

    2011-09-30

    Capacity to migrate of a representative group of polymeric additives, dyes, antioxidants, hindered amine light stabilizers (HALS) or antistatics, from plastic toys to saliva was analyzed to protect children in their habits of sucking and biting. Most of target additives appear no-regulated in toys normative but adverse effects on human health of some of them have been demonstrated and their presence in others commercial articles normative has been included. In order to offer an effective and easy tool to perform these controls, migration tests by dynamic and static contact, followed by a preconcentration step by liquid-liquid extraction (LLE) and ultra performance liquid chromatographic analysis with ultraviolet-visible and evaporative light scattering detections (UPLC-UV/Vis-ELSD) have been optimized to evaluate the migrated amounts of the additives in saliva simulant. The detection limits of the migration methodologies were ranged from 8.68 × 10(-2) to 1.30 × 10(-3)mg migrated (L simulant)(-1). Influence of several variables on this mass transport, as time, temperature and friction, was also analyzed to achieve the most aggressive methodology to protect consumers. Migration of several studied additives, whose presence has been demonstrated in several purchased commercial toys, has been observed. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. [Effectiveness of individual supported employment for people with severe mental disorders].

    Science.gov (United States)

    Rodríguez Pulido, Francisco; Caballero Estebaranz, Nayra; Tallo Aldana, Elena; Méndez Abad, Manuel E; Hernández Álvarez-Sotomayor, M Carmen; López Reig, Susana; Vílchez de León, Patricia Inés; González-Dávila, Enrique

    2017-07-13

    To assess the effectiveness of an individual placement and support (IPS) strategy in people with severe mental disorders in Tenerife Island (Spain). Patients of Community Mental Health Services with severe mental disorders were randomly assigned to two groups. One of them received IPS (n=124), and the control group (n=75) was advised in the usual job search. Patients were followed up for an average of 3.4 years and an analysis was made of how many patients worked at least one day, working hours, wages, the number of contracts and the number of hospital admissions. Non-parametric methods were used to compare the results (Mann-Whitney U test). The percentage of patients who worked at least one day was 99% in the IPS group compared with 75% in the control group; they worked on average 30.1 weeks per year vs 7.4; the monthly salary was € 777.9 vs € 599.9; the number of contracts per person was 3.89 vs 4.85, and hospital admissions were 0.19 vs 2.1. The IPS strategy is effective for the labour integration of people with severe mental illness getting them to work longer, have higher wages and fewer hospital admissions. Copyright © 2017 SESPAS. All rights reserved.

  15. Effects of the canine rattlesnake vaccine in moderate to severe cases of canine crotalid envenomation

    Directory of Open Access Journals (Sweden)

    Leonard MJ

    2014-10-01

    Full Text Available McGee J Leonard,1 Catherine Bresee,2 Andrew Cruikshank1 1Animal Specialty and Emergency Center, Los Angeles, CA, USA; 2The Biostatistics and Bioinformatics Research Center, Cedars-Sinai Medical Center, Los Angeles, CA, USA Abstract: This is a retrospective multicenter study (2006–2012 examining a population of dogs with moderate to severe crotalid envenomation for protective effects of the canine rattlesnake vaccine. Five nonacademic emergency and referral veterinary hospitals in Southern California were involved in the study and contributed records regarding a total of 82 client-owned dogs that were treated for naturally occurring rattlesnake envenomation. All dogs received antivenin (Crotalidae polyvalent, with dosages ranging from one to three vials (mean: 1.3±0.6. Fourteen dogs (17% had a history of prior vaccination against crotalid venom. In univariate logistic regression modeling, cases with lower body weight (P=0.0001 or higher snakebite severity scores (P<0.0001 were associated with greater morbidity. No statistically significant difference in morbidity or mortality between vaccinated and unvaccinated dogs was found. The findings of this study did not identify a significantly protective effect of previous vaccination in the cases of moderate to severe rattlesnake envenomation that require treatment with antivenin. Keywords: rattlesnake envenomation, vaccine, antivenin, canine

  16. A deterministic partial differential equation model for dose calculation in electron radiotherapy.

    Science.gov (United States)

    Duclous, R; Dubroca, B; Frank, M

    2010-07-07

    High-energy ionizing radiation is a prominent modality for the treatment of many cancers. The approaches to electron dose calculation can be categorized into semi-empirical models (e.g. Fermi-Eyges, convolution-superposition) and probabilistic methods (e.g.Monte Carlo). A third approach to dose calculation has only recently attracted attention in the medical physics community. This approach is based on the deterministic kinetic equations of radiative transfer. We derive a macroscopic partial differential equation model for electron transport in tissue. This model involves an angular closure in the phase space. It is exact for the free streaming and the isotropic regime. We solve it numerically by a newly developed HLLC scheme based on Berthon et al (2007 J. Sci. Comput. 31 347-89) that exactly preserves the key properties of the analytical solution on the discrete level. We discuss several test cases taken from the medical physics literature. A test case with an academic Henyey-Greenstein scattering kernel is considered. We compare our model to a benchmark discrete ordinate solution. A simplified model of electron interactions with tissue is employed to compute the dose of an electron beam in a water phantom, and a case of irradiation of the vertebral column. Here our model is compared to the PENELOPE Monte Carlo code. In the academic example, the fluences computed with the new model and a benchmark result differ by less than 1%. The depths at half maximum differ by less than 0.6%. In the two comparisons with Monte Carlo, our model gives qualitatively reasonable dose distributions. Due to the crude interaction model, these so far do not have the accuracy needed in clinical practice. However, the new model has a computational cost that is less than one-tenth of the cost of a Monte Carlo simulation. In addition, simulations can be set up in a similar way as a Monte Carlo simulation. If more detailed effects such as coupled electron-photon transport, bremsstrahlung

  17. A deterministic partial differential equation model for dose calculation in electron radiotherapy

    Science.gov (United States)

    Duclous, R.; Dubroca, B.; Frank, M.

    2010-07-01

    High-energy ionizing radiation is a prominent modality for the treatment of many cancers. The approaches to electron dose calculation can be categorized into semi-empirical models (e.g. Fermi-Eyges, convolution-superposition) and probabilistic methods (e.g. Monte Carlo). A third approach to dose calculation has only recently attracted attention in the medical physics community. This approach is based on the deterministic kinetic equations of radiative transfer. We derive a macroscopic partial differential equation model for electron transport in tissue. This model involves an angular closure in the phase space. It is exact for the free streaming and the isotropic regime. We solve it numerically by a newly developed HLLC scheme based on Berthon et al (2007 J. Sci. Comput. 31 347-89) that exactly preserves the key properties of the analytical solution on the discrete level. We discuss several test cases taken from the medical physics literature. A test case with an academic Henyey-Greenstein scattering kernel is considered. We compare our model to a benchmark discrete ordinate solution. A simplified model of electron interactions with tissue is employed to compute the dose of an electron beam in a water phantom, and a case of irradiation of the vertebral column. Here our model is compared to the PENELOPE Monte Carlo code. In the academic example, the fluences computed with the new model and a benchmark result differ by less than 1%. The depths at half maximum differ by less than 0.6%. In the two comparisons with Monte Carlo, our model gives qualitatively reasonable dose distributions. Due to the crude interaction model, these so far do not have the accuracy needed in clinical practice. However, the new model has a computational cost that is less than one-tenth of the cost of a Monte Carlo simulation. In addition, simulations can be set up in a similar way as a Monte Carlo simulation. If more detailed effects such as coupled electron-photon transport, bremsstrahlung

  18. Comparative effectiveness of mepolizumab and omalizumab in severe asthma: An indirect treatment comparison.

    Science.gov (United States)

    Cockle, Sarah M; Stynes, Gillian; Gunsoy, Necdet B; Parks, Daniel; Alfonso-Cristancho, Rafael; Wex, Jaro; Bradford, Eric S; Albers, Frank C; Willson, Jenny

    2017-02-01

    Severe asthma is a heterogeneous disease. Patients with both eosinophilic and allergic asthma phenotypes may be eligible for treatment with mepolizumab and omalizumab. Evidence on the relative effectiveness of these treatments in this 'overlap' population would be informative for clinical and payer decision making. A systematic literature review and indirect treatment comparison (Bayesian framework) were performed to assess the comparative effectiveness and tolerability of mepolizumab and omalizumab, as add-ons to standard of care. Studies included in the primary analysis were double-blind, randomized controlled trials, ≥12 weeks' duration enrolling patients with severe asthma with a documented exacerbation history and receiving high-dose inhaled corticosteroids plus ≥1 additional controller. Two populations were examined: patients potentially eligible for 1) both treatments (Overlap population) and 2) either treatment (Trial population). In the Overlap population, no differences between treatments in clinically significant exacerbations and exacerbations requiring hospitalization were found, although trends favored mepolizumab (rate ratio [RR]:0.66 [95% credible intervals (Crl):0.37,1.19]; 0.19[0.02,2.32], respectively). In the Trial population, mepolizumab treatment produced greater reductions in clinically significant exacerbations (RR:0.63 [95% CrI:0.45,0.89]) but not exacerbations requiring hospitalization compared with omalizumab (RR:0.58 [95% Crl: 0.16,2.13]), although the trend favored mepolizumab. Both treatments had broadly comparable effects on lung function, and similar tolerability profiles. Whilst this analysis has limitations due to a restricted evidence base and residual heterogeneity, it showed that in patients with severe asthma, mepolizumab seems to be at least as effective as omalizumab and that the tolerability profiles of the two treatments did not meaningfully differentiate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Acute myositis: an unusual and severe side effect of docetaxel: a case report and literature review.

    Science.gov (United States)

    Rochigneux, Philippe; Schleinitz, Nicolas; Ebbo, Mikael; Aymonier, Marie; Pourroy, Bertrand; Boissier, Romain; Salas, Sébastien; Deville, Jean-Laurent

    2018-06-01

    Docetaxel is an antimicrotubules cytotoxic agent prescribed widely by medical oncologists in multiple tumor types (breast, lung, prostate, stomach, head, and neck). However, the side effects of docetaxel are numerous (cytopenia, peripheral edema, myalgia, arthralgia, alopecia, and sensitive neuropathy) and recent concerns have been raised about neutropenic enterocolitis in France. Here, we report the case of a 57-year-old patient with metastatic prostatic cancer, who developed a severe myositis and fasciitis grade IV 1 week after his second docetaxel infusion. We reviewed the five cases of docetaxel-related myositis described in the literature, and found that most of them occurred in patients with diabetes (n=5/5) or hypertension (n=4/5). A vascular toxicity may explain this severe complication, and patients with diabetes or hypertension should be monitored closely in the context of a docetaxel chemotherapy.

  20. Ketamine infusion was effective for severe pain of Non-Hodgkin lymphoma

    Directory of Open Access Journals (Sweden)

    Tomoki Nishiyama

    2017-10-01

    Full Text Available A 52 years old man with a Non-Hodgkin lymphoma had severe pain at right buttock and lower leg. Sustained-release tablet of morphine 90 mg/day, intravenous morphine 40 mg/day, granisetron 9 mg/day, metoclopramide 30 mg/day, domperidone suppository 60 mg/day, intravenous hydroxyzine 25 mg/day, and haloperidol 20 mg/day did not decrease pain and side effects. Intravenous ketamine 10 mg in 15 min was quite effective for analgesia. Then infusion of ketamine started with 7 mg/h and increased to 10 mg/h with morphine 20 mg/day, which could control pain well with no side effects until his death. Keywords: Ketamine, Morphine, Cancer pain, Terminal

  1. Evaluation of several methods for assessing the effects of occupational exposure to radiation

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1980-05-01

    The evaluation of health effects in populations occupationally exposed to low-level ionizing radiation is a matter of considerable current controversy. The analysis of data on such exposures presents a variety of problems resulting from the time dependent nature of the exposure data, certain selective biases found in working populations, and particularly limits imposed by the size of the populations, and the magnitudes of exposures received. In this paper, several methods of analysis are presented and evaluated using data from the Hanford plant for illustration. Questions of interest include whether or not to utilize an external control, and how to handle the highly skewed exposure data most effectively. Expressions for the power of various procedures are used not only to compare methods but also to evaluate the potential for detecting effects in occupationally exposed populations

  2. The effects of fire severity on ectomycorrhizal colonization and morphometric features in Pinus pinaster Ait. seedlings

    Energy Technology Data Exchange (ETDEWEB)

    Vásquez-Gassibe, P.; Oria-de-Rueda, J.A.; Santos-del-Blanco, L.; Martín-Pinto, P.

    2016-07-01

    Aim of study: Mycorrhizal fungi in Mediterranean forests play a key role in the complex process of recovery after wildfires. A broader understanding of an important pyrophytic species as Pinus pinaster and its fungal symbionts is thus necessary for forest restoration purposes. This study aims to assess the effects of ectomycorrhizal symbiosis on maritime pine seedlings and how fire severity affects fungal colonization ability. Area of study: Central Spain, in a Mediterranean region typically affected by wildfires dominated by Pinus pinaster, a species adapted to fire disturbance. Material and Methods: We studied P. pinaster root apexes from seedlings grown in soils collected one year after fire in undisturbed sites, sites moderately affected by fire and sites highly affected by fire. Natural ectomycorrhization was observed at the whole root system level as well as at two root vertical sections (0-10 cm and 10-20 cm). We also measured several morphometric traits (tap root length, shoot length, dry biomass of shoots and root/shoot ratio), which were used to test the influence of fire severity and soil chemistry upon them. Main results: Ectomycorrhizal colonization in undisturbed soils for total and separated root vertical sections was higher than in soils that had been affected by fire to some degree. Inversely, seedling vegetative size increased according to fire severity. Research highlights: Fire severity affected soil properties and mycorrhizal colonization one year after occurrence, thus affecting plant development. These findings can contribute to a better knowledge of the factors mediating successful establishment of P. pinaster in Mediterranean forests after wildfires. (Author)

  3. Effect of severe anaemia on renal function: a case-control study

    International Nuclear Information System (INIS)

    Kumar, A.; Hentok, P.; Chandrashekar, N.; Thomas, E.J.; Tripathi, M.; Bal, C.S.; Ghosh, A.; Jailkhani, B.L.; Malhotra, O.P.

    2002-01-01

    Aim: Anaemia, if severe, causes multi systemic functional changes. We tried to find out the effect of severe anaemia on renal function. Materials and Methods: A total of 66 patients with severe anemia and 10 healthy controls were recruited in this study. The cases were divided into following groups: group A: patients with Hb≤3 gm/dl (n=33); group B: patients with Hb≤6 but > 3 gm/dl (n=33); group C: healthy controls with normal renal function and Hb>12gm/dl. Out of 66 anaemic patients, 36 had nutritional anaemia (mainly iron deficiency; group A=20, group B=16), 24 patients were suffering from aplastic anaemia (group A=11, group B=13) and rest 6 had megaloblastic anaemia (group A=2, group B=4). No subject had hypertension, diabetes, primary renal dysfunction or any other systemic illness, affecting kidney. Various renal function test parameters and diagnostic renal failure indices were obtained for all subjects. GFR with 2-sample method after injection of 99m-Tc DTPA and ERPF with single sample method after injection of 131-I OIH were also calculated. Results: Fourteen patients had mild to moderate pedal edema (10 in group A and 4 in group B). Out of these patients, 8 had palpable liver and signs of systemic congestion. Signs of raised systemic venous pressure (raised JVP) were found in 7 patients of group A. In about 55% of patients, chest x-ray showed mild to moderately enlarged heart with disturbed cardiophrenic angle. Urine output was >600 ml/day in all cases. Results are presented. All renal functional parameters and indices were significantly reduced in anaemic patients and were suggestive of pre-renal failure. The reduction was correlating well with the severity of anaemia.Conclusion: Severe anaemia leads to renal dysfunction with alteration of minor and major renal failure indices, which can be characterized by sub-clinical and pre-biochemical non-oliguric pre-renal failure

  4. The effect of heartburn and acid reflux on the severity of nausea and vomiting of pregnancy

    Science.gov (United States)

    Gill, Simerpal Kaur; Maltepe, Caroline; Koren, Gideon

    2009-01-01

    BACKGROUND: Heartburn (HB) and acid reflux (RF) in the non-pregnant population can cause nausea and vomiting; therefore, it is plausible that in women with nausea and vomiting of pregnancy (NVP), HB/RF may increase the severity of symptoms. OBJECTIVE: To determine whether HB/RF during pregnancy contribute to increased severity of NVP. METHODS: A prospectively collected cohort of women who were experiencing NVP and HB, RF or both (n=194) was studied. The Pregnancy-Unique Quantification of Emesis and Nausea (PUQE) scale and its Well-being scale was used to compare the severity of the study cohort’s symptoms. This cohort was compared with a group of women experiencing NVP but no HB/RF (n=188). Multiple linear regression was used to control for the effects of confounding factors. RESULTS: Women with HB/RF reported higher PUQE scores (9.6±2.6) compared with controls (8.9±2.6) (P=0.02). Similarly, Well-being scores for women experiencing HB/RF were lower (4.3±2.1) compared with controls (4.9±2.0) (P=0.01). Multiple linear regression analysis demonstrated that increased PUQE scores (P=0.003) and decreased Well-being scores (P=0.005) were due to the presence of HB/RF as opposed to confounding factors such as pre-existing gastrointestinal conditions/symptoms, hyperemesis gravidarum in previous pregnancies and comorbidities. CONCLUSION: The present cohort study is the first to demonstrate that HB/RF are associated with increased severity of NVP. Managing HB/RF may improve the severity of NVP. PMID:19373420

  5. Classification and unification of the microscopic deterministic traffic models.

    Science.gov (United States)

    Yang, Bo; Monterola, Christopher

    2015-10-01

    We identify a universal mathematical structure in microscopic deterministic traffic models (with identical drivers), and thus we show that all such existing models in the literature, including both the two-phase and three-phase models, can be understood as special cases of a master model by expansion around a set of well-defined ground states. This allows any two traffic models to be properly compared and identified. The three-phase models are characterized by the vanishing of leading orders of expansion within a certain density range, and as an example the popular intelligent driver model is shown to be equivalent to a generalized optimal velocity (OV) model. We also explore the diverse solutions of the generalized OV model that can be important both for understanding human driving behaviors and algorithms for autonomous driverless vehicles.

  6. Mixed deterministic statistical modelling of regional ozone air pollution

    KAUST Repository

    Kalenderski, Stoitchko

    2011-03-17

    We develop a physically motivated statistical model for regional ozone air pollution by separating the ground-level pollutant concentration field into three components, namely: transport, local production and large-scale mean trend mostly dominated by emission rates. The model is novel in the field of environmental spatial statistics in that it is a combined deterministic-statistical model, which gives a new perspective to the modelling of air pollution. The model is presented in a Bayesian hierarchical formalism, and explicitly accounts for advection of pollutants, using the advection equation. We apply the model to a specific case of regional ozone pollution-the Lower Fraser valley of British Columbia, Canada. As a predictive tool, we demonstrate that the model vastly outperforms existing, simpler modelling approaches. Our study highlights the importance of simultaneously considering different aspects of an air pollution problem as well as taking into account the physical bases that govern the processes of interest. © 2011 John Wiley & Sons, Ltd..

  7. Minaret, a deterministic neutron transport solver for nuclear core calculations

    International Nuclear Information System (INIS)

    Moller, J-Y.; Lautard, J-J.

    2011-01-01

    We present here MINARET a deterministic transport solver for nuclear core calculations to solve the steady state Boltzmann equation. The code follows the multi-group formalism to discretize the energy variable. It uses discrete ordinate method to deal with the angular variable and a DGFEM to solve spatially the Boltzmann equation. The mesh is unstructured in 2D and semi-unstructured in 3D (cylindrical). Curved triangles can be used to fit the exact geometry. For the curved elements, two different sets of basis functions can be used. Transport solver is accelerated with a DSA method. Diffusion and SPN calculations are made possible by skipping the transport sweep in the source iteration. The transport calculations are parallelized with respect to the angular directions. Numerical results are presented for simple geometries and for the C5G7 Benchmark, JHR reactor and the ESFR (in 2D and 3D). Straight and curved finite element results are compared. (author)

  8. Analysis of deterministic cyclic gene regulatory network models with delays

    CERN Document Server

    Ahsen, Mehmet Eren; Niculescu, Silviu-Iulian

    2015-01-01

    This brief examines a deterministic, ODE-based model for gene regulatory networks (GRN) that incorporates nonlinearities and time-delayed feedback. An introductory chapter provides some insights into molecular biology and GRNs. The mathematical tools necessary for studying the GRN model are then reviewed, in particular Hill functions and Schwarzian derivatives. One chapter is devoted to the analysis of GRNs under negative feedback with time delays and a special case of a homogenous GRN is considered. Asymptotic stability analysis of GRNs under positive feedback is then considered in a separate chapter, in which conditions leading to bi-stability are derived. Graduate and advanced undergraduate students and researchers in control engineering, applied mathematics, systems biology and synthetic biology will find this brief to be a clear and concise introduction to the modeling and analysis of GRNs.

  9. Distributed Design of a Central Service to Ensure Deterministic Behavior

    Directory of Open Access Journals (Sweden)

    Imran Ali Jokhio

    2012-10-01

    Full Text Available A central authentication service to EPC (Electronic Product Code system architecture is proposed in our previous work. A challenge for a central service always arises that how it can ensure a certain level of delay while processing emergent data. The increasing data in the EPC system architecture is tags data. Therefore, authenticating increasing number of tag in the central authentication service with a deterministic time response is investigated and a distributed authentication service is designed in a layered approach. A distributed design of tag searching services in SOA (Service Oriented Architecture style is also presented. Using the SOA architectural style a self-adaptive authentication service over Cloud is also proposed for the central authentication service, that may also be extended for other applications.

  10. Deterministic Evolutionary Trajectories Influence Primary Tumor Growth: TRACERx Renal

    DEFF Research Database (Denmark)

    Turajlic, Samra; Xu, Hang; Litchfield, Kevin

    2018-01-01

    The evolutionary features of clear-cell renal cell carcinoma (ccRCC) have not been systematically studied to date. We analyzed 1,206 primary tumor regions from 101 patients recruited into the multi-center prospective study, TRACERx Renal. We observe up to 30 driver events per tumor and show...... that subclonal diversification is associated with known prognostic parameters. By resolving the patterns of driver event ordering, co-occurrence, and mutual exclusivity at clone level, we show the deterministic nature of clonal evolution. ccRCC can be grouped into seven evolutionary subtypes, ranging from tumors...... outcome. Our insights reconcile the variable clinical behavior of ccRCC and suggest evolutionary potential as a biomarker for both intervention and surveillance....

  11. Minaret, a deterministic neutron transport solver for nuclear core calculations

    Energy Technology Data Exchange (ETDEWEB)

    Moller, J-Y.; Lautard, J-J., E-mail: jean-yves.moller@cea.fr, E-mail: jean-jacques.lautard@cea.fr [CEA - Centre de Saclay , Gif sur Yvette (France)

    2011-07-01

    We present here MINARET a deterministic transport solver for nuclear core calculations to solve the steady state Boltzmann equation. The code follows the multi-group formalism to discretize the energy variable. It uses discrete ordinate method to deal with the angular variable and a DGFEM to solve spatially the Boltzmann equation. The mesh is unstructured in 2D and semi-unstructured in 3D (cylindrical). Curved triangles can be used to fit the exact geometry. For the curved elements, two different sets of basis functions can be used. Transport solver is accelerated with a DSA method. Diffusion and SPN calculations are made possible by skipping the transport sweep in the source iteration. The transport calculations are parallelized with respect to the angular directions. Numerical results are presented for simple geometries and for the C5G7 Benchmark, JHR reactor and the ESFR (in 2D and 3D). Straight and curved finite element results are compared. (author)

  12. Molecular dynamics with deterministic and stochastic numerical methods

    CERN Document Server

    Leimkuhler, Ben

    2015-01-01

    This book describes the mathematical underpinnings of algorithms used for molecular dynamics simulation, including both deterministic and stochastic numerical methods. Molecular dynamics is one of the most versatile and powerful methods of modern computational science and engineering and is used widely in chemistry, physics, materials science and biology. Understanding the foundations of numerical methods means knowing how to select the best one for a given problem (from the wide range of techniques on offer) and how to create new, efficient methods to address particular challenges as they arise in complex applications.  Aimed at a broad audience, this book presents the basic theory of Hamiltonian mechanics and stochastic differential equations, as well as topics including symplectic numerical methods, the handling of constraints and rigid bodies, the efficient treatment of Langevin dynamics, thermostats to control the molecular ensemble, multiple time-stepping, and the dissipative particle dynamics method...

  13. HSimulator: Hybrid Stochastic/Deterministic Simulation of Biochemical Reaction Networks

    Directory of Open Access Journals (Sweden)

    Luca Marchetti

    2017-01-01

    Full Text Available HSimulator is a multithread simulator for mass-action biochemical reaction systems placed in a well-mixed environment. HSimulator provides optimized implementation of a set of widespread state-of-the-art stochastic, deterministic, and hybrid simulation strategies including the first publicly available implementation of the Hybrid Rejection-based Stochastic Simulation Algorithm (HRSSA. HRSSA, the fastest hybrid algorithm to date, allows for an efficient simulation of the models while ensuring the exact simulation of a subset of the reaction network modeling slow reactions. Benchmarks show that HSimulator is often considerably faster than the other considered simulators. The software, running on Java v6.0 or higher, offers a simulation GUI for modeling and visually exploring biological processes and a Javadoc-documented Java library to support the development of custom applications. HSimulator is released under the COSBI Shared Source license agreement (COSBI-SSLA.

  14. Deterministic secure communications using two-mode squeezed states

    International Nuclear Information System (INIS)

    Marino, Alberto M.; Stroud, C. R. Jr.

    2006-01-01

    We propose a scheme for quantum cryptography that uses the squeezing phase of a two-mode squeezed state to transmit information securely between two parties. The basic principle behind this scheme is the fact that each mode of the squeezed field by itself does not contain any information regarding the squeezing phase. The squeezing phase can only be obtained through a joint measurement of the two modes. This, combined with the fact that it is possible to perform remote squeezing measurements, makes it possible to implement a secure quantum communication scheme in which a deterministic signal can be transmitted directly between two parties while the encryption is done automatically by the quantum correlations present in the two-mode squeezed state

  15. Deterministically entangling multiple remote quantum memories inside an optical cavity

    Science.gov (United States)

    Yan, Zhihui; Liu, Yanhong; Yan, Jieli; Jia, Xiaojun

    2018-01-01

    Quantum memory for the nonclassical state of light and entanglement among multiple remote quantum nodes hold promise for a large-scale quantum network, however, continuous-variable (CV) memory efficiency and entangled degree are limited due to imperfect implementation. Here we propose a scheme to deterministically entangle multiple distant atomic ensembles based on CV cavity-enhanced quantum memory. The memory efficiency can be improved with the help of cavity-enhanced electromagnetically induced transparency dynamics. A high degree of entanglement among multiple atomic ensembles can be obtained by mapping the quantum state from multiple entangled optical modes into a collection of atomic spin waves inside optical cavities. Besides being of interest in terms of unconditional entanglement among multiple macroscopic objects, our scheme paves the way towards the practical application of quantum networks.

  16. A deterministic model of nettle caterpillar life cycle

    Science.gov (United States)

    Syukriyah, Y.; Nuraini, N.; Handayani, D.

    2018-03-01

    Palm oil is an excellent product in the plantation sector in Indonesia. The level of palm oil productivity is very potential to increase every year. However, the level of palm oil productivity is lower than its potential. Pests and diseases are the main factors that can reduce production levels by up to 40%. The existence of pests in plants can be caused by various factors, so the anticipation in controlling pest attacks should be prepared as early as possible. Caterpillars are the main pests in oil palm. The nettle caterpillars are leaf eaters that can significantly decrease palm productivity. We construct a deterministic model that describes the life cycle of the caterpillar and its mitigation by using a caterpillar predator. The equilibrium points of the model are analyzed. The numerical simulations are constructed to give a representation how the predator as the natural enemies affects the nettle caterpillar life cycle.

  17. Blood harmane, blood lead, and severity of hand tremor: evidence of additive effects.

    Science.gov (United States)

    Louis, Elan D; Factor-Litvak, Pam; Gerbin, Marina; Slavkovich, Vesna; Graziano, Joseph H; Jiang, Wendy; Zheng, Wei

    2011-03-01

    Tremor is a widespread phenomenon in human populations. Environmental factors are likely to play an etiological role. Harmane (1-methyl-9H-pyrido[3,4-β]indole) is a potent tremor-producing β-carboline alkaloid. Lead is another tremor-producing neurotoxicant. The effects of harmane and lead with respect to tremor have been studied in isolation. We tested the hypothesis that tremor would be particularly severe among individuals who had high blood concentrations of both of these toxicants. Blood concentrations of harmane and lead were each quantified in 257 individuals (106 essential tremor cases and 151 controls) enrolled in an environmental epidemiological study. Total tremor score (range = 0-36) was a clinical measure of tremor severity. The total tremor score ranged from 0 to 36, indicating that a full spectrum of tremor severities was captured in our sample. Blood harmane concentration correlated with total tremor score (p = 0.007), as did blood lead concentration (p = 0.045). The total tremor score was lowest in participants with both low blood harmane and lead concentrations (8.4 ± 8.2), intermediate in participants with high concentrations of either toxicant (10.5 ± 9.8), and highest in participants with high concentrations of both toxicants (13.7 ± 10.4) (p=0.01). Blood harmane and lead concentrations separately correlated with total tremor scores. Participants with high blood concentrations of both toxicants had the highest tremor scores, suggesting an additive effect of these toxicants on tremor severity. Given the very high population prevalence of tremor disorders, identifying environmental determinants is important for primary disease prevention. Copyright © 2010 Elsevier Inc. All rights reserved.

  18. Absorbing phase transitions in deterministic fixed-energy sandpile models

    Science.gov (United States)

    Park, Su-Chan

    2018-03-01

    We investigate the origin of the difference, which was noticed by Fey et al. [Phys. Rev. Lett. 104, 145703 (2010), 10.1103/PhysRevLett.104.145703], between the steady state density of an Abelian sandpile model (ASM) and the transition point of its corresponding deterministic fixed-energy sandpile model (DFES). Being deterministic, the configuration space of a DFES can be divided into two disjoint classes such that every configuration in one class should evolve into one of absorbing states, whereas no configurations in the other class can reach an absorbing state. Since the two classes are separated in terms of toppling dynamics, the system can be made to exhibit an absorbing phase transition (APT) at various points that depend on the initial probability distribution of the configurations. Furthermore, we show that in general the transition point also depends on whether an infinite-size limit is taken before or after the infinite-time limit. To demonstrate, we numerically study the two-dimensional DFES with Bak-Tang-Wiesenfeld toppling rule (BTW-FES). We confirm that there are indeed many thresholds. Nonetheless, the critical phenomena at various transition points are found to be universal. We furthermore discuss a microscopic absorbing phase transition, or a so-called spreading dynamics, of the BTW-FES, to find that the phase transition in this setting is related to the dynamical isotropic percolation process rather than self-organized criticality. In particular, we argue that choosing recurrent configurations of the corresponding ASM as an initial configuration does not allow for a nontrivial APT in the DFES.

  19. Measures of thermodynamic irreversibility in deterministic and stochastic dynamics

    International Nuclear Information System (INIS)

    Ford, Ian J

    2015-01-01

    It is generally observed that if a dynamical system is sufficiently complex, then as time progresses it will share out energy and other properties amongst its component parts to eliminate any initial imbalances, retaining only fluctuations. This is known as energy dissipation and it is closely associated with the concept of thermodynamic irreversibility, measured by the increase in entropy according to the second law. It is of interest to quantify such behaviour from a dynamical rather than a thermodynamic perspective and to this end stochastic entropy production and the time-integrated dissipation function have been introduced as analogous measures of irreversibility, principally for stochastic and deterministic dynamics, respectively. We seek to compare these measures. First we modify the dissipation function to allow it to measure irreversibility in situations where the initial probability density function (pdf) of the system is asymmetric as well as symmetric in velocity. We propose that it tests for failure of what we call the obversibility of the system, to be contrasted with reversibility, the failure of which is assessed by stochastic entropy production. We note that the essential difference between stochastic entropy production and the time-integrated modified dissipation function lies in the sequence of procedures undertaken in the associated tests of irreversibility. We argue that an assumed symmetry of the initial pdf with respect to velocity inversion (within a framework of deterministic dynamics) can be incompatible with the Past Hypothesis, according to which there should be a statistical distinction between the behaviour of certain properties of an isolated system as it evolves into the far future and the remote past. Imposing symmetry on a velocity distribution is acceptable for many applications of statistical physics, but can introduce difficulties when discussing irreversible behaviour. (paper)

  20. Deterministic Earthquake Hazard Assessment by Public Agencies in California

    Science.gov (United States)

    Mualchin, L.

    2005-12-01

    Even in its short recorded history, California has experienced a number of damaging earthquakes that have resulted in new codes and other legislation for public safety. In particular, the 1971 San Fernando earthquake produced some of the most lasting results such as the Hospital Safety Act, the Strong Motion Instrumentation Program, the Alquist-Priolo Special Studies Zone Act, and the California Department of Transportation (Caltrans') fault-based deterministic seismic hazard (DSH) map. The latter product provides values for earthquake ground motions based on Maximum Credible Earthquakes (MCEs), defined as the largest earthquakes that can reasonably be expected on faults in the current tectonic regime. For surface fault rupture displacement hazards, detailed study of the same faults apply. Originally, hospital, dam, and other critical facilities used seismic design criteria based on deterministic seismic hazard analyses (DSHA). However, probabilistic methods grew and took hold by introducing earthquake design criteria based on time factors and quantifying "uncertainties", by procedures such as logic trees. These probabilistic seismic hazard analyses (PSHA) ignored the DSH approach. Some agencies were influenced to adopt only the PSHA method. However, deficiencies in the PSHA method are becoming recognized, and the use of the method is now becoming a focus of strong debate. Caltrans is in the process of producing the fourth edition of its DSH map. The reason for preferring the DSH method is that Caltrans believes it is more realistic than the probabilistic method for assessing earthquake hazards that may affect critical facilities, and is the best available method for insuring public safety. Its time-invariant values help to produce robust design criteria that are soundly based on physical evidence. And it is the method for which there is the least opportunity for unwelcome surprises.

  1. Effects of severe obstetric complications on women’s health and infant mortality in Benin

    Science.gov (United States)

    Filippi, Véronique; Goufodji, Sourou; Sismanidis, Charalambos; Kanhonou, Lydie; Fottrell, Edward; Ronsmans, Carine; Alihonou, Eusèbe; Patel, Vikram

    2010-01-01

    Summary Objective To document the impact of severe obstetric complications on post-partum health in mothers and mortality in babies over 12 months in Benin and to assess whether severe complications associated with perinatal death are particularly likely to lead to adverse health consequences. Methods Cohort study which followed women and their babies after a severe complication or an uncomplicated childbirth. Women were selected in hospitals and interviewed at home at discharge, and at 6 and 12 months post-partum. Women were invited for a medical check-up at 6 months and 12 months. Results The cohort includes 205 women with severe complications and a live birth, 64 women with severe complications and perinatal death and 440 women with uncomplicated delivery. Women with severe complications and a live birth were not dissimilar to women with a normal delivery in terms of post-partum health, except for hypertension [adjusted OR = 5.8 (1.9–17.0)], fever [adjusted OR = 1.71 (1.1–2.8)] and infant mortality [adjusted OR = 11.0 (0.8–158.2)]. Women with complications and perinatal death were at increased risk of depression [adjusted OR = 3.4 (1.3–9.0)], urine leakages [adjusted OR = 2.7 (1.2–5.8)], and to report poor health [adjusted OR = 5.27 (2.2–12.4)] and pregnancy’s negative effects on their life [adjusted OR = 4.11 (1.9–9.0)]. Uptake of post-natal services was poor in all groups. Conclusion Women in developing countries face a high risk of severe complications during pregnancy and delivery. These can lead to adverse consequences for their own health and that of their offspring. Resources are needed to ensure that pregnant women receive adequate care before, during and after discharge from hospital. Near-miss women with a perinatal death appear a particularly high-risk group. PMID:20406426

  2. Cryotherapy effect on oral mucositis severity among recipients of bone marrow transplantation: a literature review.

    Science.gov (United States)

    Tayyem, Abdel-Qader Mahmoud

    2014-08-01

    Oral mucositis is a distressing toxic effect of cancer therapy and one of the major side effects of the myeloablative conditioning used to prepare patients for bone marrow transplantation (BMT). Oral cryotherapy is one of the recent modalities used to prevent and manage oral mucositis. The purpose of this review is to clarify the cryotherapy effect on oral mucositis severity among patients receiving myeloablative conditioning followed by BMT. A literature search was performed using six different electronic databases: CINAHL®, MEDLINE®, Nursing Ovid, PubMed, Springer, and Science Direct. Six articles were deemed relevant and included in this review. Oral mucositis increases mortality rate, length of hospital stay, opioid use, and the need for parenteral nutrition usage. It also decreases patient's quality of life and his or her desire to complete treatment. However, oral cryotherapy significantly minimizes the incidence and severity of oral mucositis and decreases secondary oral mucositis complications. Using oral cryotherapy concurrently with a regular oral care protocol can improve its efficacy for preventing and managing oral mucositis. Additional studies should be conducted to create standard oral cryotherapy protocols.

  3. [The clinical effect observation for surgery of nose and pharyngeal auxiliary oral appliance in severe OSAHS].

    Science.gov (United States)

    Hui, Peilin; Xie, Yuping; Wei, Xiaoquan; Zhao, Lijun; Ma, Wei; Wang, Jinfeng; Ning, Jing; Xu, Chao; Yang, Qian; Kang, Hong

    2015-03-01

    To investigate the therapeutic effects of oral modified device combined with nasopharyngeal enlargement surgery and evaluate the oral modified device' s adjuvant therapy meaning in severe OSAHS patients after surgery treatment. 46 cases with severe OSAHS were diagnosed by PSG according to AHI and the lowest arterial oxygen saturation (LSaO2). We performed the nasal or pharyngeal cavity expansion surgery for them according to the pathological change part correspondingly. Then all subjects were divided into combined group (n=26) and surgery alone group (n=20) according to their personal willingness. We monitored the PSG for all subjects aftter 2 weeks and 3 months respectively, then we calculate the diversity between the two group or intragroup change on the basis of the AHI, LAT, LSaO2, mean arterial oxygen saturation (MSaO2) and sleep structures recorded by PSG. At the same time, we collected the subjective sensations by questionnaire. The AHI and LAT in combined group were significantly lower and LSaO2 was significantly higher than these in surgery alone group(P0. 05). The N 1% was more shorter and the N2% and N3% were more longer after nasal or pharyngeal operation compared with pre-operative states in both groups(P. 05). The data of PSG also showed that the shallow sleep proportion was more shorter and the slow wave sleep proportion was more longer in combined group compared with surgery alone group. The subjective sensations results also showed significantly alleviated in combined group, such as mental state, daytime sleepiness and physical strength. The efficiency ratio of treatment was 85. 0% and 92. 3% in surgery alone group and combined group respectively. Nasal and pharyngeal cavity enlargement surgery combined with oral modified device is a more effective treatment in patients with severe OSAHS, and it is meaningful for the long-term curative effect of surgery to prevent relapse and improve.

  4. Vertical flow constructed wetlands for domestic wastewater treatment on tropical conditions: effect of several design parameters

    DEFF Research Database (Denmark)

    Bohorquez, Eliana; Paredes, Diego; Arias, Carlos Alberto

    Vertical flow constructed wetlands (VFWC) design and operation takes into account several variables which affect performance its performance. These aspects had been evaluated and documented among others in countries like USA, Denmark, Austria. In contrast, VFCW had not been studied in tropical...... countries and, specifically in Colombia, design and operation parameters are not defined yet. The objective of this study was evaluate the effects of filter medium, the feeding frequency and Heliconia psittacorum presence, a typical local plant, on the domestic wastewater treatment in tropical conditions....

  5. On the effectiveness of surface severe plastic deformation by shot peening at cryogenic temperature

    Science.gov (United States)

    Novelli, M.; Fundenberger, J.-J.; Bocher, P.; Grosdidier, T.

    2016-12-01

    The effect of cryogenic temperature (CT) on the graded microstructures obtained by severe shot peening using surface mechanical attrition treatment (SMAT) was investigated for two austenitic steels that used different mechanisms for assisting plastic deformation. For the metastable 304L steel, the depth of the hardened region increases because CT promotes the formation of strain induced martensite. Comparatively, for the 310S steel that remained austenitic, the size of the subsurface affected region decreases because of the improved strength of the material at CT but the fine twinned nanostructures results in significant top surface hardening.

  6. Treatment of acquired arteriovenous fistula with severe hemodynamic effects: therapeutic challenge

    Directory of Open Access Journals (Sweden)

    Bruna Ferreira Pilan

    2014-03-01

    Full Text Available A 34-year-old female patient with severe heart failure and pulmonary hypertension was diagnosed late with a high-output acquired arteriovenous fistula between the right common iliac vein and artery. The most probable cause was an iatrogenic vascular injury inflicted during a prior laparoscopic cholecystectomy. Treatment was conducted by placement of an endoprosthesis in the common iliac artery, achieving total exclusion of the fistula and complete remission of symptoms. Considering the options available for treating this type of lesion, endovascular techniques are becoming ever more effective and are now the option of first-choice for management of this pathology.

  7. Effect of sclerostin antibody treatment in a mouse model of severe osteogenesis imperfecta.

    Science.gov (United States)

    Roschger, Andreas; Roschger, Paul; Keplingter, Petra; Klaushofer, Klaus; Abdullah, Sami; Kneissel, Michaela; Rauch, Frank

    2014-09-01

    Osteogenesis imperfecta (OI) is a heritable bone fragility disorder that is usually caused by mutations affecting collagen type I production in osteoblasts. Stimulation of bone formation through sclerostin antibody treatment (Sost-ab) has shown promising results in mouse models of relatively mild OI. We assessed the effect of once-weekly intravenous Sost-ab injections for 4weeks in male Col1a1(Jrt)/+mice, a model of severe dominant OI, starting either at 4weeks (growing mice) or at 20weeks (adult mice) of age. Sost-ab had no effect on weight or femur length. In OI mice, no significant treatment-associated differences in serum markers of bone formation (alkaline phosphatase activity, procollagen type I N-propeptide) or resorption (C-telopeptide of collagen type I) were found. Micro-CT analyses at the femur showed that Sost-ab treatment was associated with higher trabecular bone volume and higher cortical thickness in wild type mice at both ages and in growing OI mice, but not in adult OI mice. Three-point bending tests of the femur showed that in wild type but not in OI mice, Sost-ab was associated with higher ultimate load and work to failure. Quantitative backscattered electron imaging of the femur did not show any effect of Sost-ab on CaPeak (the most frequently occurring calcium concentration in the bone mineral density distribution), regardless of genotype, age or measurement location. Thus, Sost-ab had a larger effect in wild type than in Col1a1(Jrt)/+mice. Previous studies had found marked improvements of Sost-ab on bone mass and strength in an OI mouse model with a milder phenotype. Our data therefore suggest that Sost-ab is less effective in a more severely affected OI mouse model. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. The effect of helmet use on injury severity and crash circumstances in skiers and snowboarders.

    Science.gov (United States)

    Hagel, Brent; Pless, I Barry; Goulet, Claude; Platt, Robert; Robitaille, Yvonne

    2005-01-01

    The aim of this study was to examine the effect of helmet use on non-head-neck injury severity and crash circumstances in skiers and snowboarders. We used a matched case-control study over the November 2001 to April 2002 winter season. 3295 of 4667 injured skiers and snowboarders reporting to the ski patrol at 19 areas in Quebec with non-head, non-neck injuries agreed to participate. Cases included those evacuated by ambulance, admitted to hospital, with restriction of normal daily activities (NDAs) >6 days, with non-helmet equipment damage, fast self-reported speed, participating on a more difficult run than usual, and jumping-related injury. Controls were injured participants without severe injuries or high-energy crash circumstances and were matched to cases on ski area, activity, day, age, and sex. Conditional logistic regression was used to relate each outcome to helmet use. There was no evidence that helmet use increased the risk of severe injury or high-energy crash circumstances. The results suggest that helmet use in skiing and snowboarding is not associated with riskier activities that lead to non-head-neck injuries.

  9. Modeling of in-vessel fission product release including fuel morphology effects for severe accident analyses

    International Nuclear Information System (INIS)

    Suh, K.Y.

    1989-10-01

    A new in-vessel fission product release model has been developed and implemented to perform best-estimate calculations of realistic source terms including fuel morphology effects. The proposed bulk mass transfer correlation determines the product of fission product release and equiaxed grain size as a function of the inverse fuel temperature. The model accounts for the fuel-cladding interaction over the temperature range between 770 K and 3000 K in the steam environment. A separate driver has been developed for the in-vessel thermal hydraulic and fission product behavior models that were developed by the Department of Energy for the Modular Accident Analysis Package (MAAP). Calculational results of these models have been compared to the results of the Power Burst Facility Severe Fuel Damage tests. The code predictions utilizing the mass transfer correlation agreed with the experimentally determined fractional release rates during the course of the heatup, power hold, and cooldown phases of the high temperature transients. Compared to such conventional literature correlations as the steam oxidation model and the NUREG-0956 correlation, the mass transfer correlation resulted in lower and less rapid releases in closer agreement with the on-line and grab sample data from the Severe Fuel Damage tests. The proposed mass transfer correlation can be applied for best-estimate calculations of fission products release from the UO 2 fuel in both nominal and severe accident conditions. 15 refs., 10 figs., 2 tabs

  10. A small effect of adding antiviral agents in treating patients with severe Bell palsy.

    Science.gov (United States)

    van der Veen, Erwin L; Rovers, Maroeska M; de Ru, J Alexander; van der Heijden, Geert J

    2012-03-01

    In this evidence-based case report, the authors studied the following clinical question: What is the effect of adding antiviral agents to corticosteroids in the treatment of patients with severe or complete Bell palsy? The search yielded 250 original research articles. The 6 randomized trials of these that could be used all reported low-quality data for answering the clinical question; apart from apparent flaws, they did not primarily include patients with severe or complete Bell palsy. Complete functional facial nerve recovery was seen in 75% of the patients receiving prednisolone only and in 83% with additional antiviral treatment. The pooled risk difference of 7% (95% confidence interval, -1% to 15%) results in a number needed to treat of 14 (ie, slightly favors adding an antiviral agent). The authors conclude that although a strong recommendation for adding antiviral agents to corticosteroids to further improve the recovery of patients with severe Bell palsy is precluded by the lack of robust evidence, it should be discussed with the patient.

  11. Effectiveness of a Low-Calorie Weight Loss Program in Moderately and Severely Obese Patients

    Directory of Open Access Journals (Sweden)

    Julia K. Winkler

    2013-10-01

    Full Text Available Aims: To compare effectiveness of a 1-year weight loss program in moderately and severely obese patients. Methods: The study sample included 311 obese patients participating in a weight loss program, which comprised a 12-week weight reduction phase (low-calorie formula diet and a 40-week weight maintenance phase. Body weight and glucose and lipid values were determined at the beginning of the program as well as after the weight reduction and the weight maintenance phase. Participants were analyzed according to their BMI class at baseline (30-34.9 kg/m2; 35-39.9 kg/m2; 40-44.9 kg/m2; 45-49.9 kg/m2; ≥50 kg/m2. Furthermore, moderately obese patients (BMI 2 were compared to severely obese participants (BMI ≥ 40 kg/m2. Results: Out of 311 participants, 217 individuals completed the program. Their mean baseline BMI was 41.8 ± 0.5 kg/m2. Average weight loss was 17.9 ± 0.6%, resulting in a BMI of 34.3 ± 0.4 kg/m2 after 1 year (p Conclusion: 1-year weight loss intervention improves body weight as well as lipid and glucose metabolism not only in moderately, but also in severely obese individuals.

  12. The Effect of Septoplasty on Voice Performance in Patients With Severe and Mild Nasal Septal Deviation.

    Science.gov (United States)

    Atan, Doğan; Özcan, Kürşat Murat; Gürbüz, Ayşe Betül Topak; Dere, Hüseyin

    2016-07-01

    The authors aimed to analyze the effect of septoplasty, performed in 2 groups with different grades of nasal septal deviation (NSD), on voice performance. A total of 43 patients who had septoplasty due to NSD and were included in the study. The study groups were divided into 2 groups as groups A and B. The patients in group A had severe NSD, and 1 of the nasal cavity was obstructed totally or near totally. In group B, the NSD narrowed the nasal passage, and the deviation was not severe. The voice performance was analyzed preoperatively, and 1 month after surgery with both objective and subjective methods. Objective analysis included acoustic voice analysis, and measurement of F0, jitter %, shimmer %. Preoperative and postoperative F0, jitter %, shimmer %, and Voice Handicap Index-30 (VHI-30) were compared in groups A and B. F0 showed a statistically significant improvement after surgery in group A (P performed for severe NSD obstructing nasal lumen totally or near totally results in significant improvements in the voice performance.

  13. 'Real-life' effectiveness studies of omalizumab in adult patients with severe allergic asthma: systematic review.

    Science.gov (United States)

    Abraham, I; Alhossan, A; Lee, C S; Kutbi, H; MacDonald, K

    2016-05-01

    We reviewed 24 'real-life' effectiveness studies of omalizumab in the treatment of severe allergic asthma that included 4117 unique patients from 32 countries with significant heterogeneity in patients, clinicians and settings. The evidence underscores the short- and long-term benefit of anti-IgE therapy in terms of the following: improving lung function; achieving asthma control and reducing symptomatology, severe exacerbations and associated work/school days lost; reducing healthcare resource utilizations, in particular hospitalizations, hospital lengths of stay and accident specialist or emergency department visits; reducing or discontinuing other asthma medications; and improving quality of life - thus confirming, complementing and extending evidence from randomized trials. Thus, omalizumab therapy is associated with signal improvements across the full objective and subjective burden of illness chain of severe allergic asthma. Benefits of omalizumab may extend up to 2-4 years, and the majority of omalizumab-treated patients may benefit for many years. Omalizumab has positive short- and long-term safety profiles similar to what is known from randomized clinical trials. Initiated patients should be monitored for treatment response at 16 weeks. Those showing positive response at that time are highly likely to show sustained treatment response and benefit in terms of clinical, quality of life and health resource utilization outcomes. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Effect of Early Foliar Disease Control on Wheat Scab Severity (Fusarium graminearum in Argentina

    Directory of Open Access Journals (Sweden)

    Jorge David Mantecón

    2013-01-01

    Full Text Available Wheat scab is common in Argentina mainly durum wheat and some bread varieties. The epidemics occur every 5 to 7 years. During the 2007, 2008, and 2009 growing seasons, three trials were conducted at the INTA Balcarce Experimental Station. Each plot had six rows of 5 m long, spaced 0.15 m apart and was set up in a randomized complete block design with four replications. Trifloxystrobin plus cyproconazole was sprayed at Z3.1 stage. Treatments were sprayed at Z6.1 stage with tebuconazole, prochloraz, and metconazole to improve scab control. Artificial inoculations were made in Z6.1. Severity of Septoria leaf bloth and leaf rust was assessed in boot stage (Z3.9. Scab severity was rated at early dough stage (Z8.3. Yields were recorded each year. Fungicide only applied at Z3.1 stage did not reduce field scab severity but reduced the seeds infection and increased the yields. Early fungicide spray produced yield increase at about 22% and a decrease in seed infection of up to 40%. Yields increased in a 55.3% and in a 19.6% when compared with the inoculated and not inoculated check, respectively. The purpose of this study was to evaluate the effect of foliar disease control on scab, crop yield, and seed health.

  15. Effect of dextran-70 on outcome in severe sepsis; a propensity-score matching study.

    Science.gov (United States)

    Bentzer, Peter; Broman, Marcus; Kander, Thomas

    2017-07-06

    Albumin may be beneficial in patients with septic shock but availability is limited and cost is high. The objective of the present study was to investigate if the use of dextran-70 in addition to albumin and crystalloids influences organ failure or mortality in patients with severe sepsis or septic shock. Patients with severe sepsis or septic shock (n = 778) admitted to a university hospital intensive care unit (ICU) between 2007 and 2015 that received dextran-70 during resuscitation were propensity score matched to controls at a 1 to 1 ratio. Outcomes were highest acute kidney injury network (AKIN) score the first 10 days in the ICU, use of renal replacement therapy, days alive and free of organ support the first 28 days after admission to ICU, mortality and events of severe bleeding. Outcomes were assessed using paired hypothesis testing. Propensity score matching resulted in two groups of patients with 245 patients in each group. The dextran group received a median volume of 1483 ml (interquartile range, 1000-2000 ml) of dextran-70 during the ICU stay. Highest AKIN score did not differ between the control- and dextran groups (1 (0-3) versus 2 (0-3), p = 0.06). Incidence of renal replacement therapy in the control- and dextran groups was similar (19% versus 22%, p = 0.42, absolute risk reduction -2.9% [95% CI: -9.9 to 4.2]). Days alive and free of renal replacement, vasopressors and mechanical ventilation did not differ between the control- and dextran groups. The 180-day mortality was 50.2% in the control group and 41.6% in the dextran group (p = 0.046, absolute risk reduction 8.6% [-0.2 to 17.4]). Fraction of patients experiencing a severe bleeding in the first 10 days in the ICU did not differ between the control and dextran groups (14% versus 18%, p = 0.21). There is a paucity of high quality data regarding effects of dextran solutions on outcome in sepsis. In the present study, propensity score matching was used in attempt to reduce bias. No

  16. [Evaluation of effectiveness of several repellents against mosquito bites available at the Polish market].

    Science.gov (United States)

    Mikulak, Ewa; Gliniewicz, Aleksandra; Królasik, Agnieszka; Sawicka, Bozena; Rabczenko, Daniel

    2012-01-01

    BACKGROUND. Mosquitoes are blood-sucking insects, nuisance to humans and animals. Their bites cause itching and allergic reactions. These insects are also vectors of several viruses, bacteria and parasites. Protection against mosquitoes is therefore justified and desirable. This can give repellents and products for protection small outdoor areas such as terraces, home gardens. OBJECTIVE. The aim of this study was to evaluate the effectiveness of eight selected products with different formulations used against mosquitoes including: 5 preparations for use on the body or clothing (repellents A, B, C, D, E and 3 products for use in small outdoor spaces (I, J, K). [corrected] Repellents were tested in laboratory trials, when volunteers were exposed to Aedes aegypti females breeding in the laboratory. Products I, J, K were tested in field trials; volunteers were exposed to female mosquitoes at various ages from the environment (Aedes sp, Culex sp). The results showed that all tested repellents were efficient during 4 hrs. After this time their effectiveness decreased--fast in the case of repellent B (10% DEET), not very fast, but significant--in the case of repellent C (15% DEET). Three products for small area protection gave (each of them) 3-hour protection against mosquito bites. Product K (21,97% allethrin) was 100% effective (no bites at all). Both kinds of product can give effective protection against mosquito bites. Their use is most effective, cheaper and more safe for the environment method of protection against mosquitoes than chemical spraying of large areas.

  17. An evaluation of several methods for assessing the effects of occupational exposure to radiation

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1983-01-01

    Several methods for the analysis of occupational radiation exposure data, including procedures based on Cox's proportional hazards model, are presented and evaluated. Issues of interest include the contribution of an external control, the effective handling of highly skewed exposure data, and the potential for detecting effects in populations occupationally exposed to radiation. Expressions for evaluating the power of various procedures are derived and applied to data from the Hanford population in order to determine power curves for detecting leukemia effects, with both additive and multiplicative linear models being used. It is found that the introduction of an external control can increase power, although not when an overall adjustment factor must be estimated from the data or when death rates for the study population are substantially lower than those for the control population. It is also found that very little power is lost if exposures are grouped. Finally, the power calculations indicate, as expected, that in analyses of occupationally exposed populations, such as the Hanford workers, there is very little chance of detecting radiation effects at the levels of our current estimates. However, power is reasonably good for detecting effects that are 10 to 15 times larger

  18. Effectiveness of prehospital trauma triage systems in selecting severely injured patients: Is comparative analysis possible?

    Science.gov (United States)

    van Rein, Eveline A J; van der Sluijs, Rogier; Houwert, R Marijn; Gunning, Amy C; Lichtveld, Rob A; Leenen, Luke P H; van Heijl, Mark

    2018-01-27

    In an optimal trauma system, prehospital trauma triage ensures transport of the right patient to the right hospital. Incorrect triage results in undertriage and overtriage. The aim of this systematic review is to evaluate and compare prehospital trauma triage system quality worldwide and determine effectiveness in terms of undertriage and overtriage for trauma patients. A systematic search of Pubmed/MEDLINE, Embase, and Cochrane Library databases was performed, using "trauma", "trauma center," or "trauma system", combined with "triage", "undertriage," or "overtriage", as search terms. All studies describing ground transport and actual destination hospital of patients with and without severe injuries, using prehospital triage, published before November 2017, were eligible for inclusion. To assess the quality of these studies, a critical appraisal tool was developed. A total of 33 articles were included. The percentage of undertriage ranged from 1% to 68%; overtriage from 5% to 99%. Older age and increased geographical distance were associated with undertriage. Mortality was lower for severely injured patients transferred to a higher-level trauma center. The majority of the included studies were of poor methodological quality. The studies of good quality showed poor performance of the triage protocol, but additional value of EMS provider judgment in the identification of severely injured patients. In most of the evaluated trauma systems, a substantial part of the severely injured patients is not transported to the appropriate level trauma center. Future research should come up with new innovative ways to improve the quality of prehospital triage in trauma patients. Copyright © 2018. Published by Elsevier Inc.

  19. Effect of the callipyge phenotype and cooking method on tenderness of several major lamb muscles.

    Science.gov (United States)

    Shackelford, S D; Wheeler, T L; Koohmaraie, M

    1997-08-01

    We conducted three experiments to determine the effects of the callipyge phenotype on the tenderness of several major lamb muscles and to determine the effect of method of cookery on the tenderness of callipyge lamb at 7 d postmortem. In Exp. 1, chops from normal (n = 23) and callipyge (n = 16) carcasses were open-hearth-broiled. Warner-Bratzler shear force values of longissimus, gluteus medius, semimembranosus, biceps femoris, semitendinosus, adductor, and quadriceps femoris were 123, 44, 28, 26, 19, 16, and 13% greater, respectively, for callipyge (P lamb carcasses (n = 60). Callipyge chops were less tender than normal chops (P cooking method, callipyge samples were less juicy than normal samples (P < .05). These data demonstrate that the callipyge phenotype will likely reduce consumer satisfaction due to reduced tenderness and juiciness; however, reduced tenderness in callipyge leg muscles could be prevented by ovenroasting.

  20. Severe acute dehydration in a desert rodent elicits a transcriptional response that effectively prevents kidney injury.

    Science.gov (United States)

    MacManes, Matthew David

    2017-08-01

    Animals living in desert environments are forced to survive despite severe heat, intense solar radiation, and both acute and chronic dehydration. These animals have evolved phenotypes that effectively address these environmental stressors. To begin to understand the ways in which the desert-adapted rodent Peromyscus eremicus survives, reproductively mature adults were subjected to 72 h of water deprivation, during which they lost, on average, 23% of their body weight. The animals reacted via a series of changes in the kidney, which included modulating expression of genes responsible for reducing the rate of transcription and maintaining water and salt balance. Extracellular matrix turnover appeared to be decreased, and apoptosis was limited. In contrast to the canonical human response, serum creatinine and other biomarkers of kidney injury were not elevated, suggesting that changes in gene expression related to acute dehydration may effectively prohibit widespread kidney damage in the cactus mouse. Copyright © 2017 the American Physiological Society.

  1. Deterministic and Stochastic Semi-Empirical Transient Tire Models

    OpenAIRE

    Umsrithong, Anake

    2012-01-01

    The tire is one of the most important components of the vehicle. It has many functions, such as supporting the load of the vehicle, transmitting the forces which drive, brake and guide the vehicle, and acting as the secondary suspension to absorb the effect of road irregularities before transmitting the forces to the vehicle suspension. A tire is a complex reinforced rubber composite air container. The structure of the tire is very complex. It consists of several layers of synthetic polymer, ...

  2. Pressurized thermal shock in nuclear power plants: Good practices for assessment. Deterministic evaluation for the integrity of reactor pressure vessel

    International Nuclear Information System (INIS)

    2010-02-01

    Starting in the early 1970s, a series of coordinated research projects (CRPs) was sponsored by the IAEA focusing on the effects of neutron radiation on reactor pressure vessel (RPV) steels and RPV integrity. In conjunction with these CRPs, many consultants meetings, specialists meetings, and international conferences, dating back to the mid-1960s, were held. Individual studies on the basic phenomena of radiation hardening and embrittlement were also performed to better understand increases in tensile strength and shifts to higher temperatures for the integrity of the RPV. The overall objective of this CRP was to perform benchmark deterministic calculations of a typical pressurized thermal shock (PTS) regime, with the aim of comparing the effects of individual parameters on the final RPV integrity assessment, and then to recommend the best practices for their implementation in PTS procedures. At present, several different procedures and approaches are used for RPV integrity assessment for both WWER 440-230 reactors and pressurized water reactors (PWRs). These differences in procedures and approaches are based, in principle, on the different codes and rules used for design and manufacturing, and the different materials used for the various types of reactor, and the different levels of implementation of recent developments in fracture mechanics. Benchmark calculations were performed to improve user qualification and to reduce the user effect on the results of the analysis. This addressed generic PWR and WWER types of RPV, as well as sensitivity analyses. The complementary sensitivity analyses showed that the following factors significantly influenced the assessment: flaw size, shape, location and orientation, thermal hydraulic assumptions and material toughness. Applying national codes and procedures to the benchmark cases produced significantly different results in terms of allowable material toughness. This was mainly related to the safety factors used and the

  3. Microbiome and metabolome modifying effects of several cardiovascular disease interventions in apo-E-/- mice.

    Science.gov (United States)

    Ryan, Paul M; London, Lis E E; Bjorndahl, Trent C; Mandal, Rupasri; Murphy, Kiera; Fitzgerald, Gerald F; Shanahan, Fergus; Ross, R Paul; Wishart, David S; Caplice, Noel M; Stanton, Catherine

    2017-03-13

    There is strong evidence indicating that gut microbiota have the potential to modify, or be modified by the drugs and nutritional interventions that we rely upon. This study aims to characterize the compositional and functional effects of several nutritional, neutraceutical, and pharmaceutical cardiovascular disease interventions on the gut microbiome, through metagenomic and metabolomic approaches. Apolipoprotein-E-deficient mice were fed for 24 weeks either high-fat/cholesterol diet alone (control, HFC) or high-fat/cholesterol in conjunction with one of three dietary interventions, as follows: plant sterol ester (PSE), oat β-glucan (OBG) and bile salt hydrolase-active Lactobacillus reuteri APC 2587 (BSH), or the drug atorvastatin (STAT). The gut microbiome composition was then investigated, in addition to the host fecal and serum metabolome. We observed major shifts in the composition of the gut microbiome of PSE mice, while OBG and BSH mice displayed more modest fluctuations, and STAT showed relatively few alterations. Interestingly, these compositional effects imparted by PSE were coupled with an increase in acetate and reduction in isovalerate (p metabolome, including alterations in several acylcarnitines previously associated with a state of metabolic dysfunction (p < 0.05). We observed functional alterations in microbial and host-derived metabolites, which may have important implications for systemic metabolic health, suggesting that cardiovascular disease interventions may have a significant impact on the microbiome composition and functionality. This study indicates that the gut microbiome-modifying effects of novel therapeutics should be considered, in addition to the direct host effects.

  4. The clinical effectiveness of Movicol in children with severe constipation: an outcome audit.

    Science.gov (United States)

    Hanson, Sandra; Bansal, Nav

    2006-03-01

    This audit reviewed the clinical effectiveness of polyethylene glycol 3350 plus electrolytes (PEG+E, Movicol) in the management of severe paediatric constipation. A seven-day disimpaction regimen was initiated followed by a maintenance dose as appropriate. An information and support service was provided by the community children's nursing team (CCNT) at Darent Valley Hospital. Twenty-three parents completed questionnaires on their children's experiences with previous and current laxative treatments, bowel movement status, in-patient admissions or home visits required and the perceived value of the back up service. The mean age of children studied was 6.7 years. Prior to PEG+E treatment, 57 per cent of children were admitted to hospital and 26 per cent required home visits for constipation treatment. After treatment, no child needed either intervention. Thirty-nine percent of parents used the support service, of which 96 per cent rated the information it provided as adequate. When asked about their satisfaction with the control of their children's constipation, 96 per cent of parents were 'more than happy' after treatment with PEG+E. The treatment of severe paediatric constipation with PEG+E in conjunction with a support and advice service was both clinically and economically effective.

  5. Cost-effectiveness analysis of a state funded programme for control of severe asthma

    Directory of Open Access Journals (Sweden)

    Loureiro Sebastião

    2007-05-01

    Full Text Available Abstract Background Asthma is one of the most common chronic diseases and a major economical burden to families and health systems. Whereas efficacy of current therapeutical options has been clearly established, cost-effectiveness analysis of public health interventions for asthma control are scarce. Methods 81 patients with severe asthma (12–75 years joining a programme in a reference clinic providing free asthma medication were asked retrospectively about costs and events in the previous 12 months. During 12 months after joining the programme, information on direct and indirect costs, asthma control by lung function, symptoms and quality of life were collected. The information obtained was used to estimate cost-effectiveness of the intervention as compared to usual public health asthma management. Sensitivity analysis was conducted. Results 64 patients concluded the study. During the 12-months follow-up within the programme, patients had 5 fewer days of hospitalization and 68 fewer visits to emergency/non scheduled medical visits per year, on average. Asthma control scores improved by 50% and quality of life by 74%. The annual saving in public resources was US$387 per patient. Family annual income increased US$512, and family costs were reduced by US$733. Conclusion A programme for control of severe asthma in a developing country can reduce morbidity, improve quality of life and save resources from the health system and patients families.

  6. Neutralizing effects of polyvalent antivenom on severe inflammatory response induced by Mesobuthus eupeus scorpion venom

    Directory of Open Access Journals (Sweden)

    Zayerzadeh1 E.

    2014-11-01

    Full Text Available This study evaluated the effects of Mesobuthus eupeus (Me scorpion venom on inflammatory response following injection. Additionally, the present study examined whether immunotherapy at specific time intervals would be effective on inflammatory response after Me venom inoculation. Animals were divided randomly into four groups: the first group received LD50 of venom and the second and third groups of animals; immunotherapy was performed in different time intervals and fourth group was considered as control group. Me venom inoculation is caused respiratory perturbations such as respiratory distress, respiration with open mouth, crepitation and finally respiratory arrest. Me inoculation is resulted in increased pro-inflammatory cytokines including TNF-α and IL-1. Venom injection also induced inflammatory response, characterized by significant increase in serum white blood cells and neutrophils at 30, 60 and 180 min following envenomation. Simultaneous administration of antivenom and venom prevented entirely clinical sings, cytokines and hematological changes. Delayed immunotherapy gradually ameliorated clinical features, cytokines changes and hematological abnormalities related to the envenomation. In conclusion, our observations indicate injection of M. eupeus scorpion venom induces severe inflammatory response which can be one of the causes of clinical complications. Additionally, immunotherapy beyond 1 h after envenomation with appropriate dose and route in victims with severe inflammatory response related to the M.eupeus scorpion envenomation is beneficial.

  7. Effect of carbon dioxide pneumoperitoneum on the severity of acute pancreatitis: an experimental study in rats.

    Science.gov (United States)

    Yol, S; Bostanci, E B; Ozogul, Y; Zengin, N I; Ozel, U; Bilgihan, A; Akoglu, M

    2004-12-01

    In the management of mild acute biliary pancreatitis, it is generally recommended to perform laparoscopic cholecystectomy after the subsidence of the attack during the same hospital admission. The effect of laparoscopy on abdominal organs has been widely investigated but not in acute pancreatitis. This study used an animal model of mild acute pancreatitis to examine the effects of CO(2) pneumoperitoneum on acute pancreatitis in rats. Mild acute pancreatitis was induced in 30 male Sprague-Dawley rats by surgical ligation of the biliopancreatic duct. After 2 days, animals were assigned to three groups: sham operation (animals were anesthetized for 30 min without undergoing laparotomy), CO(2) pneumoperitoneum (applied for 30 min at a pressure of 12 mmHg), and laparotomy (performed for 30 min, and then the abdomen was closed). Two hours after the surgical procedures, animals were killed and levels of lactate dehydrogenase, aspartate aminotransferase, glucose, urea, hematocrit, and leukocyte count among Ranson's criteria and levels of amylase, lipase, and total bilirubin were measured to determine the severity of acute pancreatitis. Histopathologic examination of the pancreas was done, and malondialdehyde and glutathione levels of the pancreas and lung were determined. The only significant differences between the groups were in lactate dehydrogenase and aspartate aminotransferase levels, which were significantly higher in the pneumoperitoneum group compared to the sham operation group. CO(2) pneumoperitoneum for 30 min at a pressure of 12 mmHg did not affect the severity of acute pancreatitis induced by ligation of the biliopancreatic duct in rats.

  8. The effects of an exercise program on several abilities associated with driving performance in older adults.

    Science.gov (United States)

    Marmeleira, José F; Godinho, Mário B; Fernandes, Orlando M

    2009-01-01

    The purpose of this study was to investigate the effects of participation in an exercise program on several abilities associated with driving performance in older adults. Thirty-two subjects were randomly assigned to either an exercise group (60-81 years, n=16) or a control group (60-82 years, n=16). The exercise program was planned to stress perceptive, cognitive, and physical abilities. It lasted 12 weeks with a periodicity of three sessions of 60 min per week. Assessments were conducted before and after the intervention on behavioral speed (in single- and dual-task conditions), visual attention, psychomotor performance, speed perception (time-to-contact), and executive functioning. Significant positive effects were found at 12-week follow-up resulting from participation in the exercise program. Behavioral speed improvements were found in reaction time, movement time, and response time (both in single- and dual-task conditions); visual attention improvements took place in speed processing and divided attention; psychomotor performance improvements occurred in lower limb mobility. These results showed that exercise is capable of enhancing several abilities relevant for driving performance and safety in older adults and, therefore, should be promoted.

  9. Review of the Monte Carlo and deterministic codes in radiation protection and dosimetry

    International Nuclear Information System (INIS)

    Tagziria, H.

    2000-02-01

    Monte Carlo technique is that the solutions are given at specific locations only, are statistically fluctuating and are arrived at with lots of computer effort. Sooner rather than later, however, one would expect that powerful variance reductions and ever-faster processor machines would balance these disadvantages out. This is especially true if one considers the rapid advances in computer technology and parallel computers, which can achieve a 300, fold faster convergence. In many fields and cases the user would, however, benefit greatly by considering when possible alternative methods to the Monte Carlo technique, such as deterministic methods, at least as a way of validation. It can be shown in fact, that for less complex problems a deterministic approach can have many advantages. In its earlier manifestations, Monte Carlo simulation was primarily performed by experts who were intimately involved in the development of the computer code. Increasingly, however, codes are being supplied as relatively user-friendly packages for widespread use, which allows them to be used by those with less specialist knowledge. This enables them to be used as 'black boxes', which in turn provides scope for costly errors, especially in the choice of cross section data and accelerator techniques. The Monte Carlo method as employed with modem computers goes back several decades, and nowadays science and software libraries would be virtually empty if one excluded work that is either directly or indirectly related to this technique. This is specifically true in the fields of 'computational dosimetry', 'radiation protection' and radiation transport in general. Hundreds of codes have been written and applied with various degrees of success. Some of these have become trademarks, generally well supported and taken over by the thousands of users. Other codes, which should be encouraged, are the so-called in house codes, which still serve well their developers' and their groups' in their intended

  10. Combining deterministic and stochastic velocity fields in the analysis of deep crustal seismic data

    Science.gov (United States)

    Larkin, Steven Paul

    Standard crustal seismic modeling obtains deterministic velocity models which ignore the effects of wavelength-scale heterogeneity, known to exist within the Earth's crust. Stochastic velocity models are a means to include wavelength-scale heterogeneity in the modeling. These models are defined by statistical parameters obtained from geologic maps of exposed crystalline rock, and are thus tied to actual geologic structures. Combining both deterministic and stochastic velocity models into a single model allows a realistic full wavefield (2-D) to be computed. By comparing these simulations to recorded seismic data, the effects of wavelength-scale heterogeneity can be investigated. Combined deterministic and stochastic velocity models are created for two datasets, the 1992 RISC seismic experiment in southeastern California and the 1986 PASSCAL seismic experiment in northern Nevada. The RISC experiment was located in the transition zone between the Salton Trough and the southern Basin and Range province. A high-velocity body previously identified beneath the Salton Trough is constrained to pinch out beneath the Chocolate Mountains to the northeast. The lateral extent of this body is evidence for the ephemeral nature of rifting loci as a continent is initially rifted. Stochastic modeling of wavelength-scale structures above this body indicate that little more than 5% mafic intrusion into a more felsic continental crust is responsible for the observed reflectivity. Modeling of the wide-angle RISC data indicates that coda waves following PmP are initially dominated by diffusion of energy out of the near-surface basin as the wavefield reverberates within this low-velocity layer. At later times, this coda consists of scattered body waves and P to S conversions. Surface waves do not play a significant role in this coda. Modeling of the PASSCAL dataset indicates that a high-gradient crust-mantle transition zone or a rough Moho interface is necessary to reduce precritical Pm

  11. Probabilistic linkage to enhance deterministic algorithms and reduce data linkage errors in hospital administrative data.

    Science.gov (United States)

    Hagger-Johnson, Gareth; Harron, Katie; Goldstein, Harvey; Aldridge, Robert; Gilbert, Ruth

    2017-06-30

     BACKGROUND: The pseudonymisation algorithm used to link together episodes of care belonging to the same patients in England (HESID) has never undergone any formal evaluation, to determine the extent of data linkage error. To quantify improvements in linkage accuracy from adding probabilistic linkage to existing deterministic HESID algorithms. Inpatient admissions to NHS hospitals in England (Hospital Episode Statistics, HES) over 17 years (1998 to 2015) for a sample of patients (born 13/28th of months in 1992/1998/2005/2012). We compared the existing deterministic algorithm with one that included an additional probabilistic step, in relation to a reference standard created using enhanced probabilistic matching with additional clinical and demographic information. Missed and false matches were quantified and the impact on estimates of hospital readmission within one year were determined. HESID produced a high missed match rate, improving over time (8.6% in 1998 to 0.4% in 2015). Missed matches were more common for ethnic minorities, those living in areas of high socio-economic deprivation, foreign patients and those with 'no fixed abode'. Estimates of the readmission rate were biased for several patient groups owing to missed matches, which was reduced for nearly all groups. CONCLUSION: Probabilistic linkage of HES reduced missed matches and bias in estimated readmission rates, with clear implications for commissioning, service evaluation and performance monitoring of hospitals. The existing algorithm should be modified to address data linkage error, and a retrospective update of the existing data would address existing linkage errors and their implications.

  12. Hazardous waste transportation risk assessment: Benefits of a combined deterministic and probabilistic Monte Carlo approach in expressing risk uncertainty

    International Nuclear Information System (INIS)

    Policastro, A.J.; Lazaro, M.A.; Cowen, M.A.; Hartmann, H.M.; Dunn, W.E.; Brown, D.F.

    1995-01-01

    This paper presents a combined deterministic and probabilistic methodology for modeling hazardous waste transportation risk and expressing the uncertainty in that risk. Both the deterministic and probabilistic methodologies are aimed at providing tools useful in the evaluation of alternative management scenarios for US Department of Energy (DOE) hazardous waste treatment, storage, and disposal (TSD). The probabilistic methodology can be used to provide perspective on and quantify uncertainties in deterministic predictions. The methodology developed has been applied to 63 DOE shipments made in fiscal year 1992, which contained poison by inhalation chemicals that represent an inhalation risk to the public. Models have been applied to simulate shipment routes, truck accident rates, chemical spill probabilities, spill/release rates, dispersion, population exposure, and health consequences. The simulation presented in this paper is specific to trucks traveling from DOE sites to their commercial TSD facilities, but the methodology is more general. Health consequences are presented as the number of people with potentially life-threatening health effects. Probabilistic distributions were developed (based on actual item data) for accident release amounts, time of day and season of the accident, and meteorological conditions

  13. [Cost-effectiveness of drotrecogin alpha [activated] in the treatment of severe sepsis in Spain].

    Science.gov (United States)

    Sacristán, José A; Prieto, Luis; Huete, Teresa; Artigas, Antonio; Badia, Xavier; Chinn, Christopher; Hudson, Peter

    2004-01-01

    The PROWESS clinical trial has shown that treatment with drotrecogin alpha (activated) in patients with severe sepsis is associated with a reduction in the absolute risk of death compared with standard treatment. The aim of the present study was to assess the cost-effectiveness of drotrecogin alpha (activated) versus that of standard care in the treatment of severe sepsis in Spain. A decision analysis model was drawn up to compare costs to hospital discharge and the long-term efficacy of drotrecogin alpha (activated) versus those of standard care in the treatment of severe sepsis in Spain from the perspective of the health care payer. Most of the information for creating the model was obtained from the PROWESS clinical trial. A two-fold baseline analysis was performed: a) for all patients included in the PROWESS clinical trial and b) for the patients with two or more organ failures. The major variables for clinical assessment were the reduction in mortality and years of life gained (YLG). Cost-effectiveness was expressed as cost per YLG. A sensitivity analysis was applied using 3% and 5% discount rates for YLG and by modifying the patterns of health care, intensive care unit costs, and life expectancy by initial co-morbidity and therapeutic efficacy of drotrecogin alpha (activated). Treatment with drotrecogin alfa (activated) was associated with a 6.0% drop in the absolute risk of death (p = 0.005) when all of the patients from the PROWESS trial were included and with a 7.3% reduction (p = 0.005) when the analysis was restricted to patients with two or more organ failures. The cost-effectiveness of drotrecogin alfa (activated) was 13,550 euros per YLG with respect to standard care after analysing all of the patients and 9,800 euros per YLG in the group of patients with two or more organ failures. In the sensitivity analysis, the results ranged from 7,322 to 16,493 euros per YLG. The factors with the greatest impact on the results were the change in the efficacy of

  14. Vaccination strategies for future influenza pandemics: a severity-based cost effectiveness analysis.

    Science.gov (United States)

    Kelso, Joel K; Halder, Nilimesh; Milne, George J

    2013-02-11

    A critical issue in planning pandemic influenza mitigation strategies is the delay between the arrival of the pandemic in a community and the availability of an effective vaccine. The likely scenario, born out in the 2009 pandemic, is that a newly emerged influenza pandemic will have spread to most parts of the world before a vaccine matched to the pandemic strain is produced. For a severe pandemic, additional rapidly activated intervention measures will be required if high mortality rates are to be avoided. A simulation modelling study was conducted to examine the effectiveness and cost effectiveness of plausible combinations of social distancing, antiviral and vaccination interventions, assuming a delay of 6-months between arrival of an influenza pandemic and first availability of a vaccine. Three different pandemic scenarios were examined; mild, moderate and extreme, based on estimates of transmissibility and pathogenicity of the 2009, 1957 and 1918 influenza pandemics respectively. A range of different durations of social distancing were examined, and the sensitivity of the results to variation in the vaccination delay, ranging from 2 to 6 months, was analysed. Vaccination-only strategies were not cost effective for any pandemic scenario, saving few lives and incurring substantial vaccination costs. Vaccination coupled with long duration social distancing, antiviral treatment and antiviral prophylaxis was cost effective for moderate pandemics and extreme pandemics, where it saved lives while simultaneously reducing the total pandemic cost. Combined social distancing and antiviral interventions without vaccination were significantly less effective, since without vaccination a resurgence in case numbers occurred as soon as social distancing interventions were relaxed. When social distancing interventions were continued until at least the start of the vaccination campaign, attack rates and total costs were significantly lower, and increased rates of vaccination

  15. On Notions of Security for Deterministic Encryption, and Efficient Constructions Without Random Oracles

    NARCIS (Netherlands)

    S. Boldyreva; S. Fehr (Serge); A. O'Neill; D. Wagner

    2008-01-01

    textabstractThe study of deterministic public-key encryption was initiated by Bellare et al. (CRYPTO ’07), who provided the “strongest possible” notion of security for this primitive (called PRIV) and constructions in the random oracle (RO) model. We focus on constructing efficient deterministic

  16. Effectiveness and safety of abatacept in moderate to severe rheumatoid arthritis.

    Science.gov (United States)

    Cortejoso-Fernández, Lucía; Romero-Jiménez, Maria Rosa; Pernía-López, María Sagrario; Montoro-Álvarez, María; Sanjurjo-Sáez, María

    2012-01-01

    Abatacept was approved in our hospital by the Pharmacy and Therapeutics Committee for treatment of moderate to severe rheumatoid arthritis (RA) in adult patients with inadequate response or intolerance to disease modifying antirheumatic drugs (DMARDs), including at least one anti-tumour necrosis factor (anti-TNF). The objectives of this study were to analyze compliance with our protocol and to evaluate effectiveness and safety of abatacept in our patients. We performed a descriptive longitudinal study of patients with RA treated with abatacept between August 2008 and May 2010 in our day care unit. We reviewed clinical records and recorded the following data: sex, age, weight, year of diagnosis, previous antirheumatic treatments and reasons for withdrawal of anti-TNFs, indication for abatacept, dose and date of administration, Disease Activity Score (DAS28) and adverse events. Effectiveness was evaluated using the European League Against Rheumatism (EULAR) criteria. We recruited 16 patients. Mean follow-up time was 10.4 (SD: 6.1) months. All patients had been previously treated with DMARDs, including at least one anti-TNF, and the mean dose of abatacept was 9.4 (SD: 1.4) mg/kg. During the first 6 months of treatment, 11/16 of patients experienced a decrease in their DAS28 value, but only 5/16 achieved a satisfactory response. Dyspnea was the most frequent adverse event (7/16), followed by fatigue and asthenia (6/16) and dry skin (5/16). The indication for abatacept in our hospital complied with the protocol approved by the Pharmacy and Therapeutics Committee. Only 5/16 of patients achieved a satisfactory response; however, it should be noted that these patients had moderate to severe RA that was refractory to other treatments. Adverse reactions were consistent with those described in the summary of product characteristics. Further studies with larger cohorts are needed to analyze the long-term safety and effectiveness profile in clinical practice.

  17. Pest persistence and eradication conditions in a deterministic model for sterile insect release.

    Science.gov (United States)

    Gordillo, Luis F

    2015-01-01

    The release of sterile insects is an environment friendly pest control method used in integrated pest management programmes. Difference or differential equations based on Knipling's model often provide satisfactory qualitative descriptions of pest populations subject to sterile release at relatively high densities with large mating encounter rates, but fail otherwise. In this paper, I derive and explore numerically deterministic population models that include sterile release together with scarce mating encounters in the particular case of species with long lifespan and multiple matings. The differential equations account separately the effects of mating failure due to sterile male release and the frequency of mating encounters. When insects spatial spread is incorporated through diffusion terms, computations reveal the possibility of steady pest persistence in finite size patches. In the presence of density dependence regulation, it is observed that sterile release might contribute to induce sudden suppression of the pest population.

  18. Exploring the use of a deterministic adjoint flux calculation in criticality Monte Carlo simulations

    International Nuclear Information System (INIS)

    Jinaphanh, A.; Miss, J.; Richet, Y.; Martin, N.; Hebert, A.

    2011-01-01

    The paper presents a preliminary study on the use of a deterministic adjoint flux calculation to improve source convergence issues by reducing the number of iterations needed to reach the converged distribution in criticality Monte Carlo calculations. Slow source convergence in Monte Carlo eigenvalue calculations may lead to underestimate the effective multiplication factor or reaction rates. The convergence speed depends on the initial distribution and the dominance ratio. We propose using an adjoint flux estimation to modify the transition kernel according to the Importance Sampling technique. This adjoint flux is also used as the initial guess of the first generation distribution for the Monte Carlo simulation. Calculated Variance of a local estimator of current is being checked. (author)

  19. Stochastic partial differential fluid equations as a diffusive limit of deterministic Lagrangian multi-time dynamics.

    Science.gov (United States)

    Cotter, C J; Gottwald, G A; Holm, D D

    2017-09-01

    In Holm (Holm 2015 Proc. R. Soc. A 471 , 20140963. (doi:10.1098/rspa.2014.0963)), stochastic fluid equations were derived by employing a variational principle with an assumed stochastic Lagrangian particle dynamics. Here we show that the same stochastic Lagrangian dynamics naturally arises in a multi-scale decomposition of the deterministic Lagrangian flow map into a slow large-scale mean and a rapidly fluctuating small-scale map. We employ homogenization theory to derive effective slow stochastic particle dynamics for the resolved mean part, thereby obtaining stochastic fluid partial equations in the Eulerian formulation. To justify the application of rigorous homogenization theory, we assume mildly chaotic fast small-scale dynamics, as well as a centring condition. The latter requires that the mean of the fluctuating deviations is small, when pulled back to the mean flow.

  20. A deterministic combination of numerical and physical models for coastal waves

    DEFF Research Database (Denmark)

    Zhang, Haiwen

    2006-01-01

    of numerical and physical modelling hence provides an attractive alternative to the use of either tool on it's own. The goal of this project has been to develop a deterministically combined numerical/physical model where the physical wave tank is enclosed in a much larger computational domain, and the two......Numerical and physical modelling are the two main tools available for predicting the influence of water waves on coastlines and structures placed in the near-shore environment. Numerical models can cover large areas at the correct scale, but are limited in their ability to capture strong...... nonlinearities, wave breaking, splash, mixing, and other such complicated physics. Physical models naturally include the real physics (at the model scale), but are limited by the physical size of the facility and must contend with the fact that different physical effects scale differently. An integrated use...

  1. Sensitivity analysis of the titan hybrid deterministic transport code for SPECT simulation

    International Nuclear Information System (INIS)

    Royston, Katherine K.; Haghighat, Alireza

    2011-01-01

    Single photon emission computed tomography (SPECT) has been traditionally simulated using Monte Carlo methods. The TITAN code is a hybrid deterministic transport code that has recently been applied to the simulation of a SPECT myocardial perfusion study. For modeling SPECT, the TITAN code uses a discrete ordinates method in the phantom region and a combined simplified ray-tracing algorithm with a fictitious angular quadrature technique to simulate the collimator and generate projection images. In this paper, we compare the results of an experiment with a physical phantom with predictions from the MCNP5 and TITAN codes. While the results of the two codes are in good agreement, they differ from the experimental data by ∼ 21%. In order to understand these large differences, we conduct a sensitivity study by examining the effect of different parameters including heart size, collimator position, collimator simulation parameter, and number of energy groups. (author)

  2. Technique of ICP monitored stepwise intracranial decompression effectively reduces postoperative complications of severe bifrontal contusion

    Directory of Open Access Journals (Sweden)

    Guan eSun

    2016-04-01

    Full Text Available Background Bifrontal contusion is a common clinical brain injury. In the early stage, it is often mild, but it progresses rapidly and frequently worsens suddenly. This condition can become life threatening and therefore requires surgery. Conventional decompression craniectomy is the commonly used treatment method. In this study, the effect of ICP monitored stepwise intracranial decompression surgery on the prognosis of patients with acute severe bifrontal contusion was investigated. Method A total of 136 patients with severe bifrontal contusion combined with deteriorated intracranial hypertension admitted from March 2001 to March 2014 in our hospital were selected and randomly divided into two groups, i.e., a conventional decompression group and an intracranial pressure (ICP monitored stepwise intracranial decompression group (68 patients each, to conduct a retrospective study. The incidence rates of acute intraoperative encephalocele, delayed hematomas, and postoperative cerebral infarctions and the Glasgow outcome scores (GOSs 6 months after the surgery were compared between the two groups.Results (1 The incidence rates of acute encephalocele and contralateral delayed epidural hematoma in the stepwise decompression surgery group were significantly lower than those in the conventional decompression group; the differences were statistically significant (P < 0.05; (2 6 months after the surgery, the incidence of vegetative state and mortality in the stepwise decompression group were significantly lower than those in the conventional decompression group (P < 0.05; the rate of favorable prognosis in the stepwise decompression group was also significantly higher than that in the conventional decompression group (P < 0.05.Conclusions The ICP monitored stepwise intracranial decompression technique reduced the perioperative complications of traumatic brain injury through the gradual release of intracranial pressure and was beneficial to the prognosis of

  3. Ubiquinol treatment for TBI in male rats: Effects on mitochondrial integrity, injury severity, and neurometabolism.

    Science.gov (United States)

    Pierce, Janet D; Gupte, Raeesa; Thimmesch, Amanda; Shen, Qiuhua; Hiebert, John B; Brooks, William M; Clancy, Richard L; Diaz, Francisco J; Harris, Janna L

    2018-06-01

    Following traumatic brain injury (TBI), there is significant secondary damage to cerebral tissue from increased free radicals and impaired mitochondrial function. This imbalance between reactive oxygen species (ROS) production and the effectiveness of cellular antioxidant defenses is termed oxidative stress. Often there are insufficient antioxidants to scavenge ROS, leading to alterations in cerebral structure and function. Attenuating oxidative stress following a TBI by administering an antioxidant may decrease secondary brain injury, and currently many drugs and supplements are being investigated. We explored an over-the-counter supplement called ubiquinol (reduced form of coenzyme Q10), a potent antioxidant naturally produced in brain mitochondria. We administered intra-arterial ubiquinol to rats to determine if it would reduce mitochondrial damage, apoptosis, and severity of a contusive TBI. Adult male F344 rats were randomly assigned to one of three groups: (1) Saline-TBI, (2) ubiquinol 30 minutes before TBI (UB-PreTBI), or (3) ubiquinol 30 minutes after TBI (UB-PostTBI). We found when ubiquinol was administered before or after TBI, rats had an acute reduction in brain mitochondrial damage, apoptosis, and two serum biomarkers of TBI severity, glial fibrillary acidic protein (GFAP) and ubiquitin C-terminal hydrolase-L1 (UCH-L1). However, in vivo neurometabolic assessment with proton magnetic resonance spectroscopy did not show attenuated injury-induced changes. These findings are the first to show that ubiquinol preserves mitochondria and reduces cellular injury severity after TBI, and support further study of ubiquinol as a promising adjunct therapy for TBI. © 2018 Wiley Periodicals, Inc.

  4. Spent Fuel Pool Dose Rate Calculations Using Point Kernel and Hybrid Deterministic-Stochastic Shielding Methods

    International Nuclear Information System (INIS)

    Matijevic, M.; Grgic, D.; Jecmenica, R.

    2016-01-01

    This paper presents comparison of the Krsko Power Plant simplified Spent Fuel Pool (SFP) dose rates using different computational shielding methodologies. The analysis was performed to estimate limiting gamma dose rates on wall mounted level instrumentation in case of significant loss of cooling water. The SFP was represented with simple homogenized cylinders (point kernel and Monte Carlo (MC)) or cuboids (MC) using uranium, iron, water, and dry-air as bulk region materials. The pool is divided on the old and new section where the old one has three additional subsections representing fuel assemblies (FAs) with different burnup/cooling time (60 days, 1 year and 5 years). The new section represents the FAs with the cooling time of 10 years. The time dependent fuel assembly isotopic composition was calculated using ORIGEN2 code applied to the depletion of one of the fuel assemblies present in the pool (AC-29). The source used in Microshield calculation is based on imported isotopic activities. The time dependent photon spectra with total source intensity from Microshield multigroup point kernel calculations was then prepared for two hybrid deterministic-stochastic sequences. One is based on SCALE/MAVRIC (Monaco and Denovo) methodology and another uses Monte Carlo code MCNP6.1.1b and ADVANTG3.0.1. code. Even though this model is a fairly simple one, the layers of shielding materials are thick enough to pose a significant shielding problem for MC method without the use of effective variance reduction (VR) technique. For that purpose the ADVANTG code was used to generate VR parameters (SB cards in SDEF and WWINP file) for MCNP fixed-source calculation using continuous energy transport. ADVATNG employs a deterministic forward-adjoint transport solver Denovo which implements CADIS/FW-CADIS methodology. Denovo implements a structured, Cartesian-grid SN solver based on the Koch-Baker-Alcouffe parallel transport sweep algorithm across x-y domain blocks. This was first

  5. Deterministic absorbed dose estimation in computed tomography using a discrete ordinates method

    International Nuclear Information System (INIS)

    Norris, Edward T.; Liu, Xin; Hsieh, Jiang

    2015-01-01

    . Conclusions: The simulation results showed that the deterministic method can be effectively used to estimate the absorbed dose in a CTDI phantom. The accuracy of the discrete ordinates method was close to that of a Monte Carlo simulation, and the primary benefit of the discrete ordinates method lies in its rapid computation speed. It is expected that further optimization of this method in routine clinical CT dose estimation will improve its accuracy and speed

  6. 360° deterministic magnetization rotation in a three-ellipse magnetoelectric heterostructure

    Science.gov (United States)

    Kundu, Auni A.; Chavez, Andres C.; Keller, Scott M.; Carman, Gregory P.; Lynch, Christopher S.

    2018-03-01

    A magnetic dipole-coupled magnetoelectric heterostructure comprised of three closely spaced ellipse shapes was designed and shown to be capable of achieving deterministic in-plane magnetization rotation. The design approach used a combination of conventional micromagnetic simulations to obtain preliminary configurations followed by simulations using a fully strain-coupled, time domain micromagnetic code for a detailed assessment of performance. The conventional micromagnetic code has short run times and was used to refine the ellipse shape and orientation, but it does not accurately capture the effects of the strain gradients present in the piezoelectric and magnetostrictive layers that contribute to magnetization reorientation. The fully coupled code was used to assess the effects of strain and magnetic field gradients on precessional switching in the side ellipses and on the resulting dipole-field driven magnetization reorientation in the center ellipse. The work led to a geometry with a CoFeB ellipse (125 nm × 95 nm × 4 nm) positioned between two smaller CoFeB ellipses (75 nm × 50 nm × 4 nm) on a 500 nm PZT-5H film substrate clamped at its bottom surface. The smaller ellipses were oriented at 45° and positioned at 70° and 250° about the central ellipse due to the film deposition on a thick substrate. A 7.3 V pulse applied to the PZT for 0.22 ns produced 180° switching of the magnetization in the outer ellipses that then drove switching in the center ellipse through dipole-dipole coupling. Full 360° deterministic rotation was achieved with a second pulse. The temporal response of the resulting design is discussed.

  7. Effectiveness of an organized bowel management program in the management of severe chronic constipation in children.

    Science.gov (United States)

    Russell, Katie W; Barnhart, Douglas C; Zobell, Sarah; Scaife, Eric R; Rollins, Michael D

    2015-03-01

    Chronic constipation is a common problem in children. The cause of constipation is often idiopathic, when no anatomic or physiologic etiology can be identified. In severe cases, low dose laxatives, stool softeners and small volume enemas are ineffective. The purpose of this study was to assess the effectiveness of a structured bowel management program in these children. We retrospectively reviewed children with chronic constipation without a history of anorectal malformation, Hirschsprung's disease or other anatomical lesions seen in our pediatric colorectal center. Our bowel management program consists of an intensive week where treatment is assessed and tailored based on clinical response and daily radiographs. Once a successful treatment plan is established, children are followed longitudinally. The number of patients requiring hospital admission during the year prior to and year after initiation of bowel management was compared using Fisher's exact test. Forty-four children with refractory constipation have been followed in our colorectal center for greater than a year. Fifty percent had at least one hospitalization the year prior to treatment for obstructive symptoms. Children were treated with either high-dose laxatives starting at 2mg/kg of senna or enemas starting at 20ml/kg of normal saline. Treatment regimens were adjusted based on response to therapy. The admission rate one-year after enrollment was 9% including both adherent and nonadherent patients. This represents an 82% reduction in hospital admissions (phospital admissions in children with severe chronic constipation. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Effects of Mild and Severe Vitamin B Deficiencies on the Meiotic Maturation of Mice Oocytes

    Directory of Open Access Journals (Sweden)

    Ai Tsuji

    2017-03-01

    Full Text Available We investigated the effects of vitamin B 1 deficiency on the meiosis maturation of oocytes. Female Crl:CD1 (ICR mice were fed a 20% casein diet (control group or a vitamin B 1 –free diet (test group. The vitamin B 1 concentration in ovary was approximately 30% lower in the test group than in the control group. Oocyte meiosis was not affected by vitamin B 1 deficiency when the deficiency was not accompanied by body weight loss. On the contrary, frequency of abnormal oocyte was increased by vitamin B 1 deficiency when deficiency was accompanied by body weight loss (referred to as severe vitamin B 1 deficiency; frequency of abnormal oocyte, 13.8% vs 43.7%, P  = .0071. The frequency of abnormal oocytes was decreased by refeeding of a vitamin B 1 –containing diet (13.9% vs 22.9%, P  = .503. These results suggest that severe vitamin B 1 deficiency inhibited meiotic maturation of oocytes but did not damage immature oocytes.

  9. Effectiveness of Music Therapy as an aid to Neurorestoration of children with severe neurological disorders

    Directory of Open Access Journals (Sweden)

    Maria L Bringas

    2015-11-01

    Full Text Available This study was a two-armed parallel group design aimed at testing real world effectiveness of a music therapy (MT intervention for children with severe neurological disorders. The control group received only the standard neurorestoration program and the experimental group received an additional MT Auditory Attention plus Communication (ACC protocol just before the usual occupational and speech therapy. Multivariate Item Response Theory (MIRT identified a neuropsychological status-latent variable manifested in all children and which exhibited highly significant changes only in the experimental group. Changes in brain plasticity also occurred in the experimental group, as evidenced using a Mismatch Event Related paradigm which revealed significant post intervention positive responses in the latency range between 308 and 400 ms in frontal regions. LORETA EEG source analysis identified prefrontal and midcingulate regions as differentially activated by the MT in the experimental group. Taken together, our results showing improved attention and communication as well as changes in brain plasticity in children with severe neurological impairments, highlight/comfort the importance of MT for the rehabilitation of patients across a wide range of dysfunctions.

  10. Effectiveness of music therapy as an aid to neurorestoration of children with severe neurological disorders.

    Science.gov (United States)

    Bringas, Maria L; Zaldivar, Marilyn; Rojas, Pedro A; Martinez-Montes, Karelia; Chongo, Dora M; Ortega, Maria A; Galvizu, Reynaldo; Perez, Alba E; Morales, Lilia M; Maragoto, Carlos; Vera, Hector; Galan, Lidice; Besson, Mireille; Valdes-Sosa, Pedro A

    2015-01-01

    This study was a two-armed parallel group design aimed at testing real world effectiveness of a music therapy (MT) intervention for children with severe neurological disorders. The control group received only the standard neurorestoration program and the experimental group received an additional MT "Auditory Attention plus Communication protocol" just before the usual occupational and speech therapy. Multivariate Item Response Theory (MIRT) identified a neuropsychological status-latent variable manifested in all children and which exhibited highly significant changes only in the experimental group. Changes in brain plasticity also occurred in the experimental group, as evidenced using a Mismatch Event Related paradigm which revealed significant post intervention positive responses in the latency range between 308 and 400 ms in frontal regions. LORETA EEG source analysis identified prefrontal and midcingulate regions as differentially activated by the MT in the experimental group. Taken together, our results showing improved attention and communication as well as changes in brain plasticity in children with severe neurological impairments, confirm the importance of MT for the rehabilitation of patients across a wide range of dysfunctions.

  11. Effect of simulated sulfuric acid rain on yield, growth and foliar injury of several crops

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J J; Neely, G E; Perrigan, S C; Grothaus, L C

    1981-01-01

    This study was designed to reveal patterns of response of major United States crops to sulfuric acid rain. Potted plants were grown in field chambers and exposed to simulated sulfuric acid rain (pH 3.0, 3.5 or 4.0) or to a control rain (pH 5.6). At harvest, the weights of the marketable portion, total aboveground portion and roots were determined for 28 crops. Of these, marketable yield production was inhibited for 5 crops (radish, beet, carrot, mustard greens, broccoli), stimulated for 6 crops (tomato, green pepper, strawberry, alfalfa, orchardgrass, timothy), and ambiguously affected for 1 crop (potato). In addition, stem and leaf production of sweet corn was stimulated. Visible injury of tomatoes might have decreased their marketabiity. No statistically significant effects on yield were observed for the other 15 crops. The results suggest that the likelihood of yield being affected by acid depends on the part of the plant utilized, as well as on species. Effects on the aboveground portion of crops and on roots are also presented. Plants were regularly examined for foliar injury associated with acid rain. Of the 35 cultivars examined, the foliage of 31 was injured at pH 3.0, 28 at pH 3.5, and 5 at pH 4.0. Foliar injury was not generally related to effects on yield. However, foliar injury of Swiss chard, mustard greens and spinach was severe enough to adversely affect marketability.

  12. Preventative and Curative Effects of Several Plant Derived Agents Against Powdery Mildew Disease of Okra

    Directory of Open Access Journals (Sweden)

    Moustafa Hemdan Ahmed MOHARAM

    2012-08-01

    Full Text Available The preventative and curative effects of some plant derived agents based on plant extracts or essential oils were studied at different concentrations against Erysiphe cichoracearum DC. ex Merat, the causal pathogen of okra powdery mildew by the detached leaf-disk and potted plants bioassays. Through detached leaf-disk assay, the highest mean preventative effect (97.74% was recorded by neem seed oil followed by jojoba oil (89.82% and extract of Rynoutria sachalinensis (82.77%. Neem seed oil at 1% was the most effective agent followed by jojoba oil and extract of R. sachalinensis at 1.5% and 2%, respectively, where they suppressed E. cichoracearum completely. Potted plants assay revealed that neem seed oil, jojoba oil and extract of R. sachalinensis as well as the fungicide (active ingredient dinocap showed higher preventative efficacy at all leaf olds treated after 7 and 14 days of inoculation as compared with extracts of henna and garlic. Moreover, the preventative efficacy partly remained apparent after 14 days of inoculation at all leaf olds tested. In field trials through 2010 and 2011 growing seasons, when the first symptoms of powdery mildew appeared naturally, 1.5% jojoba oil, 2% extract of R. sachalinensis and 1% neem seed oil were sprayed individually twice on grown plants to evaluate their efficacy on controlling powdery mildew, growth and yield of okra. Resulted showed that neem seed oil was the most effective agent and highly decreased the disease severity to 29.92%, recorded the highly curative effect (68.15% and also improved plant growth and pods yield.

  13. Analysis of effects of calandria tube uncovery under severe accident conditions in CANDU reactors

    International Nuclear Information System (INIS)

    Rogers, J.T.; Currie, T.C.; Atkinson, J.C.; Dick, R.

    1983-01-01

    A study is being undertaken for the Atomic Energy Control Board to assess the thermal and hydraulic behaviour of CANDU reactor cores under accident conditions more severe than those normally considered in the licensing process. In this paper, we consider the effects on a coolant channel of the uncovery of a calandria tube by moderator boil-off following a LOCA in a Bruce reactor unit in which emergency cooling is ineffective and the moderator heat sink is impaired by the failure of the moderator cooling system. Calandria tube uncovery and its immediate consequences, as described here, constitute only one part of the entire accident sequence. Other aspects of this sequence as well as results of the analysis of the other accident sequences studied will be described in the final report on the project and in later papers

  14. A review of studies of the effect of severe malnutrition on mental development.

    Science.gov (United States)

    Grantham-McGregor, S

    1995-08-01

    This is a review of studies on the relationship between mental development and severe malnutrition. School-age children who suffered from early childhood malnutrition have generally been found to have poorer IQ levels, cognitive function, school achievement and greater behavioral problems than matched controls and, to a lesser extent, siblings. The disadvantages last at least until adolescence. There is no consistent evidence of a specific cognitive deficit. The evidence of a causal relationship is strong but not unequivocal because of difficulties in interpreting retrospective case control studies. Marked improvements in development can occur after adoption or intervention. Therefore, the outcome depends to a large extent on the quality of the subsequent environment. It is likely that extremely deprived environments would exacerbate the effects. There is limited evidence that other nutritional deficiencies may interact with previous malnutrition in affecting cognition. The mechanism linking malnutrition to poor development is still not established.

  15. The effect of seasonality on burn incidence, severity and outcome in Central Malawi.

    Science.gov (United States)

    Tyson, Anna F; Gallaher, Jared; Mjuweni, Stephen; Cairns, Bruce A; Charles, Anthony G

    2017-08-01

    In much of the world, burns are more common in cold months. However, few studies have described the seasonality of burns in sub-Saharan Africa. This study examines the effect of seasonality on the incidence and outcome of burns in central Malawi. A retrospective analysis was performed at Kamuzu Central Hospital and included all patients admitted from May 2011 to August 2014. Demographic data, burn mechanism, total body surface area (%TBSA), and mortality were analyzed. Seasons were categorized as Rainy (December-February), Lush (March-May), Cold (June-August) and Hot (September-November). A negative binomial regression was used to assess the effect of seasonality on burn incidence. This was performed using both the raw and deseasonalized data in order to evaluate for trends not attributable to random fluctuation. A total of 905 patients were included. Flame (38%) and Scald (59%) burns were the most common mechanism. More burns occurred during the cold season (41% vs 19-20% in the other seasons). Overall mortality was 19%. Only the cold season had a statistically significant increase in burn . The incidence rate ratios (IRR) for the hot, lush, and cold seasons were 0.94 (CI 0.6-1.32), 1.02 (CI 0.72-1.45) and 1.6 (CI 1.17-2.19), respectively, when compared to the rainy season. Burn severity and mortality did not differ between seasons. The results of this study demonstrate the year-round phenomenon of burns treated at our institution, and highlights the slight predominance of burns during the cold season. These data can be used to guide prevention strategies, with special attention to the implications of the increased burn incidence during the cold season. Though burn severity and mortality remain relatively unchanged between seasons, recognizing the seasonal variability in incidence of burns is critical for resource allocation in this low-income setting. Copyright © 2017 Elsevier Ltd and ISBI. All rights reserved.

  16. Drotrecogin alfa (activated in severe sepsis: a systematic review and new cost-effectiveness analysis

    Directory of Open Access Journals (Sweden)

    Brophy James M

    2007-06-01

    Full Text Available Abstract Background Activated drotrecogin alfa (human activated protein C, rhAPC, is produced by recombinant DNA technology, and purports to improve clinical outcomes by counteracting the inflammatory and thrombotic consequences of severe sepsis. Controversy exists around the clinical benefits of this drug and an updated economic study that considers this variability is needed. Methods A systematic literature review was performed using Medline, Embase and the International Network of Agencies for Health Technology Assessment (INAHTA databases to determine efficacy, safety and previous economic studies. Our economic model was populated with systematic estimates of these parameters and with population life tables for longer term survival information. Monte Carlo simulations were used to estimate the incremental cost-effectiveness ratios (ICERs and variance for the decision analytic models. Results Two randomized clinical trials (RCTS of drotrecogin alfa in adults with severe sepsis and 8 previous economic studies were identified. Although associated with statistical heterogeneity, a pooled analysis of the RCTs did not show a statistically significant 28-day mortality benefit for drotrecogin alfa compared to placebo either for all patients (RR: 0.93, 95% CI: 0.69, 1.26 or those at highest risk as measured by APACHE II ≥ 25 (RR: 0.90, 95% CI: 0.54, 1.49. Our economic analysis based on the totality of the available clinical evidence suggests that the cost-effectiveness of drotrecogin alfa is uncertain ( Conclusion The evidence supporting the clinical and economic attractiveness of drotrecogin alfa is not conclusive and further research appears to be indicated.

  17. Natural Disasters under the Form of Severe Storms in Europe: the Cause-Effect Analysis

    Directory of Open Access Journals (Sweden)

    Virginia Câmpeanu

    2009-07-01

    Full Text Available For more than 100 years, from 1900 to 2008, there were almost 400 storms natural disasters in Europe, 40% of which occurred in the 1990s. The international prognoses for the world weather suggest a tendency toward increasing in frequency and intensity of the severe storms as the climate warms. In these circumstances, for a researcher in the field of Environmental Economics, a natural question occurs, on whether people can contribute to reducing the frequency and the magnitude of severe storms that produce disastreous social and economic effects, by acting on their causes. In researching an answer to support the public policies in the field, a cause-effect analysis applied to Europe might make a contribution to the literature in the field. This especially considering the fact that international literature regarding the factors influencing global warming contains certainties in regard to the natural factors of influence, but declared incertitudes or skepticism in regard to anthropogenic ones. Skepticism, and even tension arised during the international negotiations in Copenhagen (December 2009 in regard to the agreement for limiting global warming, with doubts being raised about the methods used by experts of the International Climate Experts Group (GIEC, and thus the results obtained, which served as a basis for the negotiations. The object of critics was in regard to the form, and at times in regard to the content. It was not about contesting the phenomenon of Global warming during the negotiations, but the methods of calculation. The methodology relies on qualitative (type top down and quantitative (type correlations bottom up cause-effect analysis of the storm disasters in Europe. Based on the instruments used, we proposed a dynamic model of association of the evolution of storm disasters in Europe with anthropogenic factors, with 3 variants. Results: The diagram cause-effect (Ishikawa or fishbone diagram and quantitative correlation of sub

  18. Effect of context on respiratory rate measurement in identifying non-severe pneumonia in African children.

    Science.gov (United States)

    Muro, Florida; Mtove, George; Mosha, Neema; Wangai, Hannah; Harrison, Nicole; Hildenwall, Helena; Schellenberg, David; Todd, Jim; Olomi, Raimos; Reyburn, Hugh

    2015-06-01

    Cough or difficult breathing and an increased respiratory rate for their age are the commonest indications for outpatient antibiotic treatment in African children. We aimed to determine whether respiratory rate was likely to be transiently raised by a number of contextual factors in a busy clinic leading to inaccurate diagnosis. Respiratory rates were recorded in children aged 2-59 months presenting with cough or difficulty breathing to one of the two busy outpatient clinics and then repeated at 10-min intervals over 1 h in a quiet setting. One hundred and sixty-seven children were enrolled with a mean age of 7.1 (SD ± 2.9) months in infants and 27.6 (SD ± 12.8) months in children aged 12-59 months. The mean respiratory rate declined from 42.3 and 33.6 breaths per minute (bpm) in the clinic to 39.1 and 32.6 bpm after 10 min in a quiet room and to 39.2 and 30.7 bpm (P pneumonia. In a random effects linear regression model, the variability in respiratory rate within children (42%) was almost as much as the variability between children (58%). Changing the respiratory rates cut-offs to higher thresholds resulted in a small reduction in the proportion of non-severe pneumonia mis-classifications in infants. Noise and other contextual factors may cause a transient increase in respiratory rate and consequently misclassification of non-severe pneumonia. However, this effect is less pronounced in older children than infants. Respiratory rate is a difficult sign to measure as the variation is large between and within children. More studies of the accuracy and utility of respiratory rate as a proxy for non-severe pneumonia diagnosis in a busy clinic are needed. © 2015 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  19. A deterministic seismic hazard map of India and adjacent areas

    International Nuclear Information System (INIS)

    Parvez, Imtiyaz A.; Vaccari, Franco; Panza, Giuliano

    2001-09-01

    A seismic hazard map of the territory of India and adjacent areas has been prepared using a deterministic approach based on the computation of synthetic seismograms complete of all main phases. The input data set consists of structural models, seismogenic zones, focal mechanisms and earthquake catalogue. The synthetic seismograms have been generated by the modal summation technique. The seismic hazard, expressed in terms of maximum displacement (DMAX), maximum velocity (VMAX), and design ground acceleration (DGA), has been extracted from the synthetic signals and mapped on a regular grid of 0.2 deg. x 0.2 deg. over the studied territory. The estimated values of the peak ground acceleration are compared with the observed data available for the Himalayan region and found in good agreement. Many parts of the Himalayan region have the DGA values exceeding 0.6 g. The epicentral areas of the great Assam earthquakes of 1897 and 1950 represent the maximum hazard with DGA values reaching 1.2-1.3 g. (author)

  20. Deterministic and fuzzy-based methods to evaluate community resilience

    Science.gov (United States)

    Kammouh, Omar; Noori, Ali Zamani; Taurino, Veronica; Mahin, Stephen A.; Cimellaro, Gian Paolo

    2018-04-01

    Community resilience is becoming a growing concern for authorities and decision makers. This paper introduces two indicator-based methods to evaluate the resilience of communities based on the PEOPLES framework. PEOPLES is a multi-layered framework that defines community resilience using seven dimensions. Each of the dimensions is described through a set of resilience indicators collected from literature and they are linked to a measure allowing the analytical computation of the indicator's performance. The first method proposed in this paper requires data on previous disasters as an input and returns as output a performance function for each indicator and a performance function for the whole community. The second method exploits a knowledge-based fuzzy modeling for its implementation. This method allows a quantitative evaluation of the PEOPLES indicators using descriptive knowledge rather than deterministic data including the uncertainty involved in the analysis. The output of the fuzzy-based method is a resilience index for each indicator as well as a resilience index for the community. The paper also introduces an open source online tool in which the first method is implemented. A case study illustrating the application of the first method and the usage of the tool is also provided in the paper.

  1. Deterministic methods for multi-control fuel loading optimization

    Science.gov (United States)

    Rahman, Fariz B. Abdul

    We have developed a multi-control fuel loading optimization code for pressurized water reactors based on deterministic methods. The objective is to flatten the fuel burnup profile, which maximizes overall energy production. The optimal control problem is formulated using the method of Lagrange multipliers and the direct adjoining approach for treatment of the inequality power peaking constraint. The optimality conditions are derived for a multi-dimensional multi-group optimal control problem via calculus of variations. Due to the Hamiltonian having a linear control, our optimal control problem is solved using the gradient method to minimize the Hamiltonian and a Newton step formulation to obtain the optimal control. We are able to satisfy the power peaking constraint during depletion with the control at beginning of cycle (BOC) by building the proper burnup path forward in time and utilizing the adjoint burnup to propagate the information back to the BOC. Our test results show that we are able to achieve our objective and satisfy the power peaking constraint during depletion using either the fissile enrichment or burnable poison as the control. Our fuel loading designs show an increase of 7.8 equivalent full power days (EFPDs) in cycle length compared with 517.4 EFPDs for the AP600 first cycle.

  2. Deterministic and Probabilistic Analysis against Anticipated Transient Without Scram

    International Nuclear Information System (INIS)

    Choi, Sun Mi; Kim, Ji Hwan; Seok, Ho

    2016-01-01

    An Anticipated Transient Without Scram (ATWS) is an Anticipated Operational Occurrences (AOOs) accompanied by a failure of the reactor trip when required. By a suitable combination of inherent characteristics and diverse systems, the reactor design needs to reduce the probability of the ATWS and to limit any Core Damage and prevent loss of integrity of the reactor coolant pressure boundary if it happens. This study focuses on the deterministic analysis for the ATWS events with respect to Reactor Coolant System (RCS) over-pressure and fuel integrity for the EU-APR. Additionally, this report presents the Probabilistic Safety Assessment (PSA) reflecting those diverse systems. The analysis performed for the ATWS event indicates that the NSSS could be reached to controlled and safe state due to the addition of boron into the core via the EBS pump flow upon the EBAS by DPS. Decay heat is removed through MSADVs and the auxiliary feedwater. During the ATWS event, RCS pressure boundary is maintained by the operation of primary and secondary safety valves. Consequently, the acceptance criteria were satisfied by installing DPS and EBS in addition to the inherent safety characteristics

  3. Deterministic versus evidence-based attitude towards clinical diagnosis.

    Science.gov (United States)

    Soltani, Akbar; Moayyeri, Alireza

    2007-08-01

    Generally, two basic classes have been proposed for scientific explanation of events. Deductive reasoning emphasizes on reaching conclusions about a hypothesis based on verification of universal laws pertinent to that hypothesis, while inductive or probabilistic reasoning explains an event by calculation of some probabilities for that event to be related to a given hypothesis. Although both types of reasoning are used in clinical practice, evidence-based medicine stresses on the advantages of the second approach for most instances in medical decision making. While 'probabilistic or evidence-based' reasoning seems to involve more mathematical formulas at the first look, this attitude is more dynamic and less imprisoned by the rigidity of mathematics comparing with 'deterministic or mathematical attitude'. In the field of medical diagnosis, appreciation of uncertainty in clinical encounters and utilization of likelihood ratio as measure of accuracy seem to be the most important characteristics of evidence-based doctors. Other characteristics include use of series of tests for refining probability, changing diagnostic thresholds considering external evidences and nature of the disease, and attention to confidence intervals to estimate uncertainty of research-derived parameters.

  4. Method to deterministically study photonic nanostructures in different experimental instruments.

    Science.gov (United States)

    Husken, B H; Woldering, L A; Blum, C; Vos, W L

    2009-01-01

    We describe an experimental method to recover a single, deterministically fabricated nanostructure in various experimental instruments without the use of artificially fabricated markers, with the aim to study photonic structures. Therefore, a detailed map of the spatial surroundings of the nanostructure is made during the fabrication of the structure. These maps are made using a series of micrographs with successively decreasing magnifications. The graphs reveal intrinsic and characteristic geometric features that can subsequently be used in different setups to act as markers. As an illustration, we probe surface cavities with radii of 65 nm on a silica opal photonic crystal with various setups: a focused ion beam workstation; a scanning electron microscope (SEM); a wide field optical microscope and a confocal microscope. We use cross-correlation techniques to recover a small area imaged with the SEM in a large area photographed with the optical microscope, which provides a possible avenue to automatic searching. We show how both structural and optical reflectivity data can be obtained from one and the same nanostructure. Since our approach does not use artificial grids or markers, it is of particular interest for samples whose structure is not known a priori, like samples created solely by self-assembly. In addition, our method is not restricted to conducting samples.

  5. Prospects in deterministic three dimensional whole-core transport calculations

    International Nuclear Information System (INIS)

    Sanchez, Richard

    2012-01-01

    The point we made in this paper is that, although detailed and precise three-dimensional (3D) whole-core transport calculations may be obtained in the future with massively parallel computers, they would have an application to only some of the problems of the nuclear industry, more precisely those regarding multiphysics or for methodology validation or nuclear safety calculations. On the other hand, typical design reactor cycle calculations comprising many one-point core calculations can have very strict constraints in computing time and will not directly benefit from the advances in computations in large scale computers. Consequently, in this paper we review some of the deterministic 3D transport methods which in the very near future may have potential for industrial applications and, even with low-order approximations such as a low resolution in energy, might represent an advantage as compared with present industrial methodology, for which one of the main approximations is due to power reconstruction. These methods comprise the response-matrix method and methods based on the two-dimensional (2D) method of characteristics, such as the fusion method.

  6. Conversion of dependability deterministic requirements into probabilistic requirements

    International Nuclear Information System (INIS)

    Bourgade, E.; Le, P.

    1993-02-01

    This report concerns the on-going survey conducted jointly by the DAM/CCE and NRE/SR branches on the inclusion of dependability requirements in control and instrumentation projects. Its purpose is to enable a customer (the prime contractor) to convert into probabilistic terms dependability deterministic requirements expressed in the form ''a maximum permissible number of failures, of maximum duration d in a period t''. The customer shall select a confidence level for each previously defined undesirable event, by assigning a maximum probability of occurrence. Using the formulae we propose for two repair policies - constant rate or constant time - these probabilized requirements can then be transformed into equivalent failure rates. It is shown that the same formula can be used for both policies, providing certain realistic assumptions are confirmed, and that for a constant time repair policy, the correct result can always be obtained. The equivalent failure rates thus determined can be included in the specifications supplied to the contractors, who will then be able to proceed to their previsional justification. (author), 8 refs., 3 annexes

  7. Deterministic network interdiction optimization via an evolutionary approach

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This paper introduces an evolutionary optimization approach that can be readily applied to solve deterministic network interdiction problems. The network interdiction problem solved considers the minimization of the maximum flow that can be transmitted between a source node and a sink node for a fixed network design when there is a limited amount of resources available to interdict network links. Furthermore, the model assumes that the nominal capacity of each network link and the cost associated with their interdiction can change from link to link. For this problem, the solution approach developed is based on three steps that use: (1) Monte Carlo simulation, to generate potential network interdiction strategies, (2) Ford-Fulkerson algorithm for maximum s-t flow, to analyze strategies' maximum source-sink flow and, (3) an evolutionary optimization technique to define, in probabilistic terms, how likely a link is to appear in the final interdiction strategy. Examples for different sizes of networks and network behavior are used throughout the paper to illustrate the approach. In terms of computational effort, the results illustrate that solutions are obtained from a significantly restricted solution search space. Finally, the authors discuss the need for a reliability perspective to network interdiction, so that solutions developed address more realistic scenarios of such problem

  8. Is there a sharp phase transition for deterministic cellular automata?

    International Nuclear Information System (INIS)

    Wootters, W.K.

    1990-01-01

    Previous work has suggested that there is a kind of phase transition between deterministic automata exhibiting periodic behavior and those exhibiting chaotic behavior. However, unlike the usual phase transitions of physics, this transition takes place over a range of values of the parameter rather than at a specific value. The present paper asks whether the transition can be made sharp, either by taking the limit of an infinitely large rule table, or by changing the parameter in terms of which the space of automata is explored. We find strong evidence that, for the class of automata we consider, the transition does become sharp in the limit of an infinite number of symbols, the size of the neighborhood being held fixed. Our work also suggests an alternative parameter in terms of which it is likely that the transition will become fairly sharp even if one does not increase the number of symbols. In the course of our analysis, we find that mean field theory, which is our main tool, gives surprisingly good predictions of the statistical properties of the class of automata we consider. 18 refs., 6 figs

  9. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information.

    Science.gov (United States)

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft's algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms.

  10. Deterministic and Probabilistic Analysis against Anticipated Transient Without Scram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Sun Mi; Kim, Ji Hwan [KHNP Central Research Institute, Daejeon (Korea, Republic of); Seok, Ho [KEPCO Engineering and Construction, Daejeon (Korea, Republic of)

    2016-10-15

    An Anticipated Transient Without Scram (ATWS) is an Anticipated Operational Occurrences (AOOs) accompanied by a failure of the reactor trip when required. By a suitable combination of inherent characteristics and diverse systems, the reactor design needs to reduce the probability of the ATWS and to limit any Core Damage and prevent loss of integrity of the reactor coolant pressure boundary if it happens. This study focuses on the deterministic analysis for the ATWS events with respect to Reactor Coolant System (RCS) over-pressure and fuel integrity for the EU-APR. Additionally, this report presents the Probabilistic Safety Assessment (PSA) reflecting those diverse systems. The analysis performed for the ATWS event indicates that the NSSS could be reached to controlled and safe state due to the addition of boron into the core via the EBS pump flow upon the EBAS by DPS. Decay heat is removed through MSADVs and the auxiliary feedwater. During the ATWS event, RCS pressure boundary is maintained by the operation of primary and secondary safety valves. Consequently, the acceptance criteria were satisfied by installing DPS and EBS in addition to the inherent safety characteristics.

  11. Deterministic ripple-spreading model for complex networks.

    Science.gov (United States)

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Hines, Evor L; Di Paolo, Ezequiel

    2011-04-01

    This paper proposes a deterministic complex network model, which is inspired by the natural ripple-spreading phenomenon. The motivations and main advantages of the model are the following: (i) The establishment of many real-world networks is a dynamic process, where it is often observed that the influence of a few local events spreads out through nodes, and then largely determines the final network topology. Obviously, this dynamic process involves many spatial and temporal factors. By simulating the natural ripple-spreading process, this paper reports a very natural way to set up a spatial and temporal model for such complex networks. (ii) Existing relevant network models are all stochastic models, i.e., with a given input, they cannot output a unique topology. Differently, the proposed ripple-spreading model can uniquely determine the final network topology, and at the same time, the stochastic feature of complex networks is captured by randomly initializing ripple-spreading related parameters. (iii) The proposed model can use an easily manageable number of ripple-spreading related parameters to precisely describe a network topology, which is more memory efficient when compared with traditional adjacency matrix or similar memory-expensive data structures. (iv) The ripple-spreading model has a very good potential for both extensions and applications.

  12. Combined effects of sivelestat and resveratrol on severe acute pancreatitis-associated lung injury in rats.

    Science.gov (United States)

    Wang, Houhong; Wang, Shuai; Tang, Amao; Gong, Huihui; Ma, Panpan; Chen, Li

    2014-08-01

    Despite extensive research and clinical efforts made in the management of acute pancre-atitis during the past few decades, to date no effective cure is available and the mortality from severe acute pancre-atitis remains high. Given that lung is the primary cause of early death in acute pancreatitis patients, novel therapeutic approaches aiming to prevent lung injury have become a subject of intensive investigation. In a previous study, we demonstrated that sivelestat, a specific inhibitor of neutrophil elastase, is effective in protecting against lung failure in rats with taurocholate-induced acute pancreatitis. As part of the analyses extended from that study, the present study aimed to evaluate the role of sivelestat and/or resveratrol in the protection against acute pancreatitis-associated lung injury. The extended analyses demonstrated the following: (1) sodium taurocholate induced apparent lung injury and dysfunction manifested by histological anomalies, including vacuolization and apoptosis of the cells in the lung, as well as biochemical aberrations in the blood (an increase in amylase concentration and a decrease in partial arterial oxygen pressure) and increases in activities of reactive oxygen species, interleukin 6, myeloperoxidase, neutrophil elastase, lung edema, bronchotracho alveolar lavage protein concentration, and bronchotracho alveolar lavage cell infiltration in the lung; and (2) in lung tissues, either sivelestat or resveratrol treatment effectively attenuated the taurocholate-induced abnormalities in all parameters analyzed except for serum amylase concentration. In addition, combined treatment with both sivelestat and resveratrol demonstrated additive protective effects on pancreatitis-associated lung injury compared with single treatment.

  13. Effect of rasagiline as adjunct therapy to levodopa on severity of OFF in Parkinson's disease.

    Science.gov (United States)

    Stocchi, F; Rabey, J M

    2011-12-01

    The LARGO study demonstrated that rasagiline 1 mg/day as adjunct to levodopa significantly reduces OFF time to the same magnitude as adjunct entacapone. This substudy of LARGO aimed to assess the effect of rasagiline and entacapone on the motor symptoms of PD during the practically defined OFF state. LARGO was a randomized, double-blind, multicenter trial that assessed the efficacy and safety of rasagiline (1 mg/day), entacapone (200 mg with each levodopa dose), and placebo in 687 levodopa-treated PD patients with motor fluctuations. A substudy of LARGO measured UPDRS motor scores in the practically defined OFF state in 32 rasagiline, 36 entacapone, and 37 placebo patients. Treatment with rasagiline produced a significant improvement over placebo of 5.64 units in UPDRS motor OFF score (P = 0.013 vs. placebo). By contrast, the effect of adjunct entacapone was not significant (P = 0.14 vs. placebo). Whereas rasagiline also showed a trend in reducing the UPDRS-ADL OFF score (P = 0.058 vs. placebo), no such trend was noted for entacapone (P = 0.26 vs. placebo). Retrospective analysis, using the Bonferroni correction, of UPDRS motor subdomains further revealed that rasagiline, but not entacapone, significantly improved bradykinesia (P rasagiline 1 mg/day is effective in reducing the severity of motor symptoms in the OFF state. This suggests a continuous effect of rasagiline 1 mg/day throughout the day and night and is consistent with its extended duration of therapeutic action. © 2011 The Author(s). European Journal of Neurology © 2011 EFNS.

  14. The effect of hospital volume on patient outcomes in severe acute pancreatitis

    Directory of Open Access Journals (Sweden)

    Shen Hsiu-Nien

    2012-08-01

    Full Text Available Abstract Background We investigated the relation between hospital volume and outcome in patients with severe acute pancreatitis (SAP. The determination is important because patient outcome may be improved through volume-based selective referral. Methods In this cohort study, we analyzed 22,551 SAP patients in 2,208 hospital-years (between 2000 and 2009 from Taiwan’s National Health Insurance Research Database. Primary outcome was hospital mortality. Secondary outcomes were hospital length of stay and charges. Hospital SAP volume was measured both as categorical and as continuous variables (per one case increase each hospital-year. The effect was assessed using multivariable logistic regression models with generalized estimating equations accounting for hospital clustering effect. Adjusted covariates included patient and hospital characteristics (model 1, and additional treatment variables (model 2. Results Irrespective of the measurements, increasing hospital volume was associated with reduced risk of hospital mortality after adjusting the patient and hospital characteristics (adjusted odds ratio [OR] 0.995, 95% confidence interval [CI] 0.993-0.998 for per one case increase. The patients treated in the highest volume quartile (≥14 cases per hospital-year had 42% lower risk of hospital mortality than those in the lowest volume quartile (1 case per hospital-year after adjusting the patient and hospital characteristics (adjusted OR 0.58, 95% CI 0.40-0.83. However, an inverse relation between volume and hospital stay or hospital charges was observed only when the volume was analyzed as a categorical variable. After adjusting the treatment covariates, the volume effect on hospital mortality disappeared regardless of the volume measures. Conclusions These findings support the use of volume-based selective referral for patients with SAP and suggest that differences in levels or processes of care among hospitals may have contributed to the volume

  15. Effect of histochrome on the severity of delayed effects of prenatal exposure to lead nitrate in the rat brain.

    Science.gov (United States)

    Ryzhavsky, B Ya; Lebedko, O A; Belolubskaya, D S

    2008-08-01

    The effects of histochrome on the severity of delayed effects of prenatal exposure to lead nitrate were studied in the rat brain. Exposure of pregnant rats to lead nitrate during activation of free radical oxidation reduced activity of NADH- and NADPH-dehydrogenases in cortical neurons of their 40-day-old progeny, reduced the number of neurons in a visual field, increased the number of pathologically modified neurons, and stimulated rat motor activity in an elevated plus-maze. Two intraperitoneal injections of histochrome in a dose of 0.1 mg/kg before and after lead citrate challenge attenuated the manifestations of oxidative stress and prevented the changes in some morphological and histochemical parameters of the brain, developing under the effect of lead exposure.

  16. Effect of percutaneous renal sympathetic nerve radiofrequency ablation in patients with severe heart failure.

    Science.gov (United States)

    Dai, Qiming; Lu, Jing; Wang, Benwen; Ma, Genshan

    2015-01-01

    This study aimed to investigate the clinical feasibility and effects of percutaneous renal sympathetic nerve radiofrequency ablation in patients with heart failure. A total of 20 patients with heart failure were enrolled, aged from 47 to 75 years (63±10 years). They were divided into the standard therapy (n = 10), and renal nerve radiofrequency ablation groups (n = 10). There were 15 males and 5 female patients, including 8 ischemic cardiomyopathy, 8 dilated cardiomyopathy, and 8 hypertensive cardiopathy. All of the patients met the criteria of New York Heart Association classes III-IV cardiac function. Patients with diabetes and renal failure were excluded. Percutaneous renal sympathetic nerve radiofrequency ablation was performed on the renal artery wall under X-ray guidance. Serum electrolytes, neurohormones, and 24 h urine volume were recorded 24 h before and after the operation. Echocardiograms were performed to obtain left ventricular ejection fraction at baseline and 6 months. Heart rate, blood pressure, symptoms of dyspnea and edema were also monitored. After renal nerve ablation, 24 h urine volume was increased, while neurohormone levels were decreased compared with those of pre-operation and standard therapy. No obvious change in heart rate or blood pressure was recorded. Symptoms of heart failure were improved in patients after the operation. No complications were recorded in the study. Percutaneous renal sympathetic nerve radiofrequency ablation may be a feasible, safe, and effective treatment for the patients with severe congestive heart failure.

  17. Effects of single low-temperature sauna bathing in patients with severe motor and intellectual disabilities.

    Science.gov (United States)

    Iiyama, Junichi; Matsushita, Kensuke; Tanaka, Nobuyuki; Kawahira, Kazumi

    2008-07-01

    We have previously reported that thermal vasodilation following warm-water bathing and low-temperature sauna bathing (LTSB) at 60 degrees C for 15 min improves the cardiac function in patients with congestive heart failure. Through a comparative before-and-after study, we studied the hemodynamic and clinical effects of single exposure to LTSB in cerebral palsy (CP) patients who usually suffer from chilled extremities and low cardiac output. The study population comprised 16 patients ranging between 19 and 53 years with severe motor and intellectual disabilities. Noninvasive methods were used to estimate the systemic and peripheral circulatory changes before and after LTSB. Using blood flow velocity analysis, the pulsatile and resistive indexes of the peripheral arteries of the patients' lower limbs were calculated. Following LTSB, the patients' deep body temperature increased significantly by 1 degrees C. Their heart rates increased and blood pressure decreased slightly. The total peripheral resistance decreased by 11%, and the cardiac output increased by 14%. There was significant improvement in the parameters that are indicative of the peripheral circulatory status, including the skin blood flow, blood flow velocity, pulsatile index, and resistive index. Numbness and chronic myalgia of the extremities decreased. There were no adverse side effects. Thus, it can be concluded that LTSB improves the peripheral circulation in CP patients.

  18. Effects of Magnesium and Vitamin B6 on the Severity of Premenstrual Syndrome Symptoms

    Directory of Open Access Journals (Sweden)

    Elham Ebrahimi

    2012-11-01

    Full Text Available Introduction: The importance of resolving the problem of premenstrual syndrome for patients has been emphasized due to its direct and indirect economical effects on the society. The aim of the current study was to evaluate the effects of magnesium and vitamin B6 on the severity of premenstrual syndrome in patients referring to health centers affiliated to Isfahan University of Medical Sciences, Iran, during 2009-10. Methods: This two-stage double-blind clinical trial was conducted on 126 women who were randomly allocated into 3 groups to receive magnesium, vitamin B6, or placebo. The study was performed in 10 selected health centers in Isfahan and lasted for 4 months. To confirm premenstrual syndrome, the participants were asked to complete a menstrual diary for 2 months at home. Drug interventions were continued for two cycles and the results of before and after the intervention were compared. Results: The findings of this study showed that the mean scores of premenstrual syndrome significantly decreased after the intervention in all groups (p < 0.05. Conclusion: According to our findings, vitamin B6 and placebo had the most and least efficiency in improving the mean premenstrual syndrome score.

  19. Stability analysis of a deterministic dose calculation for MRI-guided radiotherapy

    Science.gov (United States)

    Zelyak, O.; Fallone, B. G.; St-Aubin, J.

    2018-01-01

    Modern effort in radiotherapy to address the challenges of tumor localization and motion has led to the development of MRI guided radiotherapy technologies. Accurate dose calculations must properly account for the effects of the MRI magnetic fields. Previous work has investigated the accuracy of a deterministic linear Boltzmann transport equation (LBTE) solver that includes magnetic field, but not the stability of the iterative solution method. In this work, we perform a stability analysis of this deterministic algorithm including an investigation of the convergence rate dependencies on the magnetic field, material density, energy, and anisotropy expansion. The iterative convergence rate of the continuous and discretized LBTE including magnetic fields is determined by analyzing the spectral radius using Fourier analysis for the stationary source iteration (SI) scheme. The spectral radius is calculated when the magnetic field is included (1) as a part of the iteration source, and (2) inside the streaming-collision operator. The non-stationary Krylov subspace solver GMRES is also investigated as a potential method to accelerate the iterative convergence, and an angular parallel computing methodology is investigated as a method to enhance the efficiency of the calculation. SI is found to be unstable when the magnetic field is part of the iteration source, but unconditionally stable when the magnetic field is included in the streaming-collision operator. The discretized LBTE with magnetic fields using a space-angle upwind stabilized discontinuous finite element method (DFEM) was also found to be unconditionally stable, but the spectral radius rapidly reaches unity for very low-density media and increasing magnetic field strengths indicating arbitrarily slow convergence rates. However, GMRES is shown to significantly accelerate the DFEM convergence rate showing only a weak dependence on the magnetic field. In addition, the use of an angular parallel computing strategy

  20. Corrigendum to "Stability analysis of a deterministic dose calculation for MRI-guided radiotherapy".

    Science.gov (United States)

    Zelyak, Oleksandr; Fallone, B Gino; St-Aubin, Joel

    2018-03-12

    Modern effort in radiotherapy to address the challenges of tumor localization and motion has led to the development of MRI guided radiotherapy technologies. Accurate dose calculations must properly account for the effects of the MRI magnetic fields. Previous work has investigated the accuracy of a deterministic linear Boltzmann transport equation (LBTE) solver that includes magnetic field, but not the stability of the iterative solution method. In this work, we perform a stability analysis of this deterministic algorithm including an investigation of the convergence rate dependencies on the magnetic field, material density, energy, and anisotropy expansion. The iterative convergence rate of the continuous and discretized LBTE including magnetic fields is determined by analyzing the spectral radius using Fourier analysis for the stationary source iteration (SI) scheme. The spectral radius is calculated when the magnetic field is included (1) as a part of the iteration source, and (2) inside the streaming-collision operator. The non-stationary Krylov subspace solver GMRES is also investigated as a potential method to accelerate the iterative convergence, and an angular parallel computing methodology is investigated as a method to enhance the efficiency of the calculation. SI is found to be unstable when the magnetic field is part of the iteration source, but unconditionally stable when the magnetic field is included in the streaming-collision operator. The discretized LBTE with magnetic fields using a space-angle upwind stabilized discontinuous finite element method (DFEM) was also found to be unconditionally stable, but the spectral radius rapidly reaches unity for very low density media and increasing magnetic field strengths indicating arbitrarily slow convergence rates. However, GMRES is shown to significantly accelerate the DFEM convergence rate showing only a weak dependence on the magnetic field. In addition, the use of an angular parallel computing strategy

  1. Stability analysis of a deterministic dose calculation for MRI-guided radiotherapy.

    Science.gov (United States)

    Zelyak, O; Fallone, B G; St-Aubin, J

    2017-12-14

    Modern effort in radiotherapy to address the challenges of tumor localization and motion has led to the development of MRI guided radiotherapy technologies. Accurate dose calculations must properly account for the effects of the MRI magnetic fields. Previous work has investigated the accuracy of a deterministic linear Boltzmann transport equation (LBTE) solver that includes magnetic field, but not the stability of the iterative solution method. In this work, we perform a stability analysis of this deterministic algorithm including an investigation of the convergence rate dependencies on the magnetic field, material density, energy, and anisotropy expansion. The iterative convergence rate of the continuous and discretized LBTE including magnetic fields is determined by analyzing the spectral radius using Fourier analysis for the stationary source iteration (SI) scheme. The spectral radius is calculated when the magnetic field is included (1) as a part of the iteration source, and (2) inside the streaming-collision operator. The non-stationary Krylov subspace solver GMRES is also investigated as a potential method to accelerate the iterative convergence, and an angular parallel computing methodology is investigated as a method to enhance the efficiency of the calculation. SI is found to be unstable when the magnetic field is part of the iteration source, but unconditionally stable when the magnetic field is included in the streaming-collision operator. The discretized LBTE with magnetic fields using a space-angle upwind stabilized discontinuous finite element method (DFEM) was also found to be unconditionally stable, but the spectral radius rapidly reaches unity for very low-density media and increasing magnetic field strengths indicating arbitrarily slow convergence rates. However, GMRES is shown to significantly accelerate the DFEM convergence rate showing only a weak dependence on the magnetic field. In addition, the use of an angular parallel computing strategy

  2. [Clinical effect of different sequences of debridement-antibiotic therapy in treatment of severe chronic periodontitis].

    Science.gov (United States)

    Li, Yi; Xu, Li; Lu, Rui-fang; An, Yue-bang; Wang, Xian-e; Song, Wen-li; Meng, Huan-xi

    2015-02-18

    To evaluate the feasibility of full-mouth debridement (subgingival scaling and root planning, SRP) by 2 times within 1 week and compare the clinical effects of different sequences of debridement-antibiotic usage in patients with severe chronic periodontitis (CP). A double-blinded, placebo-controlled, randomized clinical trial was conducted in 30 severe CP patients (14 males and 16 females, 40.5 ± 8.4 years old on average from 35 to 60) receiving 3 different sequences of debridement-antibiotictherapy: Group A, antibiotic usage (metronidazole, MTZ, 0.2 g, tid, 7 d; amoxicillin, AMX 0.5 g, tid, 7 d) was started together with SRP (completed by 2 times in 7 d); Group B, antibiotic usage (MTZ 0.2 g, tid, 7 d; AMX 0.5 g, tid, 7 d) was started 1 d after SRP(completed by 2 times in 7 d); Group C, SRP alone[probing depth (PD), bleeding index (BI) and tooth mobility] was examined. The average full-mouth probing depth, the average full-mouth proximal probing depth (pPD), the percentage of sites with PD>5 mm (PD>5 mm%), the percentage of sites with proximal PD>5 mm (pPD>5 mm%), the average bleeding index (BI) and the percentage of sites with bleeding on probing (BOP%) were calculated. Clinical examinations were performed at baseline and 2 months post therapy. (1) Compared with baseline conditions, all the subjects showed clinical improvements in all the parameters evaluated 2 months post therapy, Pantibiotic usage at the same time comparing with patients using antibiotics after SRP or SRP alone.

  3. Effect of Coronary Anatomy and Hydrostatic Pressure on Intracoronary Indices of Stenosis Severity.

    Science.gov (United States)

    Härle, Tobias; Luz, Mareike; Meyer, Sven; Kronberg, Kay; Nickau, Britta; Escaned, Javier; Davies, Justin; Elsässer, Albrecht

    2017-04-24

    The authors sought to analyze height differences within the coronary artery tree in patients in a supine position and to quantify the impact of hydrostatic pressure on intracoronary pressure measurements in vitro. Although pressure equalization of the pressure sensor and the systemic pressure at the catheter tip is mandatory in intracoronary pressure measurements, subsequent measurements may be influenced by hydrostatic pressure related to the coronary anatomy in the supine position. Outlining and quantifying this phenomenon is important to interpret routine and pullback pressure measurements within the coronary tree. Coronary anatomy was analyzed in computed tomography angiographies of 70 patients to calculate height differences between the catheter tip and different coronary segments in the supine position. Using a dynamic pressure simulator, the effect of the expected hydrostatic pressure resulting from such height differences on indices stenosis severity was assessed. In all patients, the left anterior and right posterior descending arteries are the highest points of the coronary tree with a mean height difference of -4.9 ± 1.6 cm and -3.8 ± 1.0 cm; whereas the circumflex artery and right posterolateral branches are the lowest points, with mean height differences of 3.9 ± 0.9 cm and 2.6 ± 1.6 cm compared with the according ostium. In vitro measurements demonstrated a correlation of the absolute pressure differences with height differences (r = 0.993; p pressure level. Hydrostatic pressure variations resulting from normal coronary anatomy in a supine position influence intracoronary pressure measurements and may affect their interpretation during stenosis severity assessment. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  4. Biliopancreatic Diversion for Severe Obesity: Long-Term Effectiveness and Nutritional Complications.

    Science.gov (United States)

    Ballesteros-Pomar, María D; González de Francisco, Tomás; Urioste-Fondo, Ana; González-Herraez, Luis; Calleja-Fernández, Alicia; Vidal-Casariego, Alfonso; Simó-Fernández, Vicente; Cano-Rodríguez, Isidoro

    2016-01-01

    Bariatric surgery is currently the treatment of choice for those patients with severe obesity, but the procedure of choice is not clearly established. We describe weight loss and nutritional parameters in severely obese patients after biliopancreatic diversion for 10 years of follow-up. Patients were followed by the same multidisciplinary team, and data are shown for 10 years. Bariatric Analysis and Reporting Outcome System (BAROS) questionnaire, data regarding the evolution of obesity-related diseases, and nutritional parameters are reported. Two hundred ninety-nine patients underwent biliopancreatic diversion, 76.1 % women, initial BMI 50.1 kg/m(2) (7.2). Weight loss was maintained throughout 10 years with EWL% around 65 % and EBMIL% around 70 %. More than 80 % of the patients showed EWL higher than 50 %. Blood pressure, glucose metabolism, and lipid profile clearly improved after surgery. Mean nutritional parameters remained within the normal range during follow-up. Protein malnutrition affected less than 4 % and anemia up to 16 %. Fat-soluble vitamin levels decreased along the time, with vitamin D deficiency in 61.5 % of patients. No significant differences were found either in nutritional parameters or weight loss regarding gastrectomy or gastric preservation, or common limb length longer or shorter than 55 cm Biliopancreatic diversion is an effective surgical procedure in terms of weight loss, quality of life, and evolution of obesity-related diseases. Nutritional deficiencies are less frequent than feared for a malabsorptive procedure, but must be taken into account, especially for fat-soluble vitamins.

  5. Intestinal mucosal permeability of severely underweight and nonmalnourished Bangladeshi children and effects of nutritional rehabilitation.

    Science.gov (United States)

    Hossain, Md Iqbal; Nahar, Baitun; Hamadani, Jena D; Ahmed, Tahmeed; Roy, Anjan Kumar; Brown, Kenneth H

    2010-11-01

    Lactulose/mannitol (L/M) intestinal permeability tests were completed to compare the intestinal function of severely underweight children recovering from diarrhea and other illnesses and of nonmalnourished children from the same communities, and to evaluate the effects of food supplementation, with or without psychosocial stimulation, on the changes in intestinal function among the underweight children. Seventy-seven malnourished children completed intestinal permeability studies at baseline and 3 months after receiving 1 of the following randomly assigned treatment regimens: group-C--fortnightly follow-up at community-based follow-up units, including growth monitoring and promotion, health education, and micronutrient supplementation, n = 17; group-SF--same as group-C plus supplementary food (SF) to provide 150 to 300 kcal/day, n = 23; group-PS--same as group-C plus psychosocial stimulation (PS), n = 17; or group-SF + PS--same as group-C plus SF and PS, n = 20. Seventeen nonmalnourished children were included as comparison subjects. The malnourished children's mean ± SD initial age was 13.1 ± 4.0 months, their mean weight-for-age z score was -3.82 ± 0.61, and their median (interquartile range) urinary L/M recovery ratio was 0.16 (0.10-0.28). Eighty-four percent of the children had L/M ≥ 0.07, suggestive of impaired intestinal function. The median L/M of the malnourished children was significantly greater than that of 17 relatively well-nourished children (median 0.09; interquartile range 0.05-0.12; P = 0.001). There were no significant differences in baseline characteristics of the severely malnourished children by treatment group. Following treatment, the L/M ratio improved in all of the groups (P sugar permeability, is impaired among severely underweight children. Intestinal permeability improves in relation to weight gain, but intestinal mucosal recovery is not specifically related to the types or amount of food supplementation or PS provided in this trial.

  6. Enduring effects of severe developmental adversity, including nutritional deprivation, on cortisol metabolism in aging Holocaust survivors.

    Science.gov (United States)

    Yehuda, Rachel; Bierer, Linda M; Andrew, Ruth; Schmeidler, James; Seckl, Jonathan R

    2009-06-01

    In animal models, early life exposure to major environmental challenges such as malnutrition and stress results in persisting cardiometabolic, neuroendocrine and affective effects. While such effects have been associated with pathogenesis, the widespread occurrence of 'developmental programming' suggests it has adaptive function. Glucocorticoids may mediate 'programming' and their metabolism is known to be affected by early life events in rodents. To examine these relationships in humans, cortisol metabolism and cardiometabolic disease manifestations were examined in Holocaust survivors in relation to age at exposure and affective dysfunction, notably lifetime posttraumatic stress disorder (PTSD). Fifty-one Holocaust survivors and 22 controls without Axis I disorder collected 24-h urine samples and were evaluated for psychiatric disorders and cardiometabolic diagnoses. Corticosteroids and their metabolites were assayed by gas chromatography-mass spectroscopy (GC-MS); cortisol was also measured by radioimmunoassay (RIA). Holocaust survivors showed reduced cortisol by RIA, and decreased levels of 5alpha-tetrahydrocortisol (5alpha-THF) and total glucocorticoid production by GC-MS. The latter was associated with lower cortisol metabolism by 5alpha-reductase and 11beta-hydroxysteroid dehydrogenase (11beta-HSD) type-2. The greatest decrements were associated with earliest age of Holocaust exposure and less severe PTSD symptomatology. Cardiometabolic manifestations were associated with decreased 11beta-HSD-2 activity. In controls, 5alpha-reductase was positively associated with trauma-related symptoms (i.e., to traumatic exposures unrelated to the Holocaust). Extreme malnutrition and related stress during development is associated with long-lived alterations in specific pathways of glucocorticoid metabolism. These effects may be adaptive and link with lower risks of cardiometabolic and stress-related disorders in later life.

  7. The Relation between Deterministic Thinking and Mental Health among Substance Abusers Involved in a Rehabilitation Program

    Directory of Open Access Journals (Sweden)

    Seyed Jalal Younesi

    2015-06-01

    Full Text Available Objective: The current research is to investigate the relation between deterministic thinking and mental health among drug abusers, in which the role of  cognitive distortions is considered and clarified by focusing on deterministic thinking. Methods: The present study is descriptive and correlative. All individuals with experience of drug abuse who had been referred to the Shafagh Rehabilitation center (Kahrizak were considered as the statistical population. 110 individuals who were addicted to drugs (stimulants and Methamphetamine were selected from this population by purposeful sampling to answer questionnaires about deterministic thinking and general health. For data analysis Pearson coefficient correlation and regression analysis was used. Results: The results showed that there is a positive and significant relationship between deterministic thinking and the lack of mental health at the statistical level [r=%22, P<0.05], which had the closest relation to deterministic thinking among the factors of mental health, such as anxiety and depression. It was found that the two factors of deterministic thinking which function as the strongest variables that predict the lack of mental health are: definitiveness in predicting tragic events and future anticipation. Discussion: It seems that drug abusers suffer from deterministic thinking when they are confronted with difficult situations, so they are more affected by depression and anxiety. This way of thinking may play a major role in impelling or restraining drug addiction.

  8. Deterministic one-way simulation of two-way, real-time cellular automata and its related problems

    Energy Technology Data Exchange (ETDEWEB)

    Umeo, H; Morita, K; Sugata, K

    1982-06-13

    The authors show that for any deterministic two-way, real-time cellular automaton, m, there exists a deterministic one-way cellular automation which can simulate m in twice real-time. Moreover the authors present a new type of deterministic one-way cellular automata, called circular cellular automata, which are computationally equivalent to deterministic two-way cellular automata. 7 references.

  9. Effect of hydroprocessing severity on characteristics of jet fuel from OSCO 2 and Paraho distillates

    Science.gov (United States)

    Prok, G. M.; Flores, F. J.; Seng, G. T.

    1981-01-01

    Jet A boiling range fuels and broad-property research fuels were produced by hydroprocessing shale oil distillates, and their properties were measured to characterize the fuels. The distillates were the fraction of whole shale oil boiling below 343 C from TOSCO 2 and Paraho syncrudes. The TOSCO 2 was hydroprocessed at medium severity, and the Paraho was hydroprocessed at high, medium, and low severities. Fuels meeting Jet A requirements except for the freezing point were produced from the medium severity TOSCO 2 and the high severity Paraho. Target properties of a broad property research fuel were met by the medium severity TOSCO 2 and the high severity Paraho except for the freezing point and a high hydrogen content. Medium and low severity Paraho jet fuels did not meet thermal stability and freezing point requirements.

  10. Pharmacokinetics and clinical effect of phenobarbital in children with severe falciparum malaria and convulsions

    Science.gov (United States)

    Kokwaro, Gilbert O; Ogutu, Bernhards R; Muchohi, Simon N; Otieno, Godfrey O; Newton, Charles R J C

    2003-01-01

    Aims Phenobarbital is commonly used to treat status epilepticus in resource-poor countries. Although a dose of 20 mg kg−1 is recommended, this dose, administered intramuscularly (i.m.) for prophylaxis, is associated with an increase in mortality in children with cerebral malaria. We evaluated a 15-mg kg−1 intravenous (i.v.) dose of phenobarbital to determine its pharmacokinetics and clinical effects in children with severe falciparum malaria and status epilepticus. Methods Twelve children (M/F: 11/1), aged 7–62 months, received a loading dose of phenobarbital (15 mg kg−1) as an i.v. infusion over 20 min and maintenance dose of 5 mg kg−1 at 24 and 48 h later. The duration of convulsions and their recurrence were recorded. Vital signs were monitored. Plasma and cerebrospinal fluid (CSF) phenobarbital concentrations were measured with an Abbott TDx FLx® fluorescence polarisation immunoassay analyser (Abbott Laboratories, Diagnostic Division, Abbott Park, IL, USA). Simulations were performed to predict the optimum dosage regimen that would maintain plasma phenobarbital concentrations between 15 and 20 mg l−1 for 72 h. Results All the children achieved plasma concentrations above 15 mg l−1 by the end of the infusion. Mean (95% confidence interval or median and range for Cmax) pharmacokinetic parameters were: area under curve [AUC (0, ∞) ]: 4259 (3169, 5448) mg l−1.h, t½: 82.9 (62, 103) h, CL: 5.8 (4.4, 7.3) ml kg−1 h−1, Vss: 0.8 (0.7, 0.9) l kg −1, CSF: plasma phenobarbital concentration ratio: 0.7 (0.5, 0.8; n = 6) and Cmax: 19.9 (17.9–27.9) mg l−1. Eight of the children had their convulsions controlled and none of them had recurrence of convulsions. Simulations suggested that a loading dose of 15 mg kg−1 followed by two maintenance doses of 2.5 mg kg−1 at 24 h and 48 h would maintain plasma phenobarbital concentrations between 16.4 and 20 mg l−1 for 72 h. Conclusions Phenobarbital, given as an i.v. loading dose, 15 mg kg−1

  11. Opposing effects of fire severity on climate feedbacks in Siberian larch forests

    Science.gov (United States)

    Loranty, M. M.; Alexander, H. D.; Natali, S.; Kropp, H.; Mack, M. C.; Bunn, A. G.; Davydov, S. P.; Erb, A.; Kholodov, A. L.; Schaaf, C.; Wang, Z.; Zimov, N.; Zimov, S. A.

    2017-12-01

    Boreal larch forests in northeastern Siberia comprise nearly 25% of the continuous permafrost zone. Structural and functional changes in these ecosystems will have important climate feedbacks at regional and global scales. Like boreal ecosystems in North America, fire is an important determinant of landscape scale forest distribution, and fire regimes are intensifying as climate warms. In Siberian larch forests are dominated by a single tree species, and there is evidence that fire severity influences post-fire forest density via impacts on seedling establishment. The extent to which these effects occur, or persist, and the associated climate feedbacks are not well quantified. In this study we use forest stand inventories, in situ observations, and satellite remote sensing to examine: 1) variation in forest density within and between fire scars, and 2) changes in land surface albedo and active layer dynamics associated with forest density variation. At the landscape scale we observed declines in Landsat derived albedo as forests recovered in the first several decades after fire, though canopy cover varied widely within and between individual fire scars. Within an individual mid-successional fire scar ( 75 years) we observed canopy cover ranging from 15-90% with correspondingly large ranges of albedo during periods of snow cover, and relatively small differences in albedo during the growing season. We found an inverse relationship between canopy density and soil temperature within this fire scar; high-density low-albedo stands had cooler soils and shallower active layers, while low-density stands had warmer soils and deeper active layers. Intensive energy balance measurements at a high- and low- density site show that canopy cover alters the magnitude and timing of ground heat fluxes that affect active layer properties. Our results show that fire impacts on stand structure in Siberian larch forests affect land surface albedo and active layer dynamics in ways that

  12. Interventional radiology and undesirable effects

    International Nuclear Information System (INIS)

    Benderitter, M.

    2009-01-01

    As some procedures of interventional radiology are complex and long, doses received by patients can be high and cause undesired effects, notably on the skin or in underlying tissues (particularly in the brain as far as interventional neuroradiology is concerned and in lungs in the case of interventional cardiology). The author briefly discusses some deterministic effects in interventional radiology (influence of dose level, delay of appearance of effects, number of accidents). He briefly comments the diagnosis and treatment of severe radiological burns

  13. Anti-deterministic behaviour of discrete systems that are less predictable than noise

    Science.gov (United States)

    Urbanowicz, Krzysztof; Kantz, Holger; Holyst, Janusz A.

    2005-05-01

    We present a new type of deterministic dynamical behaviour that is less predictable than white noise. We call it anti-deterministic (AD) because time series corresponding to the dynamics of such systems do not generate deterministic lines in recurrence plots for small thresholds. We show that although the dynamics is chaotic in the sense of exponential divergence of nearby initial conditions and although some properties of AD data are similar to white noise, the AD dynamics is in fact, less predictable than noise and hence is different from pseudo-random number generators.

  14. Deterministic and heuristic models of forecasting spare parts demand

    Directory of Open Access Journals (Sweden)

    Ivan S. Milojević

    2012-04-01

    Full Text Available Knowing the demand of spare parts is the basis for successful spare parts inventory management. Inventory management has two aspects. The first one is operational management: acting according to certain models and making decisions in specific situations which could not have been foreseen or have not been encompassed by models. The second aspect is optimization of the model parameters by means of inventory management. Supply items demand (asset demand is the expression of customers' needs in units in the desired time and it is one of the most important parameters in the inventory management. The basic task of the supply system is demand fulfillment. In practice, demand is expressed through requisition or request. Given the conditions in which inventory management is considered, demand can be: - deterministic or stochastic, - stationary or nonstationary, - continuous or discrete, - satisfied or unsatisfied. The application of the maintenance concept is determined by the technological level of development of the assets being maintained. For example, it is hard to imagine that the concept of self-maintenance can be applied to assets developed and put into use 50 or 60 years ago. Even less complex concepts cannot be applied to those vehicles that only have indicators of engine temperature - those that react only when the engine is overheated. This means that the maintenance concepts that can be applied are the traditional preventive maintenance and the corrective maintenance. In order to be applied in a real system, modeling and simulation methods require a completely regulated system and that is not the case with this spare parts supply system. Therefore, this method, which also enables the model development, cannot be applied. Deterministic models of forecasting are almost exclusively related to the concept of preventive maintenance. Maintenance procedures are planned in advance, in accordance with exploitation and time resources. Since the timing

  15. [Effects of recruitment maneuver in prone position on hemodynamics in patients with severe pulmonary infection].

    Science.gov (United States)

    Fan, Yuan-hua; Liu, Yuan-fei; Zhu, Hua-yong; Zhang, Min

    2012-02-01

    To evaluate effects of recruitment maneuver in prone position on hemodynamics in patients with severe pulmonary infection, based on the protective pulmonary ventilation strategy. Ninety-seven cases with severe pulmonary infection admitted to intensive care unit (ICU) of Ganzhou City People's Hospital undergoing mechanical ventilation were involved. Volume controlled ventilation mode with small tidal volume (8 ml/kg) and positive end-expiratory pressure (PEEP) of 6 cm H(2)O [1 cm H(2)O = 0.098 kPa] was conducted. Each patient underwent recruitment maneuver in supine position and then in prone position [PEEP 20 cm H(2)O+pressure control (PC) 20 cm H(2)O]. Heart rate (HR), mean arterial pressure (MAP), pulse oxygen saturation [SpO(2)] and blood gas analysis data were recorded before and after recruitment maneuver in either position. A double-lumen venous catheter was inserted into internal jugular vein or subclavian vein, and a pulse index contour cardiac output (PiCCO) catheter was introduced into femoral artery. Cardiac index (CI), stroke volume index (SVI), systemic vascular resistance index (SVRI), intra-thoracic blood volume index (ITBVI), extra vascular lung water index (EVLWI), global end-diastolic volume index (GEDVI), global ejection fraction (GEF), stroke volume variation (SVV) and central vein pressure (CVP) were monitored. (1) Compared with data before recruitment maneuver, there were no significant differences in HR and MAP after supine position and prone position recruitment maneuver, but significant differences in SpO(2) were found between before and after recruitment maneuver when patients' position was changed (supine position: 0.954 ± 0.032 vs. 0.917 ± 0.025, P recruitment maneuver (P recruitment maneuver, CI [L×min(-1)×m(-2)], SVI (ml/m(2)), GEDVI (ml/m(2)) and GEF were decreased significantly during recruitment maneuver (supine position: CI 3.2 ± 0.4 vs. 3.8 ± 0.6, SVI 32.4 ± 5.6 vs. 38.8 ± 6.5, GEDVI 689 ± 44 vs. 766 ± 32, GEF 0.267 ± 0

  16. Effects of Growth Hormone Replacement on Peripheral Muscle and Exercise Capacity in Severe Growth Hormone Deficiency

    Directory of Open Access Journals (Sweden)

    Susana Gonzalez

    2018-02-01

    Full Text Available ObjectiveThe aim of this study is to evaluate the effect of growth hormone therapy (rGH on mitochondrial function on peripheral muscle and to correlate with exercise capacity in subjects with severe adult growth hormone deficiency (GHD.DesignSix months, double-blind, randomized, crossover, placebo-controlled trial of subcutaneous rGH in 17 patients with GHD.MeasurementsQuadriceps muscle biopsies were obtained at baseline, 3 months, and 6 months to measure succinate dehydrogenase (SDH to assess mitochondrial activity. Exercise capacity was measured with cardiopulmonary exercise testing. Lipids, glycemic parameters, and body fat levels were also measured.ResultsSerum insulin-like growth factor 1 (IGF1 levels reduced fat mass by 3.2% (p < 0.05 and normalized with rGH in the active phase (p < 0.005. Patients showed an increase in SDH (p < 0.01 from base line that differed between placebo and rGH therapy treatment groups (p < 0.05: those treated by rGH followed by placebo showed a significant increase in SDH (p < 0.001 followed by a decrease, with a significant between group difference at the end of 6 months (p < 0.05. No significant improvements or correlation with exercise capacity was found.ConclusionShort-term rGH for 3 months normalized IGF1 levels, reduced fat mass, and had a significant effect on mitochondrial function, but exercise capacity was unchanged.Clinical Trial RegistrationNumber ISRCTN94165486.

  17. Ionospheric effects during severe space weather events seen in ionospheric service data products

    Science.gov (United States)

    Jakowski, Norbert; Danielides, Michael; Mayer, Christoph; Borries, Claudia

    Space weather effects are closely related to complex perturbation processes in the magnetosphere-ionosphere-thermosphere systems, initiated by enhanced solar energy input. To understand and model complex space weather processes, different views on the same subject are helpful. One of the ionosphere key parameters is the Total Electron Content (TEC) which provides a first or-der approximation of the ionospheric range error in Global Navigation Satellite System (GNSS) applications. Additionally, horizontal gradients and time rate of change of TEC are important for estimating the perturbation degree of the ionosphere. TEC maps can effectively be gener-ated using ground based GNSS measurements from global receiver networks. Whereas ground based GNSS measurements provide good horizontal resolution, space based radio occultation measurements can complete the view by providing information on the vertical plasma density distribution. The combination of ground based TEC and vertical sounding measurements pro-vide essential information on the shape of the vertical electron density profile by computing the equivalent slab thickness at the ionosonde station site. Since radio beacon measurements at 150/400 MHz are well suited to trace the horizontal structure of Travelling Ionospheric Dis-turbances (TIDs), these data products essentially complete GNSS based TEC mapping results. Radio scintillation data products, characterising small scale irregularities in the ionosphere, are useful to estimate the continuity and availability of transionospheric radio signals. The different data products are addressed while discussing severe space weather events in the ionosphere e.g. events in October/November 2003. The complementary view of different near real time service data products is helpful to better understand the complex dynamics of ionospheric perturbation processes and to forecast the development of parameters customers are interested in.

  18. Effects of omalizumab in severe asthmatics across ages: A real life Italian experience.

    Science.gov (United States)

    Sposato, B; Scalese, M; Latorre, M; Scichilone, N; Matucci, A; Milanese, M; Masieri, S; Rolla, G; Steinhilber, G; Rosati, Y; Vultaggio, A; Folletti, I; Baglioni, S; Bargagli, E; Di Tomassi, M; Pio, R; Pio, A; Maccari, U; Maggiorelli, C; Migliorini, M G; Vignale, L; Pulerà, N; Carpagnano, G E; Foschino Barbaro, M P; Perrella, A; Paggiaro, P L

    2016-10-01

    This retrospective study aimed at evaluating long-term effects of Omalizumab in elderly asthmatics in a real-life setting. 105 consecutive severe asthmatics (GINA step 4-5; mean FEV 1 % predicted:66 ± 15.7) treated with Omalizumab for at least 1 year (treatment mean duration 35.1 ± 21.7 months) were divided into 3 groups according to their age at Omalizumab treatment onset: 18-39, 40-64 and ≥ 65 years. Comorbidities, number of overweight/obese subjects and patients with late-onset asthma were more frequent among older people. A similar reduction of inhaled corticosteroids dosage and SABA on-demand therapy was observed in all groups during Omalizumab treatment; a similar FEV 1 increased was also observed. Asthma Control Test (ACT) improved significantly (p Omalizumab but the percentage of exacerbation-free patients was higher in younger people (76.9%) compared to middle aged patients (49.2%) and the elderly (29%) (p = 0.049). After Omalizumab treatment, the risk for exacerbations was lower in subjects aged 40-64 (OR = 0.284 [CI95% = 0.098-0.826], p = 0.021) and 18-39 (OR = 0.133 [CI95% = 0.026-0.678], p = 0.015), compared to elderly asthmatics. Also, a significantly reduced ACT improvement (β = -1.070; p = 0.046) passing from each age class was observed. Omalizumab improves all asthma outcomes independently of age, although the magnitude of the effects observed in the elderly seems to be lower than in the other age groups. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Effect of doxycycline in patients of moderate to severe chronic obstructive pulmonary disease with stable symptoms

    Directory of Open Access Journals (Sweden)

    Prashant S Dalvi

    2011-01-01

    Full Text Available Background: The protease-antiprotease hypothesis proposes that inflammatory cells and oxidative stress in chronic obstructive pulmonary disease (COPD produce increased levels of proteolytic enzymes (neutrophil elastase, matrix metalloproteinases [MMP] which contribute to destruction of parenchyma resulting in progressive decline in forced expiratory volume in one second. Doxycycline, a tetracycline analogue, possesses anti-inflammatory properties and inhibits MMP enzymes. Objectives: To assess the effect of 4 weeks doxycycline in a dose of 100 mg once a day in patients of moderate to severe COPD with stable symptoms. Methods : In an interventional, randomized, observer-masked, parallel study design, the effect of doxycycline (100 mg once a day for 4 weeks was assessed in patients of COPD having stable symptoms after a run-in period of 4 weeks. The study participants in reference group did not receive doxycycline. The parameters were pulmonary functions, systemic inflammation marker C-reactive protein (CRP, and medical research council (MRC dyspnea scale. Use of systemic corticosteroids or antimicrobial agents was not allowed during the study period. Results: A total of 61 patients completed the study (31 patients in doxycycline group and 30 patients in reference group. At 4 weeks, the pulmonary functions significantly improved in doxycycline group and the mean reduction in baseline serum CRP was significantly greater in doxycycline group as compared with reference group. There was no significant improvement in MRC dyspnea scale in both groups at 4 weeks. Conclusion: The anti-inflammatory and MMP-inhibiting property of doxycycline might have contributed to the improvement of parameters in this study.

  20. Activity modes selection for project crashing through deterministic simulation

    Directory of Open Access Journals (Sweden)

    Ashok Mohanty

    2011-12-01

    Full Text Available Purpose: The time-cost trade-off problem addressed by CPM-based analytical approaches, assume unlimited resources and the existence of a continuous time-cost function. However, given the discrete nature of most resources, the activities can often be crashed only stepwise. Activity crashing for discrete time-cost function is also known as the activity modes selection problem in the project management. This problem is known to be NP-hard. Sophisticated optimization techniques such as Dynamic Programming, Integer Programming, Genetic Algorithm, Ant Colony Optimization have been used for finding efficient solution to activity modes selection problem. The paper presents a simple method that can provide efficient solution to activity modes selection problem for project crashing.Design/methodology/approach: Simulation based method implemented on electronic spreadsheet to determine activity modes for project crashing. The method is illustrated with the help of an example.Findings: The paper shows that a simple approach based on simple heuristic and deterministic simulation can give good result comparable to sophisticated optimization techniques.Research limitations/implications: The simulation based crashing method presented in this paper is developed to return satisfactory solutions but not necessarily an optimal solution.Practical implications: The use of spreadsheets for solving the Management Science and Operations Research problems make the techniques more accessible to practitioners. Spreadsheets provide a natural interface for model building, are easy to use in terms of inputs, solutions and report generation, and allow users to perform what-if analysis.Originality/value: The paper presents the application of simulation implemented on a spreadsheet to determine efficient solution to discrete time cost tradeoff problem.